Adventures in Language
Adventures in Language
How Language Works | How Semantics Works
Words words words. How do they get their meaning? That’s what semantics is all about! In this short episode, led by your friendly neighborhood linguist Emily (PhD), you’ll learn the 5 things you need to know about semantics. Listen and discover how the relationship between words and their meanings is oddly similar to the the relationship between money and its value. Enjoy!
Take our Semantics Quiz: https://mangosurvey.typeform.com/to/mqfj7HpC
Looking for more? Check out our related content:
How Morphology Works https://www.buzzsprout.com/1818324/10949576
Morphology Quiz: https://mangosurvey.typeform.com/to/mqfj7HpC
Come join the Mango family by subscribing to the podcast!
Instagram: @mangolanguages
Facebook: facebook.com/MangoLanguages
Website: mangolanguages.com
Contact (app inquiries): send us a message here
#semantics #semantics101 #whatissemantics #wordmeaning #linguistics #mangolanguages #howlanguageworks
Meet your guide/host! Emily Sabo (PhD, University of Michigan) is a linguist at Mango who specializes in the social and cognitive factors that impact bilingual language processing. Emily is also a language teacher, a producer of the We Are What We Speak docuseries, and get this... a storytelling standup comedian!
Dr. Emily Sabo (Host): Hey friends! Welcome back to How Language Works!
Emily here, your friendly neighborhood linguist. In our last episode, we dove into the structure of words and how they get built from individual morphemes. In this episode, we explore how words map onto their meanings, which means we’re entering into the field of semantics. By the end, you’ll know what semantics is, why it matters, and understand the main principles underlying word meaning. Lucky for you - I’ve summed all that up into 5 easy-to-understand points. So let’s dive in!
- Definition time
Semantics is the field of study dedicated to the meaning of words in human languages.
If I were to tell you I’m a semanticist, that could mean a lot of different things. I could be a formal semanticist, a lexical semanticist, or a cognitive semanticist – each of which comes with its own set of methodologies and objects of study. But what they all have in common is that they study how words and meanings map on to each other. In linguistics, we call this form-function mapping. Form = the word (e.g. house). Function = what it means (e.g. the structure for human habitation). Research into word meaning can have many applications. For example, some semantic models can impact how we teach computers to process human language and others can be used to help us learn vocabulary more efficiently in a foreign language. Now onto some of the main principles underlying words…
- Words are mostly arbitrary
Just in the way that money only has value because we assign it value, our words only carry meaning because we assign them meaning.
Why did we end up calling trees trees or boxes boxes? The short answer is that we really don’t know, but it seems to be mostly arbitrary. Now, there is some evidence that our brains are wired for certain sounds to indicate certain things to us (e.g. to most people, the sound kiki sounds like it should refer to a spiky object while the sound bobo sounds like it should refer to a round object). But that phenomenon (called Sound Symbolism) doesn’t explain how we get so many languages with different words for the same concepts.
While we don’t have concrete answers for why certain concepts got the word form assignments they did, we do know something about the how: it all boils down to conventionalization. Conventionalization is the process by which a word form comes to be associated with a particular meaning through consistent use.
We can all agree that the English word form ‘tree’ is conventionally used to refer to the woody perennial plant we all know and love. So I would get a lot of confused stares if I just decided to start using the word ‘tree’ to refer to something completely different, like ‘boxes.’ That would be weird and unnecessary – because that form-function pair is highly conventionalized! Humans typically want to understand and be understood in conversation, so it’s efficient for us to rely on these conventionalized form-function mappings to make meaning. While conventionalization is ever-present in how we make meaning with our words, that DOESN’T mean words and their meanings stay the same over time. That’ brings us to #3…
- Word meanings evolve as we use them differently
Just like the value of currencies change, so do the meanings of our words.
There’s a famous saying: “I’ll tell you what a word means by the company it keeps.” So, a word might start out with one meaning, but over time, if people are using it in different contexts over time, it’ll accrue a slightly different meaning. For example, the word ‘grab’ used to only mean ‘to seize suddenly with one’s hands,’ but now we use it in all kinds of contexts. For example, ‘grab a taxi’ just means to ‘get’ a taxi. This change in word meaning from something more specific to something more general is called generalization, and it’s quite common across languages. Shifts in word meaning like this often happens through metaphorical use, because we humans use language creatively and innovatively. And this is also how we get multiple meanings for words. Head can refer to the head of a human body, the head of a company, the head of a table. We call this polysemy, or the coexistence of multiple meanings for a given word.
- Semantics gives us a framework for defining word relationships
I’m sure you’ve heard of synonyms and antonyms - but did you know there’s a whole bunch of other word relationships out there?! As a field, semantics offers us all kinds of helpful terms and taxonomies for mapping out how words and meanings can relate to one another. Hypernym is what you’d call the umbrella term for a group of other words (e.g. the word ‘fruit’ is the hypernym for the words ‘banana’ and ‘orange’). Our brains actually track these relationships between words, and when building AI systems that use human languages, it all comes down to constructing matrices of precisely these kinds of word relationships.
- Different languages slice up meaning into different word categories
If you’ve ever learned another language, you know this is true. For example, English has one word for the article ‘the’ – but German has three different words that differ by gender. Spanish has two words to convey different types of ‘being’ (ser, estar), but English has only one. This comes back to the idea that words in any language often have multiple possible meanings. Which leaves our everyday interactions full of ambiguity – and room for endless confusion! And yet we manage to understand each other. The reason for this is that we are wired to make inferences about word meaning ambiguity based on context. Which we’ll get into in our next episode in this series, which dives into a facet of language called pragmatics!
Wrap-up: Well, we’ve reached the end of the episode! As always, if you like the show, let us know by subscribing and leaving us a comment! In today’s episode, we learned that semantics is all about how words map on to their meanings. To recap, words are to meaning as money is to value: that is, mostly arbitrary! And word meanings are notorious shape-shifters that evolve over time as we creatively innovate the ways we use our language metaphorically. We also learned that different languages cut up the semantic space differently, and that semantics saves the day by giving us concrete terms and methodologies for describing those meaning relationships within and between languages.
If you want to understand how meaning making works is language, semantics isn’t the end of the story. You need to understand pragmatics too, which is all about how word meaning is context-dependent. Semantics is about literal meaning while pragmatics is about contextual meaning. So, stay tuned for our next episode!
Check out our interactive quiz to test how much you know about semantics! (link in description)