While language does evolve over time, we shouldn’t encourage unnecessary and somewhat negative evolutions of it, and especially not encourage it to change over less time.
When two previously distinct words come to have the same meaning, this can be a problem. First, older written things become less comprehensible. Few of us today could read and understand old english because so many words have changed. The evolution of language has taken a long time to get to that point, at least. But if we encourage the acceleration of this change, something which appears to be happening even without encouragement, how long will it be?
Today, we can still pretty clearly understand things written 200 years ago; some bits are confusing but for the most part it is still clear. If language change accelerates enough, in the future, people may struggle to understand something written only a hundred years ago, or even less.
The second problem is that if the word for a thing goes away, it becomes more difficult to express that concept. Consider the word ‘literally’ whose meaning has become extremely muddled. In order to express the original concept, we now require additional emphasis. There are other, more difficult to think of terms like that - a concept for which a particular word would have been perfect had the word’s meaning not significantly changed.
So when a word’s usage is corrected, do not be so quick to defend the misuse of the word through ‘language evolves!’ If people accept that ‘oops, I used that word wrong’ and then see if there is already a better word for what they were trying to express to correct themselves with, that is probably better - in most cases.
Even more notably, new words should be used when possible, if an older word doesn’t quite fit a newly emerging thing, or even a concept that has existed for some time but has not had a word to describe it precisely. One of my favorite examples of this is the word ‘cromulent’ which expresses a concept that did not have a specific word for it in common use at the time, even though the concept of ‘understandable and linguistically correct’ certainly already existed. Also consider the now common word ‘emoji’ which was coined specifically to represent this concept. This is an excellent evolution of language because it took nothing away. It arose in response to something which did not exist, and described that thing with a word created specifically for it.
That said, fighting against the evolution of language that has already happened and is far too entrenched to ever change is nonsensical. My father, for instance, insists ‘cool’ should be for temperature description only, even though that word possessed its non-temperature meaning before he was even born. Similarly, sometimes the change is resisted for bad reasons; like the word ‘gay’. In these cases, it is best not to try to fight the change, but instead embrace and encourage it.
So ultimately, when a word is used wrong, consider whether the word evolving to the way it is being used is a positive change. If it does not make things better, it’s probably best not to encourage it.
You say this like it’s a fact that the word “literally” is worse now than it was before its recent evolution. You’re reducing the entire value of a word to a metric of “clarity”/“muddledness”, but natural language has value beyond its ability to be technically precise.
It’s worse in that there is now no common way to say what it used to mean, without adding several more words, where previously one would have communicated the meaning clearly.
Anytime a language change increases the likelihood of misunderstanding it definitely has negative effects. It may also have positive effects, but it shouldn’t be simply accepted without regard to that.
Now, disagreement on whether a particular change’s negative outweighs its positive is going to happen, obviously, but it’s important to acknowledge the bad parts exist.
It’s also important not to accept a mistake and insist that it’s fine because language changes, out of pride and desire to not be mistaken - a trend I definitely see a lot. It’s often not ‘I am using this word in a different way and have considered it’s implications’, it’s ‘I don’t want to be wrong so I will insist that I didn’t make a mistake, language changes!’
A linguist looks at an example like “literally” and says, isn’t language amazing? Words change and evolve, are created and die off, and yet everything works, people don’t stop being able to express ideas because the language got screwed up, everything takes care of itself. People were making the same complaints about words being used the wrong way 200 years ago, and a thousand years ago, thinking now we’ve lost a critical piece of the language, but it’s always fine. We have languages like French with an academy that regulates it, and we also have languages that have never been written nor taught in school. And they are all capable of expressing whatever they need to express.
Why is requiring more words inherently worse? Are languages that require more words to express an idea worse than other languages which require less words? For example, English has lots of prepositions whose meaning is sometimes instead encoded by verb conjugation in languages like Spanish (e.g. infinitives requiring “to” in English but not in Spanish). Does that difference make English worse than Spanish?
It’s not necessarily worse, I suppose. I think it is worse in this example, perhaps you don’t, and I think we can acknowledge this as a reasonable difference of opinion.
I primarily object to the seemingly common attitude acting as though it is unreasonable to consider a change in language usage bad and be opposed to it at all. The attitude that anyone objecting to a language change has the same sort of ignorance as those who don’t want the language to ever change from whatever idealized version they have. These people are ridiculous, but not everyone who opposed any particular language change is one of them.
Sure, I do think that’s a reasonable difference in opinion and I agree that it’s mostly fine for someone to dislike the way that a language is changing. I think the trouble comes in when that dislike is framed as though it comes from some position of authority or superior fluency, since it’s actually an emotional argument, not a logical argument.
Your feelings about English are valid and meaningful, but only to the exact same degree that my feelings about English are valid and meaningful. Telling someone that you don’t like the way they’re speaking is often rude, but it’s not false, because you are the authority on your own feelings. Telling someone that they’re speaking incorrectly is usually “not even wrong”, because it’s framed as a logical argument but it has no logical basis.
Well, framing it as ‘this is the currently accepted way of doing it, and according to current norms your use is wrong’ seems correct enough to me; someone can certainly be speaking incorrectly according to a certain set of norms.
It also increases the ‘friction’ somewhat, causing those who want to change things to actively push against current norms rather than argue from their own position of faux superiority, citing the changing nature of language to insist no use can ever be wrong.
And in any case it is also likely to slow down the change, which I at least think is a nearly entirely good thing. I want to still be able to read things from a couple hundred years ago, and I would similarly like those who come after me to understand the things I write without translations or aid, at least for a couple hundred years.
The problem is that there is no universal “currently accepted way of doing it”. What you’re describing is a dialect. It’s sometimes reasonable to say that a certain use is wrong in a certain dialect, but insisting that a certain use is universally wrong is just insisting that your dialect is somehow more authoritative than other dialects.
There is no absolute prescriptive authority on the English language. It just doesn’t exist at all. The common English dictionaries don’t claim to be prescriptive authorities, they claim to describe how the language is currently used. If there are any English dictionaries that claim to be prescriptive authorities (I don’t know of any off the top of my head), they’re clearly completely ignored by pretty much the entire world of actual English speakers, so their authority isn’t worth very much.
I strongly disagree that slowing down the change of language is nearly entirely good. I think it’s neutral when it’s a natural slowing caused by cultural shifts, and I think it’s strictly a bad thing when it’s a forced slowing caused by active gatekeeping from self-appointed dialect police. Language is inextricable from culture, so language change is inextricable from cultural change, so language conservatism is a form of cultural conservatism.
If I had a crystal ball and I looked a couple hundred years into the future and saw people speaking the same English that I speak today, I would be terribly sad about English-speaking culture. I sure hope we have new ways to talk by then! I also think you’re dramatically understating how much English has changed in the past couple years - while English from the 1700s can mostly be deciphered by modern English speakers without a complete re-translation, it certainly doesn’t read fluently to a modern speaker, and it’s missing a whole lot of the words and structures that people use to express their modern concerns. This is all normal and natural for a natural language.
It’s not that the word “literally” is worse now. It’s that it used to represent an idea (the idea of a thing being non-figurative) which it’s slowly coming to not mean anymore.
Words map to meanings. Those mappings can shift and change over time. But if that happening leaves a particular meaning orphaned then I’d think of that as unfortunate, no?
Maybe instead of changes being “good” or “bad” it’s more like “this shift in language increases (or decreases) the total expressiveness of the language”. Would you be less up in arms at that way of putting it?
Here’s a fantastic example: sentient, sapient, and concious. These are VERY different words with wildly different meanings, but they’re practically treated as synonyms in colloquial usage. The only way to properly express them now is to use their entire definitions, and then people question why you’re being so specific or excluding certain things.
What if you just think of it as our culture putting less emphasis on the concept of “literally” and more on “figuratively”. And the evolution actually makes the language more expressive, given the things that we’re trying to express (on average).
I don’t follow… By adding the antonym you actually make it harder to express these figurative things in the same way removing contrast from an image makes it harder to resolve, so it’s less expressive than before.
I don’t agree that it decreased the total expressiveness of English, no. The modern colloquial use of “literally” is not identical to “figuratively”, or to “very”, or any other word I can think of - it’s an intensifier with a unique connotation that doesn’t have any good alternative. At worst, we have lost some expressiveness and gained some expressiveness, and there is no objective metric to decide whether that’s a “net positive” or a “net negative”; it’s just a change.
gonna respond only to the first sentence because frankly shove off if you think i’m going to read that wall of text.
I assume you are of course the one who gets to decide what language changes are good and which are bad? Or are you going to give some organization the right to decide how we speak?
If you are not going to read something, perhaps you should avoid making ignorant comments, considering that for the most part, those topics are already addressed in my posts.
While language does evolve over time, we shouldn’t encourage unnecessary and somewhat negative evolutions of it, and especially not encourage it to change over less time.
When two previously distinct words come to have the same meaning, this can be a problem. First, older written things become less comprehensible. Few of us today could read and understand old english because so many words have changed. The evolution of language has taken a long time to get to that point, at least. But if we encourage the acceleration of this change, something which appears to be happening even without encouragement, how long will it be?
Today, we can still pretty clearly understand things written 200 years ago; some bits are confusing but for the most part it is still clear. If language change accelerates enough, in the future, people may struggle to understand something written only a hundred years ago, or even less.
The second problem is that if the word for a thing goes away, it becomes more difficult to express that concept. Consider the word ‘literally’ whose meaning has become extremely muddled. In order to express the original concept, we now require additional emphasis. There are other, more difficult to think of terms like that - a concept for which a particular word would have been perfect had the word’s meaning not significantly changed.
So when a word’s usage is corrected, do not be so quick to defend the misuse of the word through ‘language evolves!’ If people accept that ‘oops, I used that word wrong’ and then see if there is already a better word for what they were trying to express to correct themselves with, that is probably better - in most cases.
Even more notably, new words should be used when possible, if an older word doesn’t quite fit a newly emerging thing, or even a concept that has existed for some time but has not had a word to describe it precisely. One of my favorite examples of this is the word ‘cromulent’ which expresses a concept that did not have a specific word for it in common use at the time, even though the concept of ‘understandable and linguistically correct’ certainly already existed. Also consider the now common word ‘emoji’ which was coined specifically to represent this concept. This is an excellent evolution of language because it took nothing away. It arose in response to something which did not exist, and described that thing with a word created specifically for it.
That said, fighting against the evolution of language that has already happened and is far too entrenched to ever change is nonsensical. My father, for instance, insists ‘cool’ should be for temperature description only, even though that word possessed its non-temperature meaning before he was even born. Similarly, sometimes the change is resisted for bad reasons; like the word ‘gay’. In these cases, it is best not to try to fight the change, but instead embrace and encourage it.
So ultimately, when a word is used wrong, consider whether the word evolving to the way it is being used is a positive change. If it does not make things better, it’s probably best not to encourage it.
You say this like it’s a fact that the word “literally” is worse now than it was before its recent evolution. You’re reducing the entire value of a word to a metric of “clarity”/“muddledness”, but natural language has value beyond its ability to be technically precise.
It’s worse in that there is now no common way to say what it used to mean, without adding several more words, where previously one would have communicated the meaning clearly.
Anytime a language change increases the likelihood of misunderstanding it definitely has negative effects. It may also have positive effects, but it shouldn’t be simply accepted without regard to that.
Now, disagreement on whether a particular change’s negative outweighs its positive is going to happen, obviously, but it’s important to acknowledge the bad parts exist.
It’s also important not to accept a mistake and insist that it’s fine because language changes, out of pride and desire to not be mistaken - a trend I definitely see a lot. It’s often not ‘I am using this word in a different way and have considered it’s implications’, it’s ‘I don’t want to be wrong so I will insist that I didn’t make a mistake, language changes!’
A linguist looks at an example like “literally” and says, isn’t language amazing? Words change and evolve, are created and die off, and yet everything works, people don’t stop being able to express ideas because the language got screwed up, everything takes care of itself. People were making the same complaints about words being used the wrong way 200 years ago, and a thousand years ago, thinking now we’ve lost a critical piece of the language, but it’s always fine. We have languages like French with an academy that regulates it, and we also have languages that have never been written nor taught in school. And they are all capable of expressing whatever they need to express.
Why is requiring more words inherently worse? Are languages that require more words to express an idea worse than other languages which require less words? For example, English has lots of prepositions whose meaning is sometimes instead encoded by verb conjugation in languages like Spanish (e.g. infinitives requiring “to” in English but not in Spanish). Does that difference make English worse than Spanish?
It’s not necessarily worse, I suppose. I think it is worse in this example, perhaps you don’t, and I think we can acknowledge this as a reasonable difference of opinion.
I primarily object to the seemingly common attitude acting as though it is unreasonable to consider a change in language usage bad and be opposed to it at all. The attitude that anyone objecting to a language change has the same sort of ignorance as those who don’t want the language to ever change from whatever idealized version they have. These people are ridiculous, but not everyone who opposed any particular language change is one of them.
Sure, I do think that’s a reasonable difference in opinion and I agree that it’s mostly fine for someone to dislike the way that a language is changing. I think the trouble comes in when that dislike is framed as though it comes from some position of authority or superior fluency, since it’s actually an emotional argument, not a logical argument.
Your feelings about English are valid and meaningful, but only to the exact same degree that my feelings about English are valid and meaningful. Telling someone that you don’t like the way they’re speaking is often rude, but it’s not false, because you are the authority on your own feelings. Telling someone that they’re speaking incorrectly is usually “not even wrong”, because it’s framed as a logical argument but it has no logical basis.
Well, framing it as ‘this is the currently accepted way of doing it, and according to current norms your use is wrong’ seems correct enough to me; someone can certainly be speaking incorrectly according to a certain set of norms.
It also increases the ‘friction’ somewhat, causing those who want to change things to actively push against current norms rather than argue from their own position of faux superiority, citing the changing nature of language to insist no use can ever be wrong.
And in any case it is also likely to slow down the change, which I at least think is a nearly entirely good thing. I want to still be able to read things from a couple hundred years ago, and I would similarly like those who come after me to understand the things I write without translations or aid, at least for a couple hundred years.
The problem is that there is no universal “currently accepted way of doing it”. What you’re describing is a dialect. It’s sometimes reasonable to say that a certain use is wrong in a certain dialect, but insisting that a certain use is universally wrong is just insisting that your dialect is somehow more authoritative than other dialects.
There is no absolute prescriptive authority on the English language. It just doesn’t exist at all. The common English dictionaries don’t claim to be prescriptive authorities, they claim to describe how the language is currently used. If there are any English dictionaries that claim to be prescriptive authorities (I don’t know of any off the top of my head), they’re clearly completely ignored by pretty much the entire world of actual English speakers, so their authority isn’t worth very much.
I strongly disagree that slowing down the change of language is nearly entirely good. I think it’s neutral when it’s a natural slowing caused by cultural shifts, and I think it’s strictly a bad thing when it’s a forced slowing caused by active gatekeeping from self-appointed dialect police. Language is inextricable from culture, so language change is inextricable from cultural change, so language conservatism is a form of cultural conservatism.
If I had a crystal ball and I looked a couple hundred years into the future and saw people speaking the same English that I speak today, I would be terribly sad about English-speaking culture. I sure hope we have new ways to talk by then! I also think you’re dramatically understating how much English has changed in the past couple years - while English from the 1700s can mostly be deciphered by modern English speakers without a complete re-translation, it certainly doesn’t read fluently to a modern speaker, and it’s missing a whole lot of the words and structures that people use to express their modern concerns. This is all normal and natural for a natural language.
It’s not that the word “literally” is worse now. It’s that it used to represent an idea (the idea of a thing being non-figurative) which it’s slowly coming to not mean anymore.
Words map to meanings. Those mappings can shift and change over time. But if that happening leaves a particular meaning orphaned then I’d think of that as unfortunate, no?
Maybe instead of changes being “good” or “bad” it’s more like “this shift in language increases (or decreases) the total expressiveness of the language”. Would you be less up in arms at that way of putting it?
Here’s a fantastic example: sentient, sapient, and concious. These are VERY different words with wildly different meanings, but they’re practically treated as synonyms in colloquial usage. The only way to properly express them now is to use their entire definitions, and then people question why you’re being so specific or excluding certain things.
What if you just think of it as our culture putting less emphasis on the concept of “literally” and more on “figuratively”. And the evolution actually makes the language more expressive, given the things that we’re trying to express (on average).
I don’t follow… By adding the antonym you actually make it harder to express these figurative things in the same way removing contrast from an image makes it harder to resolve, so it’s less expressive than before.
I don’t agree that it decreased the total expressiveness of English, no. The modern colloquial use of “literally” is not identical to “figuratively”, or to “very”, or any other word I can think of - it’s an intensifier with a unique connotation that doesn’t have any good alternative. At worst, we have lost some expressiveness and gained some expressiveness, and there is no objective metric to decide whether that’s a “net positive” or a “net negative”; it’s just a change.
deleted by creator
Let’s not start deciding what’s positive and negative evolution of a language. We all know who gets discriminated against because of this.
gonna respond only to the first sentence because frankly shove off if you think i’m going to read that wall of text.
I assume you are of course the one who gets to decide what language changes are good and which are bad? Or are you going to give some organization the right to decide how we speak?
Honestly such a terrifying way of thinking…
If you are not going to read something, perhaps you should avoid making ignorant comments, considering that for the most part, those topics are already addressed in my posts.
If you didn’t read it, fuck off and don’t reply. I decided not to read your comment beyond that because you’re an inconsiderate cuss.
You believe that others value your time and opinion more than they actually do. If you aren’t going to read it, just scroll past
Removed by mod