I largely agree with you, and I definitely appreciate that you’re being very civil in our discussion.
that’s the primary reason i’m here at all, i enjoy discussions about things like this, it’s interesting, and sometimes you even learn things.
I think you’re putting too much burden of responsibility on the workers for decisions that the employer makes.
to be clear, i wasn’t defining this as a worker issue, or an employer issue, i was stating it as a fundamental limitation of our economic model. It’s sort of a fundamental limitation of how making money works at the moment.
I agree that some degree of turnover is ok, but I don’t think that’s what we’re talking about.
that probably isn’t, but a significant problem i find with people is that they often don’t provide enough specificity or detail surrounding their statements or claims, to the point where it’s either irrelevant or simply too broad. Broad enough in some cases to the point where you could write a PHD on it, and then work for 20 years in that field, before answering the question.
People will hand waive the entirety of capitalism, in favor of something like socialism, which has no practical implementation as far as i’m aware, outside of the few tries that haven’t quite worked out optimally so far. I just can’t justify using logic in that way, i try to at least lock down on what i’m talking about to a point where it’s broadly understandable. Which is challenging, but that’s partly why im here lol.
but I believe that the increase in productivity means that they’ll need way fewer workers.
i think this is probably true, but given the accuracy and competence of most existing AI, i highly expect this to be restricted mostly due to “additional” productivity, it’s essentially creating a new market segment where one wasn’t previously. An AI alone can’t exactly replace a human. It can replace certain aspects and parts of a human, but never a full one. So it’s really hard to say how bad it will hit the industries in question.
this isn’t just affecting fine arts and support:
yeah for sure, i’m just not sure how much of this is going to be A: significant, or B: impactful.
And the spread is only gonna get wider as they introduce “agents” who are capable of making “decisions” autonomously, so you don’t need a human to tell the AI what to do, and then do something with the output.
It’s also worth noting that this is a significantly more risky move to make, especially if you put it in charge of handling anything other than doing “menial organization” work. For example, money. I highly doubt you would find anybody willing to let an AI buy things for them.
A lot of this labor is already automated through things like scripting and strict data entry. This is probably only going to make it less strict in that sense.
Yes the Luddites never mechanized, that’s the thing they were fighting against. They couldn’t all move to complex textiles, because the market wasn’t there for it, if they lowered their prices enough to generate the demand then they couldn’t recoup their time and material costs.
To be clear, this is kind of the example of bringing a knife to a gun fight, it’s your fault if you lose at that point. And while it’s definitely true that it cost the market jobs, the increase in productivity was probably more significant than the loss from textiles. Not to mention the decrease in product prices, making the living standards of everybody higher.
You could theoretically never mechanize, but you’re fighting a losing battle, by never innovating. Just look at intel, got blown out of the water by AMD since they sat on technology for a decade, and they’re losing market share now. They had a huge stock crash over their recent CPU lineup being overcooked, and burning themselves out. They’re not having a particularly great time right now, but that’s just what happens. And as a market, we’re all doing better now, the hardware capability of CPUs has improved MASSIVELY since the start of ryzen, and laptops have even seen a significant boost in productivity so much so that apple had to move to their own silicon to keep any sort of lead on the competition. Really good CPUs are a lot cheaper now, you can use ECC memory with most ryzen chips, while you have to pay intel for that privilege. The single thread speed of CPUs has increased significantly as well, making basically every task that much faster, the power efficiency of chips has also massively improved as well.
Generally, in a market like ours, losing existing jobs, and increasing productivity is going to be a beneficial tradeoff, as it opens more space for other types of productivity down the road. It’s sort of the endless optimization of a specific item, but the global economy.
In the end it comes down to what the LLM producers are promising. They’re promising to be able to do all this.
and so far, they’ve lied. Google cheated on the gemini presentation. Grok can’t even produce real facts. ChatGPT has progressively worsened since launch due to bad data. Image generation and video generation has improved, but we’re at a point where it can’t improve more than it has already. At least that significantly, so we’re quickly approaching a wall. Unless we pivot, and they will, but it will have to be marketable at the end of the day, and that’s the hard part.
I think it’s also worth noting that we should in some capacity, prepare for the inevitable, never be comfortable, always be ready. You can’t lie down at the sight of a sword, and not expect to be killed anyway. Fighting against it might work, but that’s not historically supported in any significant capacity to my knowledge. You can do nothing, which is even worse. Or you can do your best to prepare as well as you can. There is always something to be offering over other people. Especially AI.
I’m not inherently against LLMs and AI.
Im against putting the power of LLMs in the hands of the employers instead of the workers. People self-hosting their own free LLMs to make their job and home life easier? I’m all about that, and I can even forgive the theft and energy usage to an extent.
And also that I’m a developer in this space - I don’t train models or sell them directly, but I make products that use LLMs to increase productivity. I know I’m part of the problem, but I was transfere onto the project and my job is simply too good to quit over it, so I’m a hypocrite to some extent. What people in this space are trying to do is absolutely replace workers so that businesses can save on payroll and increase margins. They don’t say it, but it’s telling how they dance around the topic.
that’s the primary reason i’m here at all, i enjoy discussions about things like this, it’s interesting, and sometimes you even learn things.
to be clear, i wasn’t defining this as a worker issue, or an employer issue, i was stating it as a fundamental limitation of our economic model. It’s sort of a fundamental limitation of how making money works at the moment.
that probably isn’t, but a significant problem i find with people is that they often don’t provide enough specificity or detail surrounding their statements or claims, to the point where it’s either irrelevant or simply too broad. Broad enough in some cases to the point where you could write a PHD on it, and then work for 20 years in that field, before answering the question.
People will hand waive the entirety of capitalism, in favor of something like socialism, which has no practical implementation as far as i’m aware, outside of the few tries that haven’t quite worked out optimally so far. I just can’t justify using logic in that way, i try to at least lock down on what i’m talking about to a point where it’s broadly understandable. Which is challenging, but that’s partly why im here lol.
i think this is probably true, but given the accuracy and competence of most existing AI, i highly expect this to be restricted mostly due to “additional” productivity, it’s essentially creating a new market segment where one wasn’t previously. An AI alone can’t exactly replace a human. It can replace certain aspects and parts of a human, but never a full one. So it’s really hard to say how bad it will hit the industries in question.
yeah for sure, i’m just not sure how much of this is going to be A: significant, or B: impactful.
It’s also worth noting that this is a significantly more risky move to make, especially if you put it in charge of handling anything other than doing “menial organization” work. For example, money. I highly doubt you would find anybody willing to let an AI buy things for them.
A lot of this labor is already automated through things like scripting and strict data entry. This is probably only going to make it less strict in that sense.
To be clear, this is kind of the example of bringing a knife to a gun fight, it’s your fault if you lose at that point. And while it’s definitely true that it cost the market jobs, the increase in productivity was probably more significant than the loss from textiles. Not to mention the decrease in product prices, making the living standards of everybody higher.
You could theoretically never mechanize, but you’re fighting a losing battle, by never innovating. Just look at intel, got blown out of the water by AMD since they sat on technology for a decade, and they’re losing market share now. They had a huge stock crash over their recent CPU lineup being overcooked, and burning themselves out. They’re not having a particularly great time right now, but that’s just what happens. And as a market, we’re all doing better now, the hardware capability of CPUs has improved MASSIVELY since the start of ryzen, and laptops have even seen a significant boost in productivity so much so that apple had to move to their own silicon to keep any sort of lead on the competition. Really good CPUs are a lot cheaper now, you can use ECC memory with most ryzen chips, while you have to pay intel for that privilege. The single thread speed of CPUs has increased significantly as well, making basically every task that much faster, the power efficiency of chips has also massively improved as well.
Generally, in a market like ours, losing existing jobs, and increasing productivity is going to be a beneficial tradeoff, as it opens more space for other types of productivity down the road. It’s sort of the endless optimization of a specific item, but the global economy.
and so far, they’ve lied. Google cheated on the gemini presentation. Grok can’t even produce real facts. ChatGPT has progressively worsened since launch due to bad data. Image generation and video generation has improved, but we’re at a point where it can’t improve more than it has already. At least that significantly, so we’re quickly approaching a wall. Unless we pivot, and they will, but it will have to be marketable at the end of the day, and that’s the hard part.
I think it’s also worth noting that we should in some capacity, prepare for the inevitable, never be comfortable, always be ready. You can’t lie down at the sight of a sword, and not expect to be killed anyway. Fighting against it might work, but that’s not historically supported in any significant capacity to my knowledge. You can do nothing, which is even worse. Or you can do your best to prepare as well as you can. There is always something to be offering over other people. Especially AI.
It’s worth mentioning two things here.
I’m not inherently against LLMs and AI.
Im against putting the power of LLMs in the hands of the employers instead of the workers. People self-hosting their own free LLMs to make their job and home life easier? I’m all about that, and I can even forgive the theft and energy usage to an extent.
And also that I’m a developer in this space - I don’t train models or sell them directly, but I make products that use LLMs to increase productivity. I know I’m part of the problem, but I was transfere onto the project and my job is simply too good to quit over it, so I’m a hypocrite to some extent. What people in this space are trying to do is absolutely replace workers so that businesses can save on payroll and increase margins. They don’t say it, but it’s telling how they dance around the topic.