Software engineers may have to develop other skills soon as artificial intelligence takes over many coding tasks.

That’s according to Amazon Web Services’ CEO, Matt Garman, who shared his thoughts on the topic during an internal fireside chat held in June, according to a recording of the meeting obtained by Business Insider.

“If you go forward 24 months from now, or some amount of time — I can’t exactly predict where it is — it’s possible that most developers are not coding,” said Garman, who became AWS’s CEO in June.

“Coding is just kind of like the language that we talk to computers. It’s not necessarily the skill in and of itself,” the executive said. “The skill in and of itself is like, how do I innovate? How do I go build something that’s interesting for my end users to use?”

This means the job of a software developer will change, Garman said.

“It just means that each of us has to get more in tune with what our customers need and what the actual end thing is that we’re going to try to go build, because that’s going to be more and more of what the work is as opposed to sitting down and actually writing code,” he said.

  • pontiffkitchen0@lemmy.world
    link
    fedilink
    arrow-up
    81
    ·
    4 months ago

    I could have missed something, but quickly scanning his job history shows he started as an intern at the beginning of AWS while getting his MBA, and then became a Product Manager. Didn’t really see any programming experience or knowledge, not sure he has the context and foundational understanding to be able to justify making claims like that.

    Seems like most of the people who talk about AI eliminating programming jobs, haven’t ever had a job writing code or have a firm grasp of what those kind of roles actually do.

    The cynic in me thinks all these articles from executives making such bold claims are to scare developers into thinking we don’t have as much leverage in the job market as we do, even after all the layoffs it’s still a workers market. The realist in me thinks they probably just like hearing themselves talk, and everyone’s guilty of talking about something they know nothing about. According to whoever’s razor it was, it’s probably the latter.

    • OhNoMoreLemmy@lemmy.ml
      link
      fedilink
      arrow-up
      26
      ·
      4 months ago

      Yeah it’s super obvious that he’s a product manager from the quotes.

      the executive said. “The skill in and of itself is like, how do I innovate? How do I go build something that’s interesting for my end users to use?”

      This is the path for promotion for product managers. Create new interesting products and move on before they fail. And yeah, if you really don’t care about failing LLMs can maybe help speed up prototyping here.

      However, what Aws actually tries to sell to users is rock solid reliability and high up time. If you start asking “how do I go build something that’s even more reliable?” It’s incredibly clear that LLMs are not the answer.

  • TootSweet@lemmy.worldM
    link
    fedilink
    English
    arrow-up
    73
    ·
    4 months ago

    Even aside from the AI hype BS, it’s like someone told Matt Garman to “tell me you don’t know shit about software engineering without telling me you don’t know shit about software engineering.”

  • raynethackery@lemmy.world
    link
    fedilink
    arrow-up
    46
    ·
    4 months ago

    “It just means that each of us has to get more in tune with what our customers need…”

    You don’t care what your customers need. You only care about what you can extract from your customers.

    • ladicius@lemmy.world
      link
      fedilink
      arrow-up
      14
      arrow-down
      1
      ·
      4 months ago

      He was talking about shareholders, and the shareholders need more money. That’s all that he and they care for.

      The customers are of no interest to him, like the employees.

    • Flying Squid@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      4 months ago

      I’m thinking what their customers need is for AWS to work right. Since they’re usually businesses themselves and having it fuck up costs them money. Possibly everything they have.

  • Disaster
    link
    fedilink
    arrow-up
    46
    ·
    4 months ago

    Why do unqualified idiots always wind up in charge?

    • pdxfed@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      4 months ago

      Remember that reasonable takes do not get headlines. In 2024 you have to making some absurd claim, making or losing some insane amount of money for your company, or be incredibly powerful to get your voice in the press.

  • Mikina@programming.dev
    link
    fedilink
    arrow-up
    41
    ·
    4 months ago

    I think it’s quite the contrary, and AI will actually increase our job security. Because now, you have a lot of people learning to code using AI, and I’ve heard from my friends who was talking to other CTO’s at a conference that they have even discusses whether it’s even worth it to bother with hiring juniors now, because it turned out that a surprisingly large amount of them are in fact just a front-end for ChatGPT.

    Can you eventually get a problem solved by talking to a LLM about it? Sure, but it will take you a lot longer, and you don’t learn much programming skills. It’s basically a lot worse version of copy-pasting code from StackOverflow, because there you can at least be certain that the code you are copying has been reviewed by at least someone, and the explanation isn’t in most cases hallucinated stuff that sounds correct. You also can’t keep asking Stack Overflow to edit your code for your use-case, and have to figure it out yourself.

    But I’m really looking forward to major companies trying to replace programmers with AIs. Google implementing LLMs into search results was my favorite recent trainwreck, and reading articles with the CEO squrming that “We actually have to manually filter the results, because solving the LLM models halucinating turned out to be a really difficult issue”. No shit, it’s almost as if you want factually correct and precise outputs from a statistically-biased but still random generator.

    Please, I want to se a company fire most of their programmers to replace with AI, and watch them burn. Hopefully, it will happen soon.

  • RedditWanderer@lemmy.world
    link
    fedilink
    arrow-up
    35
    ·
    4 months ago

    You can easily see the market is now hiring more engineers to put up with the shortcomings of LLMs. The best AI models cannot be left to code and be right more than ~10% of the time. That’s an unacceptable rate that is not likely to increase as they feed llm data to llms.

    If you ask copilot to write something like quicksort, it will get it wrong most of the time, because most people misunderstand quicksort/ have bad implementations for it, and the models don’t know the difference.

    Anytime I see a CEO CTO warning about AI taking SWE jobs on linkedin, i make a note to never work near those people, they fundamentally misunderstand the problem, and follow the hype. The other thing they like to ignore is the cost of these models, that are essentially subsidized by our tax payer funded energy infrastructures. That is an equivalently absurd amount of money compared to SWE.

  • partial_accumen@lemmy.world
    link
    fedilink
    arrow-up
    23
    ·
    edit-2
    4 months ago

    “Coding is just kind of like the language that we talk to computers. It’s not necessarily the skill in and of itself,” the executive said.

    “English language proficient isn’t really a skill its just the language we talk to each other in. There’s not necessarily skill in being able to form a cogent thought and articulate it clearly to your audience. This is why JFK’s ‘ask not what your country can do for you’ speech is equal to Gucci Gang song lyrics.”

    • Garman probably
  • ThePowerOfGeek@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    ·
    edit-2
    4 months ago

    I was skeptical going into this article. And then I read this:

    “Coding is just kind of like the language that we talk to computers. It’s not necessarily the skill in and of itself,” the executive said.

    Yeah, this guy doesn’t know shit about software development. He thinks he does. But in reality he doesn’t. He’s yet another C-level guy who’s been huffing on his own farts for too long, and is giving off real strong “the Internet is a series of tubes!” vibes.

    First of all, I’ve been hearing this same argument about how the role of software developer is imminently doomed for 10-15 years now. “AI is going to make software developers obsolete within a year!” It’s never happened, and for good reason. While generative AI has evolved at a staggering rate over the last 2-3 years, its still not at the point of being able to envision and implement complex, scalable code bases.

    Second, I and others I know have tried using Chat GPT and similar to generate working code - even recently, and it’s a lot more ‘miss’ than ‘hit’. It’s good for spinning up unit test templates and some very basic coding examples. It’s decent for getting new perspectives on specific micro code problems. But a lot of nuance is lost in translation, and that means you end up either with as much crap code as good code, or you spend an inordinate amount of time trying to finagle something that’s workable. And often it’s outright detrimental. Too many times I’ve seen or heard about less experienced developers trying to use AI and copy-and-pasting AI generated code here there and everywhere and the resulting wider code base becomes like bloated spaghetti shit. You try to blindly cut corners in software development at your own risk.

    And this brings me to my final point: there’s a very big difference between ‘coding’ and software development. Coding is coming up with a few lines of code to do one very specific thing. Software development is a holistic process that requires knowledge, experience, intuition, anticipating problems, and seeing the bigger picture.

    Eventually I could see AI being heavily used in software development. Maybe even in the next 3-5 years. But even at that point there will need to be a knowledgeable human to collaborate with, verify the output, make sure it’s being pieced together in a meaningful and workable way, and keep the AI’s work in check.

    • Nougat@fedia.ioM
      link
      fedilink
      arrow-up
      11
      arrow-down
      1
      ·
      4 months ago

      But a lot of nuance is lost in translation, …

      And as black box machine learning ability to take what we say to be closer to what we mean gets better, we eventually just end up with … a software developer.

      Even at that point, who does he think is going to explain to the AI software developer what is actually needed? Him? He doesn’t know how to do that any better than he already does to human software developers.

      Edit: OMG I just read this part:

      “It just means that each of us has to get more in tune with what our customers need and what the actual end thing is that we’re going to try to go build, because that’s going to be more and more of what the work is as opposed to sitting down and actually writing code,” he said.

      You idiot. Do you think devs want to have to redo their work five times before the client is satisfied? If you’re not already “more in tune with what our customers need and what the actual end thing is that we’re going to try to go build,” fuck man, just do that. You can do that right now, you walnut.

  • Plopp@lemmy.world
    link
    fedilink
    arrow-up
    19
    ·
    4 months ago

    Amazon cloud chief. I thought this article was going to be about some badass indigenous tribesman.

  • kaffiene@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    ·
    4 months ago

    What a fucking moronic stuffed suit. I’m picking that in 25 months, I’ll still be a dev and there won’t be any jobs lost to AI

  • SeattleRain@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    4 months ago

    Haha, this is going to be hilarious. AWS is going to crash and burn so hard. It just may trigger an exodus from could computing.

    Other than time to market what real advantages does it give companies? Most companies are not Silicon Valley start ups. Been hearing more and more about companies moving things back in house and a major outage will drive that trend even more.

  • Zier@fedia.io
    link
    fedilink
    arrow-up
    5
    ·
    4 months ago

    Developers will now become ER attendants. AKA cleaning up all the shit AI messes up. Oh no AI deleted another customer database can you fix that??