- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
This is not the flex you think it is.
While introducing bugs is certainly a risky side-effect of AI coding, the history of software development has included controversial changes in the past, including the transition from assembly language to higher-level languages, which faced resistance from some programmers who worried about loss of control and efficiency. Similarly, the adoption of object-oriented programming in the 1990s sparked criticism about code complexity and performance overhead. The shift to AI augmentation in coding may be the latest transition that meets resistance from the old guard.
Stepping away from assembly did have that effect though. The tradeoff was that code was easier to make and easier to optimize, but its undeniable that it did lead to a loss of control and efficiency.
Similarly, the shift to object-oriented programming also increased performance overhead, but the tradeoff was that you can seamlessly reuse code which makes larger projects more manageable.
The article is right that AI coding is probably here to stay, but all the disadvantages that people are highliting are real concerns that won’t go away, they’ll just be adopted as the new normal.
I don’t know that moving away from assembly made things easier to optimize, but easier to read and maintain, absolutely
Thanks for the info, I was wondering how the hell we still have laggy performance compared to the hardware we used to have.
That’s a suspiciously made-up sounding number.
In other news, Google CEO uses AI to collect company internal metrics and craft press releases.
No way to see how this can’t backfire.
I’d say, explains why everything runs like shit, but, that’s been going on for a while now.
Edit: not everything, but, it feels that way. Alot of things