The study tracked around 800 developers, comparing their output with and without GitHub’s Copilot coding assistant over three-month periods. Surprisingly, when measuring key metrics like pull request cycle time and throughput, Uplevel found no meaningful improvements for those using Copilot.
It’s a glorified autocorrect. Using it for anything else and expecting magic is an interesting idea. I’m not sure what folks are expecting there.
But I don’t ask it to explain things or generate algorithms willy nilly. I don’t expect or try to have it do something that’s not more than simply auto-completion.
I honestly like it, even if I strongly dislike the use of AI elsewhere. It’s working in this area for me.
I’ve not been too keen on copilot, then we got it at work so I tried it. For my previous position working in an ancient java project which knows no rhyme or reason, a codebase which belongs in hell’s fires, it was mostly useless.
I switched to a modern web developer position where we do a lot of data manipulation and massage it into common types to visualise in charts and tables, there it excels. A lot of what we do uses the same datasets and are then aggregated into one of a set of common types, so copilot often “understands” what I intend and gives great 5-10 line suggestions.
These last 3 weeks I’ve had the massive task of separating our data processing into separate files to finally add unit tests. Doing the refactoring was easy with IntelliJ, copilot quickly wrote tests as with 100% coverage which allowed me to find a good number of undiscovered bugs.
I haven’t used the in IDE stuff, but the best use case I’ve had was asking bing chat for help with xml mappings for nhibernate with a wonky table structure when I had to work with some legacy code. The nhibernate documentation is terrible.
I’ve said this in other comments, but it’s easier to change your audience than your content.
What you said is the important bit: at the end of the day, you’ve got a computer working as a tool for a human. That’s what it should be all about. Instead we have so much AI slop that’s hardly trying to do anything for people, but rather trying to get another algorithm’s attention so it can be shown more - whether a person actually wants to see it or not.
If AI is a tool to create a thing, under the close supervision of a human, for other humans, I’m a lot more open to it. Just don’t let it get carried away and forget about the humanity of it all.