What's amusing to me here is it's a dev comparing the "old approach" of learning via StackOverflow to learning via LLMs. My generation of programmers disrespected SO as dumbing down learning because so many devs were just copy/pasting the top answer instead of understanding it - it was faster so of course they did. LLMs are the same: worse but faster. So the article is not wrong, but it doesn't realise they're fighting a losing battle, one that we fought & lost already. https://nmn.gl/blog/ai-and-learning
@sinbad I think he has a solid point.
Sure, lots of code-monkeys would just copy-paste from SO, as they did from other forums before it. Or they'd pester a senior for the answer, and transcribe it with the same disinterest in actually learning from it (I've been that senior). But those with the inclination had the opportunity to learn. Not just from reading the existing responses, but from asking their own clarifying questions, and standing a reasonable chance of getting an answer from somebody who understood the topic and the issue.
A large language-mangler doesn't even have a wrong understanding of what it gave you, and it's not even Searle's Chinese Room. Sure, you can ask it a question, but the answer to that question will itself be a word-collage with nothing deterministic about it. It's text-in-a-blender, all the way down.
I remember the days of Wikipedia being useful mainly as a way to discover what questions you needed to ask about the topic at hand.
LLMs will give you material for questions, alright, but there's no guarantee those will even take you in the right direction.
My hope is that enough of the LLM-transcribers will be made to clean up their own mess and, though the debugging process, come to understand both the medium of code and the fact that LLMs don't know anything.
@KatS Yeah - you're both right, my point was mostly that railing against it is kinda futile, trends gonna trend and someone will just have to deal with the fallout
@sinbad Absolutely.
In the meantime, it's like the way 'baby on board" stickers translate to "the road is probably the last thing this driver is paying attention to."
The use of LLM-generated code can at least be useful as a first-pass filter for candidates, and for evaluating forum responses.
I mean, you're right that we're all getting splattered with the fallout one way or another, but at least you can avoid having to mop up that mess on your own shop floor.