What's amusing to me here is it's a dev comparing the "old approach" of learning via StackOverflow to learning via LLMs. My generation of programmers disrespected SO as dumbing down learning because so many devs were just copy/pasting the top answer instead of understanding it - it was faster so of course they did. LLMs are the same: worse but faster. So the article is not wrong, but it doesn't realise they're fighting a losing battle, one that we fought & lost already. https://nmn.gl/blog/ai-and-learning
@sinbad My theory is that we grew up with machines that you could actually understand completely. A Speccy fits in a brain. A single person can know everything about it.
That stopped around the Pentium era. A single person, no matter how smart, stopped being able to understand everything about a machine.
Over time as the mountain of stuff you can never sensibly understand piled up, people just embraced it. Don't know, don't care, make it work somehow, ship it. Ah well.
@TomF @sinbad having grown up mostly with machines from the pentium era, the way I learned Computer Stuff was to pick a topic, find books on it, and learn whatever I could about that chunk of the computer. I don't think it's possible to know the whole machine, but it's possible to draw a little box around part of it and learn it's contents completely, and then draw another box next to that, and so on. I think I have a pretty good breadth and depth of understanding, but I'll always be learning.
@TomF @sinbad I think the main thing that changed is that things like stack overflow and LLMs are low friction resources that are just practical enough that curiosity, tenacity, and a lot of free time aren't prerequisites anymore for some areas of Computer Stuff (and other topics). I don't think there are any fewer tenacious curious people with the time and support to go further than that, but it might seem like that because there's a lot more programmers now.
@TomF @sinbad I like to think that having a lower barrier of entry is a good thing, because people can still be nerd sniped into becoming great programmers and may be motivated to do so when they reach the limitations of low friction learning resources, but I don't have any proof of that, and I've seen research to suggest the opposite may be true with LLMs so idk ymmv.
@aeva @TomF @sinbad one nice difference (?) between then and now: tiny LEDs should be a lot cheaper now*, and this could potentially allow us to build circuits that give a lot more feedback about what's happening in any machine you build for this purpose.
also: you can design a custom PCB and have it fab'd and shipped here in a relatively short time now.
[*] but with the recently self-imposed trade war, who knows
@JamesWidman @aeva @sinbad This looks simple enough to learn: https://hackaday.io/project/189704-the-l7-a-very-simple-8-bit-cpu
Also, if you don't need to learn the internals of the Z80, basically the whole of the rest of the Speccy is covered down to the gate level in this excellent book:
http://www.zxdesign.info/book/