In the good old days, you had to learn assembly/machine language, C, and OS-level programming to get anything done. Even if you mostly worked on applications, you’d drop down and do something useful. At the time, this was writing machine language routines to call from BASIC. This is still a practical skill, for instance I mostly work in Scheme, but use C FFI to hook into native functionality, and debug in lldb.
Computer Science is supposed to be more math than practical, though when I took it we also did low-level graphics (BIOS calls & framebuffers), OS implementation, and other useful skills. These days almost all CS courses are job training, no theory and no implementation.
Younger programmers typically have no experience below the application language (Java, C#, Python, PHP) they work in, and only those with extensive CS degrees will ever see a C compiler. Even a shell, filesystems, and simple toolchains like Make are lost arts.
The MIT Missing Semester covers some of the mid-high levels of that, but there’s no real training in the digital logic to OS levels.
Even original Nolan Bushnell’s Atari, was bought by Warner Brothers, then (mostly) bought by Jack Tramiel after leaving Commodore. So it’s not an unbroken line. Infogrames Fr’s new management has quit with the NFT nonsense, and is making Atari-related stuff that isn’t awful.