Handmade Network»Forums
King Emhyr
2 posts
In US public schools, the standard language for teaching "computer science" is Java, where you spend more time learning OOP than learning about a computer.
Edited by King Emhyr on

AP Computer Science A is apparently a college-level introductory course to computer science, and it's basically what's talked about amongst high schoolers interested in computer science. If this is true, I don't think I want to go to college anymore.

It seems like the primary reason why Java was chosen is because of OOP, and how OOP is easier to learn because it's more natural despite OOP being straight up impractical (at least for teaching purposes). I’m not going to explain why it's impractical because we've experienced, or at least have an understanding of it's impracticality, which is likely to be one of the reasons why you registered in this website in the first place.

I may be young, but I ain't naive. I went through the whole C++ pipeline of watching a bunch of CppCon talks about C++, while thinking template metaprogramming was the shit, being fascinated with new C++ features despite them already existing in better ways, and programming in a purely object-oriented style with the whole C++ nine yards including concepts to constrain polymorphisms just to view a hierarchy of esoteric error messages... but after supporting and witnessing hell itself, I've discovered heaven and the angels that reside within it: Mike Acton, Casey Muratori, Abner Coimbre, Andrew Kelly, and so much more, and the credence that I succumbed to: data-oriented design.

You can argue that data-oriented design is more unnatural, therefore, more difficult to learn, but if you have the mental capacity to understand inheritance, method overriding, parametric polymorphism, and abstract classes, you have the mental capacity to understand, "Data goes in, data changes, then data looks different." And, "Accessing memory like bloom from Fortnite ≈ many cache misses ≈ many bullets not hitting target." And, "(Transforming large sets of data => SoA instead of AoS) ≈ designing Minecraft farms."

Maybe it turns out data-oriented design is too unnatural. How about imperative programming where you literally just write a list of instructions like in Scratch.

Furthermore, if you look in Amazon at books that teach Java, they can be, but not limited to, 1,000 pages thick, and there's even one that's advertised as brain-friendly and beginner friendly... despite having 752 pages. Just imagine a naive high schooler studying for their computer science exam. In comparison to books, the official book for C has 272 pages, a book about x64 assembly has 192 pages, and a book about data-oriented design has 307 pages. Holy shit... the latter 3 books have a combined amount of pages of 771 which is 19 more pages than the beginner friendly book about Java.

By the way, we literally spend half the entire course studying the language. We could've stopped learning about the language after learning methods, statements, and expressions and got on with understanding various data structures and algorithms and their real performance implications (not just Big O Notation) when it comes to certain situations, and learn more about how the hardware handles our code and data... but we don't because the "computer science" course doesn't like to teach about computers.

Apparently, the older generation had it different and more favorable than what it is now. I've heard y'all studied C and even went down towards 8088 assembly. What happened to this?

How did we go from programming computers to programming imaginary objects in a computer science class?

183 posts / 1 project
In US public schools, the standard language for teaching "computer science" is Java, where you spend more time learning OOP than learning about a computer.

Object oriented programming was created in the 1960s to solve specific design problems where data-method coupling made sense to reduce the number of unused variables in structures. They were very specific about it being a limited soluting to a specific problem. People used procedural languages to implement object orientation when needed, by placing function pointers in structures dedicated to describing the behavior of classes. At this stage, it was all mixed paradigm and object orientation was rarely overused, because the upfront cost of using object orientation matched the long term maintenance cost. Then object oriented languages gained popularity. The new language Java banned global methods, as an over-reaction during a time when people were tired of spaghetti code that used global variables everywhere. Instead of mastering functional programming and good coding practices, the same old spaghetti code could be crammed into objects with pointers to each other, in a cyclic dependency hell called the "visitor pattern". Then scammers realized that they could sell books like snake oil to confused beginners who wanted shortcuts by portraying OOP as a silver bullet for any design problem, using trivial examples that look good in books but reach a dead end when forced to have multiple inheritance. Offices were already full of herd mentality, because working together meant agreeing on a design and going in the same direction. Anyone disagreeing would risk being labeled unprofessional and developers tried to show off with more and more overengineered solutions, so that they could barely keep it bug free before they even started solving the problem. The majority of people were hired just to maintain the overcomplexity needed for basic math through complex pipelined graphs of classes and deeply nested inheritance, which later had to be replaced with another design pattern when the old one could not handle new requirements. When multi-threaded programming was added on top of object orientation and everyone got deadlocks from putting up mutexes everywhere, it broke its back and companies were finally forced to admit that using one pattern for everything was stupid, or lose to competitors who adopted data-driven programming, fully utilized multiple cores without locks and had time to develop new features.

Object orientation at software companies is now what Fortran has long been for banks. Those with old codebases still need people to maintain the existing object-oriented code, so schools teach what companies request the most.

A real University education in computer science is however not that shortsighted, because they know the importance of having the whole toolbox. Most of these students got started with Java, JavaScript or Python, but you can't have a professor in compiler techniques who don't know functional programming.

I used to work at a company that was so obsessed with object orientation that they modified the standard math library to write ((x + 1).sin()) instead of sin(x + 1), which quickly got really hairy when the actual math formulas became complex on their own. Other companies I worked for only applied object orientation as a last resort when other design patterns could not be used, because they have learned how expensive object orientation is to maintain.

King Emhyr
2 posts
In US public schools, the standard language for teaching "computer science" is Java, where you spend more time learning OOP than learning about a computer.
Edited by King Emhyr on

I remembered that I posted this thread a couple hours ago (hence the late reply) and decided to do a deep dive.

Here is a compilation of resources appertaining to the development, history, and description of AP Computer Science:

I'm very hopeful for Casey Muratori's Star Code Galaxy, but I'm unaware of any recent updates, other than it being "complicated".

Harvard's CS50, in my opinion, is very good, though I've never taken it. You can tell that Prof. David J. Malan put an immense amount of effort into the curriculum, unlike many other courses where many of the design decisions imply the bandwagon effect.