AP Computer Science A is apparently a college-level introductory course to computer science, and it's basically what's talked about amongst high schoolers interested in computer science. If this is true, I don't think I want to go to college anymore.
It seems like the primary reason why Java was chosen is because of OOP, and how OOP is easier to learn because it's more natural despite OOP being straight up impractical (at least for teaching purposes). I’m not going to explain why it's impractical because we've experienced, or at least have an understanding of it's impracticality, which is likely to be one of the reasons why you registered in this website in the first place.
I may be young, but I ain't naive. I went through the whole C++ pipeline of watching a bunch of CppCon talks about C++, while thinking template metaprogramming was the shit, being fascinated with new C++ features despite them already existing in better ways, and programming in a purely object-oriented style with the whole C++ nine yards including
concepts to constrain polymorphisms just to view a hierarchy of esoteric error messages... but after supporting and witnessing hell itself, I've discovered heaven and the angels that reside within it: Mike Acton, Casey Muratori, Abner Coimbre, Andrew Kelly, and so much more, and the credence that I succumbed to: data-oriented design.
You can argue that data-oriented design is more unnatural, therefore, more difficult to learn, but if you have the mental capacity to understand inheritance, method overriding, parametric polymorphism, and abstract classes, you have the mental capacity to understand, "Data goes in, data changes, then data looks different." And, "Accessing memory like bloom from Fortnite ≈ many cache misses ≈ many bullets not hitting target." And, "(Transforming large sets of data => SoA instead of AoS) ≈ designing Minecraft farms."
Maybe it turns out data-oriented design is too unnatural. How about imperative programming where you literally just write a list of instructions like in Scratch.
Furthermore, if you look in Amazon at books that teach Java, they can be, but not limited to, 1,000 pages thick, and there's even one that's advertised as brain-friendly and beginner friendly... despite having 752 pages. Just imagine a naive high schooler studying for their computer science exam. In comparison to books, the official book for C has 272 pages, a book about x64 assembly has 192 pages, and a book about data-oriented design has 307 pages. Holy shit... the latter 3 books have a combined amount of pages of 771 which is 19 more pages than the beginner friendly book about Java.
By the way, we literally spend half the entire course studying the language. We could've stopped learning about the language after learning methods, statements, and expressions and got on with understanding various data structures and algorithms and their real performance implications (not just Big O Notation) when it comes to certain situations, and learn more about how the hardware handles our code and data... but we don't because the "computer science" course doesn't like to teach about computers.
Apparently, the older generation had it different and more favorable than what it is now. I've heard y'all studied C and even went down towards 8088 assembly. What happened to this?
How did we go from programming computers to programming imaginary objects in a computer science class?