bumbread
I'm going to challenge your point, but not on whether it's true, but on the correct identification of the exact problem.
First, as a user, I don't see a problem in web browsers being big and taking days to build ~ that isn't the user's problems. That is something to keep in mind.
Not only does this software run slow, it is full of security holes, because it is so big, no one, not even its makers understand it.
https://www.theguardian.com/techn...nds-hack-senate-hearing-microsoft
True scope of the breach, which affected 100 companies and several federal agencies, is still unknown
Tech executives revealed that a historic cybersecurity breach that affected about 100 US companies and nine federal agencies was larger and more sophisticated than previously known.
The revelations came during a hearing of the US Senate’s select committee on intelligence on Tuesday on last year’s hack of SolarWinds, a Texas-based software company. Using SolarWinds and Microsoft programs, hackers believed to be working for Russia were able to infiltrate the companies and government agencies. Servers run by Amazon were also used in the cyber-attack, but that company declined to send representatives to the hearing.
If I understand correctly ~ you're saying that the problem is poor performance and memory usage problems of modern websites that use JavaScript. And your solution is to avoid JavaScript.
So I am sitting on a computer that is using 120 MB of RAM with the operating system and windows manager loaded. Then I open Firefox, and not only is it really slow to start, the memory usage shoots up to 1 GB for one page that only displays text and 2D pictures. Then I open up 3 web pages on a machine with 4GB of RAM, and my OS freezes from overwork until I give up and kill Firefox. Then I have to wait minutes for Firefox to go away and my machine can unfreeze. On Linux, not only can opening a webpage in Firefox lock your computer, I have seen it crash the entire operating system, depending the webpage I click on.
Do you understand how big 1 GB is? If you think it is reasonable to require all of that just to display some text and 2D pictures, then I really don't think you do!
But in the current world as it is now, this is problematic, and will limit the amount of stuff websites can do. Some websites can be used as portable programs like games, interactive visualizations, mathematical solvers, and much more. Getting rid of JavaScript (any scripting on the client side) will be problematic for those websites. Modern web is not just a data storage medium where you can just download data from and view it on your computer later. ~Rather it is an interactive medium where users are interacting with the websites, and websites in turn _have_ to change their behaviour dynamically. At least some form of client side scripting is required for that. Which one ~ doesn't matter.
I don't care how these people want to make money, because they can't even display text and 2D pictures, without using gigabytes of memory and cooking CPUs that were perfectly cable of doing this simple job decades ago.
Scripting once and deploying everywhere is a failure. You have to test your software on the machine it will run on, or you will get poor performance and bugs. Computers are not a magical abstraction, much as you wish they were.
Secondly, modern computers are fast so JavaScript per se doesn't create the problem of performance. Many websites are slow but for a different reason: most of the time is huge amount of DOM relayouts happening in the JavaScript code.
It causes performance problems on all machines including modern ones. You are only saying this, because you are young and have never experienced better, and you don't understand how computers work. CPUs have gotten really fast, but memory access speed has not kept pace. If you are using an object oriented language that scatters its memory all over the place, then you can't pipeline your data to the CPU for it to do work. So your super fast expensive power sucking CPU that is glowing like a kitchen hotplate, is just sitting there idling, waiting for data that takes an age in computer terms to arrive.
Watch and learn:
CppCon 2014: Mike Acton "Data-Oriented Design and C++"
https://www.youtube.com/watch?v=rX0ItVEVjHc
Also Casey did a great video demoing software that was doing the same thing as software today, that was running much faster on far weaker machines with far less memory, than modern software doing the same thing runs on modern machines.
https://www.youtube.com/watch?v=GC-0tCy4P1U
The meme I have heard from bad programmers all my life: "My programming sucks? Buy a better machine. Only basement nerds care about performance, brogrammers just want to make money." So I have to buy a better machine, to make no progress, and go backwards? How about I just fire you, and hire someone better?
I have a solution more radical than not using Javascript and similar: Don't use websites that use these lousy languages, and promote websites that don't use them.
When the Internet was still young, people of like mind used to link their websites together in webrings. So if you found one website about a subject you liked, you could find the rest of them with one click. This can be done again. We can build a fast Internet island of our own, while the rest of the Internet slows and dies. Eventually people will notice our Internet is much nicer to use than the one they came from, and they won't be able to return to the bad one. Once you have had better, there is no going back.