Handmade Network»Forums
34 posts / 1 project
Discussion: why is web broken and about it's replacement.
Edited by bumbread on Reason: Initial post
Hello~! I've seen people saying here that web is "fundamentally broken". As far as I understand the reasons for such claims are somewhere in the complexities of HTML/CSS/JavaScript or the fact that there are thousands of layers of abstractions on top of each other causing performance issues.

I would like to learn a more concrete answer to the question "what is fundamentally broken about web?".

And after the problem is established I'd like to discuss possible solutions to the problem.
19 posts
Discussion: why is web broken and about it's replacement.
Edited by Aphetres on
What is wrong? Like all software, it runs slower every year, and no one notices or cares. The web browsers are getting so big, they are larger than some operating systems, and take days to build. There is always a new security advisory out for them, and their sole funding model is to follow, and spy on their users, and to sell that information to shady people for billions of dollars.

If Casey thinks the web is broken, he should come to OpenBSD and learn what true suffering is, because any software that hogs memory, runs horribly slow on it, unless you are using a brand new machine.

34 posts / 1 project
Discussion: why is web broken and about it's replacement.
I'm going to challenge your point, but not on whether it's true, but on the correct identification of the exact problem.

First, as a user, I don't see a problem in web browsers being big and taking days to build ~ that isn't the user's problems. That is something to keep in mind.

If I understand correctly ~ you're saying that the problem is poor performance and memory usage problems of modern websites that use JavaScript. And your solution is to avoid JavaScript.

But in the current world as it is now, this is problematic, and will limit the amount of stuff websites can do. Some websites can be used as portable programs like games, interactive visualizations, mathematical solvers, and much more. Getting rid of JavaScript (any scripting on the client side) will be problematic for those websites. Modern web is not just a data storage medium where you can just download data from and view it on your computer later. ~Rather it is an interactive medium where users are interacting with the websites, and websites in turn _have_ to change their behaviour dynamically. At least some form of client side scripting is required for that. Which one ~ doesn't matter.

Secondly, modern computers are fast so JavaScript per se doesn't create the problem of performance. Many websites are slow but for a different reason: most of the time is huge amount of DOM relayouts happening in the JavaScript code.
19 posts
Discussion: why is web broken and about it's replacement.
Edited by Aphetres on
bumbread
I'm going to challenge your point, but not on whether it's true, but on the correct identification of the exact problem.

First, as a user, I don't see a problem in web browsers being big and taking days to build ~ that isn't the user's problems. That is something to keep in mind.


Not only does this software run slow, it is full of security holes, because it is so big, no one, not even its makers understand it.

https://www.theguardian.com/techn...nds-hack-senate-hearing-microsoft


True scope of the breach, which affected 100 companies and several federal agencies, is still unknown

Tech executives revealed that a historic cybersecurity breach that affected about 100 US companies and nine federal agencies was larger and more sophisticated than previously known.

The revelations came during a hearing of the US Senate’s select committee on intelligence on Tuesday on last year’s hack of SolarWinds, a Texas-based software company. Using SolarWinds and Microsoft programs, hackers believed to be working for Russia were able to infiltrate the companies and government agencies. Servers run by Amazon were also used in the cyber-attack, but that company declined to send representatives to the hearing.




If I understand correctly ~ you're saying that the problem is poor performance and memory usage problems of modern websites that use JavaScript. And your solution is to avoid JavaScript.


So I am sitting on a computer that is using 120 MB of RAM with the operating system and windows manager loaded. Then I open Firefox, and not only is it really slow to start, the memory usage shoots up to 1 GB for one page that only displays text and 2D pictures. Then I open up 3 web pages on a machine with 4GB of RAM, and my OS freezes from overwork until I give up and kill Firefox. Then I have to wait minutes for Firefox to go away and my machine can unfreeze. On Linux, not only can opening a webpage in Firefox lock your computer, I have seen it crash the entire operating system, depending the webpage I click on.

Do you understand how big 1 GB is? If you think it is reasonable to require all of that just to display some text and 2D pictures, then I really don't think you do!



But in the current world as it is now, this is problematic, and will limit the amount of stuff websites can do. Some websites can be used as portable programs like games, interactive visualizations, mathematical solvers, and much more. Getting rid of JavaScript (any scripting on the client side) will be problematic for those websites. Modern web is not just a data storage medium where you can just download data from and view it on your computer later. ~Rather it is an interactive medium where users are interacting with the websites, and websites in turn _have_ to change their behaviour dynamically. At least some form of client side scripting is required for that. Which one ~ doesn't matter.


I don't care how these people want to make money, because they can't even display text and 2D pictures, without using gigabytes of memory and cooking CPUs that were perfectly cable of doing this simple job decades ago.

Scripting once and deploying everywhere is a failure. You have to test your software on the machine it will run on, or you will get poor performance and bugs. Computers are not a magical abstraction, much as you wish they were.


Secondly, modern computers are fast so JavaScript per se doesn't create the problem of performance. Many websites are slow but for a different reason: most of the time is huge amount of DOM relayouts happening in the JavaScript code.


It causes performance problems on all machines including modern ones. You are only saying this, because you are young and have never experienced better, and you don't understand how computers work. CPUs have gotten really fast, but memory access speed has not kept pace. If you are using an object oriented language that scatters its memory all over the place, then you can't pipeline your data to the CPU for it to do work. So your super fast expensive power sucking CPU that is glowing like a kitchen hotplate, is just sitting there idling, waiting for data that takes an age in computer terms to arrive.

Watch and learn:

CppCon 2014: Mike Acton "Data-Oriented Design and C++"
https://www.youtube.com/watch?v=rX0ItVEVjHc


Also Casey did a great video demoing software that was doing the same thing as software today, that was running much faster on far weaker machines with far less memory, than modern software doing the same thing runs on modern machines.

https://www.youtube.com/watch?v=GC-0tCy4P1U

The meme I have heard from bad programmers all my life: "My programming sucks? Buy a better machine. Only basement nerds care about performance, brogrammers just want to make money." So I have to buy a better machine, to make no progress, and go backwards? How about I just fire you, and hire someone better?


I have a solution more radical than not using Javascript and similar: Don't use websites that use these lousy languages, and promote websites that don't use them.

When the Internet was still young, people of like mind used to link their websites together in webrings. So if you found one website about a subject you liked, you could find the rest of them with one click. This can be done again. We can build a fast Internet island of our own, while the rest of the Internet slows and dies. Eventually people will notice our Internet is much nicer to use than the one they came from, and they won't be able to return to the bad one. Once you have had better, there is no going back.

40 posts
Discussion: why is web broken and about it's replacement.
We can build a fast Internet island of our own, while the rest of the Internet slows and dies. Eventually people will notice our Internet is much nicer to use than the one they came from, and they won't be able to return to the bad one. Once you have had better, there is no going back.


Can we actually? I'm also young and unexperienced but I would be willing to help on that!
34 posts / 1 project
Discussion: why is web broken and about it's replacement.
Edited by bumbread on
Not only does this software run slow, it is full of security holes, because it is so big, no one, not even its makers understand it.


Yeah, security is an issue. I don't know much about web security to make a discussion about it, but I believe that's an issue that will not be magically solved by changing the architecture of the web. Pointing out a problem isn't the same as attributing it to something or even solving it. For now, security _is_ an issue that needs to be solved. I'll leave the reasons it is a problem aside.

Do you understand how big 1 GB is? If you think it is reasonable to require all of that just to display some text and 2D pictures, then I really don't think you do!


Yes, it is unacceptable. However this is a browser problem, now web problem. If you think otherwise, please provide some backup towards that point. I wonder if this is because the most popular browser engines eat that much memory, or there is a problem that web actually forces the browsers to do that. My firefox browser (Linux) has about 9*10^5 pages allocated, and with 4K pages that becomes about 1-2G of memory, I don't remember the exact number.

I don't care how these people want to make money, because they can't even display text and 2D pictures, without using gigabytes of memory and cooking CPUs that were perfectly cable of doing this simple job decades ago.


At this point I have no idea what you're arguing for or whether this is still on-topic. What people? If you're talking about web developers, why are you blaming them? Have you profiled their code? Are you sure it is _their_ fault? Tomorrow I'll try making a simple web site that 'displays some text and 2d images', or rather try to download one website that does it and try to see what's up with that, and what is in my power to do it.

Scripting once and deploying everywhere is a failure. You have to test your software on the machine it will run on, or you will get poor performance and bugs. Computers are not a magical abstraction, much as you wish they were.


No, you will have to test your code on all _platforms_ that you are going to ship on, otherwise you will get bugs. Web is meant to be displayed compatibly on different browsers (but ironically somewhat struggles with providing the correct tools to do so). I've experienced apps who's layout wasn't good for mobile phones because the text line was too wide. But the browser actually does provide you the tools needed to make a website that displays sanely on different devices. The people not caring, or those tools being too hard to use might be one of such problems. Web is one platform, an abstraction over different platforms. That is a fact. It's not perfect, it's not performant.

And computers 101, but every abstraction has performance penalty, you can't do anything about it. Even an OS is meant as an abstraction over hardware for the programs to use to provide users with functionality to solve their problems. Computers by itself aren't abstractions, I have no idea what you're trying to say.

It causes performance problems on all machines including modern ones


(1) Indeed, there are websites that are slow. I don't consider my machine to be anything but 'average grade laptop', but it is pretty fast, and nevertheless, there are websites that fail to provide speed that they should have provided.
(2) What's "it"?

You are only saying this, because you are young and have never experienced better, and you don't understand how computers work


Whoa man, hold your horses. You don't know who I am, you don't even know my name. You have no right to assume who I am or what I may or may not 'understand'. I see enough reason to stop replying to you here.

The rest of the paragraphs weren't worth replying to. You seem to have an issue differentiating between pointing out the problems and correctly attributing them.

Are the problems of 'accessing the data on the internet' also 'the problems of the web'? Not really. If I make a really slow browser the poor performance is a problem of that browser, not the web.
34 posts / 1 project
Discussion: why is web broken and about it's replacement.
Edited by bumbread on
Can we actually?


Technically yes. Will it succeed? Depends on a bunch of things. The first is the promised quality. The second is how much people do we reach. If that's 3 people connected, that's not fun.

I would postpone any work on that until the problems are identified and the solutions are discovered and proven. If someone wants to make the new web, that person has to make sure he solves the problems in the real world, instead of making "just another library that he thinks is better but has no way of proving it", which is really easy to do.
19 posts
Discussion: why is web broken and about it's replacement.
BernFeth
We can build a fast Internet island of our own, while the rest of the Internet slows and dies. Eventually people will notice our Internet is much nicer to use than the one they came from, and they won't be able to return to the bad one. Once you have had better, there is no going back.


Can we actually? I'm also young and unexperienced but I would be willing to help on that!


TBH I don't know anything about web-programming beyond HTML, and I don't want to know. I just use the web like everyone else, and find it awful. I would be happy to use forums like this to get info, and read blog posts. You don't need Javascript for that.

To get my retro text fix, I tried out the gopherspace recently, but didn't find much there. I think with Gopher you just dump your files in a folder, and it takes care of the layout. So every website in gopher looks the same, except for the information. I admire that simplicity.
Asaf Gartner
64 posts / 2 projects
Handmade Network Staff
Discussion: why is web broken and about it's replacement.
I would say that there are two fundamental problems with the web:
  1. The browser eats up a lot of the performance headroom your get from your base platform.
  2. It's not really built to do what you want. The APIs are often bad.

Having said that, most of the problems we see on the web are not directly due to the platform. They're more often due to programmers not understanding the platform and using it inefficiently on top of the platform's inefficiencies.
For an example just compare Figma to Reddit.
This extends even to the server, where you have a lot more freedom. There's slow code everywhere.
I don't think fixing the platform would automagically fix the apparent issues. The same issues crop up in other platforms, like mobile, where they had the opportunity for a fresh start. The main thing that needs to change is the programmers' mindset.
34 posts / 1 project
Discussion: why is web broken and about it's replacement.
AsafG
I would say that there are two fundamental problems with the web:
It's not really built to do what you want. The APIs are often bad.


Well on one hand that assumes what one wants, but on the other hand it opens to another, more general problem:
>> How do you provide an interface which is equally good for most things people want to have on the web.

Some people want to post simple articles with text and images. Other people will want more than text an images, they want to have interactive snippets (like canvases or iframes), dynamic updates, they might even want to create a web application.

Sometimes you want your API to be simple and abstracted. Other times you're concerned about the details. I don't think there's a single formula that'll make everyone happy?

What do you think about the idea of exposing parts of GUI rendering API (e.g. when redraw and relayout) happens so that the programmers will have the ability to explicitly optimize their interface if they want to?

I know that doesn't solve the problem as a whole, but that somewhat mitigates the 'slow code' problem. Although I think that making the API more complicated will just result in more frameworks of dealing with that. Which is why I think that

Having said that, most of the problems we see on the web are not directly due to the platform. They're more often due to programmers not understanding the platform and using it inefficiently on top of the platform's inefficiencies.


Is likely to be the actual problem with the web.

The same issues crop up in other platforms, like mobile, where they had the opportunity for a fresh start


I agree. While I could develop web and produce fairly responsive code. Developing for mobile is developers' hell. I can not do anything in any mobile development sdk without that being ultra slow and/or making my PC very hot. The process itself pre-supposes that you need a tool in order to generate information about your project.
Asaf Gartner
64 posts / 2 projects
Handmade Network Staff
Discussion: why is web broken and about it's replacement.
bumbread
>> How do you provide an interface which is equally good for most things people want to have on the web.


Same way you do it everywhere else. You give programmers as much access to the hardware as possible.


bumbread
Some people want to post simple articles with text and images. Other people will want more than text an images, they want to have interactive snippets (like canvases or iframes), dynamic updates, they might even want to create a web application.


If you want to make a game, you use a game engine. If you want to view a PDF, you use a PDF viewer. If you want to display a web page, use a web engine.
If browsers provided a platform layer that you could access directly, the web engine would just be another app implemented on top of that platform layer (and bundled with the browser so you wouldn't have to redownload it for every site).
I'm oversimplifying things, but what you really want is an internet-aware OS, not a web browser.
40 posts
Discussion: why is web broken and about it's replacement.
Where is most of the work a web browser does anyway?

Does a browser not care much about server side or that still affects it?
34 posts / 1 project
Discussion: why is web broken and about it's replacement.
You give programmers as much access to the hardware as possible.


You can not give raw access to hardware. No GPU, no filesystems. There are security concerns. Web developer, while creating client code should not be able to trivially create a malware application and access user data.

Therefore some API layers are required. Otherwise I understand what you're saying, you're saying that we should provide a minimum lowest level API that users can use in order to create their websites. But I don't think this answers the question ~ this can be implemented using WASM, which is supported by web browsers. So why we can't count WASM as the new web, or at least the path to the new web?

I'm oversimplifying things, but what you really want is an internet-aware OS, not a web browser.


I like how this analogy reflects the function of OS. In PC world, OS abstract hardware away from users such that they can run applications to solve their problems. OS have to provide safety, compatibility, provide ways to store information on the hard drive. It's the same in the case of web browsers. They abstract internet protocols and packets away from users that makes it easier to create web apps, and also provide safety and some auxiliary features. This analogy is a good food for thought.

As we know Internet communication is represented by communication levels. At the bottom there is a physical level that specifies how any two different physical devices have to talk to each other, and at the top there's an application level that has all of that abstracted so all you have to do is to type URL into your browser to access an internet resource.

The question is that if the web is going to change, than this layer stack has to change at some point. At which point it is the best to cut the stack and start implementing the protocols from scratch? I'm assuming that it is either below HTTP or above HTTP. Certainly the new protocols will have some of the inspiration from the current ones, the major differences will still be in API's.

BernFeth
Where is most of the work a web browser does anyway? Does a browser not care much about server side or that still affects it?


As far as I know it spends most of the time either rendering stuff and doing some DOM stuff, or processing HTTP requests.

The server side should not matter at all, because every server is a computer over which the developers have the full control, so that is rarely a root of the problem.
Asaf Gartner
64 posts / 2 projects
Handmade Network Staff
Discussion: why is web broken and about it's replacement.
bumbread
So why we can't count WASM as the new web, or at least the path to the new web?


The primary reason is that you're still running inside a web engine, when most of the interesting things you want to make are not websites. Games are not websites, gmail is not a website, Figma is not a website, the HMH episode guide is not a website.
Right now the best you can do is load a website that runs WASM, but what you want is to load the WASM and run it directly, without the surrounding assumption that it's running inside a website.
183 posts / 1 project
Discussion: why is web broken and about it's replacement.
Edited by Dawoodoz on
The problem is that the same protocol is being used for too many things at the same time without the ability to throw out old functionality when it becomes a security problem.

I would make a new protocol inspired by DHTML but with fast partial updates of content and a pre-generated stream of media for each page.

* The servers run the main logic so that the browser can be minimal. Just display media and receive input. Changes to a news feed updates a part of the page's layout using a message from the server to prevent busy waiting from the client browser.
* There are different modes for text documents and applications so that applications get a clean protocol for layout trees while text can flow freely on window resize.
* The server sends resources to the client in a continuous stream of progressively loaded resources that are packed together in advance for long latency satellite internet connections. Image dimensions are automatically packed together with the text from server side compilation so that the site will never jump around when images are loaded.
* No external resources from other sites allowed. All image previews on links are packed in read order into the stream when the server updates the resource stream. This will make loading pages a lot faster.
* No dynamic script interpretation on neither server nor client. No room for script injection attacks if data and logic are fully separate to begin with. Just protect the server against buffer overflow attacks and such using memory protection and higher compiled languages.
* Client side scripting can only create optional visual effects for faster visual feedback. They have absolutely no access to the user's file system and cannot create any main features. Any application wanting to do heavy calculations on the host (like a graphical calculator or grammar checker) has to be downloaded into a secure format where all communication with the computer or the internet needs explicit permissions from the user when receiving updates.
* All public websites in the system must be registered for search indexing before they get a domain name. Every new search engine will have access to the same list of public sites for fair competition.
* Every person and company gets one top domain for free so that thieves cannot extort people for money by stealing expired domain names. More domains can be paid by contributing as a server to the domain hosting infrastructure.
* Video on demand services must use a standard protocol in a special television mode so that you can switch to another service provider without leaving full-screen. No more trying to remember where the pause key is for every streaming service trying to reinvent the basics. There would be a lot more innovation if one could change the usability of all services at once based on own preferences. Markets are usually more competitive when separating different types of services rather than bundling good with bad.
* Online games must also have a special mode where you can install game engines as trusted plugins and then have a store with untrusted games for each engine behind high-level safety abstractions. This part of the browser should be optional to install for the majority who don't play games in browsers.