Still dumbfounded by bad software - Microsoft Edge

Sometimes I am still completely dumbfounded by bad software. Long story short, I'm not a bandwagon person. Edge came out for Windows and compared to IE, I really liked it. It was faster, more secure, and something available on Windows by default and since I dont like to install too much software and bloat up my system, I used it. I'm also not really a web developer so I dont care about the web dev arguments against it.

I'm on a i7 machine with 16GB of RAM and a video card I got in 2016. For the last year or 2, Ive been using Edge as my primary brwoser and I frequent mostly 4 websites: Twitter, LinkedIn, Slack, and YouTube. When on these sites, my machine would slow down drastically and my fan would RACE. I just accepted that it was probably just crappy front-end framework garbage. Finally, it got unbearable and I was starting to think about reformatting or getting a new machine.

Then I installed an updated version of Chrome and its like I got a new system - I've not heard my fan again since and the computer performs way faster. I hadn't realized that since I pretty much always had Edge open, it was Edge that was causing all the problems. Since Microsoft forces updates, it wasnt like I was using an outdated version of Edge either.

So here I find myself dumbfounded as a software professional almost about to get a new box due to a simple software problem.

My question is - what are some tips for pinpointing this kind of thing? I used process explorer and it wasnt super obvious since it wasnt like Edge was using 99% CPU or anything... IN fact it was only using 25% but this was somehow causing my CPU to race. What are some good clues I can find in task manager or process monitor that can help me isolate lousy software like this in the future and avoid it? I find it almost unbelieavable that an OS manufacturer cannot get process and memory management right on their own system, but an outside vendor can.

Edited by Todd on Reason: Initial post
One word - javascript. 25% of CPU means it is using 100% of one core if you have quadcore. Javascript executes mostly in one single thread. That's why all this slack, discord, etc javascript one-page "app" thing is complete nightmare. Browser based or standalone electron-based.

But if you have not yet installed - get an adblocker. Internet becomes 100% faster due to fact that most ads nowadays ran insane number of javascript.
There is also the impression of some people that Google deliberately programs Youtube etc. in a way that confuses most browsers. That way it seems Chrome is faster. Don't know if that's true, but it wouldn't suprise me, as the Youtube frontend is changing constantly for no apparent reason.
That sounds weird Todd. I haven't used Edge since ages, but I recall being the lightest on resources compared to the likes of Chrome/Firefox. Especially on very low-end systems (think latest generation of the cheapest, under-powered laptops) it seemed to be the only browser with an acceptable performance, which more than made up for its lack of features.
Also, haven't they switched to chromium engine?
mmozeiko
One word - javascript. 25% of CPU means it is using 100% of one core if you have quadcore. Javascript executes mostly in one single thread. That's why all this slack, discord, etc javascript one-page "app" thing is complete nightmare. Browser based or standalone electron-based.

But if you have not yet installed - get an adblocker. Internet becomes 100% faster due to fact that most ads nowadays ran insane number of javascript.


It's just crazy to me how Javascript ended up becoming the foundation of the web, and by extension, a lot of other platforms these days (i.e. mobile through React Native, etc.). Particularly confounding is how it is the basis for a lot of cloud server technology due to Node. Ugh.. I've had to do some Node development for work and it's absolutely awful.

Which reminds me..
honestly whatever other scripting language could have come out of that scripting project at the start of the browser war in '95 would still be single-threaded and have the exact same issue of being able to peg only a single core.

and I don't see any other sandboxed scripting language being able to fundamentally prevent what happened with node-js either.
Hm Martins, thanks for raising that point. You're right, Python does something similar and I distinctly remember having to get a task done where I wanted to use multiple actual threads but since Python has the global interpreter lock, that exact thing happened: 25% of CPU used but revving up the clock cycles.

On another note, can anyone explain or point me to an explanation of how clock speed is related to CPU usage, or if it even is?

For example, I've seen the CPU usage be at near 100% but the clock isn't as high as when it is at 25%, etc.. So I wonder what causes the clock to go faster? When I google for this, I get a bunch of stuff about "Max CPU usage" but thats not what Im talking about here - CPU usage usually refers to % of CPU cores being used... But that seems to be independent of the Ghz at any given time.
Todd
I've seen the CPU usage be at near 100% but the clock isn't as high as when it is at 25%, etc..

This is to prevent overheating. If all cores would run 100% with high frequency - that would generate too much heat. That's why modern CPUs can run with max frequency only when CPU usage is low, or only very few cores are fully loaded.

This is pretty documented behaviour: https://en.wikichip.org/wiki/intel/xeon_silver/4116#Frequencies

You can see from table in that link that this is even worse when you start using AVX-512 instructions. The utilize so much power than CPU needs to drop frequency a lot. Read more here: https://blog.cloudflare.com/on-th...gers-of-intels-frequency-scaling/