[Julio] has an older computer sitting on a desk, and recorded a quick video with it showing how fast this computer can do seemingly simple things, like open default Windows applications including t…
Interesting take on comparability vs performance. I gotta imaging capturing user data and sending to a cloud collector is also a big culprit.
Massive amounts of telemetry data and nearly every app these days just being a web app just chews through your hardware. We use Teams at work and it's just god awful. Hell, even steam is a problem. Even having your friends list open can cause a loss to your fps in some games.
As an individual that has worked in IT for over a decade, yes. We keep making things incredibly fast, and for complex operations, that speed gain is realized, but for diverse, simple tasks, there's ultimately, very little difference between something rather old and something rather new. The most significant uplift in real-world performance has been the SSD. Simply, eliminating, or nearly-eliminating the delay of spinning disks seek times is by far the best thing that's happened for performance. Newer OSes and newer hardware go hand-in-hand, because with added hardware speed, comes software complexity, which is why a late-stage Windows 7 system typically will outperform an early stage Windows 10 machine; what I mean by "stage" here, is the point in time where the OS is considered "current" where early-stage is that it has recently become the currently newest OS, vs late stage, when it is soon to be overshadowed by something newer.
Microsoft made great performance gains over many years with windows since migrating to all NT-kernel OSes, around Windows XP, things got faster and faster, right up to around windows 8. Windows 7 was the last version, IMO, that was designed to be faster than it's predecessors with more speed improvements than losses from the added complexity of the OS; from then on, we've been adding more complexity (ie, slowing things down) at a faster rate than we can optimize and speed them up. Vista was a huge leap forward in security, adding code signing, specifically for drivers and such; in that process, MS streamlined drivers to run in a more-native way, though kernel-mode drivers were more or less a thing of the past; this, however, caused a lot of issues as XP-era drivers wouldn't work with Vista very well, if at all. Windows 7 further streamlined this, and as far as I know, there have been minimal if any improvements since.
In all of these cases, based on the XP base code (derived from NT4), it is still functionally slower than 9x, since all versions of 9x are written in x86 machine code rather than C, which is what NT is based on AFAIK. The migration to C code brought two things with it, the first, and most pertinent thing is slowdowns in the form of compiler optimizations, or rather, the lack of compiler optimizations, the second thing is portability, as the codebase is now C, the platform can now be recompiled fairly easily for different architectures, this was a long-term play by MS to ensure future compatibility with any architecture that may arise moving forward, all that MS would need to make Windows work on x (whatever arch is "next"), would be to write a C compiler for x, then begin compiling and debugging the code. MS has been ready and even produced several builds of windows for ARM and for MIPS specifically, and can likely migrate to RISC V anytime they want (if they haven't already). This was the most significant slowdown from 9x to XP.
As time went on, security features started being integrated into the OS at the kernel level, everything from driver and application signing, to encryption (full-drive, aka bitlocker, and data-in-flight, aka AES or HTTPS), and more. The TPM requirement for Windows 11 is the next basic step in this march forward for security. They're going this way because they have to. In order to be considered a viable OS for high-security applications, like government use, they must have security features that restrict access and ensure the security of data both in flight and at rest (on disk), the TPM is the next big step to doing that. The random seed in the TPM is far superior to any pseudo-random software seed that may exist, and the secured vault ensures that only authorized access is permitted to the security keys on the TPM, for things like full-disk encryption. The entire industry has been moving this direction just under the surface; and if you haven't had an eye to watch for it, then it would be completely invisible to you. This describes most consumers and especially gamers, who just want fast games and reliable access to their computers.
Speaking of consumers, at the same time, MS, like almost all software/web/whatever companies, have been moving towards making you, and specifically, your data, into a product they can sell. this is the Google approach. As an entity, at least until fairly recently, Google didn't make any money from their customers directly, instead they harvested their data, profiled all the users and sold advertisements based on that information, and they were INCREDIBLY successful at it and made plenty enough to keep them afloat. They've recently gotten into hardware and service (all the "as a service") offerings which has allowed them to grow. Facebook and Amazon have both done the same (among others but there's too many to list), and many other companies, including MS, are wondering "why not us too" because they see dollar signs down the road, as long as they can collect enough information about you to sell; So MS in their unique position, can basically cram down your throat all the data-harvesting malware they want, provided it never gets flagged as what it really is: MALWARE.
IMO, since Windows 7, they've been doing recon on all their users to try to obtain this information, which is part of the reason why everyone is being forced into using their Microsoft accounts for their PC logins on any non-pro and non-enterprise version of windows. This way they can tie the data they're collecting about you, to you specifically. As of Windows 11, this has ramped up significantly. More and more malware to observe you and your behavior, and basically build an advertising profile for you that they can sell. They want more information all the time, and the process of collecting that information and pushing it back to MS to sell has become more and more invasive as time goes on; these processes take computing power away from you as the consumer to serve MS's end goals, of selling you, their paying customer, to their advertisers. They will be paid on both sides (by you, for their product, and by their advertisers for your information). The worst part about it is that they haven't really had any significant push-back on any of it.
If you go back to Windows 9x, or DOS/Windows 3.1 days, none of this was happening, so the performance you got, was the performance the hardware could deliver; now, all of your programs have to go through so many layers to actually hit the hardware to be executed, that it's slowed things down to the point where it's DRAMATICALLY NOTICABLE. So yeah, if you're doing something intensive, like running a compression or encryption or benchmark or similar, you'll get very close to the real performance of the system, but if you're dynamically switching between apps, launching relatively small programs frequently, and generally multitasking, you're going to be hit hard by this. Not only does the OS need to index your action to build your advertising profile, it also needs to run the antivirus to scan the files you're accessing to make sure nobody else's malware is going to run, and observe every action you take, to report back to the overlords about what you're doing. In this always-on, always-connected world, you're paying for them to spy on you pretty much all the time. It's so DRAMATICALLY WORSE with windows 11, that it's becoming apparent that this is happening - to everyone; as someone who has seen all of this growing from the shadows in IT for a decade, I'm entirely unsurprised. Simply upgrading your computer to a newer OS makes it slower, always. I've never wondered why, I've always known. There's more moving parts they're putting in the way. It's not that the PC is slower, it just has SO MUCH MORE TO DO that it doesn't move faster, and often, it's noticeably slowed down by the processes.
Without jumping ship to Linux or some other FOSS, you're basically SOL.... Your phone is spying on you (whether android or iOS), your PCs are spying on you (whether Chromebook, Windows, or MAC), your "smart" home everything is spying on you, whether you have amazon alexa, google home, or Apple's equivalent... Now, even your car is starting to spy on you. Regardless of what it is, if it's more complex than a toaster, it's probably reporting your information to someone. There's very few if any software companies that are not doing this. Your choice then becomes a choice of equally bad options of who collects your data to sell it to whomever wants it, or go full tinfoil-hat and start expunging everything from your life that has a circuit more complicated than a 1980's fridge in it, and going to live in the forest. I'm doomed to sell my data to someone; so far it's mainly been MS and Google. My line of work doesn't really allow me to go "off-grid" and survive in my field; not everyone is in my position. So make your choice. This isn't going to get better anytime soon, and as far as I can see, it will never stop.... so choose.
The real sad part for me is the amount of e-waste this produces. Especially in devices like laptops.
A clean Linux distro can extend a laptops life by a decade. I have a laptop from the c2d era that I threw an ssd in and put Linux on. Perfectly serviceable as a basic machine.
Hasn’t this always been the case? Software development is a balance between efficiency of code execution and efficiency of code creation. 20 years ago people had to code directly in assembly to make games like Roller Coaster Tycoon, but today they can use C++ (or even more abstract systems like Unity)
We hit the point where hardware is fast enough for most users about 15 years ago, and ever since we’ve been using faster hardware to allow for lazier code creation (which is good, since it means we get more software per man-hour worked)
twas always thus, software development is gaseous in that it expands to take up all the area it is placed inside, this is both by the nature of software engineering taking the quickest route to solving any action, as well as by design of collusion between operating system manufacturers (read Microsoft and Apple) and the hardware platform manufacturers they support and promote. this has been happening since the dawn of personal computer systems, when leapfrogging processor, ram, hard drive, bus, and network eventually leads to hitherto improbably extravagant specs bogged down to uselessness. it's the bane and very nature of the computing ecosphere itself.
I hadn't considered the latency of abstraction due to non-native development. I just assumed modern apps are loaded with bloatware, made more sophisticated by design, and perhaps less elegantly programmed on average.
goddammit.
I was watching this going "hey, my system is like that!" Check and yes, my 24 core Ryzen 5900X with 32GB ram with NVMe drive is painfully slow opening things like calculator, terminal etc.
I am running Fedora 38 with KDE desktop....
That the hell man
It's not just applications. I recently "upgraded" two of my PCs from Windows 8.1 to to Windows 10. Ever since that having the mouse polling rate above like 125Hz and moving the cursor would result in frame drops in games.
This happened across two machines with different hardware, the only common denominator being the switch in Windows version. Tried a bunch of troubleshooting until I ultimately upgraded CPU + RAM due to RAM becoming faulty some time later on one of the machines. That finally resolved the issue.
So yeah, having to upgrade your hardware not because it's showing its age but rather because the software running on it has become more inefficient is a real problem IMO.
So what I want to know is why do we still have programs that run on a single core when nearly every Windows PC out there is running a multi-core processor?
What are we missing to have the OS adapt any program to take advantage of the hardware?
For applications developed natively, the response times would be expected to be quite good, but fewer applications are developed natively now including things that might seem like they otherwise would be. Notepad, for example, is now based on UWP.
The telemetry thing is why I almost always turn that off in every program that has the option to disable it. You can really see the difference in a lot of games, especially online games, with just that 1 thing. It's insane.
Stop buying games that need 220gb of drive space, an Nvidia gtx 690000 and a 7263641677 core processor then. More than a 60gb download size means I pirate it unless it's a really really damn good game. Games with no drm that can be run without a $20k computer, I buy.
Time to uninstall Windows 11 and go back to 3.11! Sure, it won't run anything made since the mid 90s at best, but what it does run will surely be lightning fast!!
@jestyr I hope this problem will be solved with developers using new programming languages, like Rust or Go, instead of web-based ones, like Electron. Some libraries still need to be more polished, but IMO developers will be able to make software less bloated in the short term.
My computer fell on its side a few months ago. Now when I run video games it stutters. I could fix it for $80 and a couple hours of labor, but then I remembered that nothing I play is optimized and it all runs like shit anyway.
Time to uninstall Windows 11 and go back to 3.11! Sure, it won't run anything made since the mid 90s at best, but what it does run will surely be lightning fast!!
Time to uninstall Windows 11 and go back to 3.11! Sure, it won't run anything made since the mid 90s at best, but what it does run will surely be lightning fast!!