After doing some google-fu, I’ve been puzzled further as to how the finnish man has done it.
What I mean is, Linux is widely known and praised for being more efficient and lighter on resources than the greasy obese N.T. slog that is Windows 10/11
To the big brained ones out there, was this because the Linux Kernel more “stripped down” than a Windows bases kernel? Removing bits of bloated code that could affect speed and operations?
I’m no OS expert or comp sci graduate, but I’m guessing it has a better handle of processes, the CPU tasks it gets given and “more refined programming” under the hood?
If I remember rightly, Linux was more a server/enterprise OS first than before shipping with desktop approaches hence it’s used in a lot of institutions and educational sectors due to it being efficient as a server OS.
Hell, despite GNOME and Ubuntu getting flak for being chubby RAM hog bois, they’re still snappier than Windows 11.
MacOS? I mean, it’s snappy because it’s a descendant of UNIX which sorta bled to Linux.
Maybe that’s why? All of the snappiness and concepts were taken out of the UNIX playbook in designing a kernel and OS that isn’t a fat RAM hog that gobbles your system resources the minute you wake it up.
I apologise in advance for any possible techno gibberish but I would really like to know the “Linux is faster than a speeding bullet” phenomenon.
Cheers!
If there is free RAM, is there a reason not to use it?
Yes, caches. Lots of caches.
deleted by creator