Windows Madness – Why Your Operating System Will Always Be Too Slow

HourglassSince the first personal computer, both consumers and business users have been complaining about slow speed. A large part of the reason that processing power has regularly doubled every two years, is that the current speed always seems inadequate. It’s not that users’ requirements have been increasing significantly – it’s that due to anti-virus software, poorly written drivers, and unoptimized software, the status quo will always remain “too slow.”

Direct evidence of the madness in existing software is available using the ProcMon tool described in the previous post. Using ProcMon, it is possible to analyze and view every read/write instruction that is executed on the PC. The majority of the instructions are related to software that is rarely needed by most users – multilingual tools, various services constantly polling in the background, plug-and-play, servers, display driver configuration tools, printer drivers. These services, individually, amount to less than 1% of actual usage over the course of a day, if even that. However, with the poorly written communication protocols, each service is constantly polling the processor’s attention, resulting in constant context switches. Thus, with each advancement in hardware, the laziness of programmers and self-aggrandizing marketing departments at many large corporations result in even lower performance over time.

Next, it’s important to factor-in the anti-virus and anti-spyware technology. Each file that executes on a PC now needs to be scanned to make sure that it does not contain a possibly malicious piece of software. As hacking becomes more profitable, however, more and more viruses are written to take advantage of PCs. This results in constant growth of the virus database, and an increasing number of scans which need to be performed for each file. Thus, in addition to the decreased individual software performance, the amount of overhead per file is also increasing exponentially.

Finally, it’s important to factor-in the new trend of cloud-based computing. With software now being moved from the actual PC into the Internet, the actual performance of applications is slower than ever, constricted by the bandwidth of the Internet pipe. Software developers prefer cloud-based services due to the control over backups, user licenses, and versioning / support, however at the same time this reduces speed in a constant push toward the boundaries of existing technology and infrastructure.

Thus, at the end of the day, it’s human psyche that limits the software performance, not the technology itself. Humans have a rational limit to their patience in software performance, and most developers will do the minimum required to achieve that limit. This is primarily due to the investors funding the project, who are not willing to invest extra money for higher performance without a tangible return on that investment. The increase in performance does provide exciting opportunities for research and a limited subset of technologies; unfortunately, consumer capability and speed, for the most part, will remain the same.

Written by Andrew Palczewski

About the Author
Andrew Palczewski is CEO of apHarmony, a Chicago software development company. He holds a Master's degree in Computer Engineering from the University of Illinois at Urbana-Champaign and has over ten years' experience in managing development of software projects.
Google+

RSS Twitter LinkedIn Facebook Email

Leave a Reply

Your email address will not be published. Required fields are marked *