To what extent is software created with the aim of selling new hardware?
When I knew nothing about coding, I often had the impression that software was intentionally created so that it necessitates up-to-date hardware. It was the only explanation I could come up with to explain why applications ran slow, sometimes even slower than before, although the computers became faster and faster. Since I've been playing around with coding myself, I have encountered other possible explanations, for example: 1.) Higher languages might be used which increase speed and convenience of development and maintenance, but are less performant. 2.) Code bases might be so big or messy that it becomes hard to keep them clean, efficient and bug-free. 3.) Hardware is better, so there's not as much need to write super-performant, and performance optimization is only done when absolutely necessary. Still, I wonder if there's a component of intent there: To push new hardware, to force a pay update or whatever it is. Does anybody know details about if, how, and to what extent this is done?
6/30/2020 9:19:33 AMHonFu
70 AnswersNew Answer
devanille yes, i could believe in some cases there are planned obsolescence of hardware products, especially through making them not easily fixable. I don't have any evidence that software would *normally* be intentionally designed to make a product obsolete though. Of course there may be some noteworthy individual cases, which then coulld be considered criminal and prosecuted i suppose.
Denise Roßberg, I get it with games. They're all about being really impressive, so good performance is actually a measure for quality. I mean, who wants to play a stop-and-go game, right? So the game, if it's any good, will try to be performant, and will try to make use of every bit of hardware you throw its way. I'm thinking more about programs that are mere utilities, like Microsoft Word, Adobe Reader and such. I mean, what excuse do that sort of programs have to still suck? Why do I have to look at a loading bar, until a simple pdf file is opened? Stuff like that. I often think like: Wait, this program is not doing *that* much. Why doesn't it move goddammit? And then the question: Can there be intent behind it? Like with a washing machine that's supposed to break shortly after guarantee is gone? Is software produced in a way that it will irrationally stop to cooperate under a certain condition?
And I think there is a difference between games and other programs. I have an over 15 years old laptop. Ubuntu is running, firefox is running, eclipse is running and a lot of other stuff. Not in the latest versions but I guess there is not everything bad with the performance. But think about the latest games. Most of them wants the best hardware and if you are not a graphic designer or something like that games are the only reason to buy new hardware. Another "problem" could be that nearly everyone can learn programming. But not everyone knows how to write good code.
Sandra Meyer, interesting discussion with David Carroll. So some programs suck so hard because the makers never did there big O homework? 😂 This would all point to coders not making full use of their resources by lack of skill, money, time or whatever. However, not to an actual *intent* to run slowly. Denise Roßberg, there may be a lot of open source alternatives for private users. But companies often rely on a tool 'everyone uses', and that's frequently the product with a certain name on it, even if it's objectively lacking. Probably in these cases, the good name makes up for the deficits. I mean, how many people use Windows? How many versions ago was it, when it didn't suck? So with a washing machine or a printer, the profit is obvious: If they break after two years, you'll buy a new one. With complex stuff like Windows it's sometimes more like extortion: 'We are no longer closing security holes - buy a new OS or get hacked.' But that's not exactly the same as purposely reducing performance.
About the thought that industrial products are designed to break. This is also true about the software industry. Just think about semantic versioning. By definition, incrementing the major version introduces breaking changes that will render the old version obsolete and potentially unusable. Modern software is typically built on chains of dependencies.. Frameworks, libraries, open source code fragments, which are eventually going to break. Thus we have growing enthropy, increasing complexity. If you want to support old and new versions, you have to add more code, more conditions. I am so happy I abandoned web design in the old ages, when different versions of internet explorer defied the web standards in DIFFERENT WAYS, so needed separate CSS hacks to somehow look similar. And it is much worse now with so many platform, OS and hardware variations...
HonFu I am not quite sure, but I think companies sometimes have to take official products because of maintenance and warranty. So if the "classics" cannot be ousted, there may be no motivation to really improve them.
As long as they don't have any pain to claim more resources, it's more economic to prioritize the new features higher than performance or cleanup issues or even bugfixes. And those are all very expensive for a short-term view, since most of the affected professional software consists of millions of lines of code, GBytes of data, was created by dozens, hundreds or thousands of developers from whom many are no longer available. So it would cost additionally other resources to get familiar with their old relicts and so on. Implementing new is often cheaper than cleaning up the old stuff. Tools like Visio are not small handy codebases which could be maintained by few developers, who are enthusiastic and personally involved in that codes (normally). And beside all that, there is additional a growing number of dependencies - another API, plugin, third party tool integrations, cloud features and so on.
This thread is really exciting.. I have a few thoughts on that too and I want to offer a few tangential topics to reason about. One recent article about COBOL... basically the whole financial world was built on top of a programming language 60 years ago that nobody is interested to learn any more, because it is just not fashionable.. Yet, it is still working flawlessly (kinda). https://www.techradar.com/news/the-programming-language-that-doesnt-want-to-die The other one is a talk by the late Joe Armstrong, inventor of the Erlang functuonal language. It explains the IT industry from the perspective of a theoretical physicist, and explains how the enthropy and chaos is growing on us. A really interesting talk, I watched it recently and enjoyed it too much, and sort of explains the question in this topic too. https://youtu.be/lKXe3HUG2l4
Sandra Meyer I only shared my private view. Of course, I don't know how big companies are doing. I am just a hobbyist. And I think clean code should always be learned. This saves a lot of work. Even my small programs quickly become confusing if I don't care about conventions and clean code.
But to be fair, occasionally, when I download 100 gigs from the internet I realize how far we've come and how fast everything is now, so we can just hope that even if consumer software is horrible, the smart people at CERN etc also profit from all this better hardware.
Alexander Thiem you are not far from the truth when you mention "very good compiler". Some very exciting innovations in the Java world are exactly aimed to do this: increasing the performance, decreasing the size and memory footprint of programs, by optimizing away the lazyness of programmers and the gigantic bloat of frameworks. https://quarkus.io/ https://www.graalvm.org/
So you would also appreciate the clean code course, the code conventions course, the communication skills course, the language-independent basics course and an architectural-design-and-patterns course? 😉 but games are not too urgent from my point of view. Think of all those business / office users without sufficient knowledge to run a slim system. Even an average office suite today costs more than a complete system 10 years ago. And they need to run many software for their work that is even worse... And we all have to fight daily with slow, traffic-consuming apps and websites (more or less fighting depending on the location, but we in Germany suffer quite a lot...).
Apple slowing down deliberately its old iPhones through updates. A good example about forcing the client to buy new hardware.
devanille That's right https://www.bbc.com/news/technology-51413724 If this is true of Apple then there are probably other companies doing the same
It is common i think. Softwares will like to take advantage of new hardware and will have a fallback mode if the new hardware specifications are not met. This is most commonly seen in games and server type softwares but there are many high end softwares that will take advantage of specific hardware features like hardware enabled virtualization, multicore is becoming common now. High end CAD/CAM softwares, intensive number crunching softwares will take advantages of such hardware features. And in fallback mode the performace can really reduce, so the customers are eventually forced to buy new hardware. Softwares can themselves come in different versions that have different capabilities to exploit certain hardware. To extract the maximum from software you probably need high end hardware.
If you start working on an existing codebase that someone else has written, chances are high that it is not neatly written nor fully optimized. You cannot rewrite entire systems for the sake of minor performance improvement, if you are only tasked by fixing specific bugs. Even in greenfield projects where you start from scratch, maybe you cannot afford to fully optimize everything, because it would take too long. Time-to-market is shorter if you start with half-ready templates. Also, programming by copy-paste (google/stackoverflow) can strike back, because the same code may not exactly work for your use case, even if it looks fine at first, but may cause unforeseen bugs. Especially if you don't fully understand the copied code, with its assumptions and limitations.
Software is usually created to offer a solution to a problem, not just to "sell hardware". The exception i can think of is sample software created to demo how new hardware works, implemented by hardware manufacture companies to showcase the capabiities of the new products/chip/boards.
As the problems we want to solve with computing become more complex, the software complexity increases, existing languages and hardware solutions become insufficient in maintenability and performance, and are replaced with new paradigms that handle the complexity better. So sofware is not created to sell more hardware, rather, new solutions require more advanced hardware and software combinations.
I've even seen that the most expensive configuration of a machine was ordered, because they were not able to make the software more performant... So I buy 1, 2 && 3 🙃
Wow, that would be a really bad case, Sandra Meyer! 😂