Puh-leez. I don't for one minute
believe that you need dual core to do HTML e-mail or open up Word attachments.
But, I wax off topic.
And that's what Windows Fundamentals for Legacy PCs is for
That is also the reason why you can still buy computers that aren't dual core.
There are also very good economic and social reasons. Sending photos and videos. Again does not require dual core, but it requires a GUI and a media player. Something like a 500 MHz machine would work well.
DRM - where? I can move my data wherever I please and vista has never stopped me.
Stability - seems stable to me. I mean it is not really comparable to windows Me because it had critical multi-tasking and memory issues that the NT kernel doesn't suffer from.
Drivers - if you buy a new computer then your drivers will all work. If they don't you are more than entitled to return the computer to the retailer or distributor for a full exchange or refund under consumer law. If they say no at first, bring the law to their attention and they always buckle at the knees.
Linux - depending on the distro can use just as many resources. Again it really depends on the distribution. Linux is like people, many and varied.
The purpose of an operating system is to manage system resources so you can multi-task and to reduce the burden on programmers increasing the experience of the programmer and user on the computer. Unfortunately this comes at a cost, which as you move into greater and greater hardware specs becomes larger.
If you want to store 100GB of movies, then you have to use a HDD formatting strategy that allows you to address all the bytes on the HDD (ususlly addressed as blocks). The more bytes that needs to be addressed, the more bits are needed for the address and thus waste HDD space just to make sure you can find things. The same with RAM, videos cards and operating systems.
Consumers push these changes on themselves by demanding greater and greater things, and there is a limit to what can be accomplished.
Of course this really comes down to computer scientists because with efficient algorithms, the computational intensity of the application can be greatly reduced. Unfortunately algorithms can be as flawed as human logic at times. And that is just an imperfection in humans you have to realise.
I'm not making excuses, that's just the way things are. It is perfectly engineered for the purpose it was designed for. That is in an internet connected world that is security concious, and requires music and video, etc... And with moving to 64bit.
Another example is the move to IPv6. It uses more space to store the address with the packet you are sending because more and more people are living in the world and getting connected to the internet. Just like the bytes problem described above we need larger and larger numbers to label internet connected computers with. Thankfully with IPv6, this will not be an issue again until we develop nanobots with an IP stack.
There is a famous saying by bill gates that no-one will ever need more than 640 KiBs of RAM. You laugh at it now, but my camera takes 6MB photos. I acknowledge that I need system resources to process that and move on with my life knowing that I am better for the alternative is using film and printing photos, which if you've ever worked with film you know is a dirty dirty water hungry process that wastes natural resources and pollutes.
E-waste is bad, but it is more manageable than cutting down the number of trees we would require in paper. And then storage space for it. And with innitutives like RoHS, and computer recycling almost a reality, the burden is much less. Not to mention by using more efficient computers the computational time is decreased, further decreasing energy usage. Unfortunately for us computers just keep creeping into more and more electronics keeping power usage at approximately the same level.
Architecturally computers are just one big compromise after another. Just like truncating a Fourier or Taylor series. We work with that is possible and feasible, and the results turn out really well most of the time.
This post is way too long.
So in conclusion, the market exists from the very low end right up to the top end to allow users to pick a computer that suits their need. I mean I haven't bought a cray supercomputer. But if you could afford a computer that was slightly faster at loading your e-mail, the salesperson may give you a good enough impression to make you split with that money. And in the end the customer is happy with it all as it works very well for them accessing their e-mail.
Then after time they get annoyed because the IP stack takes too long to read from the HDD and to initialise at boot up. etc... And you end up in this forever upgrade cycle. Not to mention the burden anti-virus software places on computers.