The trick with multi-core systems is that the SOFTWARE needs to be written multi-threaded to take advantage of it.
This part is true. An application takes a single core unless it's designed with several threads.
I should point out, however, that having dual cores does help responsiveness a bit since it can spread the tasks between cores. This has diminishing returns as the number of cores increases, though. The best way is to use applications designed for multiple cores.
Therefore, for most people 2 cores is plenty as one core is busy with you currently active program while the other is handling OS background tasks, widgets, your music player, etc.
Well, it can theoretically work like this, yes, but realistically most OSes do things a bit differently. You CAN do this by setting the affinity or by using software that sets affinity, but most OSes simply try to level out the CPU usage and pay more attention to how much CPU an application is taking rather than whether it's in the foreground or not.
BTW, you'll find that 99% of windows programs are single-threaded.
You'd be surprised how many applications are multithreaded . . .
It might be more accurate to say that 99% of applications are not designed to use cores evenly or take full advantage of them. They mostly throw threads at the CPU hoping the OS will take care of distributing them.
Even without cores, threads have long been used to do things like keeping the GUI on an application responsive while other parts of the application wait for resources. This type of threading, though, isn't designed with multiple cores in mind, although it will work.
Servers have also long had multiple CPU/Core systems because they have to manage many clients simultaneously. Perhaps we are opening the floodgates for people running their own personal servers? With the big push for cloud computing, I wouldn't be surprised if more people started wanting to run the server side of things as well and not just the client side of things.
It's an event that seems to repeat itself throughout computer history: Something starts in big businesses, on large servers and mainframes. But it gets minaturized and put into PCs, and before we know it everybody is doing it in a small, inexpensive, and affordable box. It's happened before, and it'll happen again. If you want to see what people will be doing in their homes a few years from now, look at what the businesses are doing with large server racks right now. Because a few years from now, a PC will be as powerful as all of those racks. A few years more, and your phone will be that powerful as well.
This whole Software as a Service thing? It'll last until people start realizing they can host their own software. Next thing you know, they'll want to own their own software so they can host it for themselves. Geeze, where have I seen that before?
History will, as always, repeat itself.
Those of us who are using 4 or 8 core machines are doing so because we have high end 3D software or video editing software, for example, that actually can peg all of our cores to the maximum when rendering.
I think that increasingly gaming will also take advantage of multiple cores as well. A lot of tasks in games are well suited for parallelism. Physics and AI come to mind. Graphics would also work, although that is usually given to the GPU.
I also think that we'll be finding more uses for multiple cores - in computer science there is a lot of theory about problems that may be better suited for parallel machines, and some programming languages such as Smalltalk and Erlang use paradigms that are well suited for parallelism. It's possible that there's some killer application lurking around somewhere.