Software is Wasting Your Cores

Back in February, Steve Hogan made the case for getting a multi-core system. Rob Cheng’s experience shows that dual-core systems aren’t always faster though. It’s possible for a multi-core system to outperform a single-core system, but you’re not likely to see a desktop operating system or many applications that can take advantage of it. There are good reasons for that problem, and they aren’t going away any time soon.

Users like to measure performance by response time, the amount of time that it takes to get a particular thing done. Multi-core systems, on the other hand, are good at throughput; give them a big pile of unrelated things to do and they can take work off the pile in parallel, getting it all done much faster than a single processor possibly could. If a particular piece of work uses just one core, a multi-core system won’t improve response time.

“Wait,” you might say, “Why can’t the computer take the one thing that I want to do and split it into pieces so that each core can do part of the work?” Why indeed. Writing an application that way is probably an order of magnitude harder than writing software the way it’s done today. It’s made more difficult because programming languages and operating systems don’t provide the tools to split work off into parts that can be distributed to multiple cores.

For example, notice that above I said “a pile of unrelated things.” If the pieces of work to be done have shared data, distributing the work to multiple cores is a pain. It’s not just a question of CPUs, either; memory can be a point of contention. Imagine if several cores all want to write to the same area of memory. They may clobber each other’s data if they don’t coordinate efforts, but even if they do cooperate they are unable to cache the memory results locally because other CPUs are changing them. Then, once work is complete on all cores, the program has to recombine those pieces into the single thing that you wanted done in the first place. That takes a lot of synchronization and coordination.

There are, of course, applications that just love multi-core systems, but they tend to be ones that run on servers and not desktops. Databases, file servers, and web servers are good examples. In those cases the requests do tend to be independent; once a request is done on a core it can be sent off and another one grabbed from the pile. For those situations, you often can get close to having N times the performance out of an N-core system–but only if you’re measuring the throughput of the overall system.

There are some rumors that Microsoft’s future ground-up rewrite of Windows, code named Midori, might make it easier for desktop applications to take advantage of multi-core systems. I am not so sure that it will be faster, though, because Midori is supposed to be based off the .NET Framework and will still need to run existing Windows applications in some sort of emulation mode. If that thing ends up being fast, it may only be because a 16-core system succeeds in hiding Midori’s software bloat.

 1,296 total views,  1 views today

(Visited 1 times, 1 visits today)

17 thoughts on “Software is Wasting Your Cores”

  1. it is interesting that Bruce never once mentions what operating system he is talking about. Not that it would matter to me as I am stuck with XP and Vista. It seems to me if we didn’t have Windows we wouldn’t have to worry about viruses and virus software so our dual or quad cores would not be necessary to run our virus scans in the background. You guys should consider yourself lucky. I can remember when I had to run NT with dual processors so that my CD copier wouldn’t crash and waste $20.00 a CD if I even so much as moved the mouse when it was copying.
    We have come a long way. I have all three computers, single, dual, quad and as soon as you add software they all start slowing down. My quad is the fastest on all counts no matter what task or multi task I am doing. So even though it is a cheap machine it seems to outperform.
    And I am a WONK from way back. Paid $1000 for my first single speed CD ROM external drive.

  2. Dave,

    You say that “Desktop Operating Systems” likely won’t take advantage of multiple cores. I say that is wrong. It may be true if you limit that to windows.

    However there are numerous desktop operating systems that do indeed take advantage of multi-core systems, I am typing this reply on one of them now, and many many of my desktop applications and software take advantage of those cores.

    You then commented that most of the apps available don’t take advantage and the ones that do tend to be server apps.

    My desktop video applications take advantage of my cores, all of them, not a server app, just a simple video editor I use to grab and edit files from my camcorder. My image editing apps also take advantage of all my cores. So doesn’t my simple little audio multimedia player, yep it uses a database, it isn’t a server, but like many people I have a large music collection, and using a database increases the time it takes to manage that collection 4 fold.

    My operating system itself, takes advantage of all 4 cores on my system, along with the 4 gigs of memory I have, it does so seamlessly, and many of the things I do regularly are taking advantage of all my cores, it makes a huge difference in the desktop tasks I do.

    It is the choices that people make when choosing their operating systems and software that are poor, because most folks don’t know how to take advantage of their systems capabilities, and they choose poor software written by lazy and backwards thinking coders.

    The only thing not ready for multi-core systems on the desktop is windows and programmers who can’t get their heads around the fact that because they choose to use 20th century desktop OS technology to run 21st century hardware they are holding themselves back LOL.

    It certainly adds more work to code for more processors, but not doing it is like designing a V8 engine that only has 2 pistons, and six empty cylinders, because it is harder to put all the pistons in the engine block.

    That said, I am off to encode the video I shot of my granddaughter this weekend at 4 times the speed of you folks who use the slow and ancient windows movie maker 😛

  3. Having installed a Gadget on my HP Vista Dual-core PC showing the percentage of memory and each core in use, I find both cores are often in full use.

  4. True story, but it is a bit of kicking in an open door. It’s with al things, new stuf takes time.

    Personaly I like being able to run a virusscan on the system and stil be able to go online(or play a game) without massive delays.(Something I could’nt on my old single core, and for that I just did’nt scan.)

    That it takes time to change software is normal, but its like 64bit is around for ages but it’s stil not the standard. Changing to multicore was a big step and is taking off way fast. The software wil follow in time, and yes foremost software developers wil only look to their own program running fast.Correction make it just “running”.

    personaly, this artikel could be over any new technology.This one just changes the way of programming and that’s been the same for so long that changing to something new takes a lot of time.

  5. @Dave Methvin,
    The article states “…Rob Cheng’s experience shows that dual-core systems aren’t always faster though.”

    That statement is just plain wrong and you know it. I think I explained pretty well that the newer CPUs are faster on each core. Call it per-core-per-clock if it makes you feel better, but still no current single core CPU is as fast as the current same size dual core. 😉

  6. Mina,
    Since your system is still Rambus, I think you’d be surprised at how much faster a new system is compared to yours. Our RDRAM boards only support up to 1.7GHz Pentium 4 CPU. If time isn’t important to you, then by all means stick to your old system. I’m baffled as to why someone would spend so much on their memory (like the transmission of the computer) and then skimp on the CPU (like the engine of the computer).

    Great memory btw. I’ve got 2 systems that still run great with 2x256mb PC-800 RDRAM (Rambus) although 1 has already needed the motherboard replaced since they’re 7 years old or so. We use these systems exclusively for recording from the TV since they’re too slow for anything else anymore.

    Any custom built PC, and most manufacturers’ PCs can run a single core CPU as most motherboard chipsets are backwards compatible. Single core CPUs are still widely available and cheap. You can build a PC for around $300-$400USD if all you want is a cheap single core. And that system may seem fine to you until you put it side by side with a modern Core 2 Duo and see how much faster it is.

    The problem is that as system builders we charge about the cost of that system for building a system. No one in their right mind would pay $600-$700 for a single core system when we can sell them a dual core that is twice as fast for $100 more.

    Manufacturers will never stick with old technology as it’s not profitable. Manufacturers buy parts in bulk and build systems in bulk. As a custom builder, I’ve never had someone ask for a single core CPU. The demand for old/slow systems isn’t enough to make building them worth while for any manufacturer and they’d be paying taxes on the components while they sat in stock awaiting a home.

    I mean, you can buy/build whatever you want, but expecting manufacturers to cater to a small minority isn’t very realistic. 🙂

  7. @dark41, most of the new multi-core systems do more per-core-per-clock than older single-core Pentium CPUs, but that’s not really related to whether software can take advantage of multicore systems.

    @Mina, chip makers want to market multicore systems because they bring in more money, but they still sell single-core systems. Intel has its new Atom line which they intend for ultramobile PCs, but the top end of the line can be used in desktop PCs.

    It’s a lot like SUVs vs. cars. Some people really need SUVs, but not as many as have bought them. Some people really need multicore systems, but a lot more are buying them because the ads make them sound great.

  8. To each their own, but I will never go after any computer or laptop that is more than one core. I only need one core, and I almost feel insulted that manufacturers don’t seem to be keeping this option available.

    About the only good thing coming out of this happening, is that custom-built PCs/Laptops might become more frequent, as some will not want more than one core and some PCs/Laptops come out with ridiculous numbers for RAM and hard disk space.

  9. What the article doesn’t take into consideration is that the new multi-core CPUs are faster on every core than single cores.

    EG: Your view is correct when comparing a Pentium D 2.66GHz CPU with a Pentium 4 2.66GHz CPU, but your view is completely wrong when comparing a Pentium 4 2.66GHz CPU with a E8400 2.66GHz CPU. Not only will the E8400 perform single core tasks faster, but it will also run cooler and use less electricity.

    Also, who hasn’t been working on a computer or playing a game when the AV scan kicks in? Multi-core CPUs will use the available cores to run simultaneous apps for multi-tasking. A single core will simply slow down as it tries to do everything on the only core available. Dave Methvin can say it’s only this amount and only that amount all he wants, but it’s still slowing down the single core more than the multi-core.

    Thus, in every instance a multi-core CPU is faster than a single core CPU today. The only real question is whether you multi-task enough and/or use apps that will benefit from a quad core over a dual core. 🙂

  10. There is one common type of program that does use multi-core CPU’s, gaming. Almost all games take advantage of multi-cores.There is a big difference even from Hyper Threading technology.No difference when on the web or using software like Office XP.

  11. @Wesley, downloads are waiting on the network and disk drive, so extra CPUs don’t help them.

    @Mathieu, P2P and chat don’t use much CPU at all; P2P is more likely to be waiting on network or disk. Since real-time AV blocks the program that needs the file until it is scanned, there is not any parallelism. The AV scanner wants to determine if the file is a virus before it gets to the requesting program.

  12. Interesting post, but there’s something I wanted to point out. You don’t consider applications that run in the background. anti-virus/spyware programs scans, chat programs, P2P or anything else. On a single core system, those things will eat up some cpu cycles and slow down the system, while on dual-core systems, this will be splitten between the two cores. Also, many programs are now optimized for multi-core, take photoshop, video converting programs, etc.

  13. I, for one, am glad I have a dual-core 3.2 GHz processor. I am always downloading one or two things at the same time while working on something totally different. Maybe I should go to a quad-core.

  14. Omar, the apps you mentioned are specialty apps and not typical PC desktop apps. Image editing is probably the exception, but for web-size graphics multi-core doesn’t make much of a difference. And Windows itself doesn’t take much advantage of multi-core horsepower.

  15. Dave,

    Great post, in particular helping explain the “response time” vs “throughput” topic.

    One thing I’d add is, multicore systems CAN be great at improving response time – but the application needs to be multithreaded / multicore-enabled for that to happen, via one of several approaches – Intel’s TBB, OpenMP, Cilk++, thread pools, etc. (For what it’s worth, here’s an e-Book outlining some of the issues associated with moving to multicore: )

  16. Hmmm. I dunno. Let me say I am not the type to add negative replies (if i reply at all). But this article struck me wrong. All the info i have read in the past few months says the opposite. Its seems every one know is aware of the multi-core systems and are trying to catch up to write software that will harness all of the available horse power. Most photo editing apps already do. Most NLE digital Video apps do. Most 3D modeling/rendering apps do. And now even game developers are quickly catching on. So this may have been true in 06′ but today, not so much… So who is behind the times?

Leave a Comment

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.