Thought Leader James Reinders on Parallel Programming
By: Bridget Moore
The explosion of multicore processors means that parallel programming -- writing code that takes the best advantage of those multiple cores -- is required. Here, James Reinders, Intel’s evangelist [please note: Intel is the sponsor of this program] for parallel programming, talks about which applications benefit from parallelism and what tools are best suited for this process. His thoughts may surprise you.
Q: What new tools do you need as you move from serial to parallel programming to get the most out of the process?
Reinders: The biggest change is not the tools, it’s the mindset of the programmer. This is incredibly important. I actually think that human beings [naturally] think about things in parallel, but because early computers were not parallelized, we changed the way we thought. We trained ourselves to work [serially] with PCs, and now parallelism seems a little foreign to us. But people who think of programming as a parallel problem don’t have as much a problem with it as those who have trained themselves to think serially.
[As for the best tools], if you woke up one day and could think only in parallel you’d be frustrated with the tools we have now, but they are changing. From a very academic standpoint, you could say no [computer] languages are designed for parallelism so let’s toss them out and replace them, [but] that is not going to happen.
The languages we use will get augmented. Intel has done some popular things; Microsoft has some extensions to its toolset; Sun’s got stuff for Java and Apple’s got stuff too.
There are some very good things people can look for but they are still emerging, and programmers need to learn them. I can honestly say that as of the last year or so, trying to do parallel programming in FORTRAN or C or C++ is a pretty reasonable thing to do. Five years ago, it was something I couldn’t have done … without a lot of training and classes. Now these [existing] tools support what we need enough to be successful.
Google has done amazing things in parallelism. Their [Google’s] whole approach in building the search engine was all about parallelism. If you asked most people back then to go examine every Web page on the planet, they’d have written a for loop process. But Google looked at this in parallel. They said, “Let’s just go look at all of them.” They thought of it as a parallel program. “I can’t emphasize how important that is.”
Q: How about debuggers and the debugging process? How does that change with parallel programming?
Reinders: Debuggers are also getting extended but they don’t seem to move very fast. They still feel a lot like they did 20 years ago.
There are three things happening in debuggers. First, as we add language extensions, it would be nice if the debuggers knew they existed. That’s an obvious fix that is now happening quickly.
Second, I suddenly have multiple things happening at once on a computer. How can you show that to me? The debugger will usually say, “Show me what’s happening on core 2,” but if you’re working with a lot of cores, the debugger needs to show you what’s happening on a lot of cores without requiring you to open a window for each. Today’s debuggers don’t handle this well, although a few [of the more expensive ones] do.
Third, this is very academic, but how do you deal with determinism? When you get a bug in a parallel program, it can be non-deterministic, meaning it can run differently each time you run [the program.] If you run a program and it does something dumb, in non-deterministic programming, just the fact that the debugger is running causes the program to run differently, and the bug many not happen the same way. So you need a way to go back to find where it happened, what the break point was.
In serial programming, typically if I run the program 20 times, it will fail the same way, but if the program runs differently every time, it’s not obvious how to fix that. The ability to rewind or go back when you’re in the debugger and look at something that happened earlier instead of rerunning the program … is very helpful. You tell the debugger to back up. To do that, the debugger has to collect more information while you’re running so you can rewind and deal with that non-determinism.
Q: If you’re a programmer, writing custom apps for your company or an ISV, when does it become essential to employ parallel programming? What apps reap the biggest advantages, and are there apps for which parallel programming has little benefit?
Reinders: Any application that handles a lot of data and tries to process it benefits [from parallelism.] We love our data -- look at the hard drive of your home PC. Parallelism can bring benefits to obvious things that we see every day: processing pictures, video and audio. We love getting higher-res cameras, HDTV. We like higher resolution stereo sound. That’s everyday stuff.
Scientific applications are obvious beneficiaries, and business apps that do knowledge mining of sales data to reach conclusions. They all do well in parallel. That’s the obvious stuff. But then people can get very creative with new things. There are some things that you might think won’t benefit [from parallelism], like word processing software and browsers, but you’d be wrong.
Look at Microsoft Word. There are several things Microsoft does there in parallel that we all enjoy. When you hit print, it will go off and lay it out and send it to the printer, but it doesn’t freeze on you. If you go back 10 years, with Microsoft Word, you might as well have gone for coffee after hitting print.
Spelling and grammar checking, when you type in Word, it puts in the squiggles [on questionable spelling or usage]. It’s doing that in parallel. If it wasn’t, every time you type a letter, it would freeze while it looked it up. Word is WYSIWYG; if you’re in print mode, it’s justifying and kerning -- that’s doing a lot of things in parallel with many other things.
From Our Sponsor:
To learn more about Intel’s software technologies and tools, visit Intel.com/software.