When you get a new perl
, you want to use it right away. Why wait for all that pesky compiling? As soon as the new tarball hits CPAN, you want to download it and start playing with it. You can make that process a little faster by running a parallel make
.
This week, the Perl 5 Porters released Perl 5.12.3, the latest maintenance release of the language. These releases come out a couple times a year, as needed. Also this week, the Perl 5 Porters released Perl 5.13.9, the next point release building up to Perl 5.14, last week. A new development perl
comes out about once a month, with a much longer time between the maintenance versions. It’s ready for you to test drive some of the new features planned for Perl 5.14 (some can disappear, though!). With all of these new perl
s, you could spend quite a bit of time compiling and installing them, but it doesn’t have to be that way.
In Item 110. Compile and install your own perl and Compile a development version of perl, you saw how to install either a maintenance or development perl
. Neither of those cared about how long the process took, or the modern developments in wide architectures. You didn’t go past the configure
step. You might install perl
in the background, or let it run as you go to get another Mountain Dew. You can do better though.
This tip isn’t perl
specific, and you can use it with other distributions that use make
. Use the -j
switch to tell make
how many things it can do at the same time. With many cores or CPUs, you can do a lot at once:
$ ./configure ... ... Updating GNUmakefile... Now you must run 'make'. ... $ make -j 16
Poof! Wasn’t that really fast? Do you want to test things really fast? You need to do a bit more work to turn on parallel testing in the test harness:
$ ./configure ... ... Updating GNUmakefile... Now you must run 'make'. ... $ TEST_JOBS=16 make -j 16 test_harness
That’s a feature of Perl’s Test::Harness as well as the parallel make
.
Depending on your architecture, at some point you experience diminishing returns. David Golden quantified it for his system. He used a few other tricks because he’s working out of a git clone like a developer would: make a change, recompile, retest, and repeat. David mentions that once more CPUs aren’t helping, you may be IO bound. That is, your system can’t move the bits around fast enough to get them to the running processes that’s waiting for the data it needs.
One way to fix that involves a RAM disk. Since the stuff you need is already in memory, it doesn’t have to go onto physical platters or into solid state devices, just to come back again. It also doesn’t have to compete with anything else that might want the device. How you make that disk depends on your system. On a Mac you can use Esperance DV.
perlbrew now supports parallel building too, see specifically ‘-j’ option.