Provided by: makepp_2.0.98.5-2_all bug

NAME

       makepp_speedup -- How to make makepp faster

DESCRIPTION

       So you think makepp is slow?  It has gotten noticeably faster, but granted, it's still
       slow, especially if you come from GNU make.  This is because it conscientiously checks all
       those things where gmake gives you a headache, ignoring lots of dependencies (the "I think
       I need to gmake clean to get rid of a mysterious bug" syndrome).  If you suspect some Perl
       code you added to your makefiles might be at fault, take a look at perl_performance.

       But there are a few things you can do to squeeze out more speed.  Some of the things are
       labeled unsafe, in the sence that you're asking makepp not to check or do certain things,
       which you think are not needed.  If these things would have been necessary, the build may
       not be correct.  Luckily this problem will be temporary, however.  It will get corrected
       as soon as you let makepp do all checks.

       You can combine several of these tips to increase the time gain even more.

   Safe Methods
       Use makeppreplay

       The stand-alone utility makeppreplay, mppr repeats things that makepp has already done,
       without any overhead.

       Use a Faster Perl

       Within version 5.8, all are roughly the same, only 5.8.7 is a bit faster.  Tuning your
       Perl can also help, like not compiling it for 64 bits, which makepp doesn't need.  For
       example ActiveState's build (<http://www.activestate.com/activeperl>) of 5.8.7 for Linux
       is faster than the Perl 5.8.7 that comes with SuSE Linux 10.0.

       Include as Little as Possible

       Each additional file you include is doubly penalizing.  On the one hand, the compiler must
       look for and at all those files.  You don't notice this so much, because it's just a
       little extra per compiler call.  On the other hand makepp must look too, to find
       dependencies and figure out whether they incur a rebuild.  Then it can seem to stall,
       while it is digesting a lot of dependencies at once.

       An absolutely deadly variant is the project master include file, which in turn
       conveniently includes anything you might need.  The result is that any header file change
       leads to a full build.  Even without a change, makepp must think about all those headers
       again, for every source you compile.  Just a tiny effort, since this is cached, but
       thousands of files can make this staggering.

       It may be cumbersome to figure out the minimal set of includes, and to cleanup those no
       longer needed, but it really pays off.  If anybody knows a tool that can identify which
       files get included unnecessarily, I'd be glad to mention it here!

       Build as Little as You Need

       If you have a default target which makes several programs, then makepp will have to check
       all their dependencies, right down to the smallest header file.  But maybe you want to
       test your change with only one of those programs.

       Then you would call makepp with an explicit target.  The less modules or headers all those
       programs have in common, the greater the benefit of not letting makepp check them all.

       Say your top level Makeppfile has this rule:

           $(phony all): proggie1 proggie2 $(only_phony_targets */**/all)

       Then you would call things like

           $ makepp proggie2
           $ makepp proggie1 dir/subdir/proggie27

       Use preferred makefile names

       Makepp looks for makefiles (unless you specify them explicitly on the command line or with
       "load-makefile") in the order RootMakeppfile, RootMakeppfile.mk, Makeppfile and
       Makeppfile.mk, followed by the classical makefile names.  (The .mk variants are for purely
       suffix-based systems.)

       So, if you use RootMakeppfile at the root of your build tree, and Makeppfile everywhere
       else, the files will be found slightly faster.  Makepp will also have a slightly smaller
       memory consumption (caching the fact that the other names don't exist), which also means
       speed through less memory management.

       Likewise if you have a statement

           include standard

       there will first be an attempt to find standard.makepp, so you might as well use that
       name.

       Have as few rules as you need

       Makepp keeps track not only of existent files, but also of any it learns to create.
       (That's why it offers reliable wildcards like *.o.)  The price for this power is a lot of
       management.  So, if you tell it how to create a .o from a .c, that's fine, because it will
       happen for most if not all candidates.

       But if you tell it how to link any suffixless executable from a like named .o, that's
       expensive, because it will probably only happen for a small part of them (those that
       contain a main function), but the basis will get laid for all.  You have to weigh the
       comfort of a linker pattern rule, against the efficiency of individual linker rules.

       If you don't use any of them, you should also turn off the builtin rules with:

           makepp_no_builtin = 1

       If you do use them, but, for the reasons explained above, not the builtin linker rules,
       you should turn those off with:

           makepp_no_builtin_linker = 1

       Put makepp extensions into a module

       Makepp offers very convenient possibilities of being extended through Perl.  But if you
       write some functions, commands or statements in a file and include that from dozens of
       makefiles, you will get dozens of copies of them all in memory.  And they will be read
       dozens of times by the makepp parser, which is a bit slower than Perl's.

       In this situation it is better to put your own functions into a module.

       Use Repositories and/or a Build Cache

       If you have several developers working on the same machine or if you change to and fro
       between sets of build options, this is for you.  Repositories allow you to offer a central
       reference where you only need to build what is locally different.  A build cache simply
       collects all produced files, and reuses them as appropriate, with less planning needed.
       The latter page also describes the differences.

       Use Sandboxes

       If your build is so big that makepp is having a hard time digesting all the information
       and if you can find a way of splitting it up into smaller independent parts, sandboxes
       might give you better parallelism than the "--jobs" option.

       Don't log what you do

       Makepp's logging feature is very powerful for tracking down bugs in the build system, or
       for analyzing your dependencies.  Whenever you don't do these things, you can save quite a
       bit of formatting and I/O with "--no-log --no-scan-log".

   Almost Safe Methods
       Get a Headstart

       The option "--loop" (or "--stop-before-building" or "--stop-after-loading" or "--stop")
       allows makepp to start its work while you are still editing.  It will repeatedly suspend
       itself when it gets to the point analyzing the dependencies.  You decide when you're ready
       to let it go on.  On our huge project this saves half a minute, and that's only when we
       have a CPU to ourselves.

       This method has two potential drawbacks:

       •   Makeppfiles have been read by the time makepp stops.  If you edit a Makeppfile or
           something from which it would have to be rebuilt, after starting makepp, this will go
           unnoticed till the next time.  But this should rarely be necessary, since makepp
           greatly reduces the need for Makeppfile changes.

       •   If a target depends on a wildcard, and that would match more than when the Makeppfile
           was read, makepp will not notice:

               proggie: *.o
                   $(LD) $(inputs) -o $(output)

           If you add another source file, or a file from which makepp knows how to generate a
           source, then "*.o" should match the object that produces.  But, if this file was added
           after starting makepp, it will not, because the wildcard was expanded too early.

       In both of these cases you should kill the prestarted makepp and start it anew.

       Gulliver's Travels

       The option "--gullible" tells makepp to believe that a rule changes what it says it will,
       neither less nor more.  Not performing these checks can save a few percent of makepp's CPU
       time.  And the Disk I/O savings is especially welcome on network file systems.  If you do
       nightly full builds in an empty directory with the "--repository" option, but without the
       "--gullible" option, you can be fairly sure that your rule set is consistent.  Then this
       option shouldn't hurt in your daytime work.

   Potentially Unsafe Methods
       These methods are unsafe if you give makepp the wrong hints.  But everything will again be
       fine, however, as soon as you let makepp do all the checks, by not passing it any limiting
       options.  For this reason I suggest using these hints to get quick intermediate builds,
       and use lunchtime and nights to let makepp do its job thoroughly.

       Build as Little as Needed

       This is the same tip of using explicit targets discussed under "Build as Little as You
       Need" above.  But it becomes more dangerous, if you do it because you are sure that your
       change will not affect any of the other programs.  Then they will not be built, even
       though it might have been necessary.

       Know Where Not to Build

       The option "--dont-build" is very powerful for speeding makepp up a lot.  If you know one
       or more directories, which you are sure are unaffected by any change you made since the
       last time, you can issue "--dont-build" options for them.  This can save makepp a lot of
       dependency analysis.  But it will not build anything in those directories, even if it
       should have.

       Know Where to Build

       This is the same as "Know where not to build", but instead of an exclusion list, you
       supply an inclusion list.  The trick is that a "--do-build" option, with a
       "--dont-build=/" option or under a "RootMakeppfile(.mk)" directory without a
       "--dont-build" option on a higher level directory means: build nothing except what I tell
       you to.  This is what users of traditional makes are looking for when they want to build
       just one directory:

           $ makepp --do-build=dir/subdir

       or, if you don't have a "RootMakeppfile(.mk)":

           $ makepp --dont-build=/ --do-build=dir/subdir

       The difference is that any default target in the top level Makeppfile, i.e. link commands
       are also executed this way.  If you don't want that, you must give an explicit target,
       which is automatically also marked for "--do-build":

           $ makepp --do-build=dir1/subdir dir2/proggie

       Know What to Build

       An extreme variant is asking makepp not to build anything but what you tell it to.  This
       is not so dangerous if you changed no include files, only modules, and you know which
       programs they go into.

       Say you have only changed "src/a.cpp" and "src/b.cpp" and these are linked directly into
       one program.  Dot is the current directory including all subdirectories.

           $ makepp --dont-build=. src/a.o src/b.o proggie1

       Or equivalently, because a "--do-build" option, without a "--dont-build" option on a
       higher level directory implies "--dont-build" for the root of the build tree:

           $ makepp --do-build=src/a.o src/b.o proggie1

       You can do something like the following in your Shell's $ENV file or .profile to save
       typing (csh users replace '=' with ' '):

           alias mppb='makepp --do-build'
           alias mppsb='makepp --stop --do-build'

       Then the last example becomes:

           $ mppb src/a.o src/b.o proggie1

       Build on a RAM disk

       Modern computers, especially servers, typically have a high mean time between failure.  If
       this is the case for you, and you have lots of RAM to spare, you can save the time you
       wait for I/O.  You should edit on a real disk, or replicate your edits there quickly.  But
       the build results are reproducible, so they can reside in RAM.  If you don't want to risk
       rebuilding, you can always replicate to disk after each build or at night.  You should not
       do this during the build, as you might catch partially written files, just as if the
       machine had crashed.

       If you have a system and/or storage unit with good caching and RAID, the gain might not be
       so big.

AUTHOR

       Daniel Pfeiffer <occitan@esperanto.org>