Provided by: libconfig-model-tester-perl_2.053-1_all bug

NAME

       Config::Model::Tester - Test framework for Config::Model

VERSION

       version 2.053

SYNOPSIS

        # in t/model_test.t
        use warnings;
        use strict;

        use Config::Model::Tester ;
        use ExtUtils::testlib;

        my $arg = shift || ''; # typically e t l
        my $test_only_app = shift || ''; # only run one set of test
        my $do = shift ; # select subtests to run with a regexp

        run_tests($arg, $test_only_app, $do) ;

DESCRIPTION

       This class provides a way to test configuration models with tests files.  This class was designed to
       tests several models and several tests cases per model.

       A specific layout for test files must be followed.

   Simple test file layout
        t
        |-- model_test.t
        \-- model_tests.d           # do not change directory name
            |-- lcdd-test-conf.pl   # test specification for lcdd model
            \-- lcdd-examples
                |-- t0              # subtest t0
                \-- LCDD-0.5.5      # subtest for older LCDproc

       In the example above, we have 1 model to test: "lcdd" and 2 tests cases. The model name matches the file
       specified in "lib/Config/Model/*.d" directory. In this case, the model name matches
       "lib/Config/Model/system.d/lcdproc"

       Test specification is written in "lcdd-test-conf.pl" file (i.e. this modules looks for files named  like
       "<model-name>-test-conf.pl>").

       Subtests are specified in files in directory "lcdd-examples" ( i.e. this modules looks for subtests in
       directory "<model-name>-examples.pl>". "lcdd-test-conf.pl" contains instructions so that each file will
       be used as a "/etc/LCDd.conf" file during each test case.

       "lcdd-test-conf.pl" can contain specifications for more test case. Each test case will require a new file
       in "lcdd-examples" directory.

       See "Examples" for a link to the actual LCDproc model tests

   Test file layout for multi-file configuration
       When a configuration is spread over several files, test examples must be provided in sub-directories:

        t/model_tests.d
        \-- dpkg-test-conf.pl         # test specification
        \-- dpkg-examples
            \-- libversion            # example subdir, used as subtest name
                \-- debian            # directory for one test case
                    |-- changelog
                    |-- compat
                    |-- control
                    |-- copyright
                    |-- rules
                    |-- source
                    |   \-- format
                    \-- watch

       In the example above, the test specification is written in "dpkg-test-conf.pl". Dpkg layout requires
       several files per test case.  "dpkg-test-conf.pl" will contain instructions so that each directory under
       "dpkg-examples" will be used.

       See "Examples" for a link to the (many) Dpkg model tests

   Test file layout depending on system
        t/model_tests.d/
        |-- ssh-test-conf.pl
        |-- ssh-examples
            \-- basic
                |-- system_ssh_config
                \-- user_ssh_config

       In this example, the layout of the configuration files depend on the system. For instance, system wide
       "ssh_config" is stored in "/etc/ssh" on Linux, and directly in "/etc" on MacOS.

       ssh-test-conf.pl <https://github.com/dod38fr/config-model-openssh/blob/master/t/model_tests.d/ssh-test-
       conf.pl> will specify the target path of each file. I.e.:

        $home_for_test = $^O eq 'darwin' ? '/Users/joe'
                       :                   '/home/joe' ;

        # ...

             setup => {
               'system_ssh_config' => {
                   'darwin' => '/etc/ssh_config',
                   'default' => '/etc/ssh/ssh_config',
               },
               'user_ssh_config' => "$home_for_test/.ssh/config"

       See the actual Ssh and Sshd model tests <https://github.com/dod38fr/config-model-
       openssh/tree/master/t/model_tests.d>

   Basic test specification
       Each model test is specified in "<model>-test-conf.pl". This file contains a set of global variables.
       (yes, global variables are often bad ideas in programs, but they are handy for tests):

        # config file name (used to copy test case into test wr_root directory)
        $conf_file_name = "fstab" ;
        # config dir where to copy the file (optional)
        #$conf_dir = "etc" ;
        # home directory for this test
        $home_for_test = '/home/joe' ;

       Here, "t0" file will be copied in "wr_root/test-t0/etc/fstab".

        # config model name to test
        $model_to_test = "Fstab" ;

        # list of tests. This modules looks for @tests global variable
        @tests = (
           {
            # test name
            name => 't0',
            # add optional specification here for t0 test
           },
           {
            name => 't1',
            # add optional specification here for t1 test
           },
        );

        1; # to keep Perl happy

       You can suppress warnings by specifying "no_warnings => 1". On the other hand, you may also want to check
       for warnings specified to your model. In this case, you should avoid specifying "no_warnings" here and
       specify warning tests or warning filters as mentioned below.

       See actual fstab test <https://github.com/dod38fr/config-model/blob/master/t/model_tests.d/fstab-test-
       conf.pl>.

   Internal tests
       Some tests will require the creation of a configuration class dedicated for test. This test class can be
       created directly in the test specification by calling create_config_class on $model variable. See for
       instance the layer test <https://github.com/dod38fr/config-model/blob/master/t/model_tests.d/layer-test-
       conf.pl> or the test for shellvar backend <https://github.com/dod38fr/config-
       model/blob/master/t/model_tests.d/backend-shellvar-test-conf.pl>.

   Test specification with arbitrary file names
       In some models (e.g. "Multistrap", the config file is chosen by the user.  In this case, the file name
       must be specified for each tests case:

        $model_to_test = "Multistrap";

        @tests = (
           {
               name        => 'arm',
               config_file => '/home/foo/my_arm.conf',
               check       => {},
           },
        );

       See actual multistrap test <https://github.com/dod38fr/config-
       model/blob/master/t/model_tests.d/multistrap-test-conf.pl>.

   Test scenario
       Each subtest follow a sequence explained below. Each step of this sequence may be altered by adding
       specification in "<model-to-test>-test-conf.pl":

       •   Setup  test  in  "wr_root/<subtest  name>/".  If  your configuration file layout depend on the target
           system, you will have to specify the path using "setup" parameter.  See "Test file  layout  depending
           on system".

       •   Create  configuration  instance, load config data and check its validity. Use "load_check => 'no'" if
           your file is not valid.

       •   Check for config data warning. You should pass the list of expected warnings.  E.g.

               load_warnings => [ qr/Missing/, (qr/deprecated/) x 3 , ],

           Use an empty array_ref to mask load warnings.

       •   Optionally run update command:

               update => { in => 'some-test-data.txt', returns => 'foo' , no_warnings => [ 0 | 1 ] }

           "returns" is the expected return value (optional). All other arguments are passed to "update" method.
           Note that "quiet => 1" may be useful for less verbose test.

       •   Optionally load configuration data. You should design this config  data  to  suppress  any  error  or
           warning mentioned above. E.g:

               load => 'binary:seaview Synopsis="multiplatform interface for sequence alignment"',

           See Config::Model::Loader for the syntax of the string accepted by "load" parameter.

       •   Optionally, call apply_fixes:

               apply_fix => 1,

       •   Call dump_tree to check the validity of the data. Use "dump_errors" if you expect issues:

               dump_errors =>  [
                   # the issues     the fix that will be applied
                   qr/mandatory/ => 'Files:"*" Copyright:0="(c) foobar"',
                   qr/mandatory/ => ' License:FOO text="foo bar" ! Files:"*" License short_name="FOO" '
               ],

       •   Likewise, specify any expected warnings (note the list must contain only "qr" stuff):

                   dump_warnings => [ (qr/deprecated/) x 3 ],

           You can tolerate any dump warning this way:

                   dump_warnings => undef ,

       •   Run specific content check to verify that configuration data was retrieved correctly:

               check => {
                   'fs:/proc fs_spec',           "proc" ,
                   'fs:/proc fs_file',           "/proc" ,
                   'fs:/home fs_file',          "/home",
               },

           The  keys  of the hash points to the value to be checked using the syntax described in "grab(...)" in
           Config::Model::AnyThing:.

           You can run check using different check modes (See "fetch( ... )" in Config::Model::Value) by passing
           a hash ref instead of a scalar :

               check  => {
                   'sections:debian packages:0' , { mode => 'layered', value => 'dpkg-dev' },
                   'sections:base packages:0',    { mode => 'layered', value => "gcc-4.2-base' },
               },

           The whole hash content (except "value") is passed to  grab and fetch

           A regexp can also be used to check value:

              check => {
                 "License text" => qr/gnu/i,
                 "License text" => { mode => 'custom', value => qr/gnu/i },
              }

       •   Verify if a hash contains one or more keys (or keys matching a regexp):

            has_key => [
               'sections' => 'debian', # sections must point to a hash element
               'control' => [qw/source binary/],
               'copyright Files' => qr/.c$/,
               'copyright Files' => [qr/\.h$/], qr/\.c$/],
            ],

       •   Verify that a hash has not a key (or a key matching a regexp):

            has_not_key => [
               'copyright Files' => qr/.virus$/ # silly, isn't ?
            ],

       •   Verify annotation extracted from the configuration file comments:

               verify_annotation => {
                       'source Build-Depends' => "do NOT add libgtk2-perl to build-deps (see bug #554704)",
                       'source Maintainer' => "what a fine\nteam this one is",
                   },

       •   Write back the config data in "wr_root/<subtest name>/".  Note that write  back  is  forced,  so  the
           tested  configuration files are written back even if the configuration values were not changed during
           the test.

           You can skip warning when writing back with the global :

               no_warnings => 1,

       •   Check the content of the written files(s) with Test::File::Contents. Tests can be grouped in an array
           ref:

              file_contents => {
                       "/home/foo/my_arm.conf" => "really big string" ,
                       "/home/bar/my_arm.conf" => [ "really big string" , "another"], ,
                   }

              file_contents_like => {
                       "/home/foo/my_arm.conf" => [ qr/should be there/, qr/as well/ ] ,
              }

              file_contents_unlike => {
                       "/home/foo/my_arm.conf" => qr/should NOT be there/ ,
              }

       •   Check added or removed configuration files. If you expect changes, specify a subref to alter the file
           list:

               file_check_sub => sub {
                   my $list_ref = shift ;
                   # file added during tests
                   push @$list_ref, "/debian/source/format" ;
               };

       •   Copy all config data from "wr_root/<subtest name>/" to "wr_root/<subtest  name>-w/".  This  steps  is
           necessary   to  check  that  configuration  written  back  has  the  same  content  as  the  original
           configuration.

       •   Create another configuration instance to read the conf file that was just copied (configuration  data
           is checked.)

       •   You  can skip the load check if the written file still contain errors (e.g.  some errors were ignored
           and cannot be fixed) with "load_check2 => 'no'"

       •   Compare data read from original data.

       •   Run specific content check on the written config file to verify that configuration data  was  written
           and retrieved correctly:

               wr_check => {
                   'fs:/proc fs_spec' =>          "proc" ,
                   'fs:/proc fs_file' =>          "/proc",
                   'fs:/home fs_file' =>          "/home",
               },

           Like the "check" item explained above, you can run "wr_check" using different check modes.

   Running the test
       Run all tests with one of these commands:

        prove -l t/model_test.t :: [ t|l|e [ <model_name> [ <regexp> ]]]
        perl -Ilib t/model_test.t  [ t|l|e [ <model_name> [ <regexp> ]]]

       By default, all tests are run on all models.

       You can pass arguments to "t/model_test.t":

       •   a  bunch  of  letters.  't' to get test traces. 'e' to get stack trace in case of errors, 'l' to have
           logs. All other letters are ignored. E.g.

             # run with log and error traces
             prove -lv t/model_test.t :: el

       •   The model name to tests. E.g.:

             # run only fstab tests
             prove -lv t/model_test.t :: x fstab

       •   A regexp to filter subtest E.g.:

             # run only fstab tests foobar subtest
             prove -lv t/model_test.t :: x fstab foobar

             # run only fstab tests foo subtest
             prove -lv t/model_test.t :: x fstab '^foo$'

Examples

       •   LCDproc <http://lcdproc.org> has a single configuration file: "/etc/LCDd.conf". Here's  LCDproc  test
           layout  <https://github.com/dod38fr/config-model-lcdproc/tree/master/t/model_tests.d>  and  the  test
           specification <https://github.com/dod38fr/config-model-lcdproc/blob/master/t/model_tests.d/lcdd-test-
           conf.pl>

       •   Dpkg packages are constructed from several files. These files are handled like configuration files by
           Config::Model::Dpkg.      The       test       layout       <http://anonscm.debian.org/gitweb/?p=pkg-
           perl/packages/libconfig-model-dpkg-perl.git;a=tree;f=t/model_tests.d;hb=HEAD>   features   test  with
           multiple  file  in  dpkg-examples   <http://anonscm.debian.org/gitweb/?p=pkg-perl/packages/libconfig-
           model-dpkg-perl.git;a=tree;f=t/model_tests.d/dpkg-examples;hb=HEAD>.    The   test  is  specified  in
           dpkg-test-conf.pl        <http://anonscm.debian.org/gitweb/?p=pkg-perl/packages/libconfig-model-dpkg-
           perl.git;a=blob_plain;f=t/model_tests.d/dpkg-test-conf.pl;hb=HEAD>

       •   multistrap-test-conf.pl                                           <https://github.com/dod38fr/config-
           model/blob/master/t/model_tests.d/multistrap-test-conf.pl>          and           multistrap-examples
           <https://github.com/dod38fr/config-model/tree/master/t/model_tests.d/multistrap-examples>  specify  a
           test where the configuration file name is not imposed by the application. The file name must then  be
           set in the test specification.

       •   backend-shellvar-test-conf.pl                                     <https://github.com/dod38fr/config-
           model/blob/master/t/model_tests.d/backend-shellvar-test-conf.pl> is a more  complex  example  showing
           how to test a backend. The test is done creating a dummy model within the test specification.

SEE ALSO

       •   Config::Model

       •   Test::More

AUTHOR

       Dominique Dumont

COPYRIGHT AND LICENSE

       This software is Copyright (c) 2013-2016 by Dominique Dumont.

       This is free software, licensed under:

         The GNU Lesser General Public License, Version 2.1, February 1999

SUPPORT

   Websites
       The  following websites have more information about this module, and may be of help to you. As always, in
       addition to those websites please use your favorite search engine to discover more resources.

       •   Search CPAN

           The default CPAN search engine, useful to view POD in HTML format.

           <http://search.cpan.org/dist/Config-Model-Tester>

       •   AnnoCPAN

           The AnnoCPAN is a website that allows community annotations of Perl module documentation.

           <http://annocpan.org/dist/Config-Model-Tester>

       •   CPAN Ratings

           The CPAN Ratings is a website that allows community ratings and reviews of Perl modules.

           <http://cpanratings.perl.org/d/Config-Model-Tester>

       •   CPANTS

           The CPANTS is a website that analyzes the Kwalitee ( code metrics ) of a distribution.

           <http://cpants.perl.org/dist/overview/Config-Model-Tester>

       •   CPAN Testers

           The CPAN Testers is a network of smokers who run automated tests on uploaded CPAN distributions.

           <http://www.cpantesters.org/distro/C/Config-Model-Tester>

       •   CPAN Testers Matrix

           The CPAN Testers Matrix is a website that provides a visual  overview  of  the  test  results  for  a
           distribution on various Perls/platforms.

           <http://matrix.cpantesters.org/?dist=Config-Model-Tester>

       •   CPAN Testers Dependencies

           The CPAN Testers Dependencies is a website that shows a chart of the test results of all dependencies
           for a distribution.

           <http://deps.cpantesters.org/?module=Config::Model::Tester>

   Bugs / Feature Requests
       Please  report  any  bugs  or  feature  requests  by  email  to "ddumont at cpan.org", or through the web
       interface at <https://github.com/dod38fr/config-model-tester/issues>. You will be automatically  notified
       of any progress on the request by the system.

   Source Code
       The  code  is open to the world, and available for you to hack on. Please feel free to browse it and play
       with it, or whatever. If you want to contribute patches, please send me a diff or prod me  to  pull  from
       your repository :)

       <http://github.com/dod38fr/config-model-tester.git>

         git clone git://github.com/dod38fr/config-model-tester.git

perl v5.22.1                                       2016-03-29                         Config::Model::Tester(3pm)