Tutorial 4: Using Test Fixtures

New in version 3.9.0.

A fixture in ReFrame is a test that manages a resource of another test. Fixtures can be chained to create essentially a graph of dependencies. Similarly to test dependencies, the test that uses the fixture will not execute until its fixture has executed. In this tutorial, we will rewrite the OSU benchmarks example presented in Tutorial 3: Using Dependencies in ReFrame Tests using fixtures. We will cover only the basic concepts of fixtures that will allow you to start using them in your tests. For the full documentation of the test fixtures, you should refer to the Test API Reference documentation.

The full example of the OSU benchmarks using test fixtures is shown below with the relevant parts highlighted:

import os
import reframe as rfm
import reframe.utility.sanity as sn


# rfmdocstart: fetch-osu-benchmarks
class fetch_osu_benchmarks(rfm.RunOnlyRegressionTest):
    descr = 'Fetch OSU benchmarks'
    version = variable(str, value='5.6.2')
    executable = 'wget'
    executable_opts = [
        f'http://mvapich.cse.ohio-state.edu/download/mvapich/osu-micro-benchmarks-{version}.tar.gz'  # noqa: E501
    ]
    local = True

    @sanity_function
    def validate_download(self):
        return sn.assert_eq(self.job.exitcode, 0)
# rfmdocend: fetch-osu-benchmarks


# rfmdocstart: build-osu-benchmarks
class build_osu_benchmarks(rfm.CompileOnlyRegressionTest):
    descr = 'Build OSU benchmarks'
    build_system = 'Autotools'
    build_prefix = variable(str)
    # rfmdocstart: osu-benchmarks
    osu_benchmarks = fixture(fetch_osu_benchmarks, scope='session')
    # rfmdocend: osu-benchmarks

    @run_before('compile')
    def prepare_build(self):
        tarball = f'osu-micro-benchmarks-{self.osu_benchmarks.version}.tar.gz'
        self.build_prefix = tarball[:-7]  # remove .tar.gz extension

        fullpath = os.path.join(self.osu_benchmarks.stagedir, tarball)
        self.prebuild_cmds = [
            f'cp {fullpath} {self.stagedir}',
            f'tar xzf {tarball}',
            f'cd {self.build_prefix}'
        ]
        self.build_system.max_concurrency = 8

    @sanity_function
    def validate_build(self):
        # If compilation fails, the test would fail in any case, so nothing to
        # further validate here.
        return True
# rfmdocend: build-osu-benchmarks


class OSUBenchmarkTestBase(rfm.RunOnlyRegressionTest):
    '''Base class of OSU benchmarks runtime tests'''

    valid_systems = ['daint:gpu']
    valid_prog_environs = ['gnu', 'pgi', 'intel']
    num_tasks = 2
    num_tasks_per_node = 1
    # rfmdocstart: osu-binaries
    osu_binaries = fixture(build_osu_benchmarks, scope='environment')
    # rfmdocend: osu-binaries

    @sanity_function
    def validate_test(self):
        return sn.assert_found(r'^8', self.stdout)


@rfm.simple_test
class osu_latency_test(OSUBenchmarkTestBase):
    descr = 'OSU latency test'

    # rfmdocstart: prepare-run
    @run_before('run')
    def prepare_run(self):
        self.executable = os.path.join(
            self.osu_binaries.stagedir,
            self.osu_binaries.build_prefix,
            'mpi', 'pt2pt', 'osu_latency'
        )
        self.executable_opts = ['-x', '100', '-i', '1000']
    # rfmdocend: prepare-run

    @performance_function('us')
    def latency(self):
        return sn.extractsingle(r'^8\s+(\S+)', self.stdout, 1, float)


@rfm.simple_test
class osu_bandwidth_test(OSUBenchmarkTestBase):
    descr = 'OSU bandwidth test'

    @run_before('run')
    def prepare_run(self):
        self.executable = os.path.join(
            self.osu_binaries.stagedir,
            self.osu_binaries.build_prefix,
            'mpi', 'pt2pt', 'osu_bw'
        )
        self.executable_opts = ['-x', '100', '-i', '1000']

    @performance_function('MB/s')
    def bandwidth(self):
        return sn.extractsingle(r'^4194304\s+(\S+)',
                                self.stdout, 1, float)


@rfm.simple_test
class osu_allreduce_test(OSUBenchmarkTestBase):
    mpi_tasks = parameter(1 << i for i in range(1, 5))
    descr = 'OSU Allreduce test'

    @run_before('run')
    def set_executable(self):
        self.num_tasks = self.mpi_tasks
        self.executable = os.path.join(
            self.osu_binaries.stagedir,
            self.osu_binaries.build_prefix,
            'mpi', 'collective', 'osu_allreduce'
        )
        self.executable_opts = ['-m', '8', '-x', '1000', '-i', '20000']

    @performance_function('us')
    def latency(self):
        return sn.extractsingle(r'^8\s+(\S+)', self.stdout, 1, float)

Let’s start from the leaf tests, i.e. the tests that execute the benchmarks (osu_latency_test, osu_bandwidth_test and osu_allreduce_test). As in the dependencies example, all these tests derive from the OSUBenchmarkTestBase, where we define a fixture that will take care of generating the binaries of the tests:

    osu_binaries = fixture(build_osu_benchmarks, scope='environment')

A test defines a fixture using the fixture() builtin and assigns it a name by assigning the return value of the builtin to a test variable, here osu_binaries. This name will be used later to access the resource managed by the fixture.

As stated previously, a fixture is another full-fledged ReFrame test, here the build_osu_benchmarks which will take care of building the OSU benchmarks. Each fixture is associated with a scope. This practically indicates at which level a fixture is shared with other tests. There are four fixture scopes, which are listed below in decreasing order of generality:

  • session: A fixture with this scope will be executed once per ReFrame run session and will be shared across the whole run.

  • partition: A fixture with this scope will be executed once per partition and will be shared across all tests that run in that partition.

  • environment: A fixture with this scope will be executed once per partition and environment combination and will be shared across all tests that run with this partition and environment combination.

  • test: A fixture with this scope is private to the test and will be executed for each test case.

In this example, we need to build once the OSU benchmarks for each partition and environment combination, so we use the environment scope.

Accessing the fixture is very straightforward. The fixture’s result is accessible after the setup pipeline stage through the corresponding variable in the test that is defining it. Since a fixture is a standard ReFrame test, you can access any information of the test. The individual benchmarks do exactly that:

    @run_before('run')
    def prepare_run(self):
        self.executable = os.path.join(
            self.osu_binaries.stagedir,
            self.osu_binaries.build_prefix,
            'mpi', 'pt2pt', 'osu_latency'
        )
        self.executable_opts = ['-x', '100', '-i', '1000']

Here we construct the final executable path by accessing the standard stagedir attribute of the test as well as the custom-defined build_prefix variable of the build_osu_benchmarks fixture.

Let’s inspect now the build_osu_benchmarks fixture:

class build_osu_benchmarks(rfm.CompileOnlyRegressionTest):
    descr = 'Build OSU benchmarks'
    build_system = 'Autotools'
    build_prefix = variable(str)
    # rfmdocstart: osu-benchmarks
    osu_benchmarks = fixture(fetch_osu_benchmarks, scope='session')
    # rfmdocend: osu-benchmarks

    @run_before('compile')
    def prepare_build(self):
        tarball = f'osu-micro-benchmarks-{self.osu_benchmarks.version}.tar.gz'
        self.build_prefix = tarball[:-7]  # remove .tar.gz extension

        fullpath = os.path.join(self.osu_benchmarks.stagedir, tarball)
        self.prebuild_cmds = [
            f'cp {fullpath} {self.stagedir}',
            f'tar xzf {tarball}',
            f'cd {self.build_prefix}'
        ]
        self.build_system.max_concurrency = 8

    @sanity_function
    def validate_build(self):
        # If compilation fails, the test would fail in any case, so nothing to
        # further validate here.
        return True

It is obvious that it is a normal ReFrame test except that it does not need to be decorated with the @simple_test decorator. This means that the test will only be executed if it is a fixture of another test. If it was decorated, it would be executed both as a standalone test and as a fixture of another test. Another detail is that this test does not define the valid_systems and valid_prog_environs variables. Fixtures inherit those variables from the test that owns them depending on the scope.

Similarly to OSUBenchmarkTestBase, this test uses a fixture that fetches the OSU benchmarks sources. We could fetch the OSU benchmarks in this test, but we choose to separate the two primarily for demonstration purposes, but it would also make sense in cases that the data fetch is too slow.

The osu_benchmarks fixture is defined at session scope, since we only need to download the benchmarks once for the whole session:

    osu_benchmarks = fixture(fetch_osu_benchmarks, scope='session')

The rest of the test is very straightforward.

Let’s inspect the last fixture, the fetch_osu_benchmarks:

class fetch_osu_benchmarks(rfm.RunOnlyRegressionTest):
    descr = 'Fetch OSU benchmarks'
    version = variable(str, value='5.6.2')
    executable = 'wget'
    executable_opts = [
        f'http://mvapich.cse.ohio-state.edu/download/mvapich/osu-micro-benchmarks-{version}.tar.gz'  # noqa: E501
    ]
    local = True

    @sanity_function
    def validate_download(self):
        return sn.assert_eq(self.job.exitcode, 0)

There is nothing special to this test – it is just an ordinary test – except that we force it to execute locally by setting its local variable. The reason for that is that a fixture at session scope can execute with any partition/environment combination, so ReFrame could have to spawn a job in case it has chosen a remote partition to launch this fixture on. For this reason, we simply force it to execute locally regardless of the chosen partition.

It is now time to run the new tests, but let us first list them:

reframe -c tutorials/fixtures/osu_benchmarks.py -l
[ReFrame Setup]
  version:           3.10.0-dev.3+605af31a
  command:           './bin/reframe -c tutorials/fixtures/osu_benchmarks.py -l'
  launched by:       user@host
  working directory: '/home/user/Devel/reframe'
  settings file:     '/home/user/Devel/reframe/tutorials/config/settings.py'
  check search path: '/home/user/Devel/reframe/tutorials/fixtures/osu_benchmarks.py'
  stage directory:   '/home/user/Devel/reframe/stage'
  output directory:  '/home/user/Devel/reframe/output'

[List of matched checks]
- osu_allreduce_test %mpi_tasks=16
    ^build_osu_benchmarks ~daint:gpu+gnu
      ^fetch_osu_benchmarks ~daint
    ^build_osu_benchmarks ~daint:gpu+intel
      ^fetch_osu_benchmarks ~daint
    ^build_osu_benchmarks ~daint:gpu+pgi
      ^fetch_osu_benchmarks ~daint
- osu_allreduce_test %mpi_tasks=8
    ^build_osu_benchmarks ~daint:gpu+gnu
      ^fetch_osu_benchmarks ~daint
    ^build_osu_benchmarks ~daint:gpu+intel
      ^fetch_osu_benchmarks ~daint
    ^build_osu_benchmarks ~daint:gpu+pgi
      ^fetch_osu_benchmarks ~daint
- osu_allreduce_test %mpi_tasks=4
    ^build_osu_benchmarks ~daint:gpu+gnu
      ^fetch_osu_benchmarks ~daint
    ^build_osu_benchmarks ~daint:gpu+intel
      ^fetch_osu_benchmarks ~daint
    ^build_osu_benchmarks ~daint:gpu+pgi
      ^fetch_osu_benchmarks ~daint
- osu_allreduce_test %mpi_tasks=2
    ^build_osu_benchmarks ~daint:gpu+gnu
      ^fetch_osu_benchmarks ~daint
    ^build_osu_benchmarks ~daint:gpu+intel
      ^fetch_osu_benchmarks ~daint
    ^build_osu_benchmarks ~daint:gpu+pgi
      ^fetch_osu_benchmarks ~daint
- osu_bandwidth_test
    ^build_osu_benchmarks ~daint:gpu+gnu
      ^fetch_osu_benchmarks ~daint
    ^build_osu_benchmarks ~daint:gpu+intel
      ^fetch_osu_benchmarks ~daint
    ^build_osu_benchmarks ~daint:gpu+pgi
      ^fetch_osu_benchmarks ~daint
- osu_latency_test
    ^build_osu_benchmarks ~daint:gpu+gnu
      ^fetch_osu_benchmarks ~daint
    ^build_osu_benchmarks ~daint:gpu+intel
      ^fetch_osu_benchmarks ~daint
    ^build_osu_benchmarks ~daint:gpu+pgi
      ^fetch_osu_benchmarks ~daint
Found 6 check(s)

Log file(s) saved in '/tmp/rfm-eopdze64.log'

Notice how the build_osu_benchmarks fixture is populated three times, once for each partition and environment combination, and the fetch_osu_benchmarks is generated only once. The following figure shows visually the conceptual dependencies of the osu_bandwidth_test.

_images/fixtures-conceptual-deps.svg

Expanded fixtures and dependencies for the OSU benchmarks example.

A scope part is added to the base name of the fixture, which in this figure is indicated with red color.

Under the hood, fixtures use the test dependency mechanism which is described in How Test Dependencies Work In ReFrame. The dependencies listed by default and shown in the previous figure are conceptual. Depending on the available partitions and environments, tests and fixtures can be concretized differently. Fixtures in particular are also more flexible in the way they can be concretized depending on their scope. The following listing and figure show the concretization of the osu_bandwidth_test:

reframe -c tutorials/fixtures/osu_benchmarks.py -n osu_bandwidth_test -lC
[ReFrame Setup]
  version:           3.10.0-dev.3+605af31a
  command:           './bin/reframe -c tutorials/fixtures/osu_benchmarks.py -n osu_bandwidth_test -lC'
  launched by:       user@host
  working directory: '/home/user/Devel/reframe'
  settings file:     '/home/user/Devel/reframe/tutorials/config/settings.py'
  check search path: '/home/user/Devel/reframe/tutorials/fixtures/osu_benchmarks.py'
  stage directory:   '/home/user/Devel/reframe/stage'
  output directory:  '/home/user/Devel/reframe/output'

[List of matched checks]
- osu_bandwidth_test @daint:gpu+gnu
    ^build_osu_benchmarks ~daint:gpu+gnu @daint:gpu+gnu
      ^fetch_osu_benchmarks ~daint @daint:gpu+gnu
- osu_bandwidth_test @daint:gpu+intel
    ^build_osu_benchmarks ~daint:gpu+intel @daint:gpu+intel
      ^fetch_osu_benchmarks ~daint @daint:gpu+gnu
- osu_bandwidth_test @daint:gpu+pgi
    ^build_osu_benchmarks ~daint:gpu+pgi @daint:gpu+pgi
      ^fetch_osu_benchmarks ~daint @daint:gpu+gnu
Concretized 7 test case(s)

Log file(s) saved in '/tmp/rfm-uza91jj1.log'
_images/fixtures-actual-deps.svg

The actual dependencies for the OSU benchmarks example using fixtures.

The first thing to notice here is how the individual test cases of osu_bandwidth_test depend only the specific fixtures for their scope: when osu_bandwidth_test runs on the daint:gpu partition using the gnu compiler it will only depend on the build_osu_benchmarks~daint:gpu+gnu fixture. The second thing to notice is where the fetch_osu_benchmarks~daint fixture will run. Since this is a session fixture, ReFrame has arbitrarily chosen to run it on daint:gpu using the gnu environment. A session fixture can run on any combination of valid partitions and environments. The following listing and figure show how the test dependency DAG is concretized when we scope the valid programming environments from the command line using -p pgi.

reframe -c tutorials/fixtures/osu_benchmarks.py -n osu_bandwidth_test -lC -p pgi
[ReFrame Setup]
  version:           3.10.0-dev.3+605af31a
  command:           './bin/reframe -c tutorials/fixtures/osu_benchmarks.py -n osu_bandwidth_test -lC -p pgi'
  launched by:       user@host
  working directory: '/home/user/Devel/reframe'
  settings file:     '/home/user/Devel/reframe/tutorials/config/settings.py'
  check search path: '/home/user/Devel/reframe/tutorials/fixtures/osu_benchmarks.py'
  stage directory:   '/home/user/Devel/reframe/stage'
  output directory:  '/home/user/Devel/reframe/output'

[List of matched checks]
- osu_bandwidth_test @daint:gpu+pgi
    ^build_osu_benchmarks ~daint:gpu+pgi @daint:gpu+pgi
      ^fetch_osu_benchmarks ~daint @daint:gpu+pgi
Concretized 3 test case(s)

Log file(s) saved in '/tmp/rfm-dnfdagj8.log'
_images/fixtures-actual-deps-scoped.svg

The dependency graph concretized for the ‘pgi’ environment only.

Notice how the fetch_osu_benchmarks~daint fixture is selected to run in the only valid partition/environment combination. This is an important difference compared to the same example written using raw dependencies in How Test Dependencies Work In ReFrame, in which case in order not to have unresolved dependencies, we would need to specify the valid programming environment of the test that fetches the sources. Fixtures do not need that, since you can impose less strict constraints by setting their scope accordingly.

Finally, let’s run all the benchmarks at once:

[ReFrame Setup]
  version:           3.10.0-dev.3+76e02667
  command:           './bin/reframe -c tutorials/fixtures/osu_benchmarks.py -r'
  launched by:       user@host
  working directory: '/home/user/Devel/reframe'
  settings file:     '/home/user/Devel/reframe/tutorials/config/settings.py'
  check search path: '/home/user/Devel/reframe/tutorials/fixtures/osu_benchmarks.py'
  stage directory:   '/home/user/Devel/reframe/stage'
  output directory:  '/home/user/Devel/reframe/output'

[==========] Running 10 check(s)
[==========] Started on Sat Jan 22 23:08:13 2022

[----------] start processing checks
[ RUN      ] fetch_osu_benchmarks ~daint @daint:gpu+gnu
[       OK ] ( 1/22) fetch_osu_benchmarks ~daint @daint:gpu+gnu
[ RUN      ] build_osu_benchmarks ~daint:gpu+gnu @daint:gpu+gnu
[ RUN      ] build_osu_benchmarks ~daint:gpu+intel @daint:gpu+intel
[ RUN      ] build_osu_benchmarks ~daint:gpu+pgi @daint:gpu+pgi
[       OK ] ( 2/22) build_osu_benchmarks ~daint:gpu+gnu @daint:gpu+gnu
[ RUN      ] osu_allreduce_test %mpi_tasks=16 @daint:gpu+gnu
[ RUN      ] osu_allreduce_test %mpi_tasks=8 @daint:gpu+gnu
[ RUN      ] osu_allreduce_test %mpi_tasks=4 @daint:gpu+gnu
[ RUN      ] osu_allreduce_test %mpi_tasks=2 @daint:gpu+gnu
[ RUN      ] osu_bandwidth_test @daint:gpu+gnu
[ RUN      ] osu_latency_test @daint:gpu+gnu
[       OK ] ( 3/22) build_osu_benchmarks ~daint:gpu+intel @daint:gpu+intel
[       OK ] ( 4/22) build_osu_benchmarks ~daint:gpu+pgi @daint:gpu+pgi
[ RUN      ] osu_allreduce_test %mpi_tasks=16 @daint:gpu+intel
[ RUN      ] osu_allreduce_test %mpi_tasks=16 @daint:gpu+pgi
[ RUN      ] osu_allreduce_test %mpi_tasks=8 @daint:gpu+intel
[ RUN      ] osu_allreduce_test %mpi_tasks=8 @daint:gpu+pgi
[ RUN      ] osu_allreduce_test %mpi_tasks=4 @daint:gpu+intel
[ RUN      ] osu_allreduce_test %mpi_tasks=4 @daint:gpu+pgi
[ RUN      ] osu_allreduce_test %mpi_tasks=2 @daint:gpu+intel
[ RUN      ] osu_allreduce_test %mpi_tasks=2 @daint:gpu+pgi
[ RUN      ] osu_bandwidth_test @daint:gpu+intel
[ RUN      ] osu_bandwidth_test @daint:gpu+pgi
[ RUN      ] osu_latency_test @daint:gpu+intel
[ RUN      ] osu_latency_test @daint:gpu+pgi
[       OK ] ( 5/22) osu_allreduce_test %mpi_tasks=16 @daint:gpu+gnu
[       OK ] ( 6/22) osu_allreduce_test %mpi_tasks=4 @daint:gpu+gnu
[       OK ] ( 7/22) osu_allreduce_test %mpi_tasks=2 @daint:gpu+gnu
[       OK ] ( 8/22) osu_allreduce_test %mpi_tasks=8 @daint:gpu+gnu
[       OK ] ( 9/22) osu_bandwidth_test @daint:gpu+gnu
[       OK ] (10/22) osu_allreduce_test %mpi_tasks=4 @daint:gpu+pgi
[       OK ] (11/22) osu_allreduce_test %mpi_tasks=2 @daint:gpu+intel
[       OK ] (12/22) osu_allreduce_test %mpi_tasks=4 @daint:gpu+intel
[       OK ] (13/22) osu_allreduce_test %mpi_tasks=2 @daint:gpu+pgi
[       OK ] (14/22) osu_latency_test @daint:gpu+intel
[       OK ] (15/22) osu_latency_test @daint:gpu+pgi
[       OK ] (16/22) osu_allreduce_test %mpi_tasks=16 @daint:gpu+intel
[       OK ] (17/22) osu_allreduce_test %mpi_tasks=16 @daint:gpu+pgi
[       OK ] (18/22) osu_allreduce_test %mpi_tasks=8 @daint:gpu+pgi
[       OK ] (19/22) osu_latency_test @daint:gpu+gnu
[       OK ] (20/22) osu_bandwidth_test @daint:gpu+intel
[       OK ] (21/22) osu_bandwidth_test @daint:gpu+pgi
[       OK ] (22/22) osu_allreduce_test %mpi_tasks=8 @daint:gpu+intel
[----------] all spawned checks have finished

[  PASSED  ] Ran 22/22 test case(s) from 10 check(s) (0 failure(s), 0 skipped)
[==========] Finished on Sat Jan 22 23:13:40 2022
Run report saved in '/home/user/.reframe/reports/run-report.json'
Log file(s) saved in '/tmp/rfm-6gbw7qzs.log'

Tip

A reasonable question is how to choose between fixtures and dependencies?

The rule of thumb is use fixtures if your test needs to use any resource of the target test and use dependencies if you simply want to impose an order of execution for your tests.