Tutorial 3: Using Dependencies in ReFrame Tests

New in version 2.21.

A ReFrame test may define dependencies to other tests. An example scenario is to test different runtime configurations of a benchmark that you need to compile, or run a scaling analysis of a code. In such cases, you don’t want to download and rebuild your test for each runtime configuration. You could have a test where only the sources are fetched, and which all build tests would depend on. And, similarly, all the runtime tests would depend on their corresponding build test. This is the approach we take with the following example, that fetches, builds and runs several OSU benchmarks. We first create a basic run-only test, that fetches the benchmarks:

cat tutorials/deps/osu_benchmarks.py
@rfm.simple_test
class OSUDownloadTest(rfm.RunOnlyRegressionTest):
    descr = 'OSU benchmarks download sources'
    valid_systems = ['daint:login']
    valid_prog_environs = ['builtin']
    executable = 'wget'
    executable_opts = [
        'http://mvapich.cse.ohio-state.edu/download/mvapich/osu-micro-benchmarks-5.6.2.tar.gz'  # noqa: E501
    ]
    postrun_cmds = [
        'tar xzf osu-micro-benchmarks-5.6.2.tar.gz'
    ]

    @sanity_function
    def validate_download(self):
        return sn.assert_true(os.path.exists('osu-micro-benchmarks-5.6.2'))

This test doesn’t need any specific programming environment, so we simply pick the builtin environment in the login partition. The build tests would then copy the benchmark code and build it for the different programming environments:

@rfm.simple_test
class OSUBuildTest(rfm.CompileOnlyRegressionTest):
    descr = 'OSU benchmarks build test'
    valid_systems = ['daint:gpu']
    valid_prog_environs = ['gnu', 'nvidia', 'intel']
    build_system = 'Autotools'

    # rfmdocstart: inject_deps
    @run_after('init')
    def inject_dependencies(self):
        self.depends_on('OSUDownloadTest', udeps.fully)
    # rfmdocend: inject_deps

    # rfmdocstart: set_sourcedir
    @require_deps
    def set_sourcedir(self, OSUDownloadTest):
        self.sourcesdir = os.path.join(
            OSUDownloadTest(part='login', environ='builtin').stagedir,
            'osu-micro-benchmarks-5.6.2'
        )
    # rfmdocend: set_sourcedir

    @run_before('compile')
    def set_build_system_attrs(self):
        self.build_system.max_concurrency = 8

    @sanity_function
    def validate_build(self):
        return sn.assert_not_found('error', self.stderr)

The only new thing that comes in with the OSUBuildTest test is the following:

    @run_after('init')
    def inject_dependencies(self):
        self.depends_on('OSUDownloadTest', udeps.fully)

Here we tell ReFrame that this test depends on a test named OSUDownloadTest. This test may or may not be defined in the same test file; all ReFrame needs is the test name. The depends_on() function will create dependencies between the individual test cases of the OSUBuildTest and the OSUDownloadTest, such that all the test cases of OSUBuildTest will depend on the outcome of the OSUDownloadTest. This behaviour can be changed, but it is covered in detail in How Test Dependencies Work In ReFrame. You can create arbitrary test dependency graphs, but they need to be acyclic. If ReFrame detects cyclic dependencies, it will refuse to execute the set of tests and will issue an error pointing out the cycle.

A ReFrame test with dependencies will execute, i.e., enter its “setup” stage, only after all of its dependencies have succeeded. If any of its dependencies fails, the current test will be marked as failure as well.

The next step for the OSUBuildTest is to set its sourcesdir to point to the source code that was fetched by the OSUDownloadTest. This is achieved with the following specially decorated function:

    @require_deps
    def set_sourcedir(self, OSUDownloadTest):
        self.sourcesdir = os.path.join(
            OSUDownloadTest(part='login', environ='builtin').stagedir,
            'osu-micro-benchmarks-5.6.2'
        )

The @require_deps decorator binds each argument of the decorated function to the corresponding target dependency. In order for the binding to work correctly the function arguments must be named after the target dependencies. Referring to a dependency only by the test’s name is not enough, since a test might be associated with multiple programming environments. For this reason, each dependency argument is actually bound to a function that accepts as argument the name of the target partition and target programming environment. If no arguments are passed, the current programming environment is implied, such that OSUDownloadTest() is equivalent to OSUDownloadTest(self.current_environ.name, self.current_partition.name). In this case, since both the partition and environment of the target dependency do not much those of the current test, we need to specify both.

This call returns the actual test case of the dependency that has been executed. This allows you to access any attribute from the target test, as we do in this example by accessing the target test’s stage directory, which we use to construct the sourcesdir of the test.

For the next test we need to use the OSU benchmark binaries that we just built, so as to run the MPI ping-pong benchmark. Here is the relevant part:

class OSUBenchmarkTestBase(rfm.RunOnlyRegressionTest):
    '''Base class of OSU benchmarks runtime tests'''

    valid_systems = ['daint:gpu']
    valid_prog_environs = ['gnu', 'nvidia', 'intel']
    sourcesdir = None
    num_tasks = 2
    num_tasks_per_node = 1

    # rfmdocstart: set_deps
    @run_after('init')
    def set_dependencies(self):
        self.depends_on('OSUBuildTest', udeps.by_env)
    # rfmdocend: set_deps

    @sanity_function
    def validate_test(self):
        return sn.assert_found(r'^8', self.stdout)


@rfm.simple_test
class OSULatencyTest(OSUBenchmarkTestBase):
    descr = 'OSU latency test'

    # rfmdocstart: set_exec
    @require_deps
    def set_executable(self, OSUBuildTest):
        self.executable = os.path.join(
            OSUBuildTest().stagedir,
            'mpi', 'pt2pt', 'osu_latency'
        )
        self.executable_opts = ['-x', '100', '-i', '1000']
    # rfmdocend: set_exec

    @performance_function('us')
    def latency(self):
        return sn.extractsingle(r'^8\s+(\S+)', self.stdout, 1, float)

First, since we will have multiple similar benchmarks, we move all the common functionality to the OSUBenchmarkTestBase base class. Again nothing new here; we are going to use two nodes for the benchmark and we set sourcesdir to None, since none of the benchmark tests will use any additional resources. As done previously, we define the dependencies with the following:

    @run_after('init')
    def set_dependencies(self):
        self.depends_on('OSUBuildTest', udeps.by_env)

Here we tell ReFrame that this test depends on a test named OSUBuildTest “by environment.” This means that the test cases of this test will only depend on the test cases of the OSUBuildTest that use the same environment; partitions may be different.

The next step for the OSULatencyTest is to set its executable to point to the binary produced by the OSUBuildTest. This is achieved with the following specially decorated function:

    @require_deps
    def set_executable(self, OSUBuildTest):
        self.executable = os.path.join(
            OSUBuildTest().stagedir,
            'mpi', 'pt2pt', 'osu_latency'
        )
        self.executable_opts = ['-x', '100', '-i', '1000']

This concludes the presentation of the OSULatencyTest test. The OSUBandwidthTest is completely analogous.

The OSUAllreduceTest shown below is similar to the other two, except that it is parameterized. It is essentially a scalability test that is running the osu_allreduce executable created by the OSUBuildTest for 2, 4, 8 and 16 nodes.

@rfm.simple_test
class OSUAllreduceTest(OSUBenchmarkTestBase):
    mpi_tasks = parameter(1 << i for i in range(1, 5))
    descr = 'OSU Allreduce test'

    @run_after('init')
    def set_num_tasks(self):
        self.num_tasks = self.mpi_tasks

    @require_deps
    def set_executable(self, OSUBuildTest):
        self.executable = os.path.join(
            OSUBuildTest().stagedir,
            'mpi', 'collective', 'osu_allreduce'
        )
        self.executable_opts = ['-m', '8', '-x', '1000', '-i', '20000']

    @performance_function('us')
    def latency(self):
        return sn.extractsingle(r'^8\s+(\S+)', self.stdout, 1, float)

The full set of OSU example tests is shown below:

# Copyright 2016-2022 Swiss National Supercomputing Centre (CSCS/ETH Zurich)
# ReFrame Project Developers. See the top-level LICENSE file for details.
#
# SPDX-License-Identifier: BSD-3-Clause

import os

import reframe as rfm
import reframe.utility.sanity as sn
import reframe.utility.udeps as udeps


# rfmdocstart: osupingpong
class OSUBenchmarkTestBase(rfm.RunOnlyRegressionTest):
    '''Base class of OSU benchmarks runtime tests'''

    valid_systems = ['daint:gpu']
    valid_prog_environs = ['gnu', 'nvidia', 'intel']
    sourcesdir = None
    num_tasks = 2
    num_tasks_per_node = 1

    # rfmdocstart: set_deps
    @run_after('init')
    def set_dependencies(self):
        self.depends_on('OSUBuildTest', udeps.by_env)
    # rfmdocend: set_deps

    @sanity_function
    def validate_test(self):
        return sn.assert_found(r'^8', self.stdout)


@rfm.simple_test
class OSULatencyTest(OSUBenchmarkTestBase):
    descr = 'OSU latency test'

    # rfmdocstart: set_exec
    @require_deps
    def set_executable(self, OSUBuildTest):
        self.executable = os.path.join(
            OSUBuildTest().stagedir,
            'mpi', 'pt2pt', 'osu_latency'
        )
        self.executable_opts = ['-x', '100', '-i', '1000']
    # rfmdocend: set_exec

    @performance_function('us')
    def latency(self):
        return sn.extractsingle(r'^8\s+(\S+)', self.stdout, 1, float)
# rfmdocend: osupingpong


@rfm.simple_test
class OSUBandwidthTest(OSUBenchmarkTestBase):
    descr = 'OSU bandwidth test'

    @require_deps
    def set_executable(self, OSUBuildTest):
        self.executable = os.path.join(
            OSUBuildTest().stagedir,
            'mpi', 'pt2pt', 'osu_bw'
        )
        self.executable_opts = ['-x', '100', '-i', '1000']

    @performance_function('MB/s')
    def bandwidth(self):
        return sn.extractsingle(r'^4194304\s+(\S+)',
                                self.stdout, 1, float)


# rfmdocstart: osuallreduce
@rfm.simple_test
class OSUAllreduceTest(OSUBenchmarkTestBase):
    mpi_tasks = parameter(1 << i for i in range(1, 5))
    descr = 'OSU Allreduce test'

    @run_after('init')
    def set_num_tasks(self):
        self.num_tasks = self.mpi_tasks

    @require_deps
    def set_executable(self, OSUBuildTest):
        self.executable = os.path.join(
            OSUBuildTest().stagedir,
            'mpi', 'collective', 'osu_allreduce'
        )
        self.executable_opts = ['-m', '8', '-x', '1000', '-i', '20000']

    @performance_function('us')
    def latency(self):
        return sn.extractsingle(r'^8\s+(\S+)', self.stdout, 1, float)
# rfmdocend: osuallreduce


# rfmdocstart: osubuild
@rfm.simple_test
class OSUBuildTest(rfm.CompileOnlyRegressionTest):
    descr = 'OSU benchmarks build test'
    valid_systems = ['daint:gpu']
    valid_prog_environs = ['gnu', 'nvidia', 'intel']
    build_system = 'Autotools'

    # rfmdocstart: inject_deps
    @run_after('init')
    def inject_dependencies(self):
        self.depends_on('OSUDownloadTest', udeps.fully)
    # rfmdocend: inject_deps

    # rfmdocstart: set_sourcedir
    @require_deps
    def set_sourcedir(self, OSUDownloadTest):
        self.sourcesdir = os.path.join(
            OSUDownloadTest(part='login', environ='builtin').stagedir,
            'osu-micro-benchmarks-5.6.2'
        )
    # rfmdocend: set_sourcedir

    @run_before('compile')
    def set_build_system_attrs(self):
        self.build_system.max_concurrency = 8

    @sanity_function
    def validate_build(self):
        return sn.assert_not_found('error', self.stderr)
# rfmdocend: osubuild


# rfmdocstart: osudownload
@rfm.simple_test
class OSUDownloadTest(rfm.RunOnlyRegressionTest):
    descr = 'OSU benchmarks download sources'
    valid_systems = ['daint:login']
    valid_prog_environs = ['builtin']
    executable = 'wget'
    executable_opts = [
        'http://mvapich.cse.ohio-state.edu/download/mvapich/osu-micro-benchmarks-5.6.2.tar.gz'  # noqa: E501
    ]
    postrun_cmds = [
        'tar xzf osu-micro-benchmarks-5.6.2.tar.gz'
    ]

    @sanity_function
    def validate_download(self):
        return sn.assert_true(os.path.exists('osu-micro-benchmarks-5.6.2'))
# rfmdocend: osudownload

Notice that the order in which dependencies are defined in a test file is irrelevant. In this case, we define OSUBuildTest at the end. ReFrame will make sure to properly sort the tests and execute them.

Here is the output when running the OSU tests with the asynchronous execution policy:

./bin/reframe -c tutorials/deps/osu_benchmarks.py -r
[ReFrame Setup]
  version:           3.10.0-dev.3+605af31a
  command:           './bin/reframe -c tutorials/deps/osu_benchmarks.py -r'
  launched by:       user@host
  working directory: '/home/user/Devel/reframe'
  settings file:     '/home/user/Devel/reframe/tutorials/config/settings.py'
  check search path: '/home/user/Devel/reframe/tutorials/deps/osu_benchmarks.py'
  stage directory:   '/home/user/Devel/reframe/stage'
  output directory:  '/home/user/Devel/reframe/output'

[==========] Running 8 check(s)
[==========] Started on Sat Jan 22 22:49:00 2022

[----------] start processing checks
[ RUN      ] OSUDownloadTest @daint:login+builtin
[       OK ] ( 1/22) OSUDownloadTest @daint:login+builtin
[ RUN      ] OSUBuildTest @daint:gpu+gnu
[ RUN      ] OSUBuildTest @daint:gpu+intel
[ RUN      ] OSUBuildTest @daint:gpu+nvidia
[       OK ] ( 2/22) OSUBuildTest @daint:gpu+gnu
[ RUN      ] OSUAllreduceTest %mpi_tasks=16 @daint:gpu+gnu
[ RUN      ] OSUAllreduceTest %mpi_tasks=8 @daint:gpu+gnu
[ RUN      ] OSUAllreduceTest %mpi_tasks=4 @daint:gpu+gnu
[ RUN      ] OSUAllreduceTest %mpi_tasks=2 @daint:gpu+gnu
[ RUN      ] OSUBandwidthTest @daint:gpu+gnu
[ RUN      ] OSULatencyTest @daint:gpu+gnu
[       OK ] ( 3/22) OSUBuildTest @daint:gpu+intel
[       OK ] ( 4/22) OSUBuildTest @daint:gpu+nvidia
[ RUN      ] OSUAllreduceTest %mpi_tasks=16 @daint:gpu+intel
[ RUN      ] OSUAllreduceTest %mpi_tasks=16 @daint:gpu+nvidia
[ RUN      ] OSUAllreduceTest %mpi_tasks=8 @daint:gpu+intel
[ RUN      ] OSUAllreduceTest %mpi_tasks=8 @daint:gpu+nvidia
[ RUN      ] OSUAllreduceTest %mpi_tasks=4 @daint:gpu+intel
[ RUN      ] OSUAllreduceTest %mpi_tasks=4 @daint:gpu+nvidia
[ RUN      ] OSUAllreduceTest %mpi_tasks=2 @daint:gpu+intel
[ RUN      ] OSUAllreduceTest %mpi_tasks=2 @daint:gpu+nvidia
[ RUN      ] OSUBandwidthTest @daint:gpu+intel
[ RUN      ] OSUBandwidthTest @daint:gpu+nvidia
[ RUN      ] OSULatencyTest @daint:gpu+intel
[ RUN      ] OSULatencyTest @daint:gpu+nvidia
[       OK ] ( 5/22) OSUAllreduceTest %mpi_tasks=8 @daint:gpu+gnu
[       OK ] ( 6/22) OSUAllreduceTest %mpi_tasks=16 @daint:gpu+gnu
[       OK ] ( 7/22) OSUAllreduceTest %mpi_tasks=4 @daint:gpu+gnu
[       OK ] ( 8/22) OSULatencyTest @daint:gpu+gnu
[       OK ] ( 9/22) OSUAllreduceTest %mpi_tasks=2 @daint:gpu+gnu
[       OK ] (10/22) OSUAllreduceTest %mpi_tasks=16 @daint:gpu+intel
[       OK ] (11/22) OSUAllreduceTest %mpi_tasks=16 @daint:gpu+nvidia
[       OK ] (12/22) OSUAllreduceTest %mpi_tasks=8 @daint:gpu+intel
[       OK ] (13/22) OSUAllreduceTest %mpi_tasks=8 @daint:gpu+nvidia
[       OK ] (14/22) OSUBandwidthTest @daint:gpu+gnu
[       OK ] (15/22) OSUAllreduceTest %mpi_tasks=4 @daint:gpu+intel
[       OK ] (16/22) OSUAllreduceTest %mpi_tasks=2 @daint:gpu+nvidia
[       OK ] (17/22) OSUAllreduceTest %mpi_tasks=2 @daint:gpu+intel
[       OK ] (18/22) OSULatencyTest @daint:gpu+nvidia
[       OK ] (19/22) OSULatencyTest @daint:gpu+intel
[       OK ] (20/22) OSUAllreduceTest %mpi_tasks=4 @daint:gpu+nvidia
[       OK ] (21/22) OSUBandwidthTest @daint:gpu+intel
[       OK ] (22/22) OSUBandwidthTest @daint:gpu+nvidia
[----------] all spawned checks have finished

[  PASSED  ] Ran 22/22 test case(s) from 8 check(s) (0 failure(s), 0 skipped)
[==========] Finished on Sat Jan 22 22:54:26 2022
Run report saved in '/home/user/.reframe/reports/run-report.json'
Log file(s) saved in '/tmp/rfm-15ghvao1.log'

Before starting running the tests, ReFrame topologically sorts them based on their dependencies and schedules them for running using the selected execution policy. With the serial execution policy, ReFrame simply executes the tests to completion as they “arrive,” since the tests are already topologically sorted. In the asynchronous execution policy, tests are spawned and not waited for. If a test’s dependencies have not yet completed, it will not start its execution immediately.

ReFrame’s runtime takes care of properly cleaning up the resources of the tests respecting dependencies. Normally when an individual test finishes successfully, its stage directory is cleaned up. However, if other tests are depending on this one, this would be catastrophic, since most probably the dependent tests would need the outcome of this test. ReFrame fixes that by not cleaning up the stage directory of a test until all its dependent tests have finished successfully.

When selecting tests using the test filtering options, such as the -t, -n etc., ReFrame will automatically select any dependencies of these tests as well. For example, if we select only the OSULatencyTest for running, ReFrame will also select the OSUBuildTest and the OSUDownloadTest:

./bin/reframe -c tutorials/deps/osu_benchmarks.py -n OSULatencyTest -l
[ReFrame Setup]
  version:           3.10.0-dev.3+605af31a
  command:           './bin/reframe -c tutorials/deps/osu_benchmarks.py -n OSULatencyTest -l'
  launched by:       user@host
  working directory: '/home/user/Devel/reframe'
  settings file:     '/home/user/Devel/reframe/tutorials/config/settings.py'
  check search path: '/home/user/Devel/reframe/tutorials/deps/osu_benchmarks.py'
  stage directory:   '/home/user/Devel/reframe/stage'
  output directory:  '/home/user/Devel/reframe/output'

[List of matched checks]
- OSULatencyTest
    ^OSUBuildTest
      ^OSUDownloadTest
Found 3 check(s)

Log file(s) saved in '/tmp/rfm-zc483csf.log'

Finally, when ReFrame cannot resolve a dependency of a test, it will issue a warning and skip completely all the test cases that recursively depend on this one. In the following example, we restrict the run of the OSULatencyTest to the daint:gpu partition. This is problematic, since its dependencies cannot run on this partition and, particularly, the OSUDownloadTest. As a result, its immediate dependency OSUBuildTest will be skipped, which will eventually cause all combinations of the OSULatencyTest to be skipped.

./bin/reframe -c tutorials/deps/osu_benchmarks.py --system=daint:gpu -n OSULatencyTest -l
[ReFrame Setup]
  version:           3.10.0-dev.3+605af31a
  command:           './bin/reframe -c tutorials/deps/osu_benchmarks.py -n OSULatencyTest --system=daint:gpu -l'
  launched by:       user@host
  working directory: '/home/user/Devel/reframe'
  settings file:     '/home/user/Devel/reframe/tutorials/config/settings.py'
  check search path: '/home/user/Devel/reframe/tutorials/deps/osu_benchmarks.py'
  stage directory:   '/home/user/Devel/reframe/stage'
  output directory:  '/home/user/Devel/reframe/output'

./bin/reframe: could not resolve dependency: ('OSUBuildTest', 'daint:gpu', 'gnu') -> 'OSUDownloadTest'
./bin/reframe: could not resolve dependency: ('OSUBuildTest', 'daint:gpu', 'intel') -> 'OSUDownloadTest'
./bin/reframe: could not resolve dependency: ('OSUBuildTest', 'daint:gpu', 'nvidia') -> 'OSUDownloadTest'
./bin/reframe: skipping all dependent test cases
  - ('OSUBuildTest', 'daint:gpu', 'nvidia')
  - ('OSUBuildTest', 'daint:gpu', 'intel')
  - ('OSUAllreduceTest_8', 'daint:gpu', 'nvidia')
  - ('OSUAllreduceTest_16', 'daint:gpu', 'nvidia')
  - ('OSUBuildTest', 'daint:gpu', 'gnu')
  - ('OSUAllreduceTest_4', 'daint:gpu', 'intel')
  - ('OSUAllreduceTest_8', 'daint:gpu', 'intel')
  - ('OSUAllreduceTest_4', 'daint:gpu', 'nvidia')
  - ('OSUAllreduceTest_16', 'daint:gpu', 'intel')
  - ('OSULatencyTest', 'daint:gpu', 'nvidia')
  - ('OSUAllreduceTest_8', 'daint:gpu', 'gnu')
  - ('OSUAllreduceTest_2', 'daint:gpu', 'nvidia')
  - ('OSUBandwidthTest', 'daint:gpu', 'nvidia')
  - ('OSUAllreduceTest_16', 'daint:gpu', 'gnu')
  - ('OSUBandwidthTest', 'daint:gpu', 'intel')
  - ('OSULatencyTest', 'daint:gpu', 'intel')
  - ('OSUAllreduceTest_2', 'daint:gpu', 'intel')
  - ('OSUAllreduceTest_4', 'daint:gpu', 'gnu')
  - ('OSUAllreduceTest_2', 'daint:gpu', 'gnu')
  - ('OSUBandwidthTest', 'daint:gpu', 'gnu')
  - ('OSULatencyTest', 'daint:gpu', 'gnu')

[List of matched checks]
Found 0 check(s)

Log file(s) saved in '/tmp/rfm-k1w20m9z.log'

Listing Dependencies

As shown in the listing of OSULatencyTest before, the full dependency chain of the test is listed along with the test. Each target dependency is printed in a new line prefixed by the ^ character and indented proportionally to its level. If a target dependency appears in multiple paths, it will only be listed once.

The default test listing will list the dependencies at the test level or the conceptual dependencies. ReFrame generates multiple test cases from each test depending on the target system configuration. We have seen in the Tutorial 1: Getting Started with ReFrame already how the STREAM benchmark generated many more test cases when it was run in a HPC system with multiple partitions and programming environments. These are the actual depedencies and form the actual test case graph that will be executed by the runtime. The mapping of a test to its concrete test cases that will be executed on a system is called test concretization. You can view the exact concretization of the selected tests with --list=concretized or simply -lC. Here is how the OSU benchmarks of this tutorial are concretized on the system daint:

./bin/reframe -c tutorials/deps/osu_benchmarks.py -lC
[ReFrame Setup]
  version:           3.10.0-dev.3+605af31a
  command:           './bin/reframe -c tutorials/deps/osu_benchmarks.py -lC'
  launched by:       user@host
  working directory: '/home/user/Devel/reframe'
  settings file:     '/home/user/Devel/reframe/tutorials/config/settings.py'
  check search path: '/home/user/Devel/reframe/tutorials/deps/osu_benchmarks.py'
  stage directory:   '/home/user/Devel/reframe/stage'
  output directory:  '/home/user/Devel/reframe/output'

[List of matched checks]
- OSUAllreduceTest %mpi_tasks=16 @daint:gpu+gnu
    ^OSUBuildTest @daint:gpu+gnu
      ^OSUDownloadTest @daint:login+builtin
- OSUAllreduceTest %mpi_tasks=16 @daint:gpu+intel
    ^OSUBuildTest @daint:gpu+intel
      ^OSUDownloadTest @daint:login+builtin
- OSUAllreduceTest %mpi_tasks=16 @daint:gpu+nvidia
    ^OSUBuildTest @daint:gpu+nvidia
      ^OSUDownloadTest @daint:login+builtin
- OSUAllreduceTest %mpi_tasks=8 @daint:gpu+gnu
    ^OSUBuildTest @daint:gpu+gnu
      ^OSUDownloadTest @daint:login+builtin
- OSUAllreduceTest %mpi_tasks=8 @daint:gpu+intel
    ^OSUBuildTest @daint:gpu+intel
      ^OSUDownloadTest @daint:login+builtin
- OSUAllreduceTest %mpi_tasks=8 @daint:gpu+nvidia
    ^OSUBuildTest @daint:gpu+nvidia
      ^OSUDownloadTest @daint:login+builtin
- OSUAllreduceTest %mpi_tasks=4 @daint:gpu+gnu
    ^OSUBuildTest @daint:gpu+gnu
      ^OSUDownloadTest @daint:login+builtin
- OSUAllreduceTest %mpi_tasks=4 @daint:gpu+intel
    ^OSUBuildTest @daint:gpu+intel
      ^OSUDownloadTest @daint:login+builtin
- OSUAllreduceTest %mpi_tasks=4 @daint:gpu+nvidia
    ^OSUBuildTest @daint:gpu+nvidia
      ^OSUDownloadTest @daint:login+builtin
- OSUAllreduceTest %mpi_tasks=2 @daint:gpu+gnu
    ^OSUBuildTest @daint:gpu+gnu
      ^OSUDownloadTest @daint:login+builtin
- OSUAllreduceTest %mpi_tasks=2 @daint:gpu+intel
    ^OSUBuildTest @daint:gpu+intel
      ^OSUDownloadTest @daint:login+builtin
- OSUAllreduceTest %mpi_tasks=2 @daint:gpu+nvidia
    ^OSUBuildTest @daint:gpu+nvidia
      ^OSUDownloadTest @daint:login+builtin
- OSUBandwidthTest @daint:gpu+gnu
    ^OSUBuildTest @daint:gpu+gnu
      ^OSUDownloadTest @daint:login+builtin
- OSUBandwidthTest @daint:gpu+intel
    ^OSUBuildTest @daint:gpu+intel
      ^OSUDownloadTest @daint:login+builtin
- OSUBandwidthTest @daint:gpu+nvidia
    ^OSUBuildTest @daint:gpu+nvidia
      ^OSUDownloadTest @daint:login+builtin
- OSULatencyTest @daint:gpu+gnu
    ^OSUBuildTest @daint:gpu+gnu
      ^OSUDownloadTest @daint:login+builtin
- OSULatencyTest @daint:gpu+intel
    ^OSUBuildTest @daint:gpu+intel
      ^OSUDownloadTest @daint:login+builtin
- OSULatencyTest @daint:gpu+nvidia
    ^OSUBuildTest @daint:gpu+nvidia
      ^OSUDownloadTest @daint:login+builtin
Concretized 22 test case(s)

Log file(s) saved in '/tmp/rfm-l3eamaiy.log'

Notice how the various test cases of the run benchmarks depend on the corresponding test cases of the build tests.

The concretization of test cases changes if a specifc partition or programming environment is passed from the command line or, of course, if the test is run on a different system. If we scope our programming environments to gnu and builtin only, ReFrame will generate 8 test cases only instead of 22:

Note

If we do not select the builtin environment, we will end up with a dangling dependency as in the example above and ReFrame will skip all the dependent test cases.

./bin/reframe -c tutorials/deps/osu_benchmarks.py -n OSULatencyTest -L -p builtin -p gnu
[ReFrame Setup]
  version:           3.10.0-dev.3+605af31a
  command:           './bin/reframe -c tutorials/deps/osu_benchmarks.py -n OSULatencyTest -L -p builtin -p gnu'
  launched by:       user@host
  working directory: '/home/user/Devel/reframe'
  settings file:     '/home/user/Devel/reframe/tutorials/config/settings.py'
  check search path: '/home/user/Devel/reframe/tutorials/deps/osu_benchmarks.py'
  stage directory:   '/home/user/Devel/reframe/stage'
  output directory:  '/home/user/Devel/reframe/output'

[List of matched checks]
- OSULatencyTest [id: OSULatencyTest, file: '/home/user/Devel/reframe/tutorials/deps/osu_benchmarks.py']
    ^OSUBuildTest [id: OSUBuildTest, file: '/home/user/Devel/reframe/tutorials/deps/osu_benchmarks.py']
      ^OSUDownloadTest [id: OSUDownloadTest, file: '/home/user/Devel/reframe/tutorials/deps/osu_benchmarks.py']
Found 3 check(s)

Log file(s) saved in '/tmp/rfm-klltwsex.log'

To gain a deeper understanding on how test dependencies work in Reframe, please refer to How Test Dependencies Work In ReFrame.

Depending on Parameterized Tests

As shown earlier in this section, tests define their dependencies by referencing the target tests by their unique name. This is straightforward when referring to regular tests, where their name matches the class name, but it becomes cumbersome trying to refer to a parameterized tests, since no safe assumption should be made as of the variant number of the test or how the parameters are encoded in the name. In order to safely and reliably refer to a parameterized test, you should use the get_variant_nums() and variant_name() class methods as shown in the following example:

# Copyright 2016-2022 Swiss National Supercomputing Centre (CSCS/ETH Zurich)
# ReFrame Project Developers. See the top-level LICENSE file for details.
#
# SPDX-License-Identifier: BSD-3-Clause

import reframe as rfm
import reframe.utility.sanity as sn


@rfm.simple_test
class TestA(rfm.RunOnlyRegressionTest):
    z = parameter(range(10))
    executable = 'echo'
    valid_systems = ['*']
    valid_prog_environs = ['*']

    @run_after('init')
    def set_exec_opts(self):
        self.executable_opts = [str(self.z)]

    @sanity_function
    def validate(self):
        return sn.assert_eq(
            sn.extractsingle(r'\d+', self.stdout, 0, int), self.z
        )


@rfm.simple_test
class TestB(rfm.RunOnlyRegressionTest):
    executable = 'echo'
    valid_systems = ['*']
    valid_prog_environs = ['*']
    sanity_patterns = sn.assert_true(1)

    @run_after('init')
    def setdeps(self):
        variants = TestA.get_variant_nums(z=lambda x: x > 5)
        for v in variants:
            self.depends_on(TestA.variant_name(v))

In this example, TestB depends only on selected variants of TestA. The get_variant_nums() method accepts a set of key-value pairs representing the target test parameters and selector functions and returns the list of the variant numbers that correspond to these variants. Using the variant_name() subsequently, we can get the actual name of the variant.

./bin/reframe -c tutorials/deps/parameterized.py -l
[ReFrame Setup]
  version:           3.10.0-dev.3+605af31a
  command:           './bin/reframe -c tutorials/deps/parameterized.py -l'
  launched by:       user@host
  working directory: '/home/user/Devel/reframe'
  settings file:     '/home/user/Devel/reframe/tutorials/config/settings.py'
  check search path: '/home/user/Devel/reframe/tutorials/deps/parameterized.py'
  stage directory:   '/home/user/Devel/reframe/stage'
  output directory:  '/home/user/Devel/reframe/output'

[List of matched checks]
- TestB
    ^TestA %z=9
    ^TestA %z=8
    ^TestA %z=7
    ^TestA %z=6
- TestA %z=5
- TestA %z=4
- TestA %z=3
- TestA %z=2
- TestA %z=1
- TestA %z=0
Found 11 check(s)

Log file(s) saved in '/tmp/rfm-iey58chw.log'