padv.builtin.task.RunTestsPerModel Class
Namespace: padv.builtin.task
Superclasses: padv.Task
Task for running test cases associated with each model using Simulink Test
Description
The padv.builtin.task.RunTestsPerModel
class provides a task that can run
the test cases associated with your models using Simulink®
Test™.
You can add the task to your process model by using the
method addTask
. After you add the task to your process model, you can run the
task from the Process Advisor app or by using the function
runprocess
. The task runs each test case for each model in your project
and certain tests can generate code. You can control whether Simulink
Test or the MATLAB® Unit Test framework executes the test cases by using the task property
UseMATLABUnitTest
.
The Process Advisor app shows the names of the models that have test cases
under Run Tests in the Tasks column. If you want to
see the names of both the models and the associated test cases, use the
padv.builtin.task.RunTestsPerTestCase
task
instead.
To view the source code for this built-in task, in the MATLAB Command Window, enter:
open padv.builtin.task.RunTestsPerModel
The padv.builtin.task.RunTestsPerModel
class is a handle
class.
Note
When you run the task, the task runs each test case individually and only executes test-case level callbacks. The task does not execute test-file level callbacks or test-suite level callbacks.
Creation
Description
creates a task for running the test cases associated with your models using Simulink
Test.task
= padv.builtin.task.RunTestsPerModel()
sets certain properties using one or more name-value arguments. For example, task
= padv.builtin.task.RunTestsPerModel(Name=Value
)task
= padv.builtin.task.RunTestsPerModel(Name = "MyRunTestsTask")
creates a task
with the specified name.
You can use this syntax to set property values for Name
,
Title
, InputQueries
,
IterationQuery
, InputDependencyQuery
,
Licenses
, LaunchToolAction
, and
LaunchToolText
.
The padv.builtin.task.RunTestsPerModel
class also has other properties, but you cannot set
those properties during task creation.
Properties
The RunTestsPerModel
class inherits properties from padv.Task
. The properties listed in
Specialized Inherited Properties are padv.Task
properties that the RunTestsPerModel
task overrides.
The task also has properties for specifying Test Execution Options.
Specialized Inherited Properties
Unique identifier for task in process, specified as a string.
Example: "TestMyModels"
Data Types: string
Human-readable name that appears in Process Advisor app, specified as a string.
Example: "Run My Tests"
Data Types: string
Task description, specified as a string.
When you point to a task in Process Advisor and click the information icon, the tooltip shows the task description.
Example: "This task uses Simulink Test to run the test cases associated with
your model. The task runs the test cases on a model-by-model basis. Certain tests may
generate code."
Data Types: string
Path to task documentation, specified as a string.
When you point to a task in Process Advisor, click the ellipsis (...), and click Help, Process Advisor opens the task documentation.
Example: fullfile(pwd,"taskHelpFiles","myTaskDocumentation.pdf")
Data Types: string
Type of artifact, specified as one or more of the values listed in this table. To specify multiple values, use an array.
Category | Artifact Type | Description |
---|---|---|
MATLAB | "m_class" | MATLAB class |
"m_file" | MATLAB file | |
"m_func" | MATLAB function | |
"m_method" | MATLAB class method | |
"m_property" | MATLAB class property | |
Model Advisor | "ma_config_file" | Model Advisor configuration file |
"ma_justification_file" | Model Advisor justification file | |
Model Finder | "mf_database" | Model Finder database file |
Process Advisor | "padv_dep_artifacts" | Related artifacts that current artifact depends on |
"padv_output_file" | Process Advisor output file | |
Project | "project" | Current project file |
Requirements | "mwreq_file" | Requirement file (since R2024b) |
"mwreq_item" | Requirement (since R2024b) | |
| Requirement (for R2024a and earlier) | |
"sl_req_file" | Requirement file (for R2024a and earlier) | |
"sl_req_table" | Requirements Table | |
Stateflow® | "sf_chart" | Stateflow chart |
"sf_graphical_fcn" | Stateflow graphical function | |
"sf_group" | Stateflow group | |
"sf_state" | Stateflow state | |
"sf_state_transition_chart" | Stateflow state transition chart | |
"sf_truth_table" | Stateflow truth table | |
Simulink | "sl_block_diagram" | Block diagram |
"sl_data_dictionary_file" | Data dictionary file | |
"sl_embedded_matlab_fcn" | MATLAB function | |
"sl_block_diagram" | Block diagram | |
"sl_library_file" | Library file | |
"sl_model_file" | Simulink model file | |
"sl_protected_model_file" | Protected Simulink model file | |
"sl_subsystem" | Subsystem | |
"sl_subsystem_file" | Subsystem file | |
System Composer™ | "zc_block_diagram" | System Composer architecture |
"zc_component" | System Composer architecture component | |
"zc_file" | System Composer architecture file | |
Tests | "harness_info_file" | Harness info file |
"sl_harness_block_diagram" | Harness block diagram | |
"sl_harness_file" | Test harness file | |
"sl_test_case" | Simulink Test case | |
"sl_test_case_result" | Simulink Test case result | |
"sl_test_file" | Simulink Test file | |
"sl_test_iteration" | Simulink Test iteration | |
"sl_test_iteration_result" | Simulink Test iteration result | |
"sl_test_report_file" | Simulink Test result report | |
"sl_test_result_file" | Simulink Test result file | |
"sl_test_resultset" | Simulink Test result set | |
"sl_test_seq" | Test Sequence | |
"sl_test_suite" | Simulink Test suite | |
"sl_test_suite_result" | Simulink Test suite result |
Example: "sl_model_file"
Example: ["sl_model_file "zc_file"]
Query that finds the artifacts that the task iterates over, specified as a
padv.Query
object or the name of a padv.Query
object. When you specify IterationQuery
, the task runs one time
for each artifact returned by the query. In the Process
Advisor app, the artifacts returned by IterationQuery
appear under task title.
For more information about task iterations, see Overview of Process Model.
Example: padv.builtin.query.FindModelsWithTestCases(ExcludePath =
"Control")
Query that finds artifact dependencies for task inputs, specified as a
padv.Query
object or the name of a padv.Query
object.
The build system runs the query specified by
InputDependencyQuery
to find the dependencies for the task
inputs, since those dependencies can impact if task results are up-to-date. For more
information, see Overview of Process Model.
Example: padv.builtin.query.GetDependentArtifacts
List of additional licenses that the task requires, specified as a string.
Data Types: string
Function that launches a tool, specified as the function handle.
When you point to a task in the Process Advisor app, you can click the ellipsis (...) to see more options. For built-in tasks, you have the option to launch a tool associated with the task.
For the task RunTestsPerModel
, you can launch Simulink Test
Manager.
Data Types: function_handle
Description of the action that the LaunchToolAction
property
performs, specified as a string.
Data Types: string
Type of CI-compatible result files that the task itself generates when run, specified as either:
"JUnit"
— JUnit-style XML report for task results.""
— None. The build system generates a JUnit-style XML report for the task instead.
Inputs to the task, specified as:
a
padv.Query
objectthe name of
padv.Query
objectan array of
padv.Query
objectsan array of names of
padv.Query
objects
By default, the task RunTestsPerModel
gets the current model by using
the built-in query padv.builtin.query.GetIterationArtifact
and finds the
tests associated with that model by using the built-in query
padv.builtin.query.FindTestCasesForModel
.
Location for standard task outputs, specified as a string.
The built-in tasks use tokens, like $DEFAULTOUTPUTDIR$
, as
placeholders for dynamic path resolution during run-time. For more information, see
Dynamically Resolve Paths with Tokens.
Data Types: string
Test Execution Options
Name of the report author, specified as a string.
Data Types: string
Include the signal comparison plots in the report, specified as a numeric or logical
1
(true
) or 0
(false
).
When true
, the report includes the signal comparison plots
defined under baseline criteria, equivalence criteria, or assessments using the verify
operator in the test case.
Example: true
Data Types: logical
Include coverage metrics that the test collects during test execution in the report,
specified as a numeric or logical 1
(true
) or
0
(false
).
Example: false
Data Types: logical
Include error messages from test case simulations in report, specified as a numeric
or logical 1
(true
) or 0
(false
).
Example: false
Data Types: logical
Include the figures opened from a callback script, custom criteria, or by the model
in the report, specified as a numeric or logical 1
(true
) or 0
(false
).
Example: true
Data Types: logical
Include the version of MATLAB that ran the test cases in the report, specified as a numeric or logical
1
(true
) or 0
(false
).
Example: false
Data Types: logical
Include simulation metadata for each test case or iteration in the report, specified
as a numeric or logical 1
(true
) or
0
(false
).
Example: true
Data Types: logical
Include the simulation output plots for each signal in the report, specified as a
numeric or logical 1
(true
) or
0
(false
).
Example: true
Data Types: logical
Include the test requirement link, defined under Requirements in the test case, in
the report, specified as a numeric or logical 1
(true
) or 0
(false
).
Example: false
Data Types: logical
Include all or a subset of test results in the report, specified as either:
0
— Passed and failed results1
— Only passed results2
— Only failed results
Example: 2
Open the generated report, specified as a numeric or logical 1
(true
) or 0
(false
).
Example: true
Data Types: logical
Number of columns of plots to include on report pages, specified as an integer
1
, 2
, 3
, or
4
.
Example: 4
Number of rows of plots to include on report pages, specified as an integer
1
, 2
, 3
, or
4
.
Example: 4
Format for the generated report, specified as either:
"pdf"
— PDF format"docx"
— Microsoft® Word document format"zip"
— Zipped file that contains an HTML file, images, style sheet, and JavaScript® files for an HTML report
Example: "zip"
Path to the generated report, specified as a string.
The built-in tasks use tokens, like $DEFAULTOUTPUTDIR$
, as
placeholders for dynamic path resolution during run-time. For more information, see
Dynamically Resolve Paths with Tokens.
Data Types: string
File name for the generated report, specified as a string.
The built-in tasks use tokens, like $ITERATIONARTIFACT$
, as
placeholders for dynamic path resolution during run-time. For more information, see
Dynamically Resolve Paths with Tokens.
Data Types: string
Title of the report, specified as a string.
The built-in tasks use tokens, like $ITERATIONARTIFACT$
, as
placeholders for dynamic path resolution during run-time. For more information, see
Dynamically Resolve Paths with Tokens.
Data Types: string
Name of test result file, specified as a string.
The built-in tasks use tokens, like $ITERATIONARTIFACT$
, as
placeholders for dynamic path resolution during run-time. For more information, see
Dynamically Resolve Paths with Tokens.
Data Types: string
Save the test results to a file after execution, specified as a numeric or logical
1
(true
) or 0
(false
).
Example: false
Data Types: logical
Since R2023a
Simulation mode for running tests, specified as "Normal"
,
"Accelerator"
, "Rapid Accelerator"
,
"Software-in-the-Loop"
, or
"Processor-in-the-Loop"
.
By default, the property is empty (""
), which means the built-in
task uses the simulation mode that you define in the test itself. If you specify a value
other than ""
, the built-in task overrides the simulation mode set in
Simulink Test Manager. You do not need to update the test parameters or
settings to run the test in the new mode.
Example: "Software-in-the-Loop"
Use MATLAB Unit Test framework to execute test cases, specified as either:
true
(1
) — The task runs your test cases by using the MATLAB Unit Test framework to create a test runner, creates a suite of tests from your test file, and run the tests. If you use the pipeline generator,padv.pipeline.generatePipeline
, and your pipeline generator options specify theGenerateJUnitForProcess
property astrue
(1
), the task uses the MATLAB unit test XML plugin to produce JUnit-style XML format test results that integrate into CI platforms.false
(0
) — The task runs your test cases by using Simulink Test. Starting in R2023a, if you specified the task propertySimulationMode
, the task overrides the test simulation mode without having to change the test definition.
Example: true
Data Types: logical
Methods
This class overrides the following inherited methods.
run | Run test cases for each model using Simulink Test Note You do not need to manually invoke this method. When you run a task using
the Process Advisor app or the
The function taskResult = run(obj,input) ... end |
dryRun |
Dry run the task to validate task inputs and generate
representative task outputs without actually running the task. The function taskResult = dryRun(obj, input) ... end |
launchToolAction | Launch Simulink Test Manager. Process Advisor uses this method when you open the tool associated with a task. |
Examples
Add a task to your process that can run test cases for each model using Simulink Test.
Open the process model for your project. If you do not have a process model, open the Process Advisor app to automatically create a process model.
In the process model file, add the RunTestsPerModel
task to your
process model by using the addTask
method.
runTestsPerModelTask = pm.addTask(padv.builtin.task.RunTestsPerModel);
You can reconfigure the task behavior by using the task properties. For example, to generate a zipped HTML report file instead of a PDF:
runTestsPerModelTask.ReportFormat = "zip";
If you want to use the MergeTestResults
task to merge the test
results, you need to configure the MergeTestResults
task to get the
outputs from the RunTestsPerModel
task instance. By default, the
MergeTestResults
task is configured to get outputs from a
RunTestsPerTestCase
task with task name
"padv.builtin.task.RunTestsPerTestCase"
. Specify the
RunTestsPerModel
task instance as a predecessor task to the
MergeTestResults
task.
%% Merge Test Results from Running Tests per Model mergeTestTask = pm.addTask(padv.builtin.task.MergeTestResults(... PredecessorTask=runTestsPerModelTask));
Since that MergeTestResults
task depends on outputs from the
RunTestsPerModel
task, you also need to explicitly specify those
dependencies in the process
model.
mergeTestTask.dependsOn(runTestsPerModelTask);
Since R2023a
Suppose that you want to have one instance of the
RunTestsPerModel
task that runs normal mode tests and another instance
that runs software-in-the-loop (SIL) tests. You can create multiple instances of the task
inside your process model and then use the SimulationMode
to override
the simulation mode set in Simulink Test Manager.
Inside your process model, create multiple instances of the
RunTestsPerModel
task. When you create multiple instances of a task,
you must specify a unique name for each task object. For example:
milTask = pm.addTask(padv.builtin.task.RunTestsPerModel(... Name = "RunTestsNormalMode")); silTask = pm.addTask(padv.builtin.task.RunTestsPerModel(... Name = "RunTestsSILMode"));
The build system uses the Name
property as the unique
identifier for the task.
Reconfigure the task instances to run tests in different simulation modes. You can
run tests in different simulation modes without having to change the test definition by
using the SimulationMode
property to override the mode. For
example:
milTask.SimulationMode = "Normal"; silTask.SimulationMode = "Software-in-the-Loop";
To prevent task outputs from overwriting each other, reconfigure the names and locations of the task outputs by using the associated task properties. For example:
% Specify normal mode outputs milTask.OutputDirectory = defaultTestResultPath; milTask.ReportName = '$ITERATIONARTIFACT$_Normal_Test'; milTask.ResultFileName = '$ITERATIONARTIFACT$_Normal_ResultFile'; % Specify SIL mode outputs silTask.OutputDirectory = defaultTestResultPath; silTask.ReportName = '$ITERATIONARTIFACT$_SIL_Test'; silTask.ResultFileName = '$ITERATIONARTIFACT$_SIL_ResultFile';
$ITERATIONARTIFACT$
, as placeholders
for dynamic path resolution during run-time. For more information, see Dynamically Resolve Paths with Tokens.By default, the MergeTestResults
task only gets the current model and
the outputs from the task padv.builtin.task.RunTestsPerTestCase
.
If you want to merge the test results from these two task instances using the
MergeTestResults
task, you need to configure the
MergeTestResults
task to get the outputs from those task instances by
using the PredecessorTask
argument.
%% Merge Test Results (Normal and SIL) mergeTestTask = pm.addTask(padv.builtin.task.MergeTestResults(... PredecessorTask = [milTask, silTask]));
Since that MergeTestResults
task depends on outputs from the
RunTestsPerTestCase
tasks, you need to explicitly specify those
dependencies in the process
model.
mergeTestTask.dependsOn(milTask); mergeTestTask.dependsOn(silTask);
MATLAB Command
You clicked a link that corresponds to this MATLAB command:
Run the command by entering it in the MATLAB Command Window. Web browsers do not support MATLAB commands.
Select a Web Site
Choose a web site to get translated content where available and see local events and offers. Based on your location, we recommend that you select: .
You can also select a web site from the following list
How to Get Best Site Performance
Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.
Americas
- América Latina (Español)
- Canada (English)
- United States (English)
Europe
- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)
- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom (English)