Perform requirements-based testing for an automotive lane-following system.
This example shows how to:
Author high level testing requirements for a closed loop model implementing a lane-following algorithm.
Author tests in Simulink® Test™ to verify safe operation for each requirement.
Execute tests and review verification status.
This example shows how to verify an automated lane-keep assist algorithm using the Test Manager and blocks from the Model Verification library. This example is similar to the
Lane Following Control with Sensor Fusion and Lane Detection example in Model Predictive Control Toolbox™. For details on the control algorithm and closed-loop system model, see Lane Following Control with Sensor Fusion and Lane Detection (Model Predictive Control Toolbox).
Create and open a working copy of the project files. The project organizes files into several folders. The controller and system model files are in the
Models folder. The high level requirements for the controller are captured in
LaneFollowingTestRequirements.slreqx within the
Requirements folder. The Test Manager test file is in the
[projectFolder,~]=matlab.internal.project.example.projectDemoSetUp... (fullfile(matlabroot,'toolbox','simulinktest','simulinktestdemos',... 'sltestLaneFollowing.zip'),,); proj = simulinkproject(projectFolder);
The lane following controller is implemented by a Model block.
The vehicle dynamics and driving environment is modeled within the
Vehicle and Environment subsystem.
The road, lane and traffic scenarios use synthetic data generated by the Automated Driving System Toolbox™, which is saved in the
Scenario Reader block in the
Vehicle and Environment subsystem reads the scenario data during simulation.
mdl = 'LaneFollowingTestBenchExample'; open_system(mdl);
The data includes nine driving scenarios, with high level testing requirements for each scenario. Open the Requirements Editor from Simulink® Requirements™ to view the requirement set. In the Apps tab, click Requirements Manager in Model Verification, Validation, and Test section. Then, click Requirements Editor in the Requirements tab and choose the
LaneFollowingTestRequirements.slreqx file. You can also enter:
Each requirement represents a driving scenario. The first four requirements test tracking ability of the control algorithm. The next few requirements assess lane following ability under various road conditions:
Scenario 1: ACC_ISO_TargetDiscriminationTest. This is a basic test to ensure the controller can track a lead car in the travel lane.
Scenario 2: ACC_ISO_AutoRetargetTest. Test if the controller can re-target to a new car in the travel lane when the current target switches lanes.
Scenario 3: ACC_ISO_CurveTest. Test if the controller can track a slowing car in the travel lane while navigating the curvature of the road.
Scenario 4: ACC_StopnGo. This test simulates stop and go movement in the travel lane due to heavy traffic
Scenario 5: LFACC_DoubleCurveDecelTarget. Track the decelerating lead car through two S curves.
Scenario 6: LFACC_DoubleCurve_AutoRetarget. Test ability to retarget to a new lead car on a curve.
Scenario 7: LFACC_DoubleCurveStopnGo. This test simulates stop and go movement on a curved highway.
Scenario 8: LFACC_Curve_CutInOut. This test ensures the controller can identify a car cutting into and out of the travel lane
Scenario 9: LFACC_Curve_CutInOut_TooClose. This test repeats the previous test with shorter separation distance between the ego and lead car.
There are three main assessment criteria used to verify satisfactory operation of the controller:
Collision avoidance: Ensure that the ego car does not collide with the lead car at any point during the driving scenario.
Safe distance: Ensure that the time gap between the ego car and the lead car is above 0. 8s. The time gap between the two cars is defined as the ratio of the calculated headway and the ego car velocity.
Lane following: Ensure that the lateral deviation from the centerline of the lane is within 0.2 m
The first two criteria are verified by the
LaneFollowingTestBenchExample/Collision Detection/Test Assessments Test Sequence block. A
verify statement checks if a collision is detected at any point during the simulation.
verify statement checks whether the calculated time gap between the two cars falls below 0.8 s. The
duration operator allows for transient failures due to sudden changes in road conditions or sensor inputs. The
duration operator allows failures with this assessment for up to 5 s at a time.
The lane following assessment is included in the Test Sequence block
verify statement using the
duration operator checks that the absolute value of the lateral deviation from the centerline of the travel lane does not exceed 0.2 m for more than 5 s at a time.
To configure an interactive simulation, follow these steps:
Select a driving scenario by setting the
scenarioId variable to a value between 1-9 in the MATLAB® base workspace.
helperLFSetUp script to load required types and data.
Open the Bird's-Eye Scope using the visualization drop-down in the Simulink® model toolbar and set it up to observe the simulation by clicking
Simulate the model to visualize the chosen driving scenario.
Run the script
plotLFResults to assess controller performance script.
You can also run these commands to run the simulation and plot results for the first scenario:
scenarioId = 1; helperLFSetUp; sim(mdl); plotLFResults(logsout); Simulink.sdi.view;
Open the Simulation Data Inspector to view the results of the
verify statements within the Test Sequence blocks.
Assess the controller performance using the MATLAB plot figures. There are two plots -- one for assessing spacing performance and another for assessing lateral performance. For details on how to evaluate these plots, see Lane Following Control with Sensor Fusion and Lane Detection (Model Predictive Control Toolbox).
LaneFollowingTestScenarios.mldatx test file in the Test Manager. The test file has a set of nine test cases, one for each of the nine test scenarios described above.
Each test case uses the
Post-Load callback to set the appropriate scenario ID and run the setup utility. In addition, each test case also links to the corresponding requirement in the Requirements Editor for traceability.
Run the tests by clicking the Play button or entering
The results section of the Test Manager shows aggregated results for all nine test cases. For each test case, the
verify statements are aggregated to compute overall pass/fail results.
You can view the assessment figures in the test result summary pane.
Open the Requirements Editor and select Display > Verification Status to see a verification status summary for each requirement. Green and red bars indicate the test results and a summary result for the requirements set.
Close all open windows and models.
close_system(mdl, 0); clear mdl; sltest.testmanager.clear; sltest.testmanager.clearResults; close(proj); sltest.testmanager.close; Simulink.sdi.close;