I am trying to create a signal fault detection system in Simulink (Fixed Step Solver). I have simplified my model down to a single signal to help clarify the issue I am experiencing. The desired function of the system is this:
- Signal is fed into a function-call subsystem; the subsystem is driven by a function-call generator at sample time 0.01 (iterations: 1)
2. The signal is then compared with a limit (constant), and the boolean result is connected to a Discrete-Time Integrator. The value of the integral (1 or 0 x time) is compared to another constant (time limit) and the boolean result of that indicates a fault (i.e. if the signal limit is continuously exceeded for a given number of seconds, a fault is detected).
The integral reset is set as a falling edge, so when the signal drops below the limit it should reset. When I run the simulation, everything works as expected until the integral reaches a value of ~2.6, and then it resets? Also, if I limit the integral output to 2.2, it works?
I'm sure my issue is probably something simple; anybody see what I am missing?