Symbol Error Rate
Symbol error rate (SER) compares the number of errors in the recovered symbol pattern with the total number of symbols sent. It is measured at the recovered clock times using the model thresholds.
SER is calculated from an input stimulus symbol pattern and equalized waveforms. To calculate SER:
The equalized waveform is latched with the clock signal which converts to one voltage per UI (unit interval).
Model thresholds are applied to determine symbol level per UI.
The delay of the latched symbol data to the stimulus symbol pattern is calculated.
Delay and ignore bits are removed to align the stimulus and latched data symbol patterns. They are compared to each other to find the symbol errors.
Erroneous bits are counted to calculate the symbol error rate.
Once calculated, SER is reported as a metric in the time-domain simulation results. You can click the SER metric cell to open a plot of the stimulus versus recovered data symbols.
The errors are marked in the symbol waveform. You can cross reference the error at a symbol point with the equalized waveform to see the margin of error for that symbol. In this figure, the errors stop after ~4000 UI, which corresponds to the model thresholds stabilizing and the CDR locking in the equalized waveform plot.