Radar simulation: How to apply a precise delay?

6 Ansichten (letzte 30 Tage)
Nicolai Kern
Nicolai Kern am 13 Mär. 2019
Beantwortet: Honglei Chen am 15 Mär. 2019
Hi all,
I am trying to create a simple radar baseband channel model. Therefore, I want to generate a delayed version of my transmit baseband signal:
The signals are sampled with sampling frequency , so applying a delay by shifting the signal by a certain amount of samples gives a "delay resolution" of only . As the location of targets might lead to delays that are no multiples of that value, this approach might introduce an error. Are there ways of applying a delay that are not limited by time discretization (except upsampling, which is already applied), or is this a limitation not to overcome? In particular:
  • Is it possible to perform a frequency-dependent phase shift according to the shift theorem of the DFT instead of a time delay? And does this give a finer resolution?
  • I saw that there is a FreeSpace object in the Phased Array Toolbox, that also adds a delay to a signal. Does anyone know how this is realized?

Akzeptierte Antwort

Honglei Chen
Honglei Chen am 15 Mär. 2019
The FreeSpace in Phased Array System Toolbox uses fractional delay fitler to approximate the delay between samples.
HTH

Weitere Antworten (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by