MATLAB Answers

0

Radar simulation: How to apply a precise delay?

Asked by Nicolai Kern on 13 Mar 2019
Latest activity Answered by Honglei Chen
on 15 Mar 2019
Hi all,
I am trying to create a simple radar baseband channel model. Therefore, I want to generate a delayed version of my transmit baseband signal:
The signals are sampled with sampling frequency , so applying a delay by shifting the signal by a certain amount of samples gives a "delay resolution" of only . As the location of targets might lead to delays that are no multiples of that value, this approach might introduce an error. Are there ways of applying a delay that are not limited by time discretization (except upsampling, which is already applied), or is this a limitation not to overcome? In particular:
  • Is it possible to perform a frequency-dependent phase shift according to the shift theorem of the DFT instead of a time delay? And does this give a finer resolution?
  • I saw that there is a FreeSpace object in the Phased Array Toolbox, that also adds a delay to a signal. Does anyone know how this is realized?

  0 Comments

Sign in to comment.

1 Answer

Answer by Honglei Chen
on 15 Mar 2019
 Accepted Answer

The FreeSpace in Phased Array System Toolbox uses fractional delay fitler to approximate the delay between samples.
HTH

  0 Comments

Sign in to comment.