just wondering if there is a hard size limit for the filtfilt function? I was looking at a ECG time series collected at 500Hz over several hours, with an array of ~14e7 entries in double. In the process of extracting the QRS complex i first run through a Butterworth bandpass filter, using N = 3, Wn = [5 15]*2/fs in the butter(N,Wn) function. It runs fine up to 6810691 entries, anything over it gives all NaN output. Just wondering have I missed something obvious? I am running on a machine with 16gb ram, i7-4xxx, and during the process it haven't hit the memory/cpu limits. Thank you!
Regards,
Alan
edit: I have included the data file (~55mb)

 Akzeptierte Antwort

Star Strider
Star Strider am 5 Mär. 2019

0 Stimmen

Use the isnan function to check to be certain that none of the data are NaN:
NaNsum = nnz(isnan(your_signal))
The first NaN will propagate throughout the rest of the filtered signal. If this result is 0, I have no other suggestions or an explanation.
If there are any, use:
NaNidx = find(isnan(your_signal));
to discover their locations.
As a side note, the normal EKG (without arrhythmias) has a spectrum of 0 to 50 Hz, so the passband you’re using will eliminate a significant amount of detail.

4 Kommentare

actinium
actinium am 5 Mär. 2019
thank you for the very quick reply! none of the signal had an NaN value, so it was just a bit weird having it suddenly failed without doing much different othe rthan changing the input size.
currently just extracting the RR interval so I am not too worried about the higher frequency stuff, definitely good to keep in the back of the head!
Star Strider
Star Strider am 6 Mär. 2019
Other than that, the NaN results are the result of 0/0 or Inf/Inf operations. However, I’ve never encountered that with any filter, including filtfilt. My only other hypothesis is that there is a non-numeric (e.g. character) value in your vector that is causing the problem. Without your data, I can’t go further.
I will delete my Answer in a few hours.
actinium
actinium am 6 Mär. 2019
when i split the data into smaller arrays it works fine. that's why I'm wondering if there is a hard size limit.
Star Strider
Star Strider am 6 Mär. 2019
I never encountered the problem you’re reporting. The documentation for filtfilt doesn’t mention anything about support for Tall Arrays (link), assuming your data meets those criteria, although filtfilt is not listed amongst the Functions That Support Tall Arrays (A - Z) (link) either. The documentation also doesn’t mention any sort of length restriction.
If filtfilt works with shorter vectors, consider concatenating the various segments you successfully filter, providing your code does not generate any significant transients on either end of any segment. There may also be other options, such as designing your own overlap-and-add approach.
I have no idea what the problem could be.

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (1)

Walter Roberson
Walter Roberson am 6 Mär. 2019

0 Stimmen

Yes, there is a hard size limit in filtfilt. Data less than 10000 points is put through a different routine that handles endpoints differently. I suspect the routine for the larger arrays exists to be more space efficient, but I am not positive.
Other than that: NO, there is no built-in limit other than what your memory can hold.
Note that several temporary arrays are needed during filtering, so do not expect to be able to process anything more than roughly 1/5th of your memory limit.

Kategorien

Produkte

Version

R2018b

Gefragt:

am 5 Mär. 2019

Bearbeitet:

am 6 Mär. 2019

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by