Unexpected Standard exception from MEX file when reading h5 file

78 Ansichten (letzte 30 Tage)
Hi All, I am using matlab 2022a to read h5 file from my mobile disk. The h5disp command returns the following information:
>> h5disp('F:\NREL\western_wind_2006.h5')
HDF5 western_wind_2006.h5
Group '/'
Dataset 'capacity100m'
Size: 32043x52560
MaxSize: 32043x52560
Datatype: H5T_IEEE_F64LE (double)
ChunkSize: 101x52560
Filters: none
Attributes:
'capacity100m': 'MW'
Dataset 'meta'
Size: 32043
MaxSize: 32043
Datatype: H5T_COMPOUND
Member 'site_id': H5T_STD_I32LE (int32)
Member 'latitude': 16-bit floating point type
Member 'longitude': 16-bit floating point type
Member 'power_density': 16-bit floating point type
Member 'capacity_factor': 16-bit floating point type
Member 'wind_speed': 16-bit floating point type
Member 'state_code': H5T_STRING
String Length: 8
Padding: H5T_STR_NULLPAD
Character Set: H5T_CSET_ASCII
Character Type: H5T_C_S1
Member 'elevation': 16-bit floating point type
ChunkSize: []
Filters: none
FillValue: H5T_COMPOUND
Attributes:
'wind_speed': 'm/s'
'elevation': 'm'
'longitude': 'decimal degree'
'power_density': 'W/m2'
'capacity_factor': '%'
'latitude': 'decimal degree'
Dataset 'speed100m'
Size: 32043x52560
MaxSize: 32043x52560
Datatype: H5T_IEEE_F64LE (double)
ChunkSize: 101x52560
Filters: none
Attributes:
'speed100m': 'm/s'
Dataset 'time_index'
Size: 52560
MaxSize: 52560
Datatype: H5T_STRING
String Length: 30
Padding: H5T_STR_NULLPAD
Character Set: H5T_CSET_ASCII
Character Type: H5T_C_S1
ChunkSize: []
Filters: none
FillValue: ' '
When I read the dataset 'meta' or 'time_index', matlab always runs quickly and successfully. However, if I read the 'capacity100m' or 'speed100m', it will cost much more time and it will often present the following error:
>> d=h5read('F:\NREL\western_wind_2006.h5','/capacity100m');
Unexpected Standard exception from MEX file.
What() is:bad allocation
..
Error in h5read (line 93)
[data,var_class] = h5readc(Filename,Dataset,start,count,stride);
I am wondering what causes this problem. Why sometiomes I can read the 'capacity100m' dataset successfully and sometimes not? Any suggestion will be appreciated. Thank you very much.

Akzeptierte Antwort

Manoj Mirge
Manoj Mirge am 23 Mär. 2023
Hi,
The error you are getting is might be due to a memory issue. The h5read function reads datasets and creates a matrix of that dataset in MATLAB and that matrix is stored in your computer memory. If the underlying MEX file couldn’t get free memory for allocation of data, it will throw the error you are getting.
In your case, the 'capacity100m' and 'speed100m' datasets have exceptionally large size compared to ‘meta’ and ‘time_index’ datasets. That is why you can read the ‘meta’ and ‘time_index’ datasets quickly and without any error compared to other datasets.
Sometimes, if you are running other programs with your MATLAB code on your computer, other programs may occupy the memory needed by the MEX file and at that time you will get error by MEX file. That is why you can sometimes read datasets successfully because you may not have other programs running with your code at that same time.
You can check the memory usage of various programs running on your computer in the Task Manager.
To read more about this error, you can check this MATLAB answer thread.
Hope this helps.

Weitere Antworten (0)

Tags

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by