decimal to binary conversion of gpuArray gives error

my code:
dc=60;
dc=gpuArray(dc);
s=dec2bin(dc);
when exicuting this code error:
[f,e]=log2(max(d)); % How many digits do we need to represent the numbers?
s=de2bi(dc) % also gives error

2 Kommentare

Harish - if errors are being generated, then please include them in your question. Copy and paste all of the red text (that corresponds to the error) into your question.
[f,e]=log2(max(d)); % How many digits do we need to represent the numbers?

Melden Sie sich an, um zu kommentieren.

Antworten (1)

Edric Ellis
Edric Ellis am 11 Dez. 2014

0 Stimmen

Unfortunately, as you have discovered, the gpuArray version of log2 doesn't support the second output argument as needed by dec2bin. One further problem is that dec2bin returns char data which is not supported on the GPU, so I think you might be better off simply gathering the GPU data before calling dec2bin.
Could I ask - are you calling dec2bin with a large amount of data?

Kategorien

Mehr zu Matrices and Arrays finden Sie in Hilfe-Center und File Exchange

Gefragt:

am 11 Dez. 2014

Kommentiert:

am 11 Dez. 2014

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by