Edit or shadow toolbox function

9 Ansichten (letzte 30 Tage)
Daniel Greff
Daniel Greff am 26 Nov. 2018
Kommentiert: Daniel Greff am 28 Nov. 2018
Hey guys, I tried googling a bit but found nothing on this topic... which is weird as I suspect that this is not the first time someone is dealing with this "issue".
I am trying to implement a feature in an existing matlab function. The function is called "lstmForward.m" and is part of the Neural Network Toolbox. Specifically I would like to add variational dropout to the existing LSTM functions...
I can not get it to work however. I have already tried:
  • Shadowing the original function, which does not seem to work, even though my path is higher. Using which I can only find my own function, and not even the builtin one.
  • I have tried editing the function, which worked, but the function does not carry out my added code. It simply ignores it. Even simple things such as declaring a=1; do nothing. It simply passes over it and carries on.
The problem is that I can't just create my own function as it is pretty far down the function tree, several layers deep, and I would have to change everything on top. I also don't even think this could work...
So, do you have any ideas? Or do I have to give up on this prospect... Thanks.
  4 Kommentare
Jan
Jan am 28 Nov. 2018
I still do not get it: Without Admin privileges, you can neither edit nor remove nor replace the original toolbox function. The shadowing should work using Steven's suggestion of
rehash toolboxcache
but as far as I remember the original function is still prefered, if its folder is the current directory. So care about not cd -ing into the Matlab's root folder.
I have shadowed some toolbox functions also with a great care. During the startup of Matlab I add a folder to the path depending on the Matlab version, which contains improved versions of the functions, e.g. a much faster ind2rgb and a faster C-Mex for iscellstr . I've tested this exahsutively, on different Matlab versions, OS and so on before I trusted this for productive work.
You can check if there are P- or Mex-functions with the same name by which -all as mentioned already.
Steven Lord
Steven Lord am 28 Nov. 2018
Shadowing of a function with a function by the same name in another directory may not work in this case, as I suspect the function Daniel wants to change depends upon private functions that will only be available in the directory containing the original function.

Melden Sie sich an, um zu kommentieren.

Akzeptierte Antwort

Steven Lord
Steven Lord am 27 Nov. 2018
I strongly recommend against modifying functions in MathWorks toolboxes for some of the same reasons Jan called out. If you must do so, changes to files under the matlabroot directory will not be recognized until or unless you update the toolbox path cache.
If you believe that a toolbox function has a bug that should be fixed or would like to request that it be enhanced with additional capabilities, please contact Technical Support using the Contact Us link in the upper-right corner of this page and file your bug report / enhancement request.
  3 Kommentare
Steven Lord
Steven Lord am 28 Nov. 2018
Rather than modifying an internal function, I would use the documented approach on how to define your own custom layers in Deep Learning Toolbox. I'm not certain but you might be able to use the existing lstmLayer function somehow to facilitate building your own custom "lstmVariableDropout" layer.
One benefit of building your own layer using the documented technique would be to eliminate Jan's concern about If you really want to do this, sharing your work with a colleague forces him to be open to destroy his or her Matlab installation. All you would have to do to share your new layer would be to share the class file and have your colleague put it in a directory on their MATLAB path. It would also make it easier to use in future releases; copy the class file to your new installation and hopefully you should have to make minimal (or no) changes to get it to work.
I would still suggest filing an enhancement request with Support, either now or if your investigation convinces you that the lstm + variable dropout layer is useful. If such a layer were added in Deep Learning Toolbox it in a future release, you could just use it directly.
Daniel Greff
Daniel Greff am 28 Nov. 2018
That is a great option. I don't know why I hadn't really thought of approaching the problem from that direction. Maybe I can get it to work. Although it sure is a little more complicated, it definitely offers various benefits.
First I will see wether the variational dropout offers any performance improvements and then if it does I can try creating my own custom layer. And if the performance improves I will also look to file an enhancement request with Support.
Thanks again for your great support on this issue.

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (0)

Kategorien

Mehr zu Startup and Shutdown finden Sie in Help Center und File Exchange

Produkte


Version

R2018a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by