I have had the same problem in my R2017a (!) version. I have been inspecting the memory usage and running time of some function, which creates a large array based on an input size parameter. Alas, for a very large size – instead of throwing out-of-memory error – my computer stuck, so I had to forcefully abort and exit Matlab. Moreover, I had to restart the computer, as everything became too slow.
When I returned to Matlab, I have noticed a strange phenomenon: to my horror, everything became 3 times slower than before. Nothing helped:
- Restarting the computer;
- Restarting Matlab;
- Rehashing toolboxes;
- Removing prefdir;
- Running disk defragmentation;
- Running cleanup programs;
- Even reinstalling Matlab (!) did not help.
Before I continue, I need to emphasize an important point. I have noticed this hindrance in Matlab's speed only because I was probing the run-time to begin with. Impedance of a factor of 3 is not substantial enough to notice in naked eye. Yet, it is a very significant change rendering most of the simulations useless. It could have been weeks before I had noticed something was wrong, and by then it would have been too late, as there would be nothing to compare to. @Sean de Wolski, I believe it is a crucial bug in Matlab that must be addressed.
Finally, what has worked is the following.
- Draining the battery of my laptop (Dell Vostro).
- Running Dell System Diagnostic Tool at the boot-level. I ran it twice, just in case, with all the options on (with emphasis on memory and cores).
- Rebooting the computer.
Matlab is now back to its fast self. Yet, I have no idea why it has succeeded. This behavior should not exist in a ubiquitous program such as Matlab.