matlab 2016a (linux) memory leak

Dear community,
several months ago I installed ver. 2016a on out Linux (Ubuntu) Workstation. With the time I saw the this version continuously slows down while using it and become unusable. Matlab restart helps.
Tracing down the issue, I saw the this Matlab version uses more and more memory with the time (~100GB ram in less than 10 mins while analyzing the data, even after clear all).
This is a very strong hint to the memory leak, as in addition to that ver. 2014b does not have this problem.
I did not track down where exactly this happens, but my scripts use a lot of binary and text file r/o operations, such as fopen, fwrite etc. This may be the cause.
So far, using ver. 2014b helps.
Did anybody else experience that and aware about patches?

1 Kommentar

José-Luis
José-Luis am 15 Sep. 2016
Bearbeitet: José-Luis am 15 Sep. 2016
Maybe you could try narrowing it down to a specific function? If it is actually a memory leak, that might be valuable information for TMW.

Melden Sie sich an, um zu kommentieren.

Antworten (4)

Vassilis Lemonidis
Vassilis Lemonidis am 25 Apr. 2021

1 Stimme

Do we have any news on that issue? It seems still to affect R2020b parallel processing on Linux, clear all does not release workers memory and Matlab needs to be manually closed and restarted after every code execution to forcibly release it. If you believe that it is a problem with the code I am using, could you please direct me to a method that I could add to the classes I have constructed and I am using, which could handle the garbage collection process? Also, is there a tool to identify weak pointers that clear all cannot handle well?

3 Kommentare

mikhailo
mikhailo am 25 Apr. 2021
The way I solved it is that I downgraded to Matlab 2014b, the latest version, which didn't have this bug.
You may try too it too. I didn't investigate further since then. If Matlab engineers don't want to handle it, why should I make their work.
Vassilis Lemonidis
Vassilis Lemonidis am 25 Apr. 2021
Unfortunately I am bound to use a more updated version of Matlab, as I have mat-files from newer versions that will lose information if reverted to an older one, while I am also using toolboxes that were previously aso not available... Thank you whatsoever for the suggestion, it is a bit annoying that nothing has been done to fix this behavior, after so many versions and years having passed....
mikhailo
mikhailo am 25 Apr. 2021
if you can nail down to problem you could share the code to the support. In my case, I was not an option for me.
Worst case, Matlab is not the only option for the scientific computing nowadays.

Melden Sie sich an, um zu kommentieren.

Haiying Tang
Haiying Tang am 7 Nov. 2017
Bearbeitet: Haiying Tang am 7 Nov. 2017

0 Stimmen

We got this error too!
Actually, It occurred both in R2015b and R2016b for Linux, but R2014b is OK!
We tested the same codes in R2014b, R2015b and R2016b on CentOS 7. Our Matlab code(exported to a Java package by MCC) uses the Parallel Computing Toolbox and runs under the MCR with 16-core CPU and 16G RAM servers. In R2014b, each worker keeps about 500M and runs normally, while in R2015b and R2016b, it increase the memory until the worker exit(killed by the OS) after running about 24 hours.
Because of this memory issue, we have to keep using the R2014b at present!
Hope someone can help us and thanks a lot!
(Posted by my technical workmate: Pingmin Fenlly Liu)

3 Kommentare

Pingmin Fenlly Liu
Pingmin Fenlly Liu am 7 Nov. 2017
Yes, I'm here.
Steven Lord
Steven Lord am 7 Nov. 2017
Please contact Technical Support using the Contact Us link in the upper-right corner of this page. Provide them a code segment or workflow with which you can reproduce this memory increase and work with them to determine if this is actually a bug and if so how to resolve it.
Pingmin Liu
Pingmin Liu am 8 Nov. 2017
This is my another account with my company email.

Melden Sie sich an, um zu kommentieren.

mikhailo
mikhailo am 7 Nov. 2017
Bearbeitet: mikhailo am 7 Nov. 2017

0 Stimmen

Same for me. I just keep using 2014b. I remember trying to find the problem but it does not seem to be an easy one.
Guys from Mathworks: please, as you see it is not a fake, repair it. 2 years and no response. After all, that is your responsibility. And the issue is serious.

7 Kommentare

Steven Lord
Steven Lord am 7 Nov. 2017
Please contact Technical Support using the Contact Us link in the upper-right corner of this page. Provide them a code segment or workflow with which you can reproduce this memory increase and work with them to determine if this is actually a bug and if so how to resolve it.
Pingmin Fenlly Liu
Pingmin Fenlly Liu am 8 Nov. 2017
Bearbeitet: Pingmin Fenlly Liu am 8 Nov. 2017
Yes, We also find that it's not easy to locate the issue source codes(M, C and Java). We have checked our codes very carefully again and again in latest 3 years(2015-2017), hoping to solve this issue and upgrade to R2016b, which is the end version supported by our license. However, it does't not work, so we are just using the R2014b for a long time now.
Steven Lord
Steven Lord am 8 Nov. 2017
If you told your doctor "I hurt", their first question would probably be "Where does it hurt?" Without that information, they're limited in what they can do to reduce or eliminate your pain.
So where does your code hurt (with respect to memory?) If you can give Support a general idea of where the pain is located, they can try to diagnose the cause of the pain and how to reduce or eliminate it.
Pingmin Fenlly Liu
Pingmin Fenlly Liu am 9 Nov. 2017
@Steven Yes, I know what you meant. I just said what we knew about this issue at present.
Pingmin Fenlly Liu
Pingmin Fenlly Liu am 9 Nov. 2017
Bearbeitet: Pingmin Fenlly Liu am 30 Mär. 2018
@Steven I am just trying to describe it in more detail, hoping this helps:
My workmate "X" writes the M functions. I write some C functions and compile them to MEXs. We export the M and MEX files into a Java package by the "deploytool". Then my workmate "Y" and "Z" write some Java functions and call the interfaces from the generated Jar file.
In MCR R2014b, our program runs normally, and each parallel worker("ctflauncher", from "parfor") hold about 500-700MB RAM. However, in MCR R2015b and R2016b, our program always runs to out of memory after about 24 hours, because each parallel worker use more and more memory until OS kernel kills the workers.
In past 2 years, All of us checked our memory management in the M, C and Java codes for several times, like "MWArray.disposeArray(obj);", hoping to find the memory issue's cause.
mikhailo
mikhailo am 9 Nov. 2017
I confirm the issue. I had troubles while using parfor loop. Never found the reason as it is not debuggable from within parfor. No issue after replacing parfor to for-loop.
Code cannot be unfortunately shared as commercial. I remember spending about an hour searching for the reason but did not find it. And ended up keep using the 2014b version.
Steven Lord
Steven Lord am 9 Nov. 2017
I searched the Bug Reports for any bugs that mention "parfor" and existed in release R2016a, but none seemed to deal with memory leaking.
As I stated above, I strongly recommend that if possible you send a small sample of the code that shows this behavior to Technical Support using the Contact Us link in the upper-right corner of this page so they (and the development team) can investigate.
Even if you can't isolate the problem down to a particular segment of code, you might want to contact them and ask what steps they recommend to use to investigate the problem. They may be able to help you narrow down the location of the problem to the point where they can pinpoint what's going on.

Melden Sie sich an, um zu kommentieren.

Zhuo Chen
Zhuo Chen am 9 Nov. 2017

0 Stimmen

Hi,
I understand that your MATLAB is running slowly. I have a few things for you to try regarding this issue.
First, please navigate to the Java Heap Memory preferences and increase the allocated value to 2048MB. Then restart MATLAB and see if the lag occurs on start-up and throughout your work.
Secondly, please disable the source control integration for MATLAB. You can find this at Preferences > MATLAB > General > Source Control and select "None". Restart MATLAB and then see if there is a change in the performance.
I strongly recommend that if possible you post a small sample of the code that shows this behavior here.

2 Kommentare

Pingmin Fenlly Liu
Pingmin Fenlly Liu am 10 Nov. 2017
It's not the slow response of Matlab, but the memory leak (almost) in MCR. Thank you all the same.
Pingmin Fenlly Liu
Pingmin Fenlly Liu am 15 Nov. 2017
OK, I'll try it as you said someday and thanks again.

Melden Sie sich an, um zu kommentieren.

Kategorien

Produkte

Gefragt:

am 15 Sep. 2016

Kommentiert:

am 25 Apr. 2021

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by