How can I release memory after creating matplotlib figures


I have several matlpotlib functions rolled into some django-celery tasks.

Every time the tasks are called more RAM is dedicated to python. Before too long, python is taking up all of the RAM.

QUESTION: How can I release this memory?

UPDATE 2 - A Second Solution:

I asked a similar question specifically about the memory locked up when matplotlib errors, but I got a good answer to this question .clf(), .close(), and gc.collect() aren't needed if you use multiprocess to run the plotting function in a separate process whose memory will automatically be freed once the process ends.

Matplotlib errors result in a memory leak. How can I free up that memory?

UPDATE - The Solution:

These stackoverflow posts suggested that I can release the memory used by matplotlib objects with the following commands:

.clf(): Matplotlib runs out of memory when plotting in a loop

.close(): Python matplotlib: memory not being released when specifying figure size

import gc

Here is the example I used to test the solution:

import matplotlib
import matplotlib.pyplot as plt
from pylab import import figure, savefig
import numpy as np
import gc      

a = np.arange(1000000)
b = np.random.randn(1000000)

fig = plt.figure(num=1, dpi=100, facecolor='w', edgecolor='w')
ax = fig.add_subplot(111)
ax.plot(a, b)

del a, b
5/23/2017 12:25:33 PM

Accepted Answer

Did you try to run you task function several times (in a for) to be sure that not your function is leaking no matter of celery? Make sure that django.settings.DEBUG is set False( The connection object holds all queries in memmory when DEBUG=True).

8/18/2011 1:40:25 AM

Licensed under: CC-BY-SA with attribution
Not affiliated with: Stack Overflow