I have several matlpotlib functions rolled into some django-celery tasks.
Every time the tasks are called more RAM is dedicated to python. Before too long, python is taking up all of the RAM.
QUESTION: How can I release this memory?
UPDATE 2 - A Second Solution:
I asked a similar question specifically about the memory locked up when matplotlib errors, but I got a good answer to this question
gc.collect() aren't needed if you use multiprocess to run the plotting function in a separate process whose memory will automatically be freed once the process ends.
UPDATE - The Solution:
These stackoverflow posts suggested that I can release the memory used by matplotlib objects with the following commands:
import gc gc.collect()
Here is the example I used to test the solution:
import matplotlib matplotlib.use('Agg') import matplotlib.pyplot as plt from pylab import import figure, savefig import numpy as np import gc a = np.arange(1000000) b = np.random.randn(1000000) fig = plt.figure(num=1, dpi=100, facecolor='w', edgecolor='w') fig.set_size_inches(10,7) ax = fig.add_subplot(111) ax.plot(a, b) fig.clf() plt.close() del a, b gc.collect()
Did you try to run you task function several times (in a for) to be sure that not your function is leaking no matter of celery? Make sure that django.settings.DEBUG is set False( The connection object holds all queries in memmory when DEBUG=True).