How can I defer the execution of Celery tasks?


Question

I have a small script that enqueues tasks for processing. This script makes a whole lot of database queries to get the items that should be enqueued. The issue I'm facing is that the celery workers begin picking up the tasks as soon as it is enqueued by the script. This is correct and it is the way celery is supposed to work but this often leads to deadlocks between my script and the celery workers.

Is there a way I could enqueue all my tasks from the script but delay execution until the script has completed or until a fixed time delay?

I couldn't find this in the documentation of celery or django-celery. Is this possible?

Currently as a quick-fix I've thought of adding all the items to be processed into a list and when my script is done executing all the queries, I can simply iterate over the list and enqueue the tasks. Maybe this would resolve the issue but when you have thousands of items to enqueue, this might be a bad idea.

1
13
7/24/2017 8:41:03 AM

Accepted Answer

eta/countdown options enable to delay the task execution:

http://docs.celeryproject.org/en/master/userguide/calling.html#eta-and-countdown

15
10/22/2012 5:58:29 PM

I think you are trying to avoid race condition of your own scripts, not asking for a method to delay a task run.

Then you can create a task, and in that task, call each of your task with .apply(), not .apply_async() or .delay(). So that these tasks run sequentially


Licensed under: CC-BY-SA with attribution
Not affiliated with: Stack Overflow
Icon