FastAPI @repeat_every how to prevent parallel def scheduled_task() instances

Solution 1:

To accomplish this task, you need some locking system, but one that is suitable for your environment.

For example, when running only a single worker, with a single async loop, a simple Lock from the asyncio synchronization primitives would be ideal...

But if you want to introduce more workers, then the state of the lock won't be synchronized between the instances. If your workers are being spawned on the same system, you can use a file system lock (for example the one from the fnctl module), but again, it won't work anymore if you'll introduce more server instances.

The next step may be to introduce a lock on the database level, or any other external system that is capable of managing a lock or delivering some task to only one recipient, but this will get very complicated very quickly.

That's why, there are systems like celery that will allow you to schedule tasks and the system will take care to prevent, if possible, this task being executed multiple times (note this is not always possible, as the executor may for example finish the task but never update the state of the task because of some fatal error or any other interruption, like power loss. That's why those kind of systems can ensure either that task will be run at least once or that it will be run at most once, but will never guarantee both, will only do it's best to maximize the chances of the other one).