Running multiple workers using Celery

I need to read from Rabbitmq and execute task in parallel using Celery in a single system.

[2014-12-30 15:54:22,374: INFO/Worker-1] ...   
[2014-12-30 15:54:23,401: INFO/Worker-1] ...
[2014-12-30 15:54:30,878: INFO/Worker-1] ...
[2014-12-30 15:54:32,209: INFO/Worker-1] ...
[2014-12-30 15:54:33,255: INFO/Worker-1] ...
[2014-12-30 15:54:48,445: INFO/Worker-1] ...
[2014-12-30 15:54:49,811: INFO/Worker-1] ...
[2014-12-30 15:54:50,903: INFO/Worker-1] ...
[2014-12-30 15:55:39,674: INFO/Worker-1] ...
[2014-12-30 15:55:41,024: INFO/Worker-1] ...
[2014-12-30 15:55:42,147: INFO/Worker-1] ...

It seams only 1 worker is running all the time .. ie one after another in sequential order. How can I configure Celery to run multiple workers to run parallel ?


I have now updated my answer following the comment from MartinP regarding worker spawning child processes not threads:

Celery worker and worker processes are different things (Read this for reference).

When a worker is started it then spawns a certain number of child processes.

The default number of those processes is equal to a number of cores on that machine.

On Linux you can check the number of cores via:

$ nproc --all

Otherwise you can specify it yourself, for e.g.:

$ celery -A proj worker --loglevel=INFO --concurrency=2

In the above example there's one worker which will be able to spawn 2 child processes. It is normally advised to run a single worker per machine and the concurrency value will define how many processes will run in parallel, but if multiple workers required to run then you can start them like shown below:

$ celery -A proj worker -l info --concurrency=4 -n wkr1@hostname
$ celery -A proj worker -l info --concurrency=2 -n wkr2@hostname
$ celery -A proj worker -l info --concurrency=2 -n wkr3@hostname

Refer to celery docs for more info