Yuan Cheng | 22 May 04:33 2015
Picon

I am a developer from China, I want to add chinese version of celery's docs, how can I start?

any thoughts?

Rohit Atri | 21 May 08:57 2015
Picon

Qpid support

Hey folks,

Anyone here using celery with Apache Qpid? If yes, could you provide some insights on your experience so far?

Also, what is the maturity level of qpid support in 3.1 (compared to RabbitMQ of course)?

Kombu docs have a big disclaimer on top - "This transport should be used with caution due to a known potential deadlock. See Issue 2199 for more details."

Thanks,
Rohit

--
You received this message because you are subscribed to the Google Groups "celery-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to celery-users+unsubscribe-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org.
To post to this group, send email to celery-users-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org.
Visit this group at http://groups.google.com/group/celery-users.
For more options, visit https://groups.google.com/d/optout.
Dharmit Shah | 20 May 10:23 2015
Picon

Django Celery cannot connect to remote RabbitMQ on EC2

Hi,

I am using celery with django in my project. I am trying to use RabbitMQ as message queue. Trying to use local RabbitMQ works well. But if I try to use a remotely installed RabbitMQ, it fails. 

My setup is:

* Django and celery running on local system (laptop)
* RabbitMQ running in cluster mode on EC2
* Both the RabbitMQ cluster nodes show proper output when I execute commands like

$ rabbitmqctl cluster_status

or
$ rabbitmqctl list_users


For further details, please check this stackoverflow question that I posted.

I am not sure if this issue is related to celery or rabbitmq. Hence, I am posting on the mailing lists for both.

Regards,
Dharmit.

--
You received this message because you are subscribed to the Google Groups "celery-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to celery-users+unsubscribe-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org.
To post to this group, send email to celery-users-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org.
Visit this group at http://groups.google.com/group/celery-users.
For more options, visit https://groups.google.com/d/optout.
陈小生 | 19 May 07:55 2015
Picon

celery run tasks multi times

Celery Version: 3.0.15

if not use CELERY_ACKS_LATE:

    Celery Node A, got task T1, but because Celery Node A is busy, task T1 is <PENDING>,

    after 3600 seconds, Celery Node B, will got task T1 again, and run it.

    if Celery Node A is free now, it will run T1 too.

so, i change to:

    CELERY_ACKS_LATE = True,
    CELERYD_PREFETCH_MULTIPLIER = 1,

this time, Celery Node A, got task T1, and run T1, but T1 is a long time running task,

after T1 <STARTED> more than 3600 seconds,

Celery B will got task T1, and run T1 too.

how can i disable this or make the *3600 seconds* longer?

Thanks.

--
You received this message because you are subscribed to the Google Groups "celery-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to celery-users+unsubscribe-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org.
To post to this group, send email to celery-users-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org.
Visit this group at http://groups.google.com/group/celery-users.
For more options, visit https://groups.google.com/d/optout.
Mazzaroth M. | 19 May 06:00 2015
Picon

schedule celery beat to perform tasks within a time range AND queue tasks at specific scheduled times

Hi I'd like to do three things with celery beat:

1) schedule celery beat to perform a task at a specific time or range(i.e. perform a celery task every 10 minutes everyday within the time ranges of 11AM to 8PM)
2) perform a series of tasks, each at a specific time i.e. (today at 11AM, today at 12:15PM, today at 7:30PM)
3) perform a series of tasks, each at a specific time, everyday

What API should I look at to do these things? So far I am able to use celery beat to "do something every x minutes".

regards,
Mazzaroth

--
You received this message because you are subscribed to the Google Groups "celery-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to celery-users+unsubscribe-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org.
To post to this group, send email to celery-users-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org.
Visit this group at http://groups.google.com/group/celery-users.
For more options, visit https://groups.google.com/d/optout.
Bertrand | 15 May 21:01 2015
Picon

tasks blocking on writing results to redis

Dear Celery Users,

I'm having an issue where tasks executed on a remote server sometimes hang once the actual computations are completed and the task is about to write out the results and/or its status. Sometimes they stay in this state for many minutes and eventually record their status (success), sometimes they stay like this indefinitely. I'm using redis as the result backend and the worker connects to the redis server through a VPN (it seems to be causing issue from what I've read in previous posts..). I've issued a SIGUSR1 signal to one of the stalling processes (the trace is below). 

Any ideas on how to solve or debug further this issue ? 

Many thanks in advance!

celery==3.1.17
redis client==2.10.3
redis server==2.8.4

INTERNAL ERROR: SoftTimeLimitExceeded()
Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/celery/app/trace.py", line 283, in trace_task
    uuid, retval, SUCCESS, request=task_request,
  File "/usr/local/lib/python2.7/dist-packages/celery/backends/base.py", line 256, in store_result
    request=request, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/celery/backends/base.py", line 486, in _store_result
    self.set(self.get_key_for_task(task_id), self.encode(meta))
  File "/usr/local/lib/python2.7/dist-packages/celery/backends/redis.py", line 160, in set
    return self.ensure(self._set, (key, value), **retry_policy)
  File "/usr/local/lib/python2.7/dist-packages/celery/backends/redis.py", line 149, in ensure
    **retry_policy
  File "/usr/local/lib/python2.7/dist-packages/kombu/utils/__init__.py", line 243, in retry_over_time
    return fun(*args, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/celery/backends/redis.py", line 169, in _set
    pipe.execute()
  File "/usr/local/lib/python2.7/dist-packages/redis/client.py", line 2578, in execute
    return execute(conn, stack, raise_on_error)
  File "/usr/local/lib/python2.7/dist-packages/redis/client.py", line 2455, in _execute_transaction
    self.parse_response(connection, '_')
  File "/usr/local/lib/python2.7/dist-packages/redis/client.py", line 2536, in parse_response
    self, connection, command_name, **options)
  File "/usr/local/lib/python2.7/dist-packages/redis/client.py", line 577, in parse_response
    response = connection.read_response()
  File "/usr/local/lib/python2.7/dist-packages/redis/connection.py", line 569, in read_response
    response = self._parser.read_response()
  File "/usr/local/lib/python2.7/dist-packages/redis/connection.py", line 224, in read_response
    response = self._buffer.readline()
  File "/usr/local/lib/python2.7/dist-packages/redis/connection.py", line 162, in readline
    self._read_from_socket()
  File "/usr/local/lib/python2.7/dist-packages/redis/connection.py", line 120, in _read_from_socket
    data = self._sock.recv(socket_read_size)
  File "/usr/local/lib/python2.7/dist-packages/billiard/pool.py", line 229, in soft_timeout_sighandler
    raise SoftTimeLimitExceeded()
SoftTimeLimitExceeded: SoftTimeLimitExceeded()


--
You received this message because you are subscribed to the Google Groups "celery-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to celery-users+unsubscribe-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org.
To post to this group, send email to celery-users-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org.
Visit this group at http://groups.google.com/group/celery-users.
For more options, visit https://groups.google.com/d/optout.
Antoine Brunel | 15 May 19:13 2015
Picon

Python 2.7.8 Celery 3.1.18 + Django 1.8.1: Periodic tasks not firing

Hello,

Since I am a bit of a newbie in Django & Celery, I tried to use almost all the code from Celery example git project except that I enabled the django admin. 
However, I have the following problem: From django admin, I create a crontab that fires every minute, then associates it to a periodic task that runs debug_task. But nothing fires!

settings.py are slightly different: 
1- I use Rabbit-MQ server so I added to settings.py:
BROKER_URL = 'amqp://guest:guest <at> localhost:5672/'
CELERY_RESULT_BACKEND='djcelery.backends.database:DatabaseBackend'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'

2- In INSTALLED_APPS, I also enabled the django admin interface, added djcelery, but disabled kombu since I use rabbit-mq :
INSTALLED_APPS = (
    'django.contrib.auth',
    'django.contrib.contenttypes',
    'django.contrib.sessions',
    'django.contrib.sites',
    'django.contrib.messages',
    'django.contrib.staticfiles',
    #'kombu.transport.django.KombuAppConfig',
    'demoapp',
    'djcelery',
    # Uncomment the next line to enable the admin:
    'django.contrib.admin',
    # Uncomment the next line to enable admin documentation:
    #'django.contrib.admindocs',
)

urls.py were also modified to load the admin:
from django.contrib import admin
urlpatterns = [
  url(r'^admin/', include(admin.site.urls)),
]

I run rabbit-mq server, then celery beat 
%python manage.py celery beat -l info
/Users/antoinebrunel/seo/lib/python2.7/site-packages/kombu/utils/__init__.py:407: UserWarning: Module argparse was already imported from /usr/local/Cellar/python/2.7.8/Frameworks/Python.framework/Versions/2.7/lib/python2.7/argparse.pyc, but /Users/antoinebrunel/seo/lib/python2.7/site-packages is being added to sys.path
  from pkg_resources import iter_entry_points
/Users/antoinebrunel/seo/lib/python2.7/site-packages/django/core/management/base.py:259: RemovedInDjango19Warning: "requires_model_validation" is deprecated in favor of "requires_system_checks".
  RemovedInDjango19Warning)
celery beat v3.1.18 (Cipater) is starting.
__    -    ... __   -        _
Configuration ->
    . broker -> amqp://guest:** <at> localhost:5672//
    . loader -> celery.loaders.app.AppLoader
    . scheduler -> celery.beat.PersistentScheduler
    . db -> celerybeat-schedule
    . logfile -> [stderr] <at> %INFO
    . maxinterval -> now (0s)
[2015-05-15 17:10:23,318: INFO/MainProcess] beat: Starting...

Then the celery worker
python manage.py celeryd -E -B -l info
/Users/antoinebrunel/seo/lib/python2.7/site-packages/kombu/utils/__init__.py:407: UserWarning: Module argparse was already imported from /usr/local/Cellar/python/2.7.8/Frameworks/Python.framework/Versions/2.7/lib/python2.7/argparse.pyc, but /Users/antoinebrunel/seo/lib/python2.7/site-packages is being added to sys.path
  from pkg_resources import iter_entry_points
/Users/antoinebrunel/seo/lib/python2.7/site-packages/django/core/management/base.py:259: RemovedInDjango19Warning: "requires_model_validation" is deprecated in favor of "requires_system_checks".
  RemovedInDjango19Warning)
 
 -------------- celery-6MxwUkJSksV7e7IE9lEbeQ@public.gmane.org v3.1.18 (Cipater)
---- **** ----- 
--- * ***  * -- Darwin-14.3.0-x86_64-i386-64bit
-- * - **** --- 
- ** ---------- [config]
- ** ---------- .> app:         proj:0x104b60a90
- ** ---------- .> transport:   amqp://guest:** <at> localhost:5672//
- ** ---------- .> results:     djcelery.backends.database:DatabaseBackend
- *** --- * --- .> concurrency: 2 (prefork)
-- ******* ---- 
--- ***** ----- [queues]
 -------------- .> celery           exchange=celery(direct) key=celery
                
[tasks]
  . demoapp.tasks.add
  . demoapp.tasks.mul
  . demoapp.tasks.xsum
  . proj.celery.debug_task
[2015-05-15 17:09:48,872: INFO/Beat] beat: Starting...
[2015-05-15 17:09:48,930: INFO/MainProcess] Connected to amqp://guest:** <at> 127.0.0.1:5672//
[2015-05-15 17:09:48,961: INFO/MainProcess] mingle: searching for neighbors
[2015-05-15 17:09:49,986: INFO/MainProcess] mingle: all alone
/Users/antoinebrunel/seo/lib/python2.7/site-packages/celery/fixups/django.py:265: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
  warnings.warn('Using settings.DEBUG leads to a memory leak, never '
[2015-05-15 17:09:50,015: WARNING/MainProcess] /Users/antoinebrunel/seo/lib/python2.7/site-packages/celery/fixups/django.py:265: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
  warnings.warn('Using settings.DEBUG leads to a memory leak, never '
[2015-05-15 17:09:50,015: WARNING/MainProcess] celery-6MxwUkJSksV7e7IE9lEbeQ@public.gmane.org ready.

Then django dev webserver
% python manage.py runserver 
/Users/antoinebrunel/seo/lib/python2.7/site-packages/kombu/utils/__init__.py:407: UserWarning: Module argparse was already imported from /usr/local/Cellar/python/2.7.8/Frameworks/Python.framework/Versions/2.7/lib/python2.7/argparse.pyc, but /Users/antoinebrunel/seo/lib/python2.7/site-packages is being added to sys.path
  from pkg_resources import iter_entry_points
/Users/antoinebrunel/seo/lib/python2.7/site-packages/kombu/utils/__init__.py:407: UserWarning: Module argparse was already imported from /usr/local/Cellar/python/2.7.8/Frameworks/Python.framework/Versions/2.7/lib/python2.7/argparse.pyc, but /Users/antoinebrunel/seo/lib/python2.7/site-packages is being added to sys.path
  from pkg_resources import iter_entry_points
Performing system checks...
System check identified no issues (0 silenced).
May 15, 2015 - 17:08:41
Django version 1.8.1, using settings 'proj.settings'
Starting development server at http://127.0.0.1:8000/
Quit the server with CONTROL-C.

Then I run celerycam
% python manage.py celerycam
/Users/antoinebrunel/seo/lib/python2.7/site-packages/kombu/utils/__init__.py:407: UserWarning: Module argparse was already imported from /usr/local/Cellar/python/2.7.8/Frameworks/Python.framework/Versions/2.7/lib/python2.7/argparse.pyc, but /Users/antoinebrunel/seo/lib/python2.7/site-packages is being added to sys.path
  from pkg_resources import iter_entry_points
/Users/antoinebrunel/seo/lib/python2.7/site-packages/django/core/management/base.py:259: RemovedInDjango19Warning: "requires_model_validation" is deprecated in favor of "requires_system_checks".
  RemovedInDjango19Warning)
-> evcam: Taking snapshots with djcelery.snapshot.Camera (every 1.0 secs.)
[2015-05-15 17:06:01,159: INFO/MainProcess] Connected to amqp://guest:** <at> 127.0.0.1:5672//
 

So, what am I doing wrong? Why don't I see the events firing?

Thank you for your help

--
You received this message because you are subscribed to the Google Groups "celery-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to celery-users+unsubscribe-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org.
To post to this group, send email to celery-users-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org.
Visit this group at http://groups.google.com/group/celery-users.
For more options, visit https://groups.google.com/d/optout.
Dharmit Shah | 15 May 08:52 2015
Picon

Celery + Django with Redis cluster

Hello,

I am new to using celery and redis so it might be possible that I am missing something really simple.

I have setup a redis cluster and want to use it as broker for celery in my django project. Project's settings.py has below configurations fro celery:

import djcelery
djcelery.setup_loader()

BROKER_BACKEND = "redis"
BROKER_HOST = "127.0.0.1"
BROKER_PORT = "7000"

CELERY_RESULT_BACKEND = 'redis'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_IMPORTS = ('api.tasks',)

However, whenever I start celery worker, I get below error:

python manage.py celery worker --loglevel=INFO                                                                        ✭ ✱ ◼
/home/dharmit/rest_api/venv/local/lib/python2.7/site-packages/celery/app/defaults.py:251: CPendingDeprecationWarning: 
    The 'BROKER_HOST' setting is scheduled for deprecation in     version 2.5 and removal in version v4.0.     Use the BROKER_URL setting instead

  alternative='Use the {0.alt} instead'.format(opt))

/home/dharmit/rest_api/venv/local/lib/python2.7/site-packages/celery/app/defaults.py:251: CPendingDeprecationWarning: 
    The 'BROKER_PORT' setting is scheduled for deprecation in     version 2.5 and removal in version v4.0.     Use the BROKER_URL setting instead

  alternative='Use the {0.alt} instead'.format(opt))

 
 -------------- celery <at> ubuntu v3.1.18 (Cipater)
---- **** ----- 
--- * ***  * -- Linux-3.19.0-16-generic-x86_64-with-Ubuntu-15.04-vivid
-- * - **** --- 
- ** ---------- [config]
- ** ---------- .> app:         default:0x7f2ea80aa510 (djcelery.loaders.DjangoLoader)
- ** ---------- .> transport:   redis://127.0.0.1:7000//
- ** ---------- .> results:     redis
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ---- 
--- ***** ----- [queues]
 -------------- .> celery           exchange=celery(direct) key=celery
                

[tasks]
  . api.tasks.send_verification_mail
  . api.tasks.store_workouts_from_list

[2015-05-15 06:46:42,450: ERROR/MainProcess] Unrecoverable error: ResponseError('MOVED 14663 127.0.0.1:7002',)
Traceback (most recent call last):
  File "/home/dharmit/rest_api/venv/local/lib/python2.7/site-packages/celery/worker/__init__.py", line 206, in start
    self.blueprint.start(self)
  File "/home/dharmit/rest_api/venv/local/lib/python2.7/site-packages/celery/bootsteps.py", line 123, in start
    step.start(parent)
  File "/home/dharmit/rest_api/venv/local/lib/python2.7/site-packages/celery/bootsteps.py", line 374, in start
    return self.obj.start()
  File "/home/dharmit/rest_api/venv/local/lib/python2.7/site-packages/celery/worker/consumer.py", line 278, in start
    blueprint.start(self)
  File "/home/dharmit/rest_api/venv/local/lib/python2.7/site-packages/celery/bootsteps.py", line 123, in start
    step.start(parent)
  File "/home/dharmit/rest_api/venv/local/lib/python2.7/site-packages/celery/worker/consumer.py", line 478, in start
    c.connection = c.connect()
  File "/home/dharmit/rest_api/venv/local/lib/python2.7/site-packages/celery/worker/consumer.py", line 378, in connect
    conn.transport.register_with_event_loop(conn.connection, self.hub)
  File "/home/dharmit/rest_api/venv/local/lib/python2.7/site-packages/kombu/transport/redis.py", line 932, in register_with_event_loop
    cycle.on_poll_init(loop.poller)
  File "/home/dharmit/rest_api/venv/local/lib/python2.7/site-packages/kombu/transport/redis.py", line 308, in on_poll_init
    num=channel.unacked_restore_limit,
  File "/home/dharmit/rest_api/venv/local/lib/python2.7/site-packages/kombu/transport/redis.py", line 183, in restore_visible
    self.unacked_mutex_expire):
  File "/usr/lib/python2.7/contextlib.py", line 17, in __enter__
    return self.gen.next()
  File "/home/dharmit/rest_api/venv/local/lib/python2.7/site-packages/kombu/transport/redis.py", line 107, in Mutex
    i_won = client.setnx(name, lock_id)
  File "/home/dharmit/rest_api/venv/local/lib/python2.7/site-packages/redis/client.py", line 1080, in setnx
    return self.execute_command('SETNX', name, value)
  File "/home/dharmit/rest_api/venv/local/lib/python2.7/site-packages/redis/client.py", line 565, in execute_command
    return self.parse_response(connection, command_name, **options)
  File "/home/dharmit/rest_api/venv/local/lib/python2.7/site-packages/redis/client.py", line 577, in parse_response
    response = connection.read_response()
  File "/home/dharmit/rest_api/venv/local/lib/python2.7/site-packages/redis/connection.py", line 574, in read_response
    raise response
ResponseError: MOVED 14663 127.0.0.1:7002


Can someone please help me get over the error? Or is it not yet possible to use Redis cluster setup and celery together?

Regards,
Dharmit.

--
You received this message because you are subscribed to the Google Groups "celery-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to celery-users+unsubscribe-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org.
To post to this group, send email to celery-users-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org.
Visit this group at http://groups.google.com/group/celery-users.
For more options, visit https://groups.google.com/d/optout.
傅左右 | 15 May 06:00 2015
Picon

Can celery work with Azure service bus ?

Hi all,

I heard that celery can use an AMQP protocol with RabbitMQ, while Azure service bus is also AMQPed, so can I use celery with Azure service bus?

I'm new to celery. I couldn't figure it out by Google or searching document, does anyone have ever tried it out?

PS: ActiveMQ is also AMQPed but not supported by celery, does anyone know the reason?

Many thanks.

--
You received this message because you are subscribed to the Google Groups "celery-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to celery-users+unsubscribe-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org.
To post to this group, send email to celery-users-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org.
Visit this group at http://groups.google.com/group/celery-users.
For more options, visit https://groups.google.com/d/optout.
celery_beginner | 14 May 02:36 2015
Picon

Celery: list all tasks, scheduled, active and finished

I know that via inspect one can get a list of active and scheduled tasks. Is there a way to get a list of all tasks including active, scheduled, finished tasks. I know that celery flower can do that, but i need to embed it in a monitoring script and wanted to know if there was a way/command to call and get all the list of tasks.

--
You received this message because you are subscribed to the Google Groups "celery-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to celery-users+unsubscribe-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org.
To post to this group, send email to celery-users-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org.
Visit this group at http://groups.google.com/group/celery-users.
For more options, visit https://groups.google.com/d/optout.
tyler | 13 May 20:33 2015

celery multi broker defaulting to localhost

We are using celery 3.1.17 with Django and django-celery. 

Our workers properly detect the amqp host from settings when running with the worker command:

/path/to/venv/bin/python /path/to/django/project/manage.py celery worker

... <snip>
 -------------- celery <at> <hostname> v3.1.17 (Cipater)
---- **** ----- 
--- * ***  * -- Linux-3.13.0-52-generic-x86_64-with-Ubuntu-14.04-trusty
-- * - **** --- 
- ** ---------- [config]
- ** ---------- .> app:         default:0x7fc3861cbc50 (djcelery.loaders.DjangoLoader)
- ** ---------- .> transport:   amqp://guest:** <at> 10.50.147.180:5672//


However when we use multi none of the django settings are imported and the broker defaults to localhost. The command we are testing with is:

/path/to/venv/bin/python /path/to/django/project/manage.py celery multi start <name>


Any suggestions on what the problem might be and how to fix it would be greatly appreciated. 

Cheers,
Tyler

--
You received this message because you are subscribed to the Google Groups "celery-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to celery-users+unsubscribe-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org.
To post to this group, send email to celery-users-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org.
Visit this group at http://groups.google.com/group/celery-users.
For more options, visit https://groups.google.com/d/optout.

Gmane