noblomov | 4 May 15:42 2011
Picon

CELERY_ALWAYS_EAGER setting

Hi,

I've set up celery within my Django project, using djcelery and kombu
as a transport.

I'm able to fire tasks, but they seem to be blocking (like launched
with xx.apply() instead of xx.apply_async()).

Thus, using xx.apply_async(countdown=10) for example, my program hangs
for 10s before going on.

From what I've read I should be able to set CELERY_ALWAYS_EAGER =
False somewhere, but I can't find where.

Any help would be appreciated...

Thanks,

N.

zvikico | 4 May 16:44 2011
Picon

Working with a brokers (RabbitMQ) cluster

Hi,


To achieve high-availability and load-balancing, I'm using a RabbitMQ cluster for my Celery configuration. Setting up the cluster is fairly simple. However, when it comes to connecting to the cluster from Celery, I was missing something. Celery gets the broker address. In my case I have several addresses, all could be used and they act the same, due to the cluster setup. If I specify one server in my Celery configuration, I'm missing the key features of the cluster.

A possible solution is to set up a load balancer (LB) in front of the RabbitMQ cluster and configure Celery to use the LB address instead of referencing the servers directly. This configuration requires some effort to configure, especially if one is looking to eliminate the LB as a single point of failure.

A much simpler solution would be to provide an array of addresses to Celery. Celery would pick an address from the list (possibly rotating them in a round robin) and use it. In case of connection time-out, it would try the next address.

If there's a better solution, I'll be happy to know.

Thanks,
Zviki


--
You received this message because you are subscribed to the Google Groups "celery-users" group.
To post to this group, send email to celery-users-/JYPxA39Uh5TLH3MbocFF+G/Ez6ZCGd0@public.gmane.org
To unsubscribe from this group, send email to celery-users+unsubscribe <at> googlegroups.com.
For more options, visit this group at http://groups.google.com/group/celery-users?hl=en.
Antoni Aloy | 4 May 21:04 2011
Picon

Re: CELERY_ALWAYS_EAGER setting

2011/5/4 noblomov <nicomech@...>:
> Hi,
>
> I've set up celery within my Django project, using djcelery and kombu
> as a transport.
>
> I'm able to fire tasks, but they seem to be blocking (like launched
> with xx.apply() instead of xx.apply_async()).
>
> Thus, using xx.apply_async(countdown=10) for example, my program hangs
> for 10s before going on.
>
> From what I've read I should be able to set CELERY_ALWAYS_EAGER =
> False somewhere, but I can't find where.
>

put it then settings file. But as far as I know it should be false as default.

-- 
Antoni Aloy López
Blog: http://trespams.com
Site: http://apsl.net

--

-- 
You received this message because you are subscribed to the Google Groups "celery-users" group.
To post to this group, send email to celery-users@...
To unsubscribe from this group, send email to celery-users+unsubscribe <at> googlegroups.com.
For more options, visit this group at http://groups.google.com/group/celery-users?hl=en.

seanbollin | 5 May 03:07 2011

Celery Concurrency

Hi, I'm running celery with the following two commands:

python /var/www/djvenom/manage.py celeryd -B --loglevel=DEBUG -Q
google_tasks --concurrency=0 -n worker1.leftronic.com

python /var/www/djvenom/manage.py celeryd --loglevel=INFO -Q
google_tasks --concurrency=0 -n worker2.leftronic.com

Why is it that I set concurrency to 0, and still see *7* processes
running:

10230 python /var/www/djvenom/manage.py celeryd -B --loglevel=DEBUG -Q
google_tasks --concurrency=0 -n worker1.leftronic.com
10239 python /var/www/djvenom/manage.py celeryd -B --loglevel=DEBUG -Q
google_tasks --concurrency=0 -n worker1.leftronic.com
10240 python /var/www/djvenom/manage.py celeryd -B --loglevel=DEBUG -Q
google_tasks --concurrency=0 -n worker1.leftronic.com
10247 python /var/www/djvenom/manage.py celeryd -B --loglevel=DEBUG -Q
google_tasks --concurrency=0 -n worker1.leftronic.com
10250 python /var/www/djvenom/manage.py celeryd --loglevel=INFO -Q
google_tasks --concurrency=0 -n worker2.leftronic.com
10258 python /var/www/djvenom/manage.py celeryd --loglevel=INFO -Q
google_tasks --concurrency=0 -n worker2.leftronic.com
10259 python /var/www/djvenom/manage.py celeryd --loglevel=INFO -Q
google_tasks --concurrency=0 -n worker2.leftronic.com

dmitry b | 5 May 23:13 2011
Picon

Re: Celery Concurrency

I imagine that's because 0 isn't a valid concurrency value, so celery
just reverts to the default (7).  What you probably want is the
concurrency of 1, i.e. - "not concurrent' or serial.

> Why is it that I set concurrency to 0, and still see *7* processes
> running:

dmitry b | 5 May 23:16 2011
Picon

Re: CELERY_ALWAYS_EAGER setting

> Thus, using xx.apply_async(countdown=10) for example, my program hangs
> for 10s before going on.

But is it running tasks asynchronously or not? It could be hanging for
a number of reasons.  For example, it may take 10s to establish a
connection to the broker.

dmitry b | 5 May 23:20 2011
Picon

Re: Tasks not refreshed when altered on disk

Though, if you set a limit on how many tasks a worker can process
before been terminated (-maxtasksperchild), a newly spawned worker may
start with the new code.

On Apr 28, 12:44 am, splatEric <m...@...> wrote:
> You will have to restart celeryd, it doesn't have auto reload.
>
> On Apr 27, 11:58 pm, Thomas Weholt <thomas.weh...@...> wrote:
>
> > A simple question; when a task defined in a file is changed on disk
> > while celeryd is running, will celery still use the old definition of
> > the task? Can I somehow force a reload of the code?
>
> > --
> > Mvh/Best regards,
> > Thomas Weholthttp://www.weholt.org

--

-- 
You received this message because you are subscribed to the Google Groups "celery-users" group.
To post to this group, send email to celery-users@...
To unsubscribe from this group, send email to celery-users+unsubscribe <at> googlegroups.com.
For more options, visit this group at http://groups.google.com/group/celery-users?hl=en.

seanbollin | 5 May 23:34 2011

Re: Celery Concurrency

The value of 0 is an odd test value I threw in there.  I still have
similar results with a concurrency of 1.

On May 5, 2:13 pm, dmitry b <dmitry.ma...@...> wrote:
> I imagine that's because 0 isn't a valid concurrency value, so celery
> just reverts to the default (7).  What you probably want is the
> concurrency of 1, i.e. - "not concurrent' or serial.
>
>
>
>
>
>
>
> > Why is it that I set concurrency to 0, and still see *7* processes
> > running:

--

-- 
You received this message because you are subscribed to the Google Groups "celery-users" group.
To post to this group, send email to celery-users@...
To unsubscribe from this group, send email to celery-users+unsubscribe <at> googlegroups.com.
For more options, visit this group at http://groups.google.com/group/celery-users?hl=en.

Ryan | 9 May 21:49 2011
Picon

Celery tasks not being picked

I'm using Celery in a Pyramid app running through Paster. One of the classes has a number of methods that are used as Celery tasks. Supervisor is used to run celeryd. All works fine.


I also have a script I'd like to run from the command line, or from a Paster shell, that can call Celery tasks on instances of the class referred to above. When I do, however, I get an AsyncResult object back that never seems to execute. The task is never received by Celery and never shows up in the logs, although celeryd and RabbitMQ are running.

From inside the script, when I run...

        import celery.loaders
        celery = celery.loaders.load_settings()
        print celery

... the settings from celeryconfig.py which celeryd is running under are not there. If celeryd was running, I thought the settings would be present and the tasks would be picked up. Apparently not.

Note: this script is not included in CELERY_IMPORTS, but I don't think that's necessary because the class that has the celery tasks is. 

Any insights into what I am missing? Thanks.

--
You received this message because you are subscribed to the Google Groups "celery-users" group.
To post to this group, send email to celery-users-/JYPxA39Uh5TLH3MbocFF+G/Ez6ZCGd0@public.gmane.org
To unsubscribe from this group, send email to celery-users+unsubscribe <at> googlegroups.com.
For more options, visit this group at http://groups.google.com/group/celery-users?hl=en.
erikcw | 9 May 07:28 2011
Picon

PeriodicTasks automatically execute when celery beat is restarted?

Hi,

I'm using the periodic_task decorator as a cron scheduler in my Django
app.  When I restart celerybeat, all of my tasks run immediately, even
if they are not scheduled to do so that day.  For example, I have the
following task:

 <at> periodic_task(run_every=crontab(minute=0, hour=10,
day_of_week="mon"))
def myfunction():
    pass

If I restart celery using the init.d script (runs ./manage.py celeryd -
v 2 -B -s celery -E -l DEBUG -f /var/log/celeryd.log -l INFO), my
periodic task will be executed, even if it isn't any where near monday
at 10am.  What am I missing?

Thanks!
Erik


Gmane