Hi, I have chained two tasks using the `chain` signature. I have also provided the (only) argument for the second task as a return value from the first task. However, I can see in my (sys)logs that only the first task is ever accepted and executed by celery. The second task never shows up. Not exactly sure what I'm doing wrong.
http://pasted.co/3403c431 this is what it looks like, pretty simple 2 lines of code. I tried the pipe syntax as well but to no avail.
TauPan has left the channel
notanormalnerd joined the channel
flowstate joined the channel
notanormalnerd
Hello everyone. I have a question regarding task registration.
VaticanCameos
Twitter: crank7
notanormalnerd
Currently when starting up a worker all tasks in my project are being registered with this worker, even if they are in some required dependency.
Is there a way to control this? I don't want to startup all workers with all tasks, but have different workers who execute different tasks. I know I can solve this with a queue, but I don't want them to load the code in the first place.
tbarbugli joined the channel
malinoff
notanormalnerd: if you don't import modules contain tasks or autodiscover them using app.autodiscover_tasks - they won't load
notanormalnerd: understanding what imports what will help you to control which tasks are loaded when
domino14 joined the channel
mihaj joined the channel
DLSteve joined the channel
enigma_raz joined the channel
alanjds joined the channel
pcreech has quit
alanjds
Hi. I am needing to control the number of workers per queue, starting more workers, etc.
cyme (2014) tries to do that but is highly outdated
1) is it worth to update the thing or
pcreech joined the channel
2) is better to redo the features I need using management signals?
control* signals
todos joined the channel
mihaj has quit
notanormalnerd
malinoff: Thanks for the quick response
malinoff: I will have a look into that
malinoff
alanjds: use ansible - worker processes management is something orthogonal to celery, it's job for a sanely written process manager like systemd in conjunction with a simple ssh wrapper
alanjds: or you can leverage your cloud provider's apis
w/e
mihaj joined the channel
alanjds
malinoff: I would prefer something less infra-involved.
The use case is less to control machines going up and down and more to control the balance of how much workers are subscribed for each queue and
cyme is supposed to do that, for the docs embed on the repo
r_s_o has quit
my doubt is: For an app that handles the queue subscriptions, is better to control this stuff via celery control signals, or to delegate to an external tool like cyme? (counting the burden of make cyme work again)
malinoff
alanjds: definitely external tools
alanjds
Ok... can you elaborate on this?
plz*
The flaw I saw on cyme is the "unsync" between the internal state DB and the real world
talking on the control bus directly could fix that easily, I guess
but I am not the one that worked with the control bus yet on life
pcreech has quit
malinoff
alanjds: I'm not talking about cyme. Generally, celery itself shouldn't manage its worker processes, this is something external
and should be managed by external tools
not cyme specifically
brb ~30 minutes
malinoff has quit
italorossi has quit
pcreech joined the channel
tbarbugli
@malinoff for the strace dump
which process should i trace?
alanjds
malinoff: ok. tkx. As cyme repo in on celery/cyme github org, I would expect it to be the "recommended example" about how to do stuff
nanonyme has quit
tonytan4ever joined the channel
todos_ joined the channel
flowstate has quit
nanonyme joined the channel
flowstate joined the channel
pcreech has quit
InfoTest joined the channel
anth0ny joined the channel
malinoff joined the channel
nanonyme has quit
Petazz joined the channel
nanonyme joined the channel
Jaxkr joined the channel
Debnet has quit
pcreech joined the channel
pcreech has quit
hypn0s has quit
hypn0s joined the channel
pcreech joined the channel
domino14 joined the channel
Jaxkr has quit
Jaxkr joined the channel
hypn0s has quit
hypn0s joined the channel
todos joined the channel
flowstate has quit
jobelenus joined the channel
jobelenus
hey im working in django 1.9. I previously put `import myapp.tasks` in my `myapp/__init__.py` file... but i need to move it to the `myapp/apps.py` django AppConfig due to application loading... but when I do this I get a very odd error whenever I try to start my celery workers: http://pastebin.com/J7qB7Fc6 it says that the celery application cannot import celery... which I don't really understand. what could be my
problem? when is the appropriate time to import my tasks?