I'm having a little trouble wrapping my head around a specific flow
I am trying to build a workflow that is fairly synchronous: A -> B -> C -> group(d) -> E (aggregate) then back to A
Am I thinking right that the best way to achieve that is to use the same queue?
The big problem is with group(d)
In the same queue the order is fine for A -> B -> C but I would like to find a way to fan out from C do a group of tasks which then aggregate on E, it works fine with a chord it's just that it's celery is working through A to C for all messages until that is exhausted
then moving to d and e
malinoff has quit
italorossi joined the channel
italorossi has quit
italorossi joined the channel
malinoff joined the channel
italorossi has quit
Is there a way to update a running chain for example?
shiriru joined the channel
Guest69867 is now known as nelsonm
nelsonm
The group I want to fan out to are faily long running tasks, around 20 - 30 seconds. As individual tasks it's not bad and it allows for easier retrying, however there can be hundreds generated and suddenly having one worker to the job isn't so practical
ayaz has quit
Ergo joined the channel
italorossi joined the channel
italorossi has quit
Dejan joined the channel
Tanay has quit
madsj joined the channel
Tanay joined the channel
malinoff has quit
sp1rs has quit
sp1rs joined the channel
malinoff joined the channel
commx joined the channel
sp1rs has quit
sp1rs joined the channel
malinoff has quit
Tanay has quit
sp1rs has quit
malinoff joined the channel
sp1rs joined the channel
italorossi joined the channel
sp1rs has quit
malinoff has quit
malinoff joined the channel
sp1rs joined the channel
Crovax31_ joined the channel
Crovax31_ has quit
Crovax31_ joined the channel
Crovax31_
Hi, [django] is app.on_after_configure signal trigered after detected tasks files are imported? (I would like to register scheduler in tasks files, would it be an ugly pattern?)
lavalamp joined the channel
gp has quit
lavalamp has quit
SteamWells joined the channel
SteamWells
Hi All, I am reading this guide on setting up async tasks https://realpython.com/blog/python/asynchronous... - and the periodic task I have written is just not being picked up, i looked at the celery docs for the version i have installed and it looks like the decorators have been removed
have they truly been removed and I need to do it the way documented?
italorossi has quit
Crovax31_
SteamWells: are you using celery 4.0+?
if yes, this tutorial is out of date
SteamWells
yes I am
Crovax31_
you have to make a celery app instance in your project (like a wsgi instance), and use from project import app \n @app.task
SteamWells
but then i pip installed v 3.1.x from the tutorial and ran it again
no such luck
i have a celery app instance
Dejan
Hell, conda has no Celery in their repo...
Crovax31_
(ho, sorry, I thought it was even more out of date as I didn't read ^^)
SteamWells
no worries
im still investigating it, I think i can see what the issue is
i might just scrap my version and use the latest one and follow the docs
Crovax31_
yea, even settings names
SteamWells
i just wanted to get this working as I have to demo this soon
but still, better do it properly
Crovax31_
what do you mean by "soon"
SteamWells
a week
eos
but in devops time, that will soon come :D
amcm joined the channel
DLSteve joined the channel
sp1rs has quit
well on version 4.0.2 it still seems to work with the decorator
just had to remove the settings.INSTALLED_APPS arg in auto discover tasks
simple as
maxmayer joined the channel
sp1rs joined the channel
maxmayer
I am a newbie here and I just started using celery, I have been struggling with this error and I feel it is easy to fix, so basically whenever I want to start a worker I an error saying AppRegistryNotReady: Apps aren't loaded yet.