josuebc: ok, well maybe a mod would like to udpate the site to remove that link too
thanks man!
lullis
I actually found some package on github called "async_signals" (https://github.com/nyergler/async-signals), which seems to be trying to solve this issue. But it looks kind of old and inactive.
The other alternative would be for me to find one old dusty module I have for doing pubsub with Kombu, and have django apps use that instead of the signals... but that would mean yet another process to run/manage.
But I am looking at the implementation, it seems that it is only focusing on making the *sending* async, it wouldn't solve the issue that is about *receiving*
I guess I will go back to my pubsub approach.
or I could look into django channels.
josuebc
black_mamba: Yes, thanks for that. I'm doing a PR right now to fix that.
lullis: So, what exactly are you trying to do with that signal? Why not just fire up another task once the first one is done?
I mean, a task queue can be viewe as a pub/sub architecture
lullis
Basically, I'd like to have one django app that can keep track of things that are done by all my other apps.
josuebc
lullis: By "all my other apps" you mean celery apps?
lullis
No, sorry... I mean django apps.
josuebc
lullis: Why not posting the necessary data after a task is done to the Django app that keeps track?
That way it's more scalable
lullis
Sort of... like I said I had the goal in minf keeping the apps separate. If the app that generates the events need to post data to the event aggregator, then every app I have will depend on the event aggregator app.
Ideally I'd like to have the apps firing and forgetting about these things, and only having the event aggregator app worrying about keeping track of everything.
josuebc
hmm... That would be true for signals too. If the receiver is not there then nothing happens.
lullis: Well, another way to do it is to actually monitor celery. Celery implements signals and events.
Yes, but not all of the events I am trying to monitor are originated on celery tasks...
I think you are right, the django signal approach is not ideal for this.
josuebc
You'd probably use celery events only to monitor what happens in celery tasks.
Then what happens in Django you do it through Django signals
lullis
Yes, but perhaps this django channel part can be used for both.
(even though I am reading in the docs that there are no guarantees about message delivery)
josuebc
Well django channels are a way to distribute your request load
There is no guarantees in django channels because there's no way to re-send a message.
lullis
Right.
josuebc
Although still the idea would be kind of the same. Right now you can monitor what happens within Django using Django's signals. What you're missing is what happens wihtin tasks.
Within a task you'll be able to send a channel message to your worker pool, which would be similar to doing a POST to a centralized monitoring app.
lullis
My idea would be to setup a channel where either celery tasks or django requests can send messages, and having the event aggregator app as the consumer of this channel.
josuebc
That should work
lullis
That would still break my ideal requirement of separating things, in the sense that now all the producers need to know what channel to send things to... but at least the modules are decoupled.
black_mamba joined the channel
Anyway, thanks for your help, @josuebc!
josuebc
lullis: That's still true with Django signals. Other processes need to know which signals to use.