asksol: yeah. all of our values are json compatible. If you're not objecting, that gives me a bit more confidence to proceed.
domino14 joined the channel
asksol
bmbouter: we should probably remove the __del__
bmbouter: that's probably why they're not collected in the first place
as defining __del__ means the gc cannot collect cyclic references
I removed one __del__ in py-amqp recently, inherited from amqplib. In my experience you never really need them, they're just used to enable sloppy programming
there's no __del__ in kombu master at all afaict
flowstate joined the channel
flowstate has quit
domino14 joined the channel
tbp has left the channel
rfj001 has quit
rickmak_ joined the channel
rfj001 joined the channel
flowstate joined the channel
maryokhin has quit
flowstate has quit
poofmooter joined the channel
nicksloan has quit
domino14 joined the channel
poofmooter joined the channel
rickmak_ joined the channel
rickmak_ joined the channel
poofmooter joined the channel
flowstate joined the channel
flowstate has quit
poofmooter joined the channel
rickmak_ joined the channel
domino14 joined the channel
rickmak_ joined the channel
ahil joined the channel
ahil
hey is anyone here?
italorossi has quit
ahil has quit
italorossi joined the channel
italorossi has quit
flowstate joined the channel
flowstate has quit
poofmooter joined the channel
flowstate joined the channel
flowstate has quit
malinoff
asksol: ping
foist has quit
atomekk joined the channel
domino14 joined the channel
domino14 has quit
sp1rs joined the channel
sp1rs has quit
flowstate joined the channel
flowstate has quit
todos joined the channel
plx joined the channel
plx
hi
domino14 joined the channel
dis there a way to create a task that returns a group() from an input list ? (like the dmap example) the only solution I have seems to work, but returns an asyncresult containing the groupresult, so I cannot pipe it
Any tips on how to start debugging memory usage? Our nodes takes 1-2GB of RAM (per process) on start.
using RabbitMQ as a broker, celery==3.1.23, librabbitmq==1.6.1
Debnet
Hey guys!
I have a question: does schedule tasks allocate a worker?
Because I have a service which launch a lot of tasks and I expect to have 5 worker at the same time, but I have only 3 running.
(I have 2 scheduled tasks but they run at 2 PM)
rfj001 has quit
flowstate joined the channel
malinoff has quit
flowstate has quit
darkelda joined the channel
italorossi joined the channel
italorossi has quit
cecemel joined the channel
darkelda joined the channel
sp1rs joined the channel
malinoff joined the channel
gulzar joined the channel
gulzar
Hi . How can celery read static file ('/static/') from django?
malinoff
gulzar: how can something read a static file from django?
gulzar
malinoff: there are static files like css, or .db files. I have one such salite.db which I need to pass to a task in celery
malinoff
gulzar: you didn't answer my question
gulzar
malinoff:Oh sorry. I only know to read css files using 'load static' in html
malinoff: urls.static have static function , I think that can work
maryokhin joined the channel
enigma_raz joined the channel
malinoff: thank you
malinoff
gulzar: i wouldn't suggest to rely on celery being installed on the same server as your django app. Django works via http, and its use-case is to load resources via http. Why don't simply use requests and load these static files via http from django?
cecemel
Hi. We're investigating an event driven architecture of different webservices.
malinoff: I am a biologist and not a web developer, coding is my hobby and i have few projects in my mind so working with python
malinoff
gulzar: well, it would be very difficult to you to write a sane django app if you don't know http so i'd suggest to read about it first
to make a couple of manual requests via telnet, curl etc
sp1rs has quit
sp1rs joined the channel
cecemel: why HTTP as delivery method/protocol? AMQP is much more robust and easier to use
cecemel: also what you named "celery broker" is actually just "broker", for example, rabbitmq
cecemel: celery can be used as emitters (task.apply_async) and as workers (connection.drain_events) so services 1, 2 would be custom code calling task.apply_async and services 3, 4 would be celery workers
komu has quit
Asgaroth joined the channel
domino14 joined the channel
cecemel
malinoff: http is for convenience/infrastructural constraints. Are Webhooks not ok?
malinoff
cecemel: not when it comes to task queues. Use rabbitmq/amqp
Debnet
I have a question: does schedule tasks allocate a worker?
Because I have a service which launch a lot of tasks and I expect to have 5 worker at the same time, but I have only 3 running.
(I have 2 scheduled tasks but they run at 2 PM)
malinoff
Debnet: no, they don't. Workers are usually pre-allocated
Debnet
OK. Thanks.
flowstate joined the channel
cecemel
malinof: what are typical use cases for Webhooks then?