hi, I'd like to pick your collective brains on something. I have a task that should spawn two chained tasks. The "the first child renders html, the second renders a pdf output from the html". But I need to access the results of the two renderers after completion in an asynchronous way (polling web request)
hm I think my misunderstanding is around the return result of chain
I somehow was under the impression that the first task is returned, but the actual async_result refers to the last element in the chain rather than the first, doesn't it?
malinoff
eltigre, no, it refers to the first task in chain
eltigre
then how do I know the async_result of the final task at the time of starting the chain?
malinoff
you can access subtasks with .children attribute, as shown in example
eltigre
is children availabe before task completion?
wenzhixu_ has quit
malinoff
eltigre, of course. Children is a list of AsyncResult instances
eltigre
ah ok
that makes sense
malinoff
list of lists of lists etc
brockhaywood has quit
eltigre
thank you
hell_razer has quit
malinoff has quit
hell_razer joined the channel
Frosh joined the channel
cheerfulboy has quit
ceejh joined the channel
ignas joined the channel
italorossi joined the channel
zz_veatch is now known as veatch
kaakku has quit
negval joined the channel
ninkotech joined the channel
k_sze[work] has quit
loic84 has quit
loic84 joined the channel
I've got a large html text in a task result... is there a way to supress printing out the result in the console? While still having all the print messages?
bkuberek has quit
sorry I found a way
hell_razer has quit
hell_razer joined the channel
kaakku joined the channel
mher has left the channel
_OpenREM joined the channel
_OpenREM_ joined the channel
_OpenREM has left the channel
_OpenREM_ has left the channel
_OpenREM_ joined the channel
ceejh has quit
_OpenREM_
I have a django project/app that works fine with Celery on linux. In Windows, if I install using pip either system-wide or in a virtualenv, when I start celery the workers continuously exit with signal -1. If I run celery against a copy of my project just placed in My Documents, it works fine. Windows 7 64bit.
What might I have done wrong such that Celery won't work if my project is 'installed'? Project is OpenREM (http://openrem.org)
Celery function only added in current version that is in beta (0.4.3b4)
Both the First steps with Celery and First steps with Django work fine, as does using OpenREM as long as it isn't 'installed'
Flonka has quit
emmedigi joined the channel
domino14 joined the channel
ineedarobot_ joined the channel
hell_razer has quit
ineedarobot has quit
ineedarobot_ is now known as ineedarobot
ceejh joined the channel
domino14 has quit
italorossi has quit
emmedigi has quit
divideandconquer joined the channel
Kronuz joined the channel
emmedigi joined the channel
ibeex has quit
s2hc_joh1 is now known as s2hc_johan
emmedigi has quit
italorossi joined the channel
italorossi has quit
jessepollak joined the channel
kaakku has quit
emmedigi joined the channel
Gentle` is now known as Gentle
k_sze[work] joined the channel
k_sze[work] has quit
emmedigi has quit
italorossi joined the channel
bkuberek joined the channel
italorossi has quit
emmedigi joined the channel
emmedigi has quit
emmedigi joined the channel
shredding joined the channel
jessepollak has quit
bkuberek has quit
benregn has quit
frog3r_ joined the channel
frog3r has quit
rhymes joined the channel
emmedigi has quit
If anyone has any ideas about why my django project doesn't work with Celery when pip installed, please shout. If I don't catch the response, I can be found @_OpenREM on twitter. Any help, suggestions appreciated.
gondoi is now known as zz_gondoi
emmedigi joined the channel
bkuberek joined the channel
frodopwns joined the channel
Ergo^ has quit
eltigre has quit
eltigre joined the channel
Kronuz has quit
Kronuz joined the channel
alastairp joined the channel
alastairp
hi all.
masterphi joined the channel
I'm running celery/django under supervisor with a rabbitmq broker
sometimes if I restart it, celery won't connect to the broker. I get the list of tasks, but then not the "connecting to broker, mingling" message
frog3r joined the channel
frog3r_ has quit
frodopwns has quit
jessepollak joined the channel
SoftwareMaven joined the channel
frodopwns joined the channel
frodopwns
alastairp: when you restart celeryd and dont get teh message do you get erros if you then try to push tasks anyway?
bkuberek has quit
Debnet joined the channel
cah190 has left the channel
cah190 joined the channel
_OpenREM_ has quit
alastairp
frodopwns: it won't receive any tasks. If I ask for all connected workers it won't show up (e.g. with app.control)
normally if i shut it down and start it again it'll come up and find the job that was sent to it
I'm wondering if it might be a supervisor problem, I've had troubles with it in the past not starting/restarting services properly
but it seems weird that celery gets half way there and then stops
frodopwns
you could test the init.d scritp pretty easily
alastairp
alternatively, it could well be a problem connecting to the broker - dns, network timeout
frodopwns
shoudl giv eyou an idea of whether the problem is indeed with supervisor