batches is nothing but a class that denotes that the tasks we are dealing with should be read in batches, i mean if we set the limit as 10, the function will be called after accumulating 100 such tasks and the function takes array of all the tasks arguments and execute on those
@the_drow, i also included in the comment the module location
@the_drow I've also commented the example usage in the gist
whats the best way to review the results of a periodic task? particularly I would like to look for something that would be sent back to stdout or stderr on the task
(using this through django)
tonytan4ever joined the channel
nickabbey joined the channel
jbkkd joined the channel
nickabbey has quit
nickabbey joined the channel
moodh has quit
italorossi joined the channel
bltmiller joined the channel
italorossi has quit
Ergo joined the channel
malinoff has quit
malinoff joined the channel
malinoff has quit
puhrez has quit
puhrez joined the channel
mzee1000 has quit
puhrez has quit
mzee1000 joined the channel
moodh joined the channel
gavinc joined the channel
bltmiller has quit
bltmiller joined the channel
moodh has quit
colinbits joined the channel
italorossi joined the channel
the_drow joined the channel
DLSteve has quit
bltmiller has quit
bltmiller joined the channel
sry. back at this again. do periodic tasks store results?
jbkkd has quit
asksol
v0lksman: not when using the old periodic task decorator, they are ignore_result=True by default
v0lksman
asksol: I went digging for ingore_results rules but couldn't find anything...so if i set that to ignore_results=False it will store them?
asksol
I'm pretty sure it will
v0lksman
does that then mean that the task itself will need self passed in? def foo(self): self.results
hrm...thought I saw that somewhere...but it errors...will dig
thanks
moodh joined the channel
asksol
what do you expect to be in there?
v0lksman
well the problem I'm trying to solve is that I have a task that goes into a state where the task completes without exception but the result can indicate a larger problem in my stack. So I want to check for something in the output of the task and take action accordingly
moodh has quit
I can't modify the script as it's part of a 3rd party module so I can only really access things through my task wrapper
the_rat has quit
my experience so far with Celery has always been a fire and forget background task...so this is the first time I want to try to validate the task execution
asksol
so you have a task starting another task, trying to monitor the subtask?
or do you want to check the exception in the task? Wrap it in a try: except block I guess. The final result of the task does not exist until after the task function returns
todos joined the channel
the_rat joined the channel
tonytan4ever has quit
v0lksman
asksol: no...it's a task running a django management script
so would create a task to look for the results of the periodic_task?
asksol
I'm not sure, if the result is there can you not just look at the worker logs?
as in if the result can be sent, the error should've been logged already
v0lksman
yeah what I'm looking for appears in the celeryd.err log I'm logging all task events to
asksol
I would maybe have a monitor task to do this in general, but if you're debugging a single issue it's probably better to add some logging or similar
v0lksman
but I was hoping for an easier way to find that error as I have other tasks that run but only this one can cause a problem
asksol
management commands could e.g. raise SystemExit
that event should be in the logs
or it could raise an exception that in unpickleable
bltmiller joined the channel
wrap the task in a try: except: traceback.print_stack()
v0lksman
so your suggesting is a log parser for my celery deamon log?
asksol
we have a new log parsing tool in master: celery logtool
v0lksman
but it never raises an exception...well it does when it really fails but I'm looking for this partial failures that don't
asksol
if the exception is unpickleable, it could end up being swallowed
it helps if you know the id of the task, so that you can search for it in the logs
v0lksman
I was going to ask if self on the periodic_task would provide that...
tonytan4ever joined the channel
so it seems even with bind=True I can't get the task id?