5:44 AM
digitalfx joined the channel
5:51 AM
dasrecht_off is now known as dasrecht
5:53 AM
saurajeetd joined the channel
5:53 AM
abestanway joined the channel
5:56 AM
kakbit_ has quit
5:59 AM
poctc joined the channel
6:00 AM
poctc is now known as acfirtwex
6:00 AM
datenbrille joined the channel
6:01 AM
dasrecht is now known as dasrecht_off
6:01 AM
kakbit joined the channel
6:01 AM
saurajeetd has quit
6:02 AM
kepper joined the channel
6:04 AM
acfirtwex has quit
6:06 AM
WrathChylde joined the channel
6:07 AM
WrathChylde has quit
6:09 AM
TomasNunez joined the channel
6:09 AM
saurajeetd joined the channel
6:11 AM
Don_E joined the channel
6:11 AM
AndroidLoverInSF has quit
6:11 AM
AndroidLoverInSF joined the channel
6:11 AM
AndroidLoverInSF has quit
6:13 AM
thegrif joined the channel
6:13 AM
junix659 has quit
6:17 AM
spinscale joined the channel
6:21 AM
thumpba_ joined the channel
6:23 AM
gauravarora joined the channel
6:25 AM
martinseener joined the channel
6:25 AM
yfried joined the channel
6:28 AM
Bastian1 joined the channel
6:32 AM
junix659 joined the channel
6:34 AM
datenbrille has quit
6:35 AM
warkolm has quit
6:36 AM
CodePulsar joined the channel
6:36 AM
piavlo has quit
6:44 AM
yfried has quit
6:45 AM
datenbrille joined the channel
6:47 AM
gauravarora has quit
6:47 AM
wilmoore joined the channel
6:48 AM
abestanway has quit
6:51 AM
dasrecht_off is now known as dasrecht
6:51 AM
moapa
are there any additional requirements than kibana/es/logstash to increment values from logs? i.e bytes, latency etc
6:52 AM
6:52 AM
logstashbot
6:55 AM
moapa
If using statds as output as per example above, its meant to be shipped from statds to graphite, not kibana?
6:56 AM
KidCartouche joined the channel
6:57 AM
Jippi joined the channel
6:57 AM
thegrif has quit
7:01 AM
gauravarora joined the channel
7:01 AM
dasrecht is now known as dasrecht_off
7:02 AM
peaceman joined the channel
7:02 AM
jedzik joined the channel
7:04 AM
Don_E
7:04 AM
logstashbot
Title: logstash - open source log management (at
logstash.net )
7:05 AM
nexex has quit
7:05 AM
datenbrille1 joined the channel
7:06 AM
datenbrille has quit
7:08 AM
warkolm joined the channel
7:12 AM
martinseener_ joined the channel
7:14 AM
eper joined the channel
7:14 AM
martinseener has quit
7:14 AM
martinseener_ is now known as martinseener
7:19 AM
eper has quit
7:20 AM
chifas has quit
7:24 AM
pblittle has quit
7:25 AM
stemid joined the channel
7:26 AM
stemid
MONTHDAY stopped working as soon as June started for me. so it worked while it was two digits but 1 and 2 couldn't be parsed. MONTHDAY (?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9])
7:27 AM
that regex is over my head, maybe I should just use \d+?
7:27 AM
the input data starts like this <166>Jun 2 09:23:29 and my pattern for that is <\d+>%{MONTH} %{MONTHDAY} %{TIME}
7:28 AM
Jarth73 joined the channel
7:28 AM
Jarth73
Hello World
7:30 AM
7:30 AM
stemid
should your grok match every single line?
7:31 AM
dk_dd joined the channel
7:31 AM
moapa
Don_E: Thanks!
7:31 AM
Jarth73
stemid: hi, i've set match rules on a keyword ( certified unique ) which add a tag, then a condition to work on the tagged message
7:33 AM
stemid: you mean i should set a break_on_match => yes where i add the tag ?
7:34 AM
stemid
Jarth73: try it. I just know that grokparsefailure tags could just mean that some line did not match, or some part did not match.
7:34 AM
while the grok in general still works.
7:35 AM
I'm a newbie myself, started using logstash a few weeks ago.
7:36 AM
vralfy joined the channel
7:37 AM
_Flusher is now known as Flusher
7:37 AM
Flusher has quit
7:37 AM
Flusher joined the channel
7:38 AM
warkolm has left the channel
7:38 AM
Don_E
break_on_match should default to true ?
7:38 AM
babadofar joined the channel
7:39 AM
Jarth73
stemid: about the same time as me then
7:39 AM
stemid: break_on_match is on by default
7:40 AM
cod has quit
7:40 AM
grokdebug never fails, not sure if it will display if a grokparsefailure occurs, i figure it blanks then
7:40 AM
Don_E: hey, is not not so by default ? docs say yes
7:40 AM
stemid
Default value is true
7:40 AM
babadofar has quit
7:41 AM
also use add_field not add_tag. are you getting the new field you want added?
7:41 AM
Jarth73
hmmm, could it be i get a _grokparsefailure tag on ALL messages because the filter fails one time ?
7:44 AM
stemid: i do not want to add a field ( which contain values ) i want to add a tag for referral
7:45 AM
stemid
ok I thought tags were being replaced by fields.
7:45 AM
you can do conditionals on fields
7:45 AM
dasrecht_off is now known as dasrecht
7:45 AM
Jarth73
by now i'm quite sure this fails continuously because one entry is not matching for the rule
7:46 AM
stemid: i'm reasonably sure the conditional part works as expected :)
7:46 AM
stemid: but it's a good idea to add a tag when it's processed it as well actually, just for testing
7:46 AM
Don_E
Jarth73 add tags on every grok match to see where it fails
7:47 AM
jotterbot1234 has quit
7:50 AM
Jarth73
Don_E: that's what i'm doing :)
7:50 AM
Don_E: how are you ?
7:50 AM
stemid: now i get your remark, it seems indeed the conditional on [tag] == "tagname" is not working
7:51 AM
stemid
I only do conditionals on fields because I got the impression that tags were being phased out.
7:51 AM
thumpba_ has quit
7:52 AM
Jarth73
stemid: uhm, i've not seen any 'deprecated' notification yet
7:53 AM
stemid: if i get what they serve that would not be beneficial either
7:55 AM
losh joined the channel
7:56 AM
moapa
Hmm. how to deal with log lines containing different amount of spaces & tabs between words?
7:57 AM
for example
7:57 AM
%{TIMESTAMP} Hello John, How are you?
7:57 AM
%{TIMESTAMP} Hello John, How are you?
7:57 AM
babadofar joined the channel
7:57 AM
Im using %{GREEDYDATA} right now, but i get _grokparsefailure
7:57 AM
for some rows
7:58 AM
kepper has quit
7:59 AM
stemid
moapa: I've been forced to use \s* or \s*
7:59 AM
sometimes
7:59 AM
ugh, sorry I mean \s?
7:59 AM
depends on how many spaces
7:59 AM
but GREEDYDATA should cover that
7:59 AM
yfried joined the channel
8:00 AM
you're probably getting grokparsefailure for other reasons
8:00 AM
Jarth73
moapa: same here, apparently my _grokparsefailure are due to pattern to data match failures
8:01 AM
dasrecht is now known as dasrecht_off
8:01 AM
kepper joined the channel
8:01 AM
Debolaz has quit
8:02 AM
eper joined the channel
8:02 AM
loban5 joined the channel
8:03 AM
jotterbot1234 joined the channel
8:04 AM
spinscal_ joined the channel
8:04 AM
jotterbot1234 has quit
8:05 AM
jotterbot1234 joined the channel
8:06 AM
jotterbot1234 has quit
8:07 AM
spinscale has quit
8:10 AM
grahamn has quit
8:12 AM
ktosiek joined the channel
8:12 AM
!argh! it is a space in the input
8:13 AM
stemid: when a data is shown it is not prepended by a zero if it is only a single digit day
8:14 AM
stemid: so my pattern should read %{MONTH}\s*%{MONTHDAY} instead of having just the two fields separated by a space
8:14 AM
once more it works in grokdebuggger but not in logstash grok
8:17 AM
Don_E: i don't get it why it keeps failing after i add the tag, the tag is added when there's a match on a keyword in the message
8:17 AM
Don_E: still, the tags are correctly added but always get a _grokparsefailure
8:17 AM
Don_E: i've set remove_tag => [ "_grokparsefailure" ]
8:18 AM
but this seems to just delay adding the _grok... fail tag
8:19 AM
gauravarora has quit