#logstash

/

      • digitalfx joined the channel
      • dasrecht_off is now known as dasrecht
      • saurajeetd joined the channel
      • abestanway joined the channel
      • kakbit_ has quit
      • poctc joined the channel
      • poctc is now known as acfirtwex
      • datenbrille joined the channel
      • dasrecht is now known as dasrecht_off
      • kakbit joined the channel
      • saurajeetd has quit
      • kepper joined the channel
      • acfirtwex has quit
      • WrathChylde joined the channel
      • WrathChylde has quit
      • TomasNunez joined the channel
      • saurajeetd joined the channel
      • Don_E joined the channel
      • AndroidLoverInSF has quit
      • AndroidLoverInSF joined the channel
      • AndroidLoverInSF has quit
      • thegrif joined the channel
      • junix659 has quit
      • spinscale joined the channel
      • thumpba_ joined the channel
      • gauravarora joined the channel
      • martinseener joined the channel
      • yfried joined the channel
      • Bastian1 joined the channel
      • junix659 joined the channel
      • datenbrille has quit
      • warkolm has quit
      • CodePulsar joined the channel
      • piavlo has quit
      • yfried has quit
      • datenbrille joined the channel
      • gauravarora has quit
      • wilmoore joined the channel
      • abestanway has quit
      • dasrecht_off is now known as dasrecht
      • moapa
        are there any additional requirements than kibana/es/logstash to increment values from logs? i.e bytes, latency etc
      • In http://logstash.net/docs/1.4.1/tutorials/metric... they are using statsd as output ?
      • logstashbot
      • moapa
        If using statds as output as per example above, its meant to be shipped from statds to graphite, not kibana?
      • KidCartouche joined the channel
      • Jippi joined the channel
      • thegrif has quit
      • gauravarora joined the channel
      • dasrecht is now known as dasrecht_off
      • peaceman joined the channel
      • jedzik joined the channel
      • Don_E
        moapa look at the metrics filter if you want them to stay in kibana / ES (http://logstash.net/docs/1.4.1/filters/metrics)
      • logstashbot
        Title: logstash - open source log management (at logstash.net)
      • nexex has quit
      • datenbrille1 joined the channel
      • datenbrille has quit
      • warkolm joined the channel
      • martinseener_ joined the channel
      • eper joined the channel
      • martinseener has quit
      • martinseener_ is now known as martinseener
      • eper has quit
      • chifas has quit
      • pblittle has quit
      • stemid joined the channel
      • stemid
        MONTHDAY stopped working as soon as June started for me. so it worked while it was two digits but 1 and 2 couldn't be parsed. MONTHDAY (?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9])
      • that regex is over my head, maybe I should just use \d+?
      • the input data starts like this <166>Jun 2 09:23:29 and my pattern for that is <\d+>%{MONTH} %{MONTHDAY} %{TIME}
      • Jarth73 joined the channel
      • Jarth73
        Hello World
      • my grok rules work fine in grokdebug.herokuapp.com but i get _grokparsefailure tags
      • stemid
        should your grok match every single line?
      • dk_dd joined the channel
      • moapa
        Don_E: Thanks!
      • Jarth73
        stemid: hi, i've set match rules on a keyword ( certified unique ) which add a tag, then a condition to work on the tagged message
      • stemid: you mean i should set a break_on_match => yes where i add the tag ?
      • stemid
        Jarth73: try it. I just know that grokparsefailure tags could just mean that some line did not match, or some part did not match.
      • while the grok in general still works.
      • I'm a newbie myself, started using logstash a few weeks ago.
      • vralfy joined the channel
      • _Flusher is now known as Flusher
      • Flusher has quit
      • Flusher joined the channel
      • warkolm has left the channel
      • Don_E
        break_on_match should default to true ?
      • babadofar joined the channel
      • Jarth73
        stemid: about the same time as me then
      • stemid: break_on_match is on by default
      • cod has quit
      • grokdebug never fails, not sure if it will display if a grokparsefailure occurs, i figure it blanks then
      • Don_E: hey, is not not so by default ? docs say yes
      • stemid
        Default value is true
      • babadofar has quit
      • also use add_field not add_tag. are you getting the new field you want added?
      • Jarth73
        hmmm, could it be i get a _grokparsefailure tag on ALL messages because the filter fails one time ?
      • stemid: i do not want to add a field ( which contain values ) i want to add a tag for referral
      • stemid
        ok I thought tags were being replaced by fields.
      • you can do conditionals on fields
      • dasrecht_off is now known as dasrecht
      • Jarth73
        by now i'm quite sure this fails continuously because one entry is not matching for the rule
      • stemid: i'm reasonably sure the conditional part works as expected :)
      • stemid: but it's a good idea to add a tag when it's processed it as well actually, just for testing
      • Don_E
        Jarth73 add tags on every grok match to see where it fails
      • jotterbot1234 has quit
      • Jarth73
        Don_E: that's what i'm doing :)
      • Don_E: how are you ?
      • stemid: now i get your remark, it seems indeed the conditional on [tag] == "tagname" is not working
      • stemid
        I only do conditionals on fields because I got the impression that tags were being phased out.
      • thumpba_ has quit
      • Jarth73
        stemid: uhm, i've not seen any 'deprecated' notification yet
      • stemid: if i get what they serve that would not be beneficial either
      • losh joined the channel
      • moapa
        Hmm. how to deal with log lines containing different amount of spaces & tabs between words?
      • for example
      • %{TIMESTAMP} Hello John, How are you?
      • %{TIMESTAMP} Hello John, How are you?
      • babadofar joined the channel
      • Im using %{GREEDYDATA} right now, but i get _grokparsefailure
      • for some rows
      • kepper has quit
      • stemid
        moapa: I've been forced to use \s* or \s*
      • sometimes
      • ugh, sorry I mean \s?
      • depends on how many spaces
      • but GREEDYDATA should cover that
      • yfried joined the channel
      • you're probably getting grokparsefailure for other reasons
      • Jarth73
        moapa: same here, apparently my _grokparsefailure are due to pattern to data match failures
      • dasrecht is now known as dasrecht_off
      • kepper joined the channel
      • Debolaz has quit
      • eper joined the channel
      • loban5 joined the channel
      • jotterbot1234 joined the channel
      • spinscal_ joined the channel
      • jotterbot1234 has quit
      • jotterbot1234 joined the channel
      • jotterbot1234 has quit
      • spinscale has quit
      • grahamn has quit
      • ktosiek joined the channel
      • !argh! it is a space in the input
      • stemid: when a data is shown it is not prepended by a zero if it is only a single digit day
      • stemid: so my pattern should read %{MONTH}\s*%{MONTHDAY} instead of having just the two fields separated by a space
      • once more it works in grokdebuggger but not in logstash grok
      • Don_E: i don't get it why it keeps failing after i add the tag, the tag is added when there's a match on a keyword in the message
      • Don_E: still, the tags are correctly added but always get a _grokparsefailure
      • Don_E: i've set remove_tag => [ "_grokparsefailure" ]
      • but this seems to just delay adding the _grok... fail tag
      • gauravarora has quit