Title: filter { if [type] == "squid-access" { grok { match => { "message - Pastebin.com (at pastebin.com)
intransit joined the channel
rastro
date{} will match UNIX with an epoch value/
ICantCook
SQUIDACCESS %{NUMBER:timestamp}%{SPACE}%
I've tried that, as per my pastebin
Vase joined the channel
rastro
ICantCook: but you don't have an ISO8601 format field, you have a UNIX format field.
ICantCook
but not seeing it in the logstash stdout
tchiang1 joined the channel
rastro
match => [ "timestamp", "UNIX" ]
ICantCook
oh, I thought that was where I put the desired output
tchiang has quit
I want to convert it that ISO format
rastro
ICantCook: no, that's the input format.
ICantCook: date{} converts it into a, um, date./
ICantCook: by default it'll stick it in @timestamp, but you can choose a different destination.
rwhavens has quit
froztbyt1 is now known as froztbyte
ICantCook
rastro: awesome, that worked :)
rwhavens joined the channel
rastro
ICantCook: great.
rwhavens has quit
amirite joined the channel
Zonywhoop joined the channel
pu22l3r joined the channel
is-mw2 joined the channel
is-mw has quit
virusuy joined the channel
shedis joined the channel
ethlor joined the channel
pk1484 has quit
amirite has quit
rastro has quit
smerrill-offline is now known as smerrill
shedis has quit
WrathChylde has quit
shedis joined the channel
shedis has quit
kireevco joined the channel
JDiPierro joined the channel
|splat| joined the channel
nemothekid joined the channel
tchiang1 has quit
bdpayne joined the channel
Vase has quit
bdpayne has quit
spulec joined the channel
JDiPierro has quit
ignarps
I am getting filter worker exceptions in logstash 1.4.2, I am trying to use a grok to define a field, check the field value and set another field depending on the results.
Title: filter { if [type] == "mongos" { grok { match => { "message" => " - Pastebin.com (at pastebin.com)
ignarps
if I comment out lines 25 thru 33 I don't get the exceptions
Embalmed has quit
the exception messages are in the pastebin
Zonywhoop has quit
Embalmed joined the channel
dendazen joined the channel
ICantCook
I'm planning to run curator on every older than 60 days, but would have a backup of this first. Is there a way to dump all the data >60 days old into a flat json file(s)?
andrew[andrboot]
ICantCook: you'd need to pull it out of elasticsearch
cyberRodent joined the channel
ICantCook
andrew[andrboot]: is there a better solution than what I've suggested?
andrew[andrboot]
ICantCook: not sure tbh, if you could archive all x entries on 1 node or 2 nodes that would probably work?
ICantCook
goal is to retain data forever in tar.gz'd encrypted files in amazon s3
andrew[andrboot]
aaah
rwhavens joined the channel
ICantCook
yeah, then one day if I'm asked "What happened 3 years ago?" I could restore the apropriate data and answer