9:22 AM
Tyr-Heimdal
when I entered the date match the config failed
9:22 AM
yybel
paste the config somewhere if you can
9:22 AM
pawnbox joined the channel
9:22 AM
Tyr-Heimdal
I have a pastebin with the config in if you could spare a minute?
9:23 AM
9:23 AM
logstashbot
9:23 AM
Tyr-Heimdal
I'm updating with the date match now
9:23 AM
yybel
the if statement seems wrong
9:23 AM
just put if [timestamp]
9:24 AM
oh
9:24 AM
Darcidride joined the channel
9:24 AM
didnt read through, i though that was condition to get the date
9:25 AM
Tyr-Heimdal
I had the date { match => [ "timestamp", "UNIX_MS" ] }
9:25 AM
under the last mutate statement
9:25 AM
that made the config fail
9:25 AM
yybel
was there any other error
9:25 AM
did you use the configtest
9:26 AM
Tyr-Heimdal
No. I'm new to logstash/elk, and got it up and running through a docker container I found
9:27 AM
Norrland
Speaking of timestamps. "YYYY-MM-dd HH:mm:ss.SSS", will that match a timestamp with more than 3 digits in the millisecond part?
9:27 AM
Tyr-Heimdal
Norrland: don't hijack my thread :P
9:27 AM
^^
9:28 AM
yybel
there can be only 1000 milliseconds in a second :) i dont know if it takes smaller fractions than milliseconds
9:29 AM
Norrland
yybel: true. Some applications have more fine-grain timestamps with HH:mm:ss.SSSSSS :)
9:30 AM
hugh_jass has quit
9:30 AM
hugh_jass joined the channel
9:31 AM
Tyr-Heimdal: no date {} part in your config?
9:31 AM
brokencycle has quit
9:31 AM
9:31 AM
logstashbot
9:32 AM
Tyr-Heimdal
9:32 AM
logstashbot
9:32 AM
Tyr-Heimdal
This is my current config, that broke
9:32 AM
it has the date part
9:33 AM
gives me _csvparsefailure
9:34 AM
yybel
so it gives the error at runtime?
9:34 AM
Tyr-Heimdal
if I remove the date-part, it parses the logs but ofc gives me the epoch as a number field
9:34 AM
yeah
9:35 AM
yybel
try renaming the timestamp in your csv and date filter to something else like epoch_time
9:35 AM
i dont know if its mad because the name is almost same as @timestamp
9:37 AM
its weird that its csvparsefailure
9:37 AM
Norrland
Tyr-Heimdal: ah.
9:37 AM
Tyr-Heimdal
same csvparsefailure
9:37 AM
yybel
on all events?
9:37 AM
hey
9:37 AM
its the conversion to integer maybe
9:37 AM
maybe it must be string
9:38 AM
Tyr-Heimdal
it only leaves 2 events from thousands
9:38 AM
if I remove the date part, I get thousands of events
9:39 AM
yybel
oh, date filter doc says unix_ms parses int value
9:39 AM
Tyr-Heimdal
yeah, read somewhere that it should be int
9:40 AM
yybel
try anyways
9:41 AM
Tyr-Heimdal
same result. gives 2 entries
9:41 AM
yybel
is there anything in logstash logs
9:42 AM
Tyr-Heimdal
i guess I need to have something in the output-part for that..?
9:42 AM
yybel
no theres something in /var/log/logstash.log and .err
9:43 AM
and .stdout
9:46 AM
Tyr-Heimdal
INFO: [logstash-148599b191b4-8-12176] started
9:46 AM
{:timestamp=>"2016-06-29T09:40:00.629000+0000", :message=>"Trouble parsing csv", :source=>"message", :raw=>"", :exception=>#<NoMethodError: undefined method `each_index' for nil:NilClass>, :level=>:warn}
9:46 AM
{:timestamp=>"2016-06-29T09:40:00.741000+0000", :message=>"Trouble parsing csv", :source=>"message", :raw=>"", :exception=>#<NoMethodError: undefined method `each_index' for nil:NilClass>, :level=>:warn}
9:46 AM
only stdout.log there
9:47 AM
yybel
is your csv string in message field originally?
9:48 AM
Tyr-Heimdal
if I remove the date-part, then it populates the message field with the original lines, yes
9:50 AM
yybel
try putting some trash filter before the csv filter
9:50 AM
if ([message] =~ /^#/ or [message] == "") { drop{} }
9:50 AM
pfallenop has quit
9:50 AM
tbaror
Hello, can anyone in here help me with design advise, i have question regarding logstash/elasticsearch scale to accommodate large volume message , i am in process of building security event center that will get mostly syslog from firewalls ,switch, and winlogbeats what is most recommended for given scenario what is the message volume to consider using kind of queue management , or there is tweak on logstash or elasticsearch side
9:50 AM
yybel
still weird that nothing goes through when date filter is there
9:51 AM
are you checking the output from elasticsearch?
9:51 AM
Tyr-Heimdal
you mean where do I get my errors/conclusions from?
9:51 AM
yybel
yeah
9:51 AM
Tyr-Heimdal
kibana
9:51 AM
so, yeah
9:51 AM
yybel
does elasticsearch have some errors about incoming events
9:51 AM
for logstash debugging its good to use output stdout
9:52 AM
then tail the stdout logfile
9:53 AM
Tyr-Heimdal
so, with that filter you gave me, it still processes everything like it should
9:53 AM
if I re-add date after csv, it fails
9:54 AM
I have to say it's a good feeling that it wasn't an easy fix ^^
9:54 AM
yybel
:)
9:57 AM
Tyr-Heimdal
on the other hand...if it was a quick fix...this "#%£$€ would be working now :P
9:58 AM
pfallenop joined the channel
9:58 AM
pfallenop has quit
9:58 AM
pfallenop joined the channel
9:58 AM
Xylakant has quit
9:59 AM
yybel
i tried your data and config and it works for me
9:59 AM
maybe the input is somehow messed because i used stdin input
10:00 AM
hugh_jass has quit
10:00 AM
hugh_jass joined the channel
10:00 AM
Tyr-Heimdal
I'll open the file in an editor and double check if there are anything hidden in it
10:00 AM
yybel
10:00 AM
logstashbot
10:01 AM
Tyr-Heimdal
that's just horribly annoying
10:02 AM
nope, no whitespaces, no special chars or anything else
10:04 AM
and that was just a duplicate of my config file?
10:04 AM
yybel
changed input to stdin and output to stdout
10:04 AM
otherwise the same
10:04 AM
Tyr-Heimdal
that shouldn't be significant, right..?
10:05 AM
yybel
no unless lumberjack contains already some field information
10:06 AM
but if you get all the same fields as in that paste except for the @timestamp being wrong then it shouldnt be the issue
10:06 AM
Tyr-Heimdal
i do
10:07 AM
yybel
do the output stdout at least for better debugging
10:07 AM
you get the info out of events that dont make it to elasticseach
10:07 AM
Tyr-Heimdal
I'll look into that
10:07 AM
thanks!
10:08 AM
how about the geo-issue? got any input there?
10:10 AM
yybel
kibana map doesnt work?
10:10 AM
Tyr-Heimdal
right
10:11 AM
yybel
im using geoip filter and it creates array named location with two number values for lat and lng
10:11 AM
Tyr-Heimdal
I get this output "location": {
10:11 AM
"lon": 12.80557,
10:11 AM
"lat": 62.827551
10:11 AM
}
10:12 AM
could it be that easy...that it's called lon instead of lng?
10:13 AM
yybel
my data doesnt have those field names
10:13 AM
just two values in an array for location
10:13 AM
fatdragon joined the channel
10:13 AM
yours seems to be valid way too
10:15 AM
maybe you need to also convert location field into "geo_point"
10:15 AM
you have location.lat and location.lon as float but location is something default
10:15 AM
notebox joined the channel
10:16 AM
Tyr-Heimdal
meaning..?
10:16 AM
yybel
or maybe im totally lost
10:16 AM
i dont think logstash understands that geo_point type
10:17 AM
fatdragon has quit
10:17 AM
so. if you check in kibana settings and indices the list of fields i guess the location field is not geo_point type
10:18 AM
Tyr-Heimdal
that's correct
10:19 AM
so how do I make that happen
10:19 AM
?
10:21 AM
yybel
i suppose you have to edit elasticsearch index template for logstash
10:21 AM
Tyr-Heimdal
ok. I'll look into that
10:22 AM
thanks for all your help! Allways nice to talk to people willing to help out :D
10:23 AM
yybel
10:23 AM
logstashbot
10:24 AM
yybel
you can ask more at #elasticsearch :)
10:24 AM
Tyr-Heimdal
awesome! :D
10:24 AM
abk joined the channel
10:25 AM
Julinux joined the channel
10:27 AM
DevRelIrcBot joined the channel
10:30 AM
hugh_jass has quit
10:30 AM
hugh_jass joined the channel
10:33 AM
optiz0r has quit
10:36 AM
Xylakant joined the channel
10:42 AM
optiz0r joined the channel
10:44 AM
optiz0r has quit
10:44 AM
optiz0r joined the channel
10:46 AM
jefrite has quit
10:51 AM
sarkis joined the channel
10:56 AM
sarkis has quit
10:59 AM
rem5 has quit
11:00 AM
hugh_jass has quit