i gtg for now. thanks for the input, although hardly anything of that was new. if anyone knows who to contact regarding this (we are willing to pay for this), please drop me a msg.
SkaveRat
<-
I'll take it from here
electrical
kleind: you could always take Support from ES ;-) hehe
kleind
they told me to buy more hardware.......
justme123 has quit
and never really listened to the problem description
electrical
kleind: really? hmm
SkaveRat
also they don't have any hands-on support on short notice
mikran has quit
electrical
we don't do hands on support no ( Disclaimer; i work at Elastic )
tchiang has quit
SkaveRat
I've contacted elastic for some local support/consulting (germany) and they sent me some info on an official company who does official elastic support and consulting
I'll be visiting the ELK workshop next week in amsterdam. Will have to do for now, probably
jstoiko joined the channel
jigax joined the channel
electrical
SkaveRat: ah okay :-)
Yeah, we use partners for the consulting stuff indeed.
SkaveRat
they'd do some in-house consulting, but not before may, which is a little late for us
koendc joined the channel
electrical
Ah i see. okay
The workshop might give some insight indeed.
mschmitt joined the channel
jeffr76
i do wish there was better documentation for performance tuning the stack
deviantony has quit
electrical
Issue is that it very depends on the use case... for example website search is completely different pattern ( mainly searching ) then for logging ( mainly indexing )
jeffr76
our LS nodes do about 2k doc/s which is better than they were and cpu is about 50% used, but i would love to squeeze out more
_habanero has quit
electrical
I hope at some point we can work on that kind of documentation.
SkaveRat
the biggest problem we currently see is, that LS can't remotely handle the data we throw at it. we now log about 2k logs/s, which a single machine can easily handle, but we plan on pushing about 50k+ logs in the future, which is really annoying if we need to set up a giant cluster just for parsing the logs
jeffr76
for use we are alomst 50%/50%
we have a cluster now, and planning on about 60k/s
with peaks much more than that
electrical
SkaveRat: if i read back correctly you are using Redis as in input right? is that queue filled up or empty at most times?
jeffr76
like 3x
_habanero joined the channel
olivier__ has quit
SkaveRat
electrical, while we benchmark, we reset the (in-memory) queue to 8million logs each time, to get reproducable results. in production it's pretty much empty, just a shorttime buffer
electrical
SkaveRat: okay. that means that LS is processing as fast as its receiving if its near 0 items in the queue.
did you see any times the queue size increased
?
SkaveRat
well, in production we push only about 2k logs into it, while we can handle a lot more than that. that's the reason it's empty
sidomsa has quit
lucascastro joined the channel
cittatva
hm, so the irc input seems to only connect to the last channel in the channels list
jigax has quit
kees_ has quit
rojem joined the channel
colinsurprenant joined the channel
olivier__ joined the channel
_Bryan_ joined the channel
Rumbles has quit
jettrocoenradie joined the channel
_maes_ has quit
electrical
cittatva: really? that's a horrible bug.
colinsurprenant has quit
cittatva
it's possibly that I'm doing it wrong, I guess? I'll post my config...
will do - i'll poke at it a little first and see if I can figure out what's going on
Rumbles joined the channel
electrical
okay thanks!
dasrecht_off is now known as dasrecht
iamchrisf joined the channel
nicetR joined the channel
iamchrisf_ joined the channel
Rumbles
I'm trying to use gsub to remove the middle part of a log line which has been joined together using the multiline filter (I want to remove the end of the first log line and the start of the second log line) but it doesn't seem to work how I would expect it to.... as far as I can see the regex should work, but it doesn't replace anything when a line goes through. I have put the config along with a message after
Title: untangle NGFW filter with json parse failure and sucess egs - Pastebin.com (at pastebin.com)
Rumbles
I've put in an entry that fails to pass the json parser straight after the filter, after that I have also put an example of one which passed the json parse filter successfully