you could run it per client ( run facter ) or from the puppet master since all of them report there anyway. the input in this case connects to the API of the puppet master and asks for all the facts from all the reported hosts
vali
hmmm
so just run in client mode and connect to the puppet master
thanasisk: Error: Spurious "]". You may want to quote your arguments with double quotes in order to prevent extra brackets from being evaluated as nested commands.
vali
but how do I request the stuff I need ?
thanasisk
what the heck does that mean?
al1o joined the channel
al1o has quit
JohnnyRun has quit
al1o joined the channel
kepper has quit
kepper joined the channel
jpyth
jpyth has quit
cstoner joined the channel
mdelnegro has quit
LNShadow joined the channel
[o__o] joined the channel
mihailv joined the channel
mihailv
hi i am getting this erro if i want to install logstash contrib after i installed the core version logstash >= 1.4.2 is needed by logstash-contrib-1.4.2-1_efd53ef.noarch
thanasisk has quit
samstav has quit
jedzik1 joined the channel
al1o has quit
this is after i installed 1.2.1-1
on centos
MartinCleaver has quit
electrical
why install 1.2.1? that's so old
mihailv
i erased the old version and tried to install the contrib version directly
Error: Execution of '/bin/rpm -i /var/lib/logstash/swdl/logstash-contrib-1.4.2-1_efd53ef.noarch.rpm' returned 1: error: Failed dependencies: logstash >= 1.4.2 is needed by logstash-contrib-1.4.2-1_efd53ef.noarch logstash < 1.4.3 is needed by logstash-contrib-1.4.2-1_efd53ef.noarch
this is the package received by param ls_contrib => 'https://download.elasticsearch.org/logstash/logstash/packages/centos/logstash-contrib-1.4.2-1_efd53ef.noarch.rpm',
iv spent 1.5 hours dumping inn a 8GB file from lumberjack, still i am only half way .. :p
kepper has quit
about 600 events/sec
seems its only able to send 4000 events per sec. giving me a total of 3.5 hours estimated time for a mere 8gb file ?
electrical
mihailv: it fails because package_url ( for the core package ) is empty. fails because of that.. you are letting the core package be instaleld via the repo ( 1.4.2 ) but want to install the 1.4.1 contrib package
that will never work :-)
adsisco_ joined the channel
kjelle: did you increase the number of filter threads? ( -w option )
esc has quit
MartinCleaver joined the channel
cakirke has quit
kjelle
electrical: hmm. on the lumberjack?
mihailv
currently i don't have any logstash instance installed and try to install the 1.4.2 version( this is the param ls_contrib => 'https://download.elasticsearch.org/logstash/logstash/packages/centos/logstash-contrib-1.4.2-1_efd53ef.noarch.rpm' )
kjelle: logstash instance.. Grok can be fairly slow yeah
kjelle
electrical: my regex is quite large..
Raeven
Hi, i am still having problems with logstash-forwarder. I have checked the certificates and the time but i still get a message saying "Failed to tls handshake with 192.168.2.210:12345 tls: either ServerName or InsecureSkipVerify must be specified in the tls.Config"
kjelle
electrical: ill try to upp the -w to 8
Raeven
I must be doing something wrong.
electrical
mihailv: ahh okay. sorry i was wrong.. you are installing the contrib package but it requires the core package as well which you didn't specify in the package_url
Rukshan has left the channel
kjelle
electrical: woho. 527% cpu :D
jessemdavis joined the channel
electrical: it seems faster. perhaps 3x faster with -w 8.. still too slow.
aljohri joined the channel
mihailv
i did define the package url to be the same as the contrib_package_url but i commented out in the tests, you can see it in the gist it;s right before the contrib_package_url variable
kjelle
electrical: i guess this is the filter worker (Grok) and not either my network, lumberjack nor elasticsearch. is that correct
?
electrical
mihailv: the package_url is for the core packge not the contrib one :-)
kjelle: yeah indeed... you can run top -Hp $LSPID to see which threads are at 100% ?
aljohri has quit
esc joined the channel
wilmoore joined the channel
jmreicha joined the channel
kjelle
electrical: i got 1 >output, 1 <lumberjack and 8 worker. output at ~60-70, lumberjack at ~45-50, workers at ~28-30
mihailv
ok, there is something i am missing :in the documentation says i should use 2 params: install_contrib and contrib_package_url . I tried that and it fails , what is the solution? First install the core version and after the contrib one? I used the package_url over there because of a bug that now seems to be fixed
kjelle
(%cpu, almost no ram)
electrical: this is the same box being the mater of a 20 ES dataNode cluster. So there is like 30 JAva processes. at ~3 % cpu
electrical
mihailv: package_url should point to the core package url.. contrib_package_url should point to the contrib package url.. you can run them separately.
f0 has quit
wilmoore has quit
kjelle: hmm okay. hard to see where it is then.. most of the time when a certain thread is at/near 100% it means its working as hard as it can.
kjelle
electrical: i can set -w 1 and see how hard it is..
jmreicha has quit
Matrix_ joined the channel
mihailv
this is my setup https://gist.github.com/anonymous/966ea97f4b55f.... From the docs it seems to be corect. With thes setup i am getting the Error: Execution of '/bin/rpm -i /var/lib/logstash/swdl/logstash-contrib-1.4.2-1_efd53ef.noarch.rpm' returned 1: error: Failed dependencies: error
electrical: this is a "poc", laters i will use LS receivers to push all data to rabbitmq, then have multiple LS parsers to pull data, parse and shuffle to ES.
electrical: it seems this might be the max per LS parser.
electrical
mihailv: that is never possible.. when you set the contrib_package_url you _need_ the package_url in the same class since the contrib package depends on the core package
kjelle: we do now that the grok filter can be fairly slow indeed with large regex patterns. we hired someone who is going to work on that ( also the KV filter )
electrical: as you can imagine, it is having a hard day ;) is it possible to make filters better - e.g. does grok support commasepratated delimiters and mapping them to fields. i basically just have a long liste of fields with , inbetween.
mdelnegro has quit
electrical: hehe
saurajeetd joined the channel
mdelnegro joined the channel
electrical
kjelle: comma separated sounds like CSV possibly?
kjelle
electrical: yes
blalor_afk is now known as blalor
so i might check out that instead.
electrical
you could try that filter.. might parse it more effectively
kjelle
i really dont need any regex matching (syntax checks).
electrical: yepps, ill do that. thanks ;)
electrical
np :-)
pu22l3r joined the channel
kjelle
electrical: here we go. a 8 worker csv filter instead \;)
cschneid has quit
bother is now known as bothers
electrical: it is equal.
electrical: in speed.
electrical
hmm okay.
kjelle
electrical: so i guess i should push to rabbitmq
then pull with multiple 8-worker LS parsers.
electrical: ill try with a smaller regex footprint (the stime and ltime are a bit expensive)
electrical
could do that as well yeah. there are so many different options :-)
kjelle
but I think it shouldnt do much ..
cstoner joined the channel
im up at about 10240 events/second.
i have ~50 million in my 8gb file.. 1 hour and 20 minutes..
pu22l3r has quit
question is why my workers (8) are only at 25-30%cpu and not 100
pu22l3r joined the channel
dblessing joined the channel
awheeler joined the channel
gmcwhistler has quit
kees_
PHP Warning: array_key_exists() expects parameter 2 to be array, null given in /mnt/web/tweakers-7/inc/class/members/preferences/SessionPreferencesStore.php << komt ook best wel heel erg vaak voor
mbruzek joined the channel
electrical
kjelle: how many cores do you have on that machine?