I have a question about logstash, can I ask it here?
bjorn__
Go ahead, and we will see if anyone can answer
selinuxium_ has quit
Mounica has quit
yunus has quit
ManelAcacio
I am using logstash to fetch data from sql to elastic. And because logstash doesn t nest values from docs with same _id, I am using the aggregate plugin. But the examples that I have all have a flag like "task END" to output the nested field to the output.
serdar
how can I get Logstash 5.0?
ManelAcacio
If you don t have this flag "task end", its possible to do?
kepper has quit
instilled has quit
permalac joined the channel
olegg__
guys plz help, i get this error -> :exception=>#<EOFError: End of file reached> any idea?
and it cant deliver an email
dhollinger joined the channel
MindfulMonk joined the channel
gentunian joined the channel
robotonic joined the channel
BlackCrypt0 has quit
que has quit
Sandcrab has quit
robotonic has quit
kepper joined the channel
t4nk594 joined the channel
ef_ joined the channel
t4nk594
We use tab-delimited CSV for our access logs and when our vulnerability scanners hit the applications, they tend to toss a bit of garbage in the fields. One thing being a double quote (written to the log as '\"') which the CSV parser does not like.
I'm trying to replace '\"' with '"""' but it looks like it's getting written to the field as '\"\"\"' when I believe I want '"""'. Any thoughts? mutate { gsub => ["event_message", '\"', '"""'] }
FOCer joined the channel
Is this the correct approach, or should I be changing a value on the csv filter?
ef_ has quit
akp has quit
d_runk joined the channel
erve_ joined the channel
brahama joined the channel
Schwarzbaer_ has quit
ajmartinez
torrancew: trying to get the bug repro done today
instilled joined the channel
skynat2 has quit
colinsurprenant joined the channel
Xylakant has quit
Xylakant joined the channel
haukebruno joined the channel
robotonic joined the channel
bsparrow joined the channel
skynat2 joined the channel
erve joined the channel
sndcrb joined the channel
ManelAcacio
Can I compare a document ID with the previous document ID?
fatdragon joined the channel
squaly has quit
dalvin has quit
BenGatewood joined the channel
BlackCrypt0 joined the channel
squaly joined the channel
ajmartinez
torrancew: hrm... minimal conf can't seem to reproduce the issue
Itkovian joined the channel
_JZ_ joined the channel
GiantEvolving joined the channel
t4nk594 has quit
GiantEvolving
I have some questions about using logstash to ship logs to elasticsearch and I’m not sure if I should ask them here or in the elasticsearch channel. The questions are about the interface between the two products ; right on the boundary. As such, please forgive me if I ask a question in here and it should be in the elasticsearch channel.
kepper has quit
GiantEvolving has quit
kepper joined the channel
GiantEvolving joined the channel
The summary is this: I am running Elasticsearch 2.2.1, Logstash 2.2.2, and Kibana 4.4.2. Logs are shipping correctly, but when I view them in Kibana, I note a problem with the mapping. Host name is an analyzed field and is getting split on hyphen. I want it not_analyzed. I understand that I can’t change a current mapping, that I need a new one.
On a related note, I have tried searching for ‘host.raw’, but it’s not being created or I can’t find it; when I try searching for it, kibana returns no results.
Question 1. how do I specify the mapping? Is it a config change to the logstash elasticsearch plugin or a config change to the elasticsearch server? I want to manage the mapping with puppet. Relatedly: is this the most current one that I can copy https://github.com/elastic/logstash/blob/v1.3.1... and make changes to?
My research so far shows that for question 1, I can’t put a config file into place on the ES server, I have to use the API. This is firmly into the ES side of things so I’ll take my question there.
t4nk882 joined the channel
Mounica joined the channel
Xylakant
GiantEvolving: the elasticsearch puppet module can manage index templates