Logging from Java app to ELK without need for parsing logs -


i want send logs java app elasticsearch, , conventional approach seems to set logstash on server running app, , have logstash parse log files (with regex...!) , load them elasticsearch.

is there reason it's done way, rather setting log4j (or logback) log things in desired format directly log collector can shipped elasticsearch asynchronously? seems crazy me have fiddle grok filters deal multiline stack traces (and burn cpu cycles on log parsing) when app log desired format in first place?

on tangentially related note, apps running in docker container, best practice log directly elasticsearch, given need run 1 process?

i think it's ill-advised log directly elasticsearch log4j/logback/whatever appender, agree writing logstash filters parse "normal" human-readable java log bad idea too. use https://github.com/logstash/log4j-jsonevent-layout everywhere can have log4j's regular file appenders produce json logs don't require further parsing logstash.


Comments

Popular posts from this blog

c# - Binding a comma separated list to a List<int> in asp.net web api -

Delphi 7 and decode UTF-8 base64 -

html - Is there any way to exclude a single element from the style? (Bootstrap) -