csv - Filter/grok method on logstash -


supposed have log file:

jan 1 22:54:17 drop   %logsource% >eth1 rule: 7; rule_uid: {c1336766-9489-4049-9817-50584d83a245}; src: 70.77.116.190; dst: %dstip%; proto: tcp; product: vpn-1 & firewall-1; service: 445; s_port: 2612; jan 1 22:54:22 drop   %logsource% >eth1 rule: 7; rule_uid: {c1336766-9489-4049-9817-50584d83a245}; src: 61.164.41.144; dst: %dstip%; proto: udp; product: vpn-1 & firewall-1; service: 5060; s_port: 5069; jan 1 22:54:23 drop   %logsource% >eth1 rule: 7; rule_uid: {c1336766-9489-4049-9817-50584d83a245}; src: 69.55.245.136; dst: %dstip%; proto: tcp; product: vpn-1 & firewall-1; service: 445; s_port: 2970; jan 1 22:54:41 drop   %logsource% >eth1 rule: 7; rule_uid: {c1336766-9489-4049-9817-50584d83a245}; src: 95.104.65.30; dst: %dstip%; proto: tcp; product: vpn-1 & firewall-1; service: 445; s_port: 2565; jan 1 22:54:43 drop   %logsource% >eth1 rule: 7; rule_uid: {c1336766-9489-4049-9817-50584d83a245}; src: 222.186.24.11; dst: %dstip%; proto: tcp; product: vpn-1 & firewall-1; service: 2967; s_port: 6000; jan 1 22:54:54 drop   %logsource% >eth1 rule: 7; rule_uid: {c1336766-9489-4049-9817-50584d83a245}; src: 74.204.108.202; dst: %dstip%; proto: udp; product: vpn-1 & firewall-1; service: 137; s_port: 53038; jan 1 22:55:10 drop   %logsource% >eth1 rule: 7; rule_uid: {c1336766-9489-4049-9817-50584d83a245}; src: 71.111.186.26; dst: %dstip%; proto: tcp; product: vpn-1 & firewall-1; service: 445; s_port: 38548; jan 1 23:02:56 accept %logsource% >eth1 inzone: external; outzone: local; rule: 3; rule_uid: {723f81ef-75c9-4cbb-8913-0ebb3686e0f7}; service_id: icmp-proto; icmp: echo request; src: 24.188.22.101; dst: %dstip%; proto: 

what filters/grok method can implement them separated different fields? if use semi colon separator, different last row of data there more semi colons other rows. should use if else statement separate?

looks typical use case grok , kv filter.

first use grok filter separate fields. put last part (key value pairs) 1 field. use grok debugger find correct pattern. might approach:

%{ciscotimestamp:timestamp} %{word:action}%{space}%{data:logsource} %{data:interface} %{greedydata:kvpairs} 

in logstash's config:

grok {     match => [ 'message', '%{ciscotimestamp:timestamp} %{word:action}%{space}%{data:logsource} %{data:interface} %{greedydata:kvpairs}' ] } 

afterwards use kv filter split key value pairs. might work:

kv {     source => "kvpairs" # new field generated grok before     field_split => "; " # split fields semicolon }  

try , maybe adjust little bit , should able parse log lines correctly.


Comments

Popular posts from this blog

dns - How To Use Custom Nameserver On Free Cloudflare? -

python - Pygame screen.blit not working -

c# - Web API response xml language -