I was interested to visualize some of the traffic flows from my Fortigate Firewall at home. FortiAnalyzer has some of this functionality but I wanted to learn a bit about the ELK Stack so I thought this would be a fun little project to get me started!

I started off with a CentOS 7 Virtual Machine and installed ELK following one of the many online guides. My main goal was to take the traffic logs and visualize the source and destination as geo points on a world map

Logstash

Edit the config file to add a listener for syslog from the Fortigate, the filter will then sort the fields in the log and do geo IP look ups on source\destination IPs.Restart logstash once the config changes have been made. You will need to update the output IP.

input {
udp {
port => 5000
type => firewall
}
}

filter {
if [type] == "firewall" {
mutate {
add_tag => ["fortigate"]
}
grok {
break_on_match => false
match => [ "message", "%{SYSLOG5424PRI:syslog_index}%{GREEDYDATA:message}" ]
overwrite => [ "message" ]
tag_on_failure => [ "failure_grok_fortigate" ]
}

kv { }

if [msg] {
mutate {
replace => [ "message", "%{msg}" ]
}
}
mutate {
convert => { "duration" => "integer" }
convert => { "rcvdbyte" => "integer" }
convert => { "rcvdpkt" => "integer" }
convert => { "sentbyte" => "integer" }
convert => { "sentpkt" => "integer" }
convert => { "cpu" => "integer" }
convert => { "disk" => "integer" }
convert => { "disklograte" => "integer" }
convert => { "fazlograte" => "integer" }
convert => { "mem" => "integer" }
convert => { "totalsession" => "integer" }
}
mutate {
add_field => [ "fgtdatetime", "%{date} %{time}" ]
add_field => [ "loglevel", "%{level}" ]
replace => [ "fortigate_type", "%{type}" ]
replace => [ "fortigate_subtype", "%{subtype}" ]
remove_field => [ "msg", "message", "date", "time", "eventtime" ]
}
date {
match => [ "fgtdatetime", "YYYY-MM-dd HH:mm:ss" ]
locale => "en"
timezone => "NZ"
remove_field => [ "fgtdatetime" ]
}
geoip {
source => "srcip"
target => "geosrcip"
add_field => [ "[geosrcip][coordinates]", "%{[geosrcip][longitude]}" ]
add_field => [ "[geosrcip][coordinates]", "%{[geosrcip][latitude]}" ]
}
geoip {
source => "dstip"
target => "geodstip"
add_field => [ "[geodstip][coordinates]", "%{[geodstip][longitude]}" ]
add_field => [ "[geodstip][coordinates]", "%{[geodstip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float" ]
}

}
}

output {
if [host] == "10.a.b.c" {
elasticsearch {
hosts => "10.x.y.z:9200"
index => "fortinet-%{+YYYY.MM.dd}"
}
}

Elasticsearch

I needed to create an index template so that Elasticsearch knew that the fields, this is done by making an API request to ElasticSearch.This will create a mapping for the geodstip and geosrcip fields so that they can be referenced as Geohashes.

curl -X PUT "127.0.0.1:9200/_template/template_forti" -H 'Content-Type: application/json' -d'
{
     "version" : 50002,
     "order" : 1,
     "template" : "fortinet-*",
     "settings" : {
       "index" : {
         "refresh_interval" : "5s"
       }
     },
     "mappings" : {
       "_default_" : {
         "_all" : {
           "enabled" : true,
           "norms" : false
         },
         "dynamic_templates" : [
           {
             "message_field" : {
               "path_match" : "message",
               "match_mapping_type" : "string",
               "mapping" : {
                 "type" : "text",
                 "norms" : false
               }
             }
           },
           {
             "string_fields" : {
               "match" : "*",
               "match_mapping_type" : "string",
               "mapping" : {
                 "type" : "text",
                 "norms" : false,
                 "fields" : {
                   "keyword" : {
                     "type" : "keyword",
                     "ignore_above" : 256
                   }
                 }
               }
             }
           }
         ],
         "properties" : {
           "@timestamp" : {
             "type" : "date",
             "include_in_all" : false
           },
           "@version" : {
             "type" : "keyword",
             "include_in_all" : false
           },
           "geoip" : {
             "dynamic" : true,
             "properties" : {
               "ip" : {
                 "type" : "ip"
               },
               "location" : {
                 "type" : "geo_point"
               },
               "latitude" : {
                 "type" : "half_float"
               },
               "longitude" : {
                 "type" : "half_float"
               }
             }
           },
           "geodstip" : {
             "dynamic" : true,
             "properties" : {
               "ip" : {
                 "type" : "ip"
               },
               "location" : {
                 "type" : "geo_point"
               },
               "latitude" : {
                 "type" : "half_float"
               },
               "longitude" : {
                 "type" : "half_float"
               }
             }
           },
           "geosrcip" : {
             "dynamic" : true,
             "properties" : {
               "ip" : {
                 "type" : "ip"
               },
               "location" : {
                 "type" : "geo_point"
               },
               "latitude" : {
                 "type" : "half_float"
               },
               "longitude" : {
                 "type" : "half_float"
               }
             }
           }
         }
       }
     },
     "aliases" : { }
 }
 '

Fortigate

Config Syslog to send to your LogStash Server

config log syslogd setting
    set status enable
    set server "10.x.y.z"
    set port 5000
end

In Kabina create an index using the format of Fortinet-*
You should then be able to create a region map based on the geosrcip.country_code2.keyword value.

Below is an example of random scanning traffic hitting the public interface on my fortigate for the last 24 hours.