Coder Social home page Coder Social logo

synesis_lite_suricata's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

synesis_lite_suricata's Issues

[Re]Some problem inside my Elasticstack + Suricata

I solved my last problem, but now, different problems appear in the logs of Elastic and Kibana

Elastic log error

`
aina@elasticsearch:~$ sudo head /var/log/elasticsearch/elasticsearch.log
[2021-07-28T13:22:52,898][WARN ][o.e.x.m.e.l.LocalExporter] [eY6v6GM] unexpected error while indexing monitoring document
org.elasticsearch.xpack.monitoring.exporter.ExportException: ClusterBlockException[blocked by: [FORBIDDEN/12/index read-only / allow delete (api)];]
at org.elasticsearch.xpack.monitoring.exporter.local.LocalBulk.lambda$throwExportException$2(LocalBulk.java:125) ~[x-pack-monitoring-6.8.0.jar:6.8.0]
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195) ~[?:?]
at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:177) ~[?:?]
at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948) ~[?:?]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484) ~[?:?]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474) ~[?:?]
at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150) ~[?:?]
at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173) ~[?:?]

**Kibana log error**

Jul 28 15:44:48 kibana kibana[1530]: {"type":"log","@timestamp":"2021-07-28T12:44:48Z","tags":["error","task_manager"],"pid":1530,"message":"Failed to poll for work: [cluster_block_exception] blocked by: [FORBIDDEN/12/index read-only / allow delete (api)]; :: {"path":"/.kibana_task_manager/_doc/Maps-maps_telemetry/_update","query":{"if_seq_no":11,"if_primary_term":2,"refresh":"true"},"body":"{\"doc\":{\"type\":\"task\",\"task\":{\"taskType\":\"maps_telemetry\",\"state\":\"{\\\"runs\\\":1,\\\"stats\\\":{\\\"mapsTotalCount\\\":0,\\\"timeCaptured\\\":\\\"2021-07-12T10:00:18.993Z\\\",\\\"attributesPerMap\\\":{\\\"dataSourcesCount\\\":{\\\"min\\\":0,\\\"max\\\":0,\\\"avg\\\":0},\\\"layersCount\\\":{\\\"min\\\":0,\\\"max\\\":0,\\\"avg\\\":0},\\\"layerTypesCount\\\":{},\\\"emsVectorLayersCount\\\":{}}}}\",\"params\":\"{}\",\"attempts\":0,\"scheduledAt\":\"2021-07-12T10:00:14.897Z\",\"runAt\":\"2021-07-28T12:45:48.369Z\",\"status\":\"running\"},\"kibana\":{\"uuid\":\"c9ffff37-0cdd-43c4-b95c-ca38ea93aee8\",\"version\":6080399,\"apiVersion\":1}}}","statusCode":403,"response":"{\"error\":{\"root_cause\":[{\"type\":\"cluster_block_exception\",\"reason\":\"blocked by: [FORBIDDEN/12/index read-only / allow delete (api)];\"}],\"type\":\"cluster_block_exception\",\"reason\":\"blocked by: [FORBIDDEN/12/index read-only / allow delete (api)];\"},\"status\":403}"}"}

`
Logstash seems to be working :

`
Jul 28 14:49:20 logstash logstash[2264]: [2021-07-28T14:49:20,449][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"/etc/logstash/synlite_suricata/geoipdbs/GeoLite2-City.mmdb"}
Jul 28 14:49:20 logstash logstash[2264]: [2021-07-28T14:49:20,767][INFO ][logstash.outputs.elasticsearch] Installing elasticsearch template to _template/synlite-suricata_stats-1.0.1
Jul 28 14:49:20 logstash logstash[2264]: [2021-07-28T14:49:20,769][INFO ][logstash.outputs.elasticsearch] Installing elasticsearch template to _template/synlite-suricata-1.0.1
Jul 28 14:49:20 logstash logstash[2264]: [2021-07-28T14:49:20,882][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"/etc/logstash/synlite_suricata/geoipdbs/GeoLite2-ASN.mmdb"}

`

And suricata is working well with fielbeat

I am using ELK 6.8.0

Could you help me please???

Kibana Url Format filters appear as raw HTML in saved search visualizations

Hello.

In Elasticsearch/KIbana 7.4.0, all saved search visualizations present the Url formats as raw HTML.
They are presented correctly when viewing the same saved search in Discovery.

This did not happen in 6.x.
I have upgraded to 1.1.0 as required for 7.x support.

Screenshots included.

image

image

Is this a known issue ?
Is there a workaround ?

Thank you.

Src/Dst Always WAN

Inbound traffic always shows my WAN IP instead of the internal IP address the traffic is destined for. The same for outbound traffic always showing WAN IP as the source. Is this because Suricata is only set to monitor my WAN interface.

Error: Failed to install template e-suricata_stats-1.0.1

Hi
I have installed ELK 6.11 and need to integrate the synesis_lite_suricata solution for the dashboard. I have followed all the recommendations as listed in the tutorial. However I am facing following issue. @robcowart can you please look into it and guide what might went wrong. I am ussing suricata version 4.1.2

[2020-08-31T11:07:23,055][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"/etc/logstash/synlite_suricata/geoipdbs/GeoLite2-ASN.mmdb"} [2020-08-31T11:07:23,064][ERROR][logstash.outputs.elasticsearch] Failed to install template. {:message=>"Got response code '400' contacting Elasticsearch at URL 'http://192.x.x.x:9200/_template/synlite-suricata_stats-1.0.1'", :class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError", :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.6.2-java/lib/logstash/outputs/elasticsearch/http_client/manticore_adapter.rb:80:in perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.6.2-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:332:in perform_request_to_url'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.6.2-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:319:in block in perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.6.2-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:414:in with_connection'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.6.2-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:318:in perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.6.2-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:326:in block in Pool'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.6.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:352:in template_put'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.6.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:86:in template_install'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.6.2-java/lib/logstash/outputs/elasticsearch/template_manager.rb:31:in install'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.6.2-java/lib/logstash/outputs/elasticsearch/template_manager.rb:17:in install_template'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.6.2-java/lib/logstash/outputs/elasticsearch/common.rb:212:in install_template'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.6.2-java/lib/logstash/outputs/elasticsearch/common.rb:49:in block in setup_after_successful_connection'"]} [2020-08-31T11:07:23,062][ERROR][logstash.outputs.elasticsearch] Failed to install template. {:message=>"Got response code '400' contacting Elasticsearch at URL 'http://192.168.100.107:9200/_template/synlite-suricata-1.0.1'", :class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError", :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.6.2-java/lib/logstash/outputs/elasticsearch/http_client/manticore_adapter.rb:80:in perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.6.2-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:332:in perform_request_to_url'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.6.2-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:319:in block in perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.6.2-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:414:in with_connection'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.6.2-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:318:in perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.6.2-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:326:in block in Pool'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.6.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:352:in template_put'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.6.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:86:in template_install'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.6.2-java/lib/logstash/outputs/elasticsearch/template_manager.rb:31:in install'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.6.2-java/lib/logstash/outputs/elasticsearch/template_manager.rb:17:in install_template'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.6.2-java/lib/logstash/outputs/elasticsearch/common.rb:212:in install_template'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.6.2-java/lib/logstash/outputs/elasticsearch/common.rb:49:in block in setup_after_successful_connection'"]} [2020-08-31T11:07:31,642][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"/etc/logstash/synlite_suricata/geoipdbs/GeoLite2-City.mmdb"} [2020-08-31T11:07:31,644][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"/etc/logstash/synlite_suricata/geoipdbs/GeoLite2-ASN.mmdb"} [2020-08-31T11:07:40,765][INFO ][logstash.inputs.beats ] Beats inputs: Starting input listener {:address=>"192.168.100.107:5044"} [2020-08-31T11:07:40,872][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"synlite_suricata", :thread=>"#<Thread:0x3962a0df run>"} [2020-08-31T11:07:41,134][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:synlite_suricata], :non_running_pipelines=>[]} [2020-08-31T11:07:41,162][INFO ][org.logstash.beats.Server] Starting server on port: 5044 [2020-08-31T11:07:41,757][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

Commercial version?

Hello,

If this is the "lite" is there a commercial version of this?
i watched the koiossian website but didn't find further informations. Including the prices

thanks

Threats tab

Not receiving any value on at-risk services, at-risk servers and high-risk clients

docker

Is there dockerized version of this?

Unable to capture and display dashboard

Dear all, I am facing with this issue. "The request for this panel failed. -- The aggregations key is missing from the response, check your permissions for this request."

Kindly support. Thanks !!

image

dashboard present error

I used elk7.1 and Kibana dashboard present lot of error after installed synesis_lite_suricata(7.1.x)
.how can I do resolve this problem ?
image

image

image

failed to parse field [event.host] of type [keyword]

ELK FILEBEAT 6.4.2

[WARN ] 2019-04-08 15:37:37.272 [Ruby-0-Thread-19: :1] elasticsearch - Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"suricata-1.0.1-2019.04.08", :_type=>"doc", :routing=>nil}, #LogStash::Event:0x15177b81], :response=>{"index"=>{"_index"=>"suricata-1.0.1-2019.04.08", "_type"=>"doc", "_id"=>"ZhDh-2kB5FeKoMpssAt5", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [event.host] of type [keyword]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:220"}}}}}

HTTP Responses and Requests > 32766 Not Able to be Analyzed

Hope you are well!

I ran across a few (I'm sure non-specific to this project) issues this morning when gathering HTTP traffic where some of the really large returns for http_request_body_printable and http_response_body fail to index because they are larger than 32766. I saw a few options online for ignoring text fields above a certain limit, example:

{
  "logs_template": {
    "template": "logs*",
    "mappings": {
      "_default_": {
        "_all": {
          "enabled": false
        },
        "dynamic_templates": [
          {
            "notanalyzed": {
              "match": "*",
              "match_mapping_type": "string",
              "mapping": {
                "ignore_above": 512,
                "type": "string",
                "index": "not_analyzed",
                "doc_values": true
              }
            }
          }
        ]
      }
    }
  }
}

I was looking at your template and was wondering how this could best fit in without ruining anything else. Thanks!

Unable to Index Events

Dear Team,
I am new to open source. I have followed the guide to install suricata and used Rob's synesis_lite. All seems to be well but when I import the kibana dashboard, I am unable to see any data and get errors that Could not index event to Elasticsearch. {:status=>400, ...... like
"error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [http.content_range] of type [keyword] in document with id 'zwhjAngBHO889NJY5rF1'
"error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [dns.flags] of type [long] in document with id 'XAhjAngBHO889NJY77dZ'.

I hope that is enough information to point you in the right direction. I appreciate any guidance on resolving this

Server: VM
OS: Ubuntu 20.4
ELK: 7.11.1
Suricata: 6.0.1
Thanks

Failed to install template. {:message=>"Got response code '400' contacting Elasticsearch at URL

Hi there,
Environment: Version: 6.8.9, Build: default/rpm/be2c7bf/2020-05-04T17:00:34.323820Z, JVM: 1.8.0_252

Have synesis_lite_snort working perfectly, now trying to get synesis_lite_suricata working. There is a constant error:
Failed to install template. {:message=>"Got response code '400' contacting Elasticsearch at URL http://127.0.0.1:9200

Even though I can curl to that url and navigate the elasticsearch. It wont load the template for synesis_lite_suricata. I have checked the path to the template in 30_ config output file and it is correct. So it seems to be something in the template that it wont allow it to install.

@robcowart any ideas?

Suricata Kibana Dashboard no data displayed

Hi all, I am using kibana 7.1.1, Filebeat 7.1.1. But there is no data captured and collected to visualized on the dashboard.

eve.json logs for suricata are located in /var/log/suricata and data are collected and growing. From here is showing that Suricata monitoring is running fine.

Strangely, on the dashboard has no data. Is it something wrong with my logstash? It did not run properly, my another question is can we use Filebeat instead of logstash? How can this be done??

Please advise.

Problem Importing Suricata Index Pattern to Kibana

Hello all. I have been following the installation guide, but I am having problems After Step 9 when Setting Up Kibana. I have not been able to import in the Index Patterns correctly.

My system -- I am using ELK 6.7.2 with Filebeat and with Suricata 4.1.4 on Ubuntu 18.04 and installed on a VPS with a public IP address. I can access the kibana UI by just putting in my fully qualified domain name (no port 5601 needed) and it prompts for a username and password to bring up the page.

When I run the curl command and use my username and password along with the FQN with the port and the path to my synlite_suricata.index_pattern.json and synlite_suricata_stats.index_pattern.json files, I get an error message that "failed to connect - connection timed out". I have also tried it leaving out the username and password from the curl command.

When I removed the port from the URL path in the curl command, I get the following output:

<title>301 Moved Permanently</title>

301 Moved Permanently


nginx/1.14.0 (Ubuntu)

Then when I tried to import the synlite_suricata.dashboards.json using the Kibana UI - Management - Saved Objects import section, I got an error message

"Index Pattern Conflicts
The following saved objects use index patterns that do not exist. Please select the index patterns you'd like re-associated with them. You can create a new index pattern if necessary

suricata-*"

Here is one of the errors that shows up in the /var/log/logstash/logstash-plain.log

Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"suricata-1.0.1-2019.05.14", :_type=>"doc", :routing=>nil}, #LogStash::Event:0x2165f318], :response=>{"index"=>{"_index"=>"suricata-1.0.1-2019.05.14", "_type"=>"doc", "_id"=>"bWqJt2oBfxlZ2agOhqWc", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [event.host] of type [keyword] in document with id 'bWqJt2oBfxlZ2agOhqWc'", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:685"}}}}}

Is there another way to bring in the Index Patterns for suricata to Kibana?

Can anyone provide me a solution or if I need to provide any other information to help troubleshoot?

Thank you.

Could not index to Elasticsearch

I am getting this error.

[2020-09-01T11:56:07,289][WARN ][logstash.outputs.elasticsearch][synlite_suricata][7f0f636925cafdc45ccbf6445a1562dacede6781ba4cf6f1b34e30bf21e877ba] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"suricata-1.1.0-2020.08.29", :routing=>nil, :_type=>"_doc"}, #LogStash::Event:0x768d173c], :response=>{"index"=>{"_index"=>"suricata-1.1.0-2020.08.29", "_type"=>"_doc", "_id"=>"_XhPSXQBd2cgVu2RSa1s", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [snmp.vars] of type [float] in document with id '_XhPSXQBd2cgVu2RSa1s'. Preview of field's value: '1.3.6.1.2.1.25.3.2.1.5.1'", "caused_by"=>{"type"=>"number_format_exception", "reason"=>"multiple points"}}}}}

Can you please help me?

If Suricata or Snort can analysis netflow data same as Elastiflow

Hi Rob,
Recently I want integrate IDS with ELK to analysis network attack, So I have a question that if the suricata or snort can analysis netflow data instead of the localhost interface data, I want send netflow data to suricata or snort, then the attack or alerts data visualization via ELK, I don't know if it feasible, looking forward your reply , thanks in advance.

Synlite lite suricata install valididity

Good Day Mr. Rob.. hope you are doing well, i wanted to find out if this setup still valid? because. im news to elastic stack so I tried setting this up but kept getting errors related to Suricata indexing in kibana dashboards. This is a really really useful tool and I wanted to use this for my honor's research paper stats. your assistance would be highly appreciated

Best Regards

logstash excesive memory usage

I did installed on the ELK 7.10 (basic license) both synesis lite v1.1.0 and elastiflow 4.0.1. I did set logstash HEAP to 4GB, elastisearch to 32GB on physical server with 64GB of RAM and swap disabled. Nothing else is installed either on ELK or on the server itself.

VIRT usage for elasticsearch raised to 42GB, logstash to 22GB effectively eating whole memory, with other processes OOMs, even when ELK is currently doing nothing, as I do not route any logs / netflow to the server yet! I had to limit elasticsearch to 24GB and now it look like this (again just quietly sitting):

PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
27060 logstash 20 0 22.4g 5.0g 26360 S 11.9 8.0 26:51.28 java
26502 elastic+ 20 0 34.0g 25.5g 29664 S 0.3 40.5 4:14.21 java

I really do not understand usage of 20GB of logstash RAM, do you please have any explanation or suggestion? Is there anything I could do to limit logstash from such excessive memory usage?

Logstash error: "Error interpreting the template of the input - range can't iterate over /.../eve.json"

We at GekkoFyre Networks are receiving an error with Logstash about the following and cannot proceed any further as a result:

2020-01-07T22:53:57.506+1100    INFO    [monitoring]    log/log.go:154  Uptime: 3.029140895s
2020-01-07T22:53:57.506+1100    INFO    [monitoring]    log/log.go:131  Stopping metrics logging.
2020-01-07T22:53:57.506+1100    INFO    instance/beat.go:435    filebeat stopped.
2020-01-07T22:53:57.506+1100    ERROR   instance/beat.go:916    Exiting: Error getting config for fileset suricata/eve: Error interpreting the template of the input: template: text:3:22: executing "text" at <.paths>: range can't iterate over /var/log/suricata/eve.json
Exiting: Error getting config for fileset suricata/eve: Error interpreting the template of the input: template: text:3:22: executing "text" at <.paths>: range can't iterate over /var/log/suricata/eve.json
[root@barker ~]#

Everything else works fine, otherwise, and this includes Filebeat, Metricbeat, Packetbeat, ElasticSearch, Kibana, and Heartbeat, for a cluster of about a dozen servers (including several of both VPS' and dedicated servers each).

We're not sure why we are receiving this error, but would appreciate any and all advice on how to proceed from here, thank you.

Unable to index more than 8 GB of Suricata logs

Dear Team,

we are unable to index more than 8 gb of network data, below are the error. Please suggest
WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400,

yellow open suricata_stats-1.1.0-2019.11.19 E56hJBsWT7SLwwdBe9fAxQ 3 1 54 0 1.3mb 1.3mb
green open .opendistro-alerting-alerts IGu4a-sYR9K5qsSc1MonzQ 1 0 0 0 283b 283b
green open .kibana_1 pNwqmVMpT2auYw4awigqTw 1 0 912 53 425.2kb 425.2kb
green open .opendistro-alerting-alert-history-2019.11.18-000002 YEElqcjvSgeplAIg3SCPUw 1 0 0 0 283b 283b
yellow open suricata-1.1.0-2019.11.19 409tfWehQvynPZKOHEpj3Q 3 1 591230 0 156.2mb 156.2mb
yellow open suricata-1.1.0-2019.11.18 ZY4zVSvhRx2BR8kYl_kCGg 3 1 2945 0 1.3mb 1.3mb
yellow open suricata_stats-1.1.0-2019.11.18 cExiCSn0TxCeY90qTJZYMA 3 1 4 0 150.3kb 150.3kb

Elasticsearch Index Templates incompatible with ES 7.x

The import of the Elasticsearch index templates by Logstash is incompatible with Elasticsearch 7.x, since they removed mapping types:

https://www.elastic.co/guide/en/elasticsearch/reference/current/removal-of-types.html

curl -X PUT http://127.0.0.1:9200/_template/synlite-suricata-1.0.1 -H "Content-Type: application/json" -H "kbn-xsrf: true" -d @synlite_suricata.template.json 

{"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"Root mapping definition has unsupported parameters:  [_default_ : {numeric_detection=true, 

Suricata Stats

Hi @robcowart
Just a question I am sending the suricata eve file via filebeats to my ELK server. I see there is a stats log file as well but when specifying this log in the filebeats on the pfsense server they not getting ingested into the ELK server. The normal eve files works really well and thank you so much!

Any advise on what to check for?

Data from pfsense

Hi Rob,
Support for filebeat on BSD usually lags behind the releases. As such there is only a package for 6.7 Filebeat for BSD (which isnt compatible with ES/LS 7.x).
I use suricata on pfSense which has the option to dump json format (i think) into syslog and export via barnyard2. (so syslog input on logstash).

It also has the option to send the eve.json directly to redis.

I was wondering if you had any thoughts on the best way I could get the data to logstash without having the edit the filters too much.

the syslog looks like this when the eve.json dump is turned on

Aug 6 17:33:07 pfsense-hostname suricata[82723]: {"timestamp": "2019-08-06T17:33:07.881056+1000", "flow_id": 790425064649987, "in_iface": "igb0", "event_type": "ssh", "src_ip": "x.x.x.x", "src_port": 40889, "dest_ip": "x.x.x.x", "dest_port": 22, "proto": "TCP", "ssh": {"client": {"proto_version": "2.0", "software_version": "OpenSSH_7.5-hpn14v5"}, "server": {"proto_version": "2.0", "software_version": "OpenSSH_7.5-hpn14v5"}}}

my guess is I need to setup syslog input to strip the all the data before {"timestamp... and then add the filebeat - event.type: suricata. then it should be able to just use the same filter?

Cheers

Logstash stuck while loading GeoIP DB

So, I followed the proccess as drescribed, and when I try to run logstash and check the log it get stuck on this point, been more than 20 mins so I just thought it may coul be some kind of issue:

The issue I talk about is at the end of the log file, but I add everything if needed.

[2018-12-28T23:14:33,717][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"} [2018-12-28T23:14:33,724][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"} [2018-12-28T23:14:34,203][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.2.4"} [2018-12-28T23:14:34,324][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600} [2018-12-28T23:14:59,212][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"synlite_suricata", "pipeline.workers"=>3, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50} [2018-12-28T23:14:59,577][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://192.168.31.125:9200/]}} [2018-12-28T23:14:59,580][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://192.168.31.125:9200/, :path=>"/"} [2018-12-28T23:14:59,720][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://192.168.31.125:9200/"} [2018-12-28T23:14:59,772][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6} [2018-12-28T23:14:59,772][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the typeevent field won't be used to determine the document _type {:es_version=>6} [2018-12-28T23:14:59,788][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>"/etc/logstash/synlite_suricata/templates/synlite_suricata_stats.template.json"} [2018-12-28T23:14:59,799][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"order"=>0, "version"=>10001, "index_patterns"=>"suricata_stats-1.0.1-*", "settings"=>{"index"=>{"number_of_shards"=>3, "number_of_replicas"=>1, "refresh_interval"=>"10s", "codec"=>"best_compression"}}, "mappings"=>{"_default_"=>{"numeric_detection"=>true, "dynamic_templates"=>[{"string_fields"=>{"match_mapping_type"=>"string", "match"=>"*", "mapping"=>{"type"=>"keyword"}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "event"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"host"=>{"type"=>"keyword"}, "subtype"=>{"type"=>"keyword"}, "type"=>{"type"=>"keyword"}}}, "node"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"ipaddr"=>{"type"=>"ip"}, "hostname"=>{"type"=>"keyword"}}}, "stats"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"app_layer"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"flow"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"dcerpc_tcp"=>{"type"=>"long"}, "dcerpc_udp"=>{"type"=>"long"}, "dnp3"=>{"type"=>"long"}, "dns_tcp"=>{"type"=>"long"}, "dns_udp"=>{"type"=>"long"}, "failed_tcp"=>{"type"=>"long"}, "failed_udp"=>{"type"=>"long"}, "ftp"=>{"type"=>"long"}, "http"=>{"type"=>"long"}, "imap"=>{"type"=>"long"}, "msn"=>{"type"=>"long"}, "modbus"=>{"type"=>"long"}, "smb"=>{"type"=>"long"}, "smtp"=>{"type"=>"long"}, "ssh"=>{"type"=>"long"}, "tls"=>{"type"=>"long"}}}, "tx"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"dcerpc_tcp"=>{"type"=>"long"}, "dcerpc_udp"=>{"type"=>"long"}, "dnp3"=>{"type"=>"long"}, "dns_tcp"=>{"type"=>"long"}, "dns_udp"=>{"type"=>"long"}, "ftp"=>{"type"=>"long"}, "http"=>{"type"=>"long"}, "modbus"=>{"type"=>"long"}, "smb"=>{"type"=>"long"}, "smtp"=>{"type"=>"long"}, "ssh"=>{"type"=>"long"}, "tls"=>{"type"=>"long"}}}}}, "capture"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"kernel_drops"=>{"type"=>"long"}, "kernel_packets"=>{"type"=>"long"}}}, "defrag"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"ipv4"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"fragments"=>{"type"=>"long"}, "reassembled"=>{"type"=>"long"}, "timeouts"=>{"type"=>"long"}}}, "ipv6"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"fragments"=>{"type"=>"long"}, "reassembled"=>{"type"=>"long"}, "timeouts"=>{"type"=>"long"}}}, "max_frag_hits"=>{"type"=>"long"}}}, "decoder"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"avg_pkt_size"=>{"type"=>"long"}, "bytes"=>{"type"=>"long"}, "dce"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"pkt_too_small"=>{"type"=>"long"}}}, "erspan"=>{"type"=>"long"}, "ethernet"=>{"type"=>"long"}, "gre"=>{"type"=>"long"}, "icmpv4"=>{"type"=>"long"}, "icmpv6"=>{"type"=>"long"}, "ieee8021ah"=>{"type"=>"long"}, "invalid"=>{"type"=>"long"}, "ipraw"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"invalid_ip_version"=>{"type"=>"long"}}}, "ipv4"=>{"type"=>"long"}, "ipv4_in_ipv6"=>{"type"=>"long"}, "ipv6"=>{"type"=>"long"}, "ipv6_in_ipv6"=>{"type"=>"long"}, "ltnull"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"pkt_too_small"=>{"type"=>"long"}, "unsupported_type"=>{"type"=>"long"}}}, "max_pkt_size"=>{"type"=>"long"}, "mpls"=>{"type"=>"long"}, "null"=>{"type"=>"long"}, "pkts"=>{"type"=>"long"}, "ppp"=>{"type"=>"long"}, "pppoe"=>{"type"=>"long"}, "raw"=>{"type"=>"long"}, "sctp"=>{"type"=>"long"}, "sll"=>{"type"=>"long"}, "tcp"=>{"type"=>"long"}, "teredo"=>{"type"=>"long"}, "udp"=>{"type"=>"long"}, "vlan"=>{"type"=>"long"}, "vlan_qinq"=>{"type"=>"long"}}}, "detect"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"alert"=>{"type"=>"long"}}}, "dns"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"memcap_global"=>{"type"=>"long"}, "memcap_state"=>{"type"=>"long"}, "memuse"=>{"type"=>"long"}}}, "file_store"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"open_files"=>{"type"=>"long"}}}, "flow"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"emerg_mode_entered"=>{"type"=>"long"}, "emerg_mode_over"=>{"type"=>"long"}, "icmpv4"=>{"type"=>"long"}, "icmpv6"=>{"type"=>"long"}, "memcap"=>{"type"=>"long"}, "memuse"=>{"type"=>"long"}, "spare"=>{"type"=>"long"}, "tcp"=>{"type"=>"long"}, "tcp_reuse"=>{"type"=>"long"}, "udp"=>{"type"=>"long"}}}, "flow_mgr"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"bypassed_pruned"=>{"type"=>"long"}, "closed_pruned"=>{"type"=>"long"}, "est_pruned"=>{"type"=>"long"}, "flows_checked"=>{"type"=>"long"}, "flows_notimeout"=>{"type"=>"long"}, "flows_removed"=>{"type"=>"long"}, "flows_timeout"=>{"type"=>"long"}, "flows_timeout_inuse"=>{"type"=>"long"}, "new_pruned"=>{"type"=>"long"}, "rows_busy"=>{"type"=>"long"}, "rows_checked"=>{"type"=>"long"}, "rows_empty"=>{"type"=>"long"}, "rows_maxlen"=>{"type"=>"long"}, "rows_skipped"=>{"type"=>"long"}}}, "http"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"memcap"=>{"type"=>"long"}, "memuse"=>{"type"=>"long"}}}, "tcp"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"insert_data_normal_fail"=>{"type"=>"long"}, "insert_data_overlap_fail"=>{"type"=>"long"}, "insert_list_fail"=>{"type"=>"long"}, "invalid_checksum"=>{"type"=>"long"}, "no_flow"=>{"type"=>"long"}, "memuse"=>{"type"=>"long"}, "overlap"=>{"type"=>"long"}, "overlap_diff_data"=>{"type"=>"long"}, "pseudo"=>{"type"=>"long"}, "pseudo_failed"=>{"type"=>"long"}, "reassembly_gap"=>{"type"=>"long"}, "reassembly_memuse"=>{"type"=>"long"}, "rst"=>{"type"=>"long"}, "segment_memcap_drop"=>{"type"=>"long"}, "sessions"=>{"type"=>"long"}, "ssn_memcap_drop"=>{"type"=>"long"}, "stream_depth_reached"=>{"type"=>"long"}, "syn"=>{"type"=>"long"}, "synack"=>{"type"=>"long"}}}}}, "uptime"=>{"type"=>"long"}, "tags"=>{"type"=>"keyword"}}}}}} [2018-12-28T23:14:59,828][INFO ][logstash.outputs.elasticsearch] Installing elasticsearch template to _template/synlite-suricata_stats-1.0.1 [2018-12-28T23:14:59,913][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//192.168.31.125:9200"]} [2018-12-28T23:14:59,919][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://192.168.31.125:9200/]}} [2018-12-28T23:14:59,920][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://192.168.31.125:9200/, :path=>"/"} [2018-12-28T23:14:59,927][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://192.168.31.125:9200/"} [2018-12-28T23:14:59,957][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6} [2018-12-28T23:14:59,957][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: thetype event field won't be used to determine the document _type {:es_version=>6} [2018-12-28T23:14:59,963][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>"/etc/logstash/synlite_suricata/templates/synlite_suricata.template.json"} [2018-12-28T23:14:59,978][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"order"=>0, "version"=>10001, "index_patterns"=>"suricata-1.0.1-*", "settings"=>{"index"=>{"number_of_shards"=>3, "number_of_replicas"=>1, "refresh_interval"=>"10s", "codec"=>"best_compression"}}, "mappings"=>{"_default_"=>{"numeric_detection"=>true, "dynamic_templates"=>[{"tcp.ack"=>{"path_match"=>"tcp.ack", "mapping"=>{"type"=>"boolean"}}}, {"tcp.cwr"=>{"path_match"=>"tcp.cwr", "mapping"=>{"type"=>"boolean"}}}, {"tcp.ece"=>{"path_match"=>"tcp.ece", "mapping"=>{"type"=>"boolean"}}}, {"tcp.fin"=>{"path_match"=>"tcp.fin", "mapping"=>{"type"=>"boolean"}}}, {"tcp.psh"=>{"path_match"=>"tcp.psh", "mapping"=>{"type"=>"boolean"}}}, {"tcp.rst"=>{"path_match"=>"tcp.rst", "mapping"=>{"type"=>"boolean"}}}, {"tcp.syn"=>{"path_match"=>"tcp.syn", "mapping"=>{"type"=>"boolean"}}}, {"tcp.urg"=>{"path_match"=>"tcp.urg", "mapping"=>{"type"=>"boolean"}}}, {"string_fields"=>{"match_mapping_type"=>"string", "match"=>"*", "mapping"=>{"type"=>"keyword"}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "alert"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"action"=>{"type"=>"keyword"}, "category"=>{"type"=>"keyword"}, "cve"=>{"type"=>"keyword"}, "gid"=>{"type"=>"long"}, "rev"=>{"type"=>"long"}, "severity"=>{"type"=>"long"}, "signature"=>{"type"=>"keyword"}, "signature_id"=>{"type"=>"long"}}}, "app_proto"=>{"type"=>"keyword"}, "autonomous_system"=>{"type"=>"keyword"}, "city"=>{"type"=>"keyword"}, "client_asn"=>{"type"=>"long"}, "client_autonomous_system"=>{"type"=>"keyword"}, "client_city"=>{"type"=>"keyword"}, "client_country"=>{"type"=>"keyword"}, "client_geo_location"=>{"type"=>"geo_point"}, "client_hostname"=>{"type"=>"keyword"}, "client_ip"=>{"type"=>"ip"}, "country"=>{"type"=>"keyword"}, "dest_asn"=>{"type"=>"long"}, "dest_autonomous_system"=>{"type"=>"keyword"}, "dest_city"=>{"type"=>"keyword"}, "dest_country"=>{"type"=>"keyword"}, "dest_geo_location"=>{"type"=>"geo_point"}, "dest_hostname"=>{"type"=>"keyword"}, "dest_ip"=>{"type"=>"ip"}, "dest_port"=>{"type"=>"long"}, "dest_port_name"=>{"type"=>"keyword"}, "dest_rep_tags"=>{"type"=>"keyword"}, "dns"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"id"=>{"type"=>"long"}, "rcode"=>{"type"=>"keyword"}, "rdata"=>{"type"=>"keyword"}, "rrname"=>{"type"=>"keyword"}, "rrtype"=>{"type"=>"keyword"}, "ttl"=>{"type"=>"long"}, "tx_id"=>{"type"=>"long"}, "type"=>{"type"=>"keyword"}}}, "event"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"host"=>{"type"=>"keyword"}, "subtype"=>{"type"=>"keyword"}, "type"=>{"type"=>"keyword"}}}, "fileinfo"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"filename"=>{"type"=>"keyword"}, "gaps"=>{"type"=>"boolean"}, "size"=>{"type"=>"long"}, "state"=>{"type"=>"keyword"}, "stored"=>{"type"=>"boolean"}, "tx_id"=>{"type"=>"long"}}}, "flow"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"age"=>{"type"=>"long"}, "alerted"=>{"type"=>"boolean"}, "bytes"=>{"type"=>"long"}, "bytes_toclient"=>{"type"=>"long"}, "bytes_toserver"=>{"type"=>"long"}, "end"=>{"type"=>"date"}, "pkts"=>{"type"=>"long"}, "pkts_toclient"=>{"type"=>"long"}, "pkts_toserver"=>{"type"=>"long"}, "reason"=>{"type"=>"keyword"}, "start"=>{"type"=>"date"}, "state"=>{"type"=>"keyword"}}}, "flow_id"=>{"type"=>"long"}, "http"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"hostname"=>{"type"=>"keyword"}, "http_content_type"=>{"type"=>"keyword"}, "http_method"=>{"type"=>"keyword"}, "http_refer"=>{"type"=>"keyword"}, "http_user_agent"=>{"type"=>"keyword"}, "length"=>{"type"=>"long"}, "protocol"=>{"type"=>"keyword"}, "redirect"=>{"type"=>"keyword"}, "status"=>{"type"=>"long"}, "url"=>{"type"=>"keyword"}, "useragent_app"=>{"type"=>"keyword"}, "useragent_app_ver"=>{"type"=>"keyword"}, "useragent_device"=>{"type"=>"keyword"}, "useragent_os"=>{"type"=>"keyword"}, "useragent_os_ver"=>{"type"=>"keyword"}, "xff"=>{"type"=>"keyword"}}}, "icmp_code"=>{"type"=>"long"}, "icmp_type"=>{"type"=>"long"}, "in_iface"=>{"type"=>"keyword"}, "ip_version"=>{"type"=>"keyword"}, "log"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"message"=>{"type"=>"keyword"}, "severity"=>{"type"=>"keyword"}, "tags"=>{"type"=>"keyword"}}}, "node"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"ipaddr"=>{"type"=>"ip"}, "hostname"=>{"type"=>"keyword"}}}, "proto"=>{"type"=>"keyword"}, "rep_tags"=>{"type"=>"keyword"}, "server_asn"=>{"type"=>"long"}, "server_autonomous_system"=>{"type"=>"keyword"}, "server_city"=>{"type"=>"keyword"}, "server_country"=>{"type"=>"keyword"}, "server_geo_location"=>{"type"=>"geo_point"}, "server_hostname"=>{"type"=>"keyword"}, "server_ip"=>{"type"=>"ip"}, "service_name"=>{"type"=>"keyword"}, "service_port"=>{"type"=>"long"}, "src_asn"=>{"type"=>"long"}, "src_autonomous_system"=>{"type"=>"keyword"}, "src_city"=>{"type"=>"keyword"}, "src_country"=>{"type"=>"keyword"}, "src_geo_location"=>{"type"=>"geo_point"}, "src_hostname"=>{"type"=>"keyword"}, "src_ip"=>{"type"=>"ip"}, "src_port"=>{"type"=>"long"}, "src_port_name"=>{"type"=>"keyword"}, "src_rep_tags"=>{"type"=>"keyword"}, "tcp"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"ecn"=>{"type"=>"boolean"}, "state"=>{"type"=>"keyword"}, "tcp_flags"=>{"type"=>"keyword"}, "tcp_flags_tc"=>{"type"=>"keyword"}, "tcp_flags_ts"=>{"type"=>"keyword"}}}, "tls"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"fingerprint"=>{"type"=>"keyword"}, "issuerdn"=>{"type"=>"keyword"}, "notafter"=>{"type"=>"date"}, "notbefore"=>{"type"=>"date"}, "serial"=>{"type"=>"keyword"}, "session_resumed"=>{"type"=>"boolean"}, "sni"=>{"type"=>"keyword"}, "subject"=>{"type"=>"keyword"}, "version"=>{"type"=>"keyword"}}}, "tx_id"=>{"type"=>"long"}, "vars"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"flowints"=>{"dynamic"=>true, "type"=>"object", "properties"=>{"applayer.anomaly.count"=>{"type"=>"long"}, "http.anomaly.count"=>{"type"=>"long"}, "smtp.anomaly.count"=>{"type"=>"long"}, "tcp.retransmission.count"=>{"type"=>"long"}, "tls.anomaly.count"=>{"type"=>"long"}}}}}, "vlan"=>{"type"=>"long"}, "tags"=>{"type"=>"keyword"}, "traffic_locality"=>{"type"=>"keyword"}}}}}} [2018-12-28T23:14:59,995][INFO ][logstash.outputs.elasticsearch] Installing elasticsearch template to _template/synlite-suricata-1.0.1 [2018-12-28T23:15:00,055][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//192.168.31.125:9200"]} [2018-12-28T23:15:00,150][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"/etc/logstash/synlite_suricata/geoipdbs/GeoLite2-City.mmdb"} [2018-12-28T23:15:00,286][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"/etc/logstash/synlite_suricata/geoipdbs/GeoLite2-ASN.mmdb"} [2018-12-28T23:15:22,991][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"/etc/logstash/synlite_suricata/geoipdbs/GeoLite2-City.mmdb"} [2018-12-28T23:15:22,993][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"/etc/logstash/synlite_suricata/geoipdbs/GeoLite2-ASN.mmdb"}

The thing is when I remove the filter part from conf.d and let only 1 output I get logs. But this seems to be stuck for ever

failed to parse field [http.content_range] of type [keyword]

Hi,

i have ELK 7.6.2 Ubuntu 18.04. and i send logs from pfsense using beats 6.8.7

[2020-04-23T11:36:46,752][WARN ][logstash.outputs.elasticsearch][synlite_suricata] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"suricata-1.1.0-2020.04.22", :routing=>nil, :_type=>"_doc"}, #LogStash::Event:0x6cdceb6e], :response=>{"index"=>{"_index"=>"suricata-1.1.0-2020.04.22", "_type"=>"_doc", "_id"=>"gDwupnEBHbGBrxJRn8Jk", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [http.content_range] of type [keyword] in document with id 'gDwupnEBHbGBrxJRn8Jk'. Preview of field's value: '{size=127499264, start=3519, raw=bytes 3519-3717/127499264, end=3717}'", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:404"}}}}}

Could you advice the correct type?
Thank you.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.