Coder Social home page Coder Social logo

synesis_lite_snort's People

Contributors

robcowart avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

synesis_lite_snort's Issues

Error: Uncaught TypeError: Cannot read property ‘scripted’ of undefined

Dear All,

Good day! We are currently using ELK stack version 6.7.1 build 20266 on production.

I'm trying to build a dashboard for our security monitoring tool then suddenly encountered the following error after uploading index patterns.

Deprecation warning: value provided is not in a recognized RFC2822 or ISO format. moment construction falls back to js Date(), which is not reliable across all browsers and versions. Non RFC2822/ISO date formats are discouraged and will be removed in an upcoming major release. Please refer to http://momentjs.com/guides/#/warnings/js-date/ for more info.
Arguments:
[0] _isAMomentObject: true, _isUTC: false, _useUTC: false, _l: undefined, _i: now-15m, _f: undefined, _strict: undefined, _locale: [object Object]
Error
at Function.createFromInputFallback (https://localhost/built_assets/dlls/vendors.bundle.dll.js:82:102025)
at https://localhost/built_assets/dlls/vendors.bundle.dll.js:82:113223
at https://localhost/built_assets/dlls/vendors.bundle.dll.js:82:113274
at ye (https://localhost/built_assets/dlls/vendors.bundle.dll.js:82:113564)
at https://localhost/built_assets/dlls/vendors.bundle.dll.js:82:113954
at ge (https://localhost/built_assets/dlls/vendors.bundle.dll.js:82:114016)
at pe (https://localhost/built_assets/dlls/vendors.bundle.dll.js:82:114048)
at e (https://localhost/built_assets/dlls/vendors.bundle.dll.js:82:99054)
at Function.convertTimeToUTCString (https://localhost/bundles/commons.bundle.js:3:488713)
at DashboardStateManager.handleTimeChange (https://localhost/bundles/commons.bundle.js:4:434379)
kibana#/dashboard/e3c64770-6cab-11e8-acab-b78fd1091474?_g=h@44136fa&_a=h@4f010da:1 Mixed Content: The page at 'https://localhost/app/kibana#/dashboard/e3c64770-6cab-11e8-acab-b78fd1091474?_g=h@44136fa&_a=h@4f010da' was loaded over HTTPS, but requested an insecure image 'http://www.koiossian.com/public/snort.gif'. This content should also be served over HTTPS.

Having trouble getting this working

All,

I just need a bit of configuration help. I've followed the installation directions as thoroughly as possible, but I must be doing something wrong. In Kibana I can see the log entries in Discover-snort-, but when I go to the dashboard, everything is zero (except for the total counts) and when I hover over filter fields it displays a pop-up that says "doesn't exist on any documents in the snort- index pattern"

I know it must be something easy, something I've overlooked, so if I could be directed where to look, that would be awesome.

Thanks in advance for any help,
Cheers,
Mike

Could not index event to Elasticsearch.

logstash | [2019-04-29T15:33:41,424][WARN ][logstash.outputs.elasticsearch]
Could not index event to Elasticsearch.
{:status=>400, :action=>["index", {:_id=>nil,
:_index=>"snort-1.0.0-2019.04.29", :_type=>"doc",
:routing=>nil}, #LogStash::Event:0x2717df61],
:response=>{"index"=>{"_index"=>"snort-1.0.0-2019.04.29", "_type"=>"doc",
"_id"=>"eBW6aWoBhbNbdByl8QlP", "status"=>400,
"error"=>{"type"=>"mapper_parsing_exception",
"reason"=>"failed to parse field [event.host] of type [keyword]",
"caused_by"=>{"type"=>"illegal_state_exception",
"reason"=>"Can't get text on a START_OBJECT at 1:115"}}}}}

HTTP Responses and Requests > 32766 Not Able to be Analyzed

Hope you are well!

I ran across a few (I'm sure non-specific to this project) issues this morning when gathering HTTP traffic where some of the really large returns for http_request_body_printable and http_response_body fail to index because they are larger than 32766. I saw a few options online for ignoring text fields above a certain limit, example:

{
  "logs_template": {
    "template": "logs*",
    "mappings": {
      "_default_": {
        "_all": {
          "enabled": false
        },
        "dynamic_templates": [
          {
            "notanalyzed": {
              "match": "*",
              "match_mapping_type": "string",
              "mapping": {
                "ignore_above": 512,
                "type": "string",
                "index": "not_analyzed",
                "doc_values": true
              }
            }
          }
        ]
      }
    }
  }
}

I was looking at your template and was wondering how this could best fit in without ruining anything else. Thanks!

logstash error

Hi i copied your project and when i`m using it i got this error :

[2020-01-07T15:56:58,402][ERROR][logstash.outputs.elasticsearch][synlite_snort] Failed to install template. {:message=>"Got response code '400' contacting Elasticsearch at URL 'http://127.0.0.1:9200/_template/synlite-snort-1.0.0'", :class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError", :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.2.3-java/lib/logstash/outputs/elasticsearch/http_client/manticore_adapter.rb:80:in perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.2.3-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:332:in perform_request_to_url'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.2.3-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:319:in block in perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.2.3-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:414:in with_connection'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.2.3-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:318:in perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.2.3-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:326:in block in Pool'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.2.3-java/lib/logstash/outputs/elasticsearch/http_client.rb:352:in template_put'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.2.3-java/lib/logstash/outputs/elasticsearch/http_client.rb:86:in template_install'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.2.3-java/lib/logstash/outputs/elasticsearch/template_manager.rb:28:in install'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.2.3-java/lib/logstash/outputs/elasticsearch/template_manager.rb:16:in install_template'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.2.3-java/lib/logstash/outputs/elasticsearch/common.rb:134:in install_template'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.2.3-java/lib/logstash/outputs/elasticsearch/common.rb:51:in block in setup_after_successful_connection'"]}

and this

[2020-01-07T15:57:16,357][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][synlite_snort] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.

can you help me

[event][host] problem

Logstash error:

[2018-10-26T20:14:26,568][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"snort-1.0.0-2018.10.26", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x749d4fc9], :response=>{"index"=>{"_index"=>"snort-1.0.0-2018.10.26", "_type"=>"doc", "_id"=>"lB-VsWYB9Ov0hgQBpt4F", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [event.host]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:608"}}}}}
[2018-10-26T20:14:26,573][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"snort-1.0.0-2018.10.26", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x688bf2c6], :response=>{"index"=>{"_index"=>"snort-1.0.0-2018.10.26", "_type"=>"doc", "_id"=>"lR-VsWYB9Ov0hgQBpt4H", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [event.host]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:608"}}}}}
[2018-10-26T20:14:41,307][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"snort-1.0.0-2018.10.26", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x5da37f7e], :response=>{"index"=>{"_index"=>"snort-1.0.0-2018.10.26", "_type"=>"doc", "_id"=>"lh-VsWYB9Ov0hgQB396Y", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [event.host]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:608"}}}}}

CVE Graph

Hello, this is not an issue, but a query.
Great work you have here. I was wondering how did you get the CVE data for the signatures triggered. I tried looking at your dashboard, but it refers to a visualization which you have not posted.

I was wondering if you could share the visualization, and how you got the CVE details inside. This is a really useful feature.

Thank you in advance!

Index to elasticsearch not possible

Hello,
i encounter the following issue out of the logstash log:

[2019-01-18T11:53:18,482][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"snort-1.0.0-2019.01.18", :_type=>"doc", :routing=>nil}, #LogStash::Event:0x7292b14b], :response=>{"index"=>{"_index"=>"snort-1.0.0-2019.01 .18", "_type"=>"doc", "_id"=>"UYXOYGgB9Zyq9vh65aGM", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [event.host] of type [keyword]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:77"}}}}}

I installed the ELK Stack fresh.
Filebeat is shipping logs from snort maschine to the server where the ELK stack is running. Further i see the Index in Index Management of Elasticsearch .

As far as i understand, there must be some kind of problem to correctly index the Data.

I am quite new to this topic and not aware of any possiblity to debug this any further

Thank you all in advance
Greetings
Simon
P.S.: Sry for posting in the wrong project.

You are using a deprecated config setting "document_type" set in elasticsearch

Hello All,

Fairly new to ELK stack I did encountered the following warning below during my setup.

'[WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch bulk_path=>"/_xpack/monitoring/_bulk?system_id=logstash&system_api_version=6&interval=1s", hosts=>[http://localhost:9200], sniffing=>false, manage_template=>false, id=>"aa9ca250418a0d4ca15e4f43714f32837666af91a6d80d8f12351a90b6be9d81", document_type=>"%{[@metadata][document_type]}", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_993be7a6-e875-499f-93e7-aa84272d4faa", enable_metric=>true, charset=>"UTF-8">, workers=>1, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, ilm_enabled=>false, ilm_rollover_alias=>"logstash", ilm_pattern=>"{now/d}-000001", ilm_policy=>"logstash-policy", action=>"index", ssl_certificate_verification=>true, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2019-06-06T14:14:01,621][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>".monitoring-logstash", "pipeline.workers"=>1, "pipeline.batch.size"=>2, "pipeline.batch.delay"=>50}
[2019-06-06T14:14:01,640][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2019-06-06T14:14:01,648][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-06-06T14:14:01,656][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2019-06-06T14:14:01,656][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2019-06-06T14:14:01,680][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2019-06-06T14:14:01,728][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>".monitoring-logstash", :thread=>"#<Thread:0x1fdf29b6 sleep>"}
[2019-06-06T14:14:01,737][INFO ][logstash.agent ] Pipelines running {:count=>2, :running_pipelines=>[:synlite_snort, :".monitoring-logstash"], :non_running_pipelines=>[]}
[2019-06-06T14:14:02,134][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}'

Please let me know what particular changes do I need to make.

Thanks and cheers!

snort logs from Pfsense have tags":["__snort_alert_fast_pattern_mismatch"]}

Hello ,
I trying to use this nice project for getting snort Pfsense alerts to elasticsearch ,
i wrote a grok filter since i saw that current filters not suite to Pfsense Snort format, which is:
"08/15/19-12:37:53.466898 ,1,2012247,2,\"ET P2P BTWebClient UA uTorrent in use\",TCP,213.123.237.19,26730,178.79.242.19,80,0,Potential Corporate Privacy Violation,1",

the grok filter i use is as follows :
%{SNALTM:[snort_timestamp]}%{SPACE},%{NONNEGINT:[gid]},%{NONNEGINT:[sid]},%{NONNEGINT:[rev]},(?:\\\")%{GREEDYDATA:[signature]}(?:\\\"),%{NOTCURLYCLOSE:[proto]},%{IP:[src_ip]},%{INT:[src_port]},%{IP:[dest_ip]},%{INT:[dest_port]},%{NOTSQRCLOSE:[class]},%{NONNEGINT:[priority]},?.*$

with added pattern SNALTM %{DATE_US}-%{TIME}
with all above i still get in ES the following below message , any idea how to ingest it correctly
please advice
Thanks
{"_id":"fa6MlGwBvXhfIZ9HQqtv","_type":"_doc","_index":"snort-1.0.0-2019.08.15","@timestamp":["2019-08-15T09:11:04.736Z"],"input":{"type":"log"},"@version":"1.0.0","node":{"ipaddr":"212.143.237.1","hostname":"gfn-fw-bsh.gefen.local"},"event":{"message":"08/15/19-12:11:04.639465 ,1,2027397,2,\"ET POLICY Spotify P2P Client\",UDP,213.123.237.147,57621,213.133.237.255,57621,427,Not Suspicious Traffic,3","type":"snort","host":{"name":"gfn-fw-bsh.n.local"}},"log":{"file":{"path":"/var/log/snort/snort_igb15370/alert"}},"tags":["__snort_alert_fast_pattern_mismatch"]}

import synlite_snort.index_pattern.json 400 error Bad Request

Hi,

When I try to import the index_pattern
root@ids:/# curl -X POST -u kibanaadmin http://127.0.0.1:5601/api/saved_objects/index-pattern/snort-* -H "Content-Type: application/json" -H "kbn-xsrf: true" -d @~/ELK/synesis_lite_snort/kibana/synlite_snort.index_pattern.json
Warning: Couldn't read data from file
Warning: "~/ELK/synesis_lite_snort/kibana/synlite_snort.index_pattern.json",
Warning: this makes an empty POST.
Enter host password for user 'kibanaadmin':

I get this error
{"statusCode":400,"error":"Bad Request","message":"\"value\" must be an object","validation":{"source":"payload","keys":["value"]}}

Do I need to change something in the import value ?
I do running version 6.8.3 of Kibana.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.