Coder Social home page Coder Social logo

etherflows's Introduction

etherflows's People

Contributors

enteee avatar

Stargazers

 avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

etherflows's Issues

flowgen id

A user should be able to define the id of the flowgen (3 bytes log). The last 3 bytes of the source address of the generated frames (not vendor part) should be setted accordingly. If no ID was specified the flowgen id should be choosen by random. Also the flowworker should be able to extract the flowgen id from the src-MAC and append it as env.flowgen to the packet.
There is one problem left to solve: In the cluster configuration we have the following filter in place:

ether src b4:be:b1:6b:00:b5

Which ensures that we're only processing packets comming from a flowgen. I have to find a solution to do this with dynamic generated flowgen id. A 'filter all packets from a specific vendor'-bpf rule whould be handy. But I don't think this is possible.. investigating..

Pcaps submit

It would be cool if we could submit a pcap file for analysis purposes. Then the pcap gets analyzed by the flowworker.

flowworker: reset delay

if the flowworker does not process any traffic it will not call write_frame thus will not update Wroker.delay. Which means the last worker.delay will be transmitted to the flowgen, which might be quite high. So we've to find a way to reset the Worker.delay once the worker is in idle mode.

Filter alert

The idea is to write a packet filter for logstash which operates on json data.
The following configuration explains what this filter plugin does:

filter { 
     jqfilter {
         filter => [
             ' Json query string ' , rule uuid
         ]
         target => "field name"
     }
}

this plugin matches the Json query string against the json representation of the event and if matched adds the rule uuid to the field specified with target.

Garbage collection

These days it can happen that a packet is beeing processed by multiple flow workers. This happens because a swtich brodacasts packets with destination MAC to all switchports. We've to eliminate duplicated packes in the ES-database.

Idea how we can do this:

  1. Buffer one packet and only send packet to the DB when we get an other packet from the same flow. (flowworker.py ++)
  2. Smart periodic cleanup on database with a query and delete script. (cronjob like) Delete packets which are dups.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.