Coder Social home page Coder Social logo

bounty_tools's People

Contributors

gradiuscypher avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

bounty_tools's Issues

Build Docker Image with recon-ng

Build a docker image to be deployed on Digital Ocean with recon-ng. This will be a first pass to see how running recon tools in a Docker image on Digital Ocean will work. In the future, other tools may be added.

Improve help messages

Currently, if you fail to add the right args to some arguments, no help messages are printed and it fails silently. Need to help the user figure out why no action was taken.

Modular Usage - functionality split into modules

The tooling should be able to be split into modules to allow easier editing and integration:

  • Core
  • Connection management/Queuing
  • Data Storage
  • Recon
  • Scanning
  • Enrichment
  • Reporting/Alerting

Under each of these modules should be submodules that contain specific functionality, for example:

  • Core

    • Core plugin interaction, loading
  • Data Storage

    • Local
    • MySQL
  • Recon

    • recon-ng
    • shodan
  • Scanning

    • masscan/nmap
    • burp
  • Enrichment

    • shodan
    • ipinfo
    • other public APIs
  • Reporting/Alerting

    • Google Docs
    • Elastic
    • Console
    • Emails
    • Push notifications

Service Information

A plugin that can gather service information based on active ports. Will gather things like:

  • webpage titles
  • redirects
  • etc

ElasticSearch - Rolling window for data

Have a rolling window for hosts to be checked against. For example:

Limit the host check within the last 7 days, that way if we filter our Elasticsearch queries to 7 days, we'll know that we have the most up to date hosts for that period. Duplicate hosts may exist outside of that period. We can also track historical changes to hosts, or when they are/are not present.

Project reorganization - Development ease of use and separation of duties

Take the current code and migrate to a more segmented approach. Current using a single argparse object causes a naming collision between switches.

Core focus should be around ease of use and development.

Things to address:

  • Name collision between arguments

  • Separation of duties, eg. plugins should also handle importing their data to DBs

  • Updated documentation, inline and otherwise.

  • Better consideration and plan for private plugins and how to properly import them. Possibly consider dynamic importing of plugins.

Remove all data persistence modules except for Elastic

There's not really a reason to continue including the complexity of supporting more than Elastic when Elastic provides everything we want.

Clean up all of the code related to anything other than persisting the data in Elastic. This should also cut down on the amount of command switches.

This task includes cleaning up the import-style switches like --dbimport, etc.

Client/Server System: queuing, scheduling, task polling

Rebuild of core system functions, some tasks include (but not limited to):

  • Queuing system to allow tasks to be load balanced across scan engines.
  • Scheduling to allow execution of tasks on specific intervals
  • Task polling to determine when a process has been completed on a scan engine, to allow multiple tasks to be non-blocking

Logging of actions taken

Script should log actions taken to a file which gets downloaded before the droplet is destroyed.

All actions taken should be logged. Will help in debugging automation.

Tool Integration: recon-ng

This is the task for integrating recon-ng into both the Docker image, Scanner REST API, and Server data persistence and interaction.

bounty_tools server features

These features should be included in the centralized server:

  • exporting to Elasticsearch for investigation
  • centralized collection of active hosts from each target, updated as scans complete
  • list of active scanners and their connectivity details
  • alert lists
  • push/webhook notifications
  • expose aquatone screenshots?

Document bounty_tools architecture

The idea behind this new set of bounty tools is such: A set of images that can be deployed in the cloud or locally that can run various bug bounty tools. Each image runs on a Scanner instance that is controlled by a REST API. This task will be to document data flow, data storage, etc.

A central Server controls all scanners and contains all historical data, as well as manages things like scheduling scans, sending alerts, enriching data, etc.

A user interface will control the central server. Ideally this will be a web app, but will start with CLI tools initially.

Explore vagrant/chef/docker for box provisioning

This sort of provisioning might be better than doing paramiko+shellscript for initial configuration since a ton of lost time during setup is spent waiting for SSH/box setup, which is a dirty hack.

Maintain an overall map of a target

For example, I want to be able to see all of the things related to a target: hosts, ports, hostnames, etc. We should persist data even if it's no longer live, but mark it as such.

automation.py - work queue system appears to fail during start sometimes

Sometimes when an automation loop is started with distribution, one of the workers automatically fails into a "DONE" state claiming the work queue is empty. Eg:

(venv3) gradius@ubuntu:~/github/bounty_tools$ ./bounty_tools.py --bulkrecon --hostjson target_hosts.json --distribute 3 --reconng
Required arguments not passed. Need either --createvm or --droplet to execute, along with --workspace and --domains
DONE
48720710 Grabbing work...{}
48720711 Grabbing work...{}
Done working...
48720710 Grabbing work...{}

Enrichment - Censys.io

Use Censys.io to determine open ports.

Should add to DBs the same way that Shodan would add ports.

Eg: In Elastic, add a doc_type called port and link to host via _id.

REST API for interacting with Scanner tools

Part of this design requires a REST API to interact with the tools on the Scanner Docker images. For each tool you should be able to do the following:

  • Launch tool
  • Retrieve status of tool
  • Retrieve result of tool

Other functionality might be:

  • Stop tool
  • Distribute tool run between multiple Scanners

in modular_try2, the import_to_db introduces duplicates

During the second run on a duplicate database, the function to import to a local DB creates new entries for each host under "AltHost" since it's seeing the same IP for a second time.

Ideally we shouldn't import a host that has the same hostname as an AltHost.

Improve spin-up speed of DigitalOcean VMs

The DO VMs take quite a long time to spin up and become usable. Figure out how to improve this speed. This may include setting up regional VM images in DigitalOcean as well.

Identify Host IP

For each IP address we have, we should get as much identifying information on it as possible. Things like whois, location, ownership, etc.

This can then be filled in to Elasticsearch. First we should attempt an update approach so that we can work in batches rather than during the recon cycle. If this approach does not work, we can integrate it into the recon cycle before indexing a host.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.