Coder Social home page Coder Social logo

anon-exploiter / sitebroker Goto Github PK

View Code? Open in Web Editor NEW
417.0 21.0 105.0 263 KB

A cross-platform python based utility for information gathering and penetration testing automation!

License: MIT License

Python 98.16% Dockerfile 0.44% Jupyter Notebook 1.40%
python penetration-testing web-application-security cross-platform-python penetration-automation information-gathering docker-image wapt

sitebroker's Introduction

SiteBroker

Open In Colab Maintenance python GitHub GitHub closed issues Twitter LinkedIn

A cross-platform python based utility for information gathering and penetration automation!

I don't know If I was on weed when I made this or maybe to practice just php-cli & python, of no use tbh! :)

Output

Sitebroker's Full Output

Easy way (Docker Image)

You can simply run this docker image to test SiteBroker without installing python packages system-wide or in a virtualenv.

docker build -t sitebroker .
docker run -it --rm sitebroker

Requirements

  • Python (3.6.* - 3.7.*)
  • Python pip3
  • Python module requests
  • Python module colorama
  • Python module dnspython
  • Python module bs4

Install modules

pip install -r requirements.txt

Tested on

  • Windows 7/8/8.1
  • Kali linux (2017.2)

Download SiteBroker

You can download the latest version of SiteBroker by cloning the GitHub repository.

git clone https://github.com/Anon-Exploiter/SiteBroker

Updates

  • Changed The Whole Script Into Python (Previously It Was Written In PHP)
  • Exceptions Covered for both User Interrupting && Internel Issues!
  • Removed NetCraft Module as We need to use selinium and phantomJS for it (Ultimately making script slow!)
  • Changed the Problem Of Responce Code Of '200' for most sites in Admin Panel Finder Module && Shell Finder Module

Change-log

  • Added New Features For Reverse IP (Via HackerTarget && YouGetSignal)
  • Added New Features For Crawling (Via Google, Bing && Manually With My Hands ;)
  • Added New Method For Subdomains Scanning! (Takes Some Time Though :p)

Usage

Initializing Script

python SiteBroker.py

Advanced Usage


Author: Syed Umar Arfeen 

Usage: python SiteBroker.py
A cross-platform python based utility for information gathering and penetration automation!

Options:

 1). Cloudflare Check.
 2). Website Crawler.
  	 |____ Google Based Crawling
 	 |____ Bing Based Crawling
 	 |____ Manually Crawling
 3). Reverse IP.
  	 |____ YouGetSignal Based
 	 |____ HackerTarget's API Based
 4). Information Gathering.
  	 |____ Whois Lookup
 	 |____ BrowserSpy Report
 5). Nameservers.
 6). WebSite Speed.
 7). Subdomains Scanner
 8). Shell Finder.
 9). Admin Panel Finder.
 10). Grab Banner.
 11). Everything.
  
  Example:
	python SiteBroker.py

Screenshots

Stargazers over time

Stargazers over time

Note

Do not change the position of any module as given under the Usage, this may cause an failure in the working of the script...
P.S ~ Dont Change The Colors. They're Butiphul like this.
P.S.S ~ For Dependencies -> Solution
	~ An0n 3xPloiTeR

sitebroker's People

Contributors

aminvakil avatar anon-exploiter avatar dependabot[bot] avatar noraj avatar yoanndp avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

sitebroker's Issues

Error while applying choice 11

[#] Err0r: Kindly Report the err0r below to An0n3xPloiTeR :) (If Your Internet's Working ;)
"""
'href'
"""

[$] Thanks For Using :D
~ An0n 3xPloiTeR :)

Cannot build after Bump urllib3 from 1.25.8 to 1.26.5

pip install --no-cache-dir -r /usr/local/src/requirements.txt --upgrade
 ---> Running in e10440a4619e
Collecting beautifulsoup4==4.7.1
  Downloading beautifulsoup4-4.7.1-py3-none-any.whl (94 kB)
Collecting bs4==0.0.1
  Downloading bs4-0.0.1.tar.gz (1.1 kB)
Collecting certifi==2019.6.16
  Downloading certifi-2019.6.16-py2.py3-none-any.whl (157 kB)
Collecting chardet==3.0.4
  Downloading chardet-3.0.4-py2.py3-none-any.whl (133 kB)
Collecting colorama==0.4.1
  Downloading colorama-0.4.1-py2.py3-none-any.whl (15 kB)
Collecting dnspython==1.16.0
  Downloading dnspython-1.16.0-py2.py3-none-any.whl (188 kB)
Collecting idna==2.8
  Downloading idna-2.8-py2.py3-none-any.whl (58 kB)
Collecting requests==2.22.0
  Downloading requests-2.22.0-py2.py3-none-any.whl (57 kB)
Collecting soupsieve==1.9.2
  Downloading soupsieve-1.9.2-py2.py3-none-any.whl (34 kB)
Collecting urllib3==1.26.5
  Downloading urllib3-1.26.5-py2.py3-none-any.whl (138 kB)
INFO: pip is looking at multiple versions of idna to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of dnspython to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of <Python from Requires-Python> to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of colorama to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of chardet to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of certifi to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of bs4 to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of beautifulsoup4 to determine which version is compatible with other requirements. This could take a while.
ERROR: Cannot install -r /usr/local/src/requirements.txt (line 8) and urllib3==1.26.5 because these package versions have conflicting dependencies.

The conflict is caused by:
    The user requested urllib3==1.26.5
    requests 2.22.0 depends on urllib3!=1.25.0, !=1.25.1, <1.26 and >=1.21.1

To fix this you could try to:
1. loosen the range of package versions you've specified
2. remove package versions to allow pip attempt to solve the dependency conflict

ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/user_guide/#fixing-conflicting-dependencies
The command '/bin/sh -c pip install --no-cache-dir -r /usr/local/src/requirements.txt --upgrade' returned a non-zero code: 1

More info https://github.com/aminvakil/SiteBroker/runs/3328092485?check_suite_focus=true

Error in finding nameservers

---------------------------------------------------------------------------------------
Finding Nameservers Of 'http://testphp.vulnweb.com' ...
---------------------------------------------------------------------------------------

Traceback (most recent call last):
File "SiteBroker.py", line 220, in <module>
nameServers(website)
File "/wd/modules/nameservers.py", line 25, in nameServers
res = Nameservers(website, 'NS')
File "/usr/local/lib/python2.7/site-packages/dns/resolver.py", line 1132, in query
raise_on_no_answer, source_port)
File "/usr/local/lib/python2.7/site-packages/dns/resolver.py", line 1053, in query
raise_on_no_answer)
File "/usr/local/lib/python2.7/site-packages/dns/resolver.py", line 234, in __init__
raise NoAnswer(response=response)
dns.resolver.NoAnswer: The DNS response does not contain an answer to the question: testphp.vulnweb.com. IN NS

Python: 2.7.15
OS: slim-stretch

Same Response Code For All URL's

Same Response Code For All URL's

Well, passing along some sites i found out at some sites it shows the same response code of 200 at every URL! in admin panel && web shell finder.

Issue with Kali Linux 5.4.0 & Python 3.9

Hi,
Further to cd into SiteBroker directory in Kali Linux 2020 with python 3.9 installed and necessary requirements running:

Linux kali 5.4.0-kali3-amd64 #1 SMP Debian 5.4.13-1kali1 (2020-01-20) x86_64 GNU/Linux
on a VM

The following error message is returned:

root@kali:~/SiteBroker# python SiteBroker.py
File "SiteBroker.py", line 38
val_Select = f"\t{r}[$] Please Use The Index Value From The List\n\t\t[+] Not By Your Own :/\n\t\t\t ~ An0n 3xPloiTeR \n"
^
SyntaxError: invalid syntax

:(((

Need Help

λ python SiteBroker.py
Traceback (most recent call last):
File "SiteBroker.py", line 24, in
from modules import BingCrawl, browserspyRep
File "C:\Users\Lucky\Desktop\s\modules_init_.py", line 28, in
from nameservers import nameServers
File "C:\Users\Lucky\Desktop\s\modules\nameservers.py", line 21, in
from dns.resolver import query as Nameservers
ImportError: No module named dns.resolver

error

Doing Reverse IP OF 'http://host' Via YGS! ...

Traceback (most recent call last):
  File "C:\Python37\lib\site-packages\urllib3\connectionpool.py", line 706, in urlopen
    chunked=chunked,
  File "C:\Python37\lib\site-packages\urllib3\connectionpool.py", line 445, in _make_request
    six.raise_from(e, None)
  File "<string>", line 3, in raise_from
  File "C:\Python37\lib\site-packages\urllib3\connectionpool.py", line 440, in _make_request
    httplib_response = conn.getresponse()
  File "C:\Python37\lib\http\client.py", line 1344, in getresponse
    response.begin()
  File "C:\Python37\lib\http\client.py", line 306, in begin
    version, status, reason = self._read_status()
  File "C:\Python37\lib\http\client.py", line 275, in _read_status
    raise RemoteDisconnected("Remote end closed connection without"
http.client.RemoteDisconnected: Remote end closed connection without response

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Python37\lib\site-packages\requests\adapters.py", line 449, in send
    timeout=timeout
  File "C:\Python37\lib\site-packages\urllib3\connectionpool.py", line 756, in urlopen
    method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
  File "C:\Python37\lib\site-packages\urllib3\util\retry.py", line 532, in increment
    raise six.reraise(type(error), error, _stacktrace)
  File "C:\Python37\lib\site-packages\urllib3\packages\six.py", line 769, in reraise
    raise value.with_traceback(tb)
  File "C:\Python37\lib\site-packages\urllib3\connectionpool.py", line 706, in urlopen
    chunked=chunked,
  File "C:\Python37\lib\site-packages\urllib3\connectionpool.py", line 445, in _make_request
    six.raise_from(e, None)
  File "<string>", line 3, in raise_from
  File "C:\Python37\lib\site-packages\urllib3\connectionpool.py", line 440, in _make_request
    httplib_response = conn.getresponse()
  File "C:\Python37\lib\http\client.py", line 1344, in getresponse
    response.begin()
  File "C:\Python37\lib\http\client.py", line 306, in begin
    version, status, reason = self._read_status()
  File "C:\Python37\lib\http\client.py", line 275, in _read_status
    raise RemoteDisconnected("Remote end closed connection without"
urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "SiteBroker.py", line 205, in <module>
    reverseViaYGS(website)
  File "C:\my_python\SiteBroker\modules\reverseip.py", line 35, in reverseViaYGS
    request = requests.post(url, headers=_headers, data=post)
  File "C:\Python37\lib\site-packages\requests\api.py", line 117, in post
    return request('post', url, data=data, json=json, **kwargs)
  File "C:\Python37\lib\site-packages\requests\api.py", line 61, in request
    return session.request(method=method, url=url, **kwargs)
  File "C:\Python37\lib\site-packages\requests\sessions.py", line 542, in request
    resp = self.send(prep, **send_kwargs)
  File "C:\Python37\lib\site-packages\requests\sessions.py", line 655, in send
    r = adapter.send(request, **kwargs)
  File "C:\Python37\lib\site-packages\requests\adapters.py", line 498, in send
    raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))

No result is returning

No results is coming?? i tried cloudflare bypass it is not working and tried crawler it is not returning the result. Please see the screenshot
screenshot 35

Errot while running ReverseIP via YGS

Tested on Kali 2018.4 amd64 full update with all requirements installed

Traceback (most recent call last):
File "SiteBroker.py", line 211, in <module>
reverseViaYGS(website)
File "/root/kitploit/SiteBroker/modules/reverseip.py", line 49, in reverseViaYGS
grab = json.loads(request)
File "/usr/lib/python2.7/json/__init__.py", line 339, in loads
return _default_decoder.decode(s)
File "/usr/lib/python2.7/json/decoder.py", line 364, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib/python2.7/json/decoder.py", line 382, in raw_decode
raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded

Google and manual crawler doesn't work

I've installed your package in alpine using these commands:

apk add --no-cache --virtual .build-deps gcc musl-dev libxml2-dev libxslt-dev \
&& pip install --no-cache-dir -r /usr/local/src/requirements.txt --upgrade \
&& apk del .build-deps

pip install doesn't give me any error (Successfully installed beautifulsoup4-4.7.1 bs4-0.0.1 certifi-2019.6.16 chardet-3.0.4 colorama-0.4.1 dnspython-1.16.0 idna-2.8 lxml-4.3.4 requests-2.22.0 soupsieve-1.9.2 urllib3-1.25.3) and python SiteBroker.py runs successfully except for google and manual crawler which gives me these errors:

Google crawler:

Traceback (most recent call last):
  File "SiteBroker.py", line 97, in <module>
    GoogleCrawl(website)
  File "/usr/local/src/modules/crawler.py", line 17, in googleCrawl
    soup 	= BeautifulSoup(content, 'lxml')
  File "/usr/local/lib/python3.7/site-packages/bs4/__init__.py", line 196, in __init__
    % ",".join(features))
bs4.FeatureNotFound: Couldn't find a tree builder with the features you requested: lxml. Do you need to install a parser library?

Manual crawler:

Traceback (most recent call last):
  File "SiteBroker.py", line 105, in <module>
    ManualCrawl(website)
  File "/usr/local/src/modules/crawler.py", line 54, in manualCrawl
    soup 	= BeautifulSoup(request, 'lxml')
  File "/usr/local/lib/python3.7/site-packages/bs4/__init__.py", line 196, in __init__
    % ",".join(features))
bs4.FeatureNotFound: Couldn't find a tree builder with the features you requested: lxml. Do you need to install a parser library?

I've tested every other commands and they're all ok and working.

I'm not a python developer but it seems from error that I haven't installed something, can you help me fix it?

neeed help

wifiphisher 1.4 requires dbus-python, which is not installed.
wifiphisher 1.4 requires pbkdf2, which is not installed.
wifiphisher 1.4 requires PyRIC, which is not installed.
wifiphisher 1.4 requires roguehostapd, which is not installed.
nyawc 1.8.1 has requirement beautifulsoup4==4.6.0, but you'll have beautifulsoup4 4.6.3 which is incompatible.
nyawc 1.8.1 has requirement lxml==4.0.0, but you'll have lxml 4.2.5 which is incompatible.
nyawc 1.8.1 has requirement requests==2.18.4, but you'll have requests 2.19.1 which is incompatible.

Error while running

Traceback (most recent call last):
File "SiteBroker.py", line 205, in
ManualCrawl(website)
File "/root/Desktop/SiteBroker/modules/crawler.py", line 71, in manualCrawl
_links.append(links['href'])
File "/usr/local/lib/python2.7/dist-packages/bs4/element.py", line 1011, in getitem
return self.attrs[key]
KeyError: 'href'

List index out of range

this is what I keep getting

[#] Err0r: Kindly Report the err0r below to An0n3xPloiTeR :) (If Your Internet's Working ;)
"""
list index out of range
"""

[$] Thanks For Using :D
~ An0n 3xPloiTeR :)

root@kali:~/Desktop/SiteBroker#

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.