1tayh / noisy Goto Github PK
View Code? Open in Web Editor NEWSimple random DNS, HTTP/S internet traffic noise generator
License: GNU General Public License v3.0
Simple random DNS, HTTP/S internet traffic noise generator
License: GNU General Public License v3.0
reload() was moved into importlib in Python 3 because it was being overuse and led issues that were to difficult-to-debug.
flake8 testing of https://github.com/1tayH/noisy on Python 3.6.3
$ flake8 . --count --select=E901,E999,F821,F822,F823 --show-source --statistics
./noisy.py:13:1: F821 undefined name 'reload'
reload(sys)
^
1 F821 undefined name 'reload'
1
My interpreter, as seen below is Python 2.7.
Here is the last URLs visited, and the traceback of the exception
INFO:root:Visiting http://www.kopimi.com/index.html
INFO:root:Visiting https://en.wikipedia.org/wiki/Piratbyr�n
Traceback (most recent call last):
File "noisy.py", line 258, in <module>
main()
File "noisy.py", line 254, in main
crawler.crawl()
File "noisy.py", line 225, in crawl
self._browse_from_links()
File "noisy.py", line 166, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 166, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 147, in _browse_from_links
sub_page = self._request(random_link).content
File "noisy.py", line 41, in _request
response = requests.get(url, headers=headers, timeout=5)
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 72, in get
return request('get', url, params=params, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 58, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 498, in request
prep = self.prepare_request(req)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 441, in prepare_request
hooks=merge_hooks(request.hooks, self.hooks),
File "/usr/local/lib/python2.7/dist-packages/requests/models.py", line 309, in prepare
self.prepare_url(url, params)
File "/usr/local/lib/python2.7/dist-packages/requests/models.py", line 359, in prepare_url
url = url.decode('utf8')
File "/usr/lib/python2.7/encodings/utf_8.py", line 16, in decode
return codecs.utf_8_decode(input, errors, True)
UnicodeDecodeError: 'utf8' codec can't decode byte 0xe5 in position 38: unexpected end of data
The correct link is https://en.wikipedia.org/wiki/Piratbyr%C3%A5n
I was not able to confirm, but I think that the problem is coming from:
Line 19 in ae70264
where latin-1 is mandated. Wouldn't a standard utf-8 approach work better?
Adding
# -*- coding: utf-8 -*-
at the top of noisy.py should do the trick. It would tell Python 2 to work with UTF-8, and be transparent to Python 3
Thank you for a great script.
Only issue is that when the script is running (cronjob) and internet connection is down (VPN + killswitch) then the cPU temp reaches critical levels.
Hi!
The systemd services can now use sandboxing options, which has a positive effect on security. I would like noisy to use these settings as well. Also, additionally consider creating an apparmor profile.
Thanks for noisy!
to prevent cyclical traffic noise providing a pattern.
Thanks
root@kali:~/noisy# ./noisy.py --config config.json
./noisy.py: line 1: import: command not found
./noisy.py: line 2: import: command not found
./noisy.py: line 3: import: command not found
./noisy.py: line 4: import: command not found
./noisy.py: line 5: import: command not found
./noisy.py: line 6: import: command not found
./noisy.py: line 7: import: command not found
./noisy.py: line 8: import: command not found
./noisy.py: line 10: import: command not found
from: too many arguments
./noisy.py: line 13: try:: command not found
from: too many arguments
./noisy.py: line 15: except: command not found
from: too many arguments
./noisy.py: line 18: try:: command not found
./noisy.py: line 19: syntax error near unexpected token sys' ./noisy.py: line 19:
reload(sys)'
Line 1538 in 2815487
I left noisy running all night long and this morning I got this log:
INFO:root:Visiting https://medium.com/p/a1e0216c8a23
Traceback (most recent call last):
File "noisy.py", line 265, in <module>
main()
File "noisy.py", line 261, in main
crawler.crawl()
File "noisy.py", line 232, in crawl
self._browse_from_links()
File "noisy.py", line 173, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 173, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 173, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 173, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 173, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 173, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 155, in _browse_from_links
sub_links = self._extract_urls(sub_page, random_link)
File "noisy.py", line 120, in _extract_urls
normalize_urls = [self._normalize_link(url, root_url) for url in urls]
File "noisy.py", line 62, in _normalize_link
parsed_url = urlparse(link)
File "/usr/lib/python2.7/urlparse.py", line 143, in urlparse
tuple = urlsplit(url, scheme, allow_fragments)
File "/usr/lib/python2.7/urlparse.py", line 191, in urlsplit
raise ValueError("Invalid IPv6 URL")
ValueError: Invalid IPv6 URL
Only a hanlder is needed for discard the IPv6 URL and it should stay working without problems.
could this have the config file split into separate parts, eg
I've noticed using nethogs that FF and Chrome sends traffic sky high when loading a new site (500 KB/sec then 359 KB/sec and so on and then slowly downwards), causing traffic spikes giving away the exact moment you're using the broadband connection. My question would be. Whether Noisy could create the same spikes when loading new pages etc (Noisy maxes out at 89 KB/sec where FF and Chrome maxes out at 800 KB/sec).
Thanks, again.
How to make this script to crawl to spesific URL? For example https://github.com/1tayH/noisy
like this specific URL, instead of only crawling it to the https://github.com
thanks!
Hi, I run noisy fine on my Linux machine but when I tried to run it on Mac, I came across a problem "No module named requests".
Note: this is what I did to run noisy on my Mac Big Sur 11.4:
pip install requests
- feedback from terminal: requirements satisfiedpython noisy.py --config config.json
then I got the following:python noisy.py --config config.json
Traceback (most recent call last):
File "noisy.py", line 10, in <module>
import requests
ImportError: No module named requests
Does anyone know how to fix this problem? Thanks.
P.S. when I run: python --version, I get: Python 2.7; when I run python3 --version, I get Python 3.9.5
Just ran this out of curiosity, I don't think NSFW links should be used by default, or a warning should be provided.
python noisy.py -c config.json -t 10
usage: noisy.py [-h] [--log -l] --config -c [--timeout -t]
noisy.py: error: argument --config is required
python noisy.py --config config.json --timeout 10 works.
So -c or -t are not accepted? thx
$ python3 --version; pip3 --version
Python 3.7.3
pip 20.2.1 from /home/elonsatoshi/.local/lib/python3.7/site-packages/pip (python 3.7)
$ python3 ./noisy.py --config config.json --log debug
It outputs nothing, until I hit Ctrl-C to close it, then it outputs a huge Python error message of the kind that usually occurs when you Ctrl-C a Python script.
an systemd-analyze security noisy.service
analyse show me an scary exposure level of 9.2 !!
please consider hardening the systemd config file
`
sudo systemd-analyze security noisy.service
NAME DESCRIPTION EXPOSURE
✗ RemoveIPC= Service user may leave SysV IPC objects around 0.1
✗ RootDirectory=/RootImage= Service runs within the host's root directory 0.1
✓ User=/DynamicUser= Service runs under a static non-root user identity
✗ CapabilityBoundingSet=~CAP_SYS_TIME Service processes may change the system clock 0.2
✗ NoNewPrivileges= Service processes may acquire new privileges 0.2
✓ AmbientCapabilities= Service process does not receive ambient capabilities
✗ PrivateDevices= Service potentially has access to hardware devices 0.2
✗ ProtectClock= Service may write to the hardware clock or system clock 0.2
✗ CapabilityBoundingSet=~CAP_SYS_PACCT Service may use acct() 0.1
✗ CapabilityBoundingSet=~CAP_KILL Service may send UNIX signals to arbitrary processes 0.1
✗ ProtectKernelLogs= Service may read from or write to the kernel log ring buffer 0.2
✗ CapabilityBoundingSet=~CAP_WAKE_ALARM Service may program timers that wake up the system 0.1
✗ CapabilityBoundingSet=~CAP_(DAC_|FOWNER|IPC_OWNER) Service may override UNIX file/IPC permission checks 0.2
✗ ProtectControlGroups= Service may modify the control group file system 0.2
✗ CapabilityBoundingSet=~CAP_LINUX_IMMUTABLE Service may mark files immutable 0.1
✗ CapabilityBoundingSet=~CAP_IPC_LOCK Service may lock memory into RAM 0.1
✗ ProtectKernelModules= Service may load or read kernel modules 0.2
✗ CapabilityBoundingSet=~CAP_SYS_MODULE Service may load kernel modules 0.2
✗ CapabilityBoundingSet=~CAP_SYS_TTY_CONFIG Service may issue vhangup() 0.1
✗ CapabilityBoundingSet=~CAP_SYS_BOOT Service may issue reboot() 0.1
✗ CapabilityBoundingSet=~CAP_SYS_CHROOT Service may issue chroot() 0.1
✗ PrivateMounts= Service may install system mounts 0.2
✗ SystemCallArchitectures= Service may execute system calls with all ABIs 0.2
✗ CapabilityBoundingSet=~CAP_BLOCK_SUSPEND Service may establish wake locks 0.1
✗ MemoryDenyWriteExecute= Service may create writable executable memory mappings 0.1
✗ RestrictNamespaces=~user Service may create user namespaces 0.3
✗ RestrictNamespaces=~pid Service may create process namespaces 0.1
✗ RestrictNamespaces=~net Service may create network namespaces 0.1
✗ RestrictNamespaces=~uts Service may create hostname namespaces 0.1
✗ RestrictNamespaces=~mnt Service may create file system namespaces 0.1
✗ CapabilityBoundingSet=~CAP_LEASE Service may create file leases 0.1
✗ CapabilityBoundingSet=~CAP_MKNOD Service may create device nodes 0.1
✗ RestrictNamespaces=~cgroup Service may create cgroup namespaces 0.1
✗ RestrictSUIDSGID= Service may create SUID/SGID files 0.2
✗ RestrictNamespaces=~ipc Service may create IPC namespaces 0.1
✗ ProtectHostname= Service may change system host/domainname 0.1
✗ CapabilityBoundingSet=~CAP_(CHOWN|FSETID|SETFCAP) Service may change file ownership/access mode/capabilities unrestricted 0.2
✗ CapabilityBoundingSet=~CAP_SET(UID|GID|PCAP) Service may change UID/GID identities/capabilities 0.3
✗ LockPersonality= Service may change ABI personality 0.1
✗ ProtectKernelTunables= Service may alter kernel tunables 0.2
✗ RestrictAddressFamilies=~AF_PACKET Service may allocate packet sockets 0.2
✗ RestrictAddressFamilies=~AF_NETLINK Service may allocate netlink sockets 0.1
✗ RestrictAddressFamilies=~AF_UNIX Service may allocate local sockets 0.1
✗ RestrictAddressFamilies=~… Service may allocate exotic sockets 0.3
✗ RestrictAddressFamilies=~AF_(INET|INET6) Service may allocate Internet sockets 0.3
✗ CapabilityBoundingSet=~CAP_MAC_ Service may adjust SMACK MAC 0.1
✗ RestrictRealtime= Service may acquire realtime scheduling 0.1
✗ CapabilityBoundingSet=~CAP_SYS_RAWIO Service has raw I/O access 0.2
✗ CapabilityBoundingSet=~CAP_SYS_PTRACE Service has ptrace() debugging abilities 0.3
✗ CapabilityBoundingSet=~CAP_SYS_(NICE|RESOURCE) Service has privileges to change resource use parameters 0.1
✓ SupplementaryGroups= Service has no supplementary groups
✗ DeviceAllow= Service has no device ACL 0.2
✗ CapabilityBoundingSet=~CAP_NET_ADMIN Service has network configuration privileges 0.2
✗ ProtectSystem= Service has full access to the OS file hierarchy 0.2
✗ ProtectProc= Service has full access to process tree (/proc hidepid=) 0.2
✗ ProcSubset= Service has full access to non-process /proc files (/proc subset=) 0.1
✗ ProtectHome= Service has full access to home directories 0.2
✗ CapabilityBoundingSet=~CAP_NET_(BIND_SERVICE|BROADCAST|RAW) Service has elevated networking privileges 0.1
✗ CapabilityBoundingSet=~CAP_AUDIT_* Service has audit subsystem access 0.1
✗ CapabilityBoundingSet=~CAP_SYS_ADMIN Service has administrator privileges 0.3
✗ PrivateNetwork= Service has access to the host's network 0.5
✗ PrivateUsers= Service has access to other users 0.2
✗ PrivateTmp= Service has access to other software's temporary files 0.2
✗ CapabilityBoundingSet=~CAP_SYSLOG Service has access to kernel logging 0.1
✓ KeyringMode= Service doesn't share key material with other services
✓ Delegate= Service does not maintain its own delegated control group subtree
✗ SystemCallFilter=~@clock Service does not filter system calls 0.2
✗ SystemCallFilter=~@cpu-emulation Service does not filter system calls 0.1
✗ SystemCallFilter=~@debug Service does not filter system calls 0.2
✗ SystemCallFilter=~@module Service does not filter system calls 0.2
✗ SystemCallFilter=~@mount Service does not filter system calls 0.2
✗ SystemCallFilter=~@obsolete Service does not filter system calls 0.1
✗ SystemCallFilter=~@PRIVILEGED Service does not filter system calls 0.2
✗ SystemCallFilter=~@raw-io Service does not filter system calls 0.2
✗ SystemCallFilter=~@reboot Service does not filter system calls 0.2
✗ SystemCallFilter=~@resources Service does not filter system calls 0.2
✗ SystemCallFilter=~@swap Service does not filter system calls 0.2
✗ IPAddressDeny= Service does not define an IP address allow list 0.2
✓ NotifyAccess= Service child processes cannot alter service state
✗ UMask= Files created by service are world-readable by default 0.1
→ Overall exposure level for noisy.service: 9.2 UNSAFE 😨
`
https://www.fossil.com/us/en/account-dashboard/registered-products.html is currently not working, with browsers (Chromium, Firefox) returning "ERR_INVALID_REDIRECT".
Noisy crashes when visiting it
pi@RaspSalon:~/dev/noisy $ python3 noisy.py --config config.json --log debug
INFO:urllib3.connectionpool:Starting new HTTPS connection (1): www.fossil.com
DEBUG:urllib3.connectionpool:"GET /us/en/account-dashboard/registered-products.html HTTP/1.1" 302 20
Traceback (most recent call last):
File "noisy.py", line 274, in <module>
main()
File "noisy.py", line 270, in main
crawler.crawl()
File "noisy.py", line 238, in crawl
body = self._request(url).content
File "noisy.py", line 49, in _request
response = requests.get(url, headers=headers, timeout=5)
File "/usr/lib/python3/dist-packages/requests/api.py", line 60, in get
return request('get', url, **kwargs)
File "/usr/lib/python3/dist-packages/requests/api.py", line 49, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/lib/python3/dist-packages/requests/sessions.py", line 457, in request
resp = self.send(prep, **send_kwargs)
File "/usr/lib/python3/dist-packages/requests/sessions.py", line 595, in send
history = [resp for resp in gen] if allow_redirects else []
File "/usr/lib/python3/dist-packages/requests/sessions.py", line 595, in <listcomp>
history = [resp for resp in gen] if allow_redirects else []
File "/usr/lib/python3/dist-packages/requests/sessions.py", line 189, in resolve_redirects
allow_redirects=False,
File "/usr/lib/python3/dist-packages/requests/sessions.py", line 569, in send
r = adapter.send(request, **kwargs)
File "/usr/lib/python3/dist-packages/requests/adapters.py", line 329, in send
conn = self.get_connection(request.url, proxies)
File "/usr/lib/python3/dist-packages/requests/adapters.py", line 243, in get_connection
conn = self.poolmanager.connection_from_url(url)
File "/usr/lib/python3/dist-packages/urllib3/poolmanager.py", line 130, in connection_from_url
u = parse_url(url)
File "/usr/lib/python3/dist-packages/urllib3/util/url.py", line 143, in parse_url
raise LocationParseError(url)
urllib3.exceptions.LocationParseError: Failed to parse: www.fossil.com:-1
My view is that requests should not let through an uncaught error from urllib3. I will try to raise the issue to them, but will provide a temporary workaround for noisy.
python -V
Python 2.7.9
INFO:urllib3.connectionpool:Starting new HTTPS connection (1): pages.ebay.at
Traceback (most recent call last):
File "noisy.py", line 274, in
main()
File "noisy.py", line 270, in main
crawler.crawl()
File "noisy.py", line 238, in crawl
self._browse_from_links()
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 160, in _browse_from_links
sub_page = self._request(random_link).content
File "noisy.py", line 49, in _request
response = requests.get(url, headers=headers, timeout=5)
File "/usr/lib/python2.7/dist-packages/requests/api.py", line 60, in get
return request('get', url, **kwargs)
File "/usr/lib/python2.7/dist-packages/requests/api.py", line 49, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 457, in request
resp = self.send(prep, **send_kwargs)
File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 606, in send
r.content
File "/usr/lib/python2.7/dist-packages/requests/models.py", line 724, in content
self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes()
File "/usr/lib/python2.7/dist-packages/requests/models.py", line 653, in generate
for chunk in self.raw.stream(chunk_size, decode_content=True):
File "/usr/lib/python2.7/dist-packages/urllib3/response.py", line 256, in stream
data = self.read(amt=amt, decode_content=decode_content)
File "/usr/lib/python2.7/dist-packages/urllib3/response.py", line 186, in read
data = self._fp.read(amt)
File "/usr/lib/python2.7/httplib.py", line 602, in read
s = self.fp.read(amt)
File "/usr/lib/python2.7/socket.py", line 380, in read
data = self._sock.recv(left)
File "/usr/lib/python2.7/dist-packages/urllib3/contrib/pyopenssl.py", line 188, in recv
data = self.connection.recv(*args, **kwargs)
File "/usr/lib/python2.7/dist-packages/OpenSSL/SSL.py", line 995, in recv
self._raise_ssl_error(self._ssl, result)
File "/usr/lib/python2.7/dist-packages/OpenSSL/SSL.py", line 851, in _raise_ssl_error
raise ZeroReturnError()
OpenSSL.SSL.ZeroReturnError
File "/opt/noisy/noisy.py", line 262, in <module>
main()
File "/opt/noisy/noisy.py", line 253, in main
crawler.load_config_file(args.config)
File "/opt/noisy/noisy.py", line 177, in load_config_file
config = json.load(config_file)
File "/usr/local/Cellar/[email protected]/3.9.7/Frameworks/Python.framework/Versions/3.9/lib/python3.9/json/__init__.py", line 293, in load
return loads(fp.read(),
File "/usr/local/Cellar/[email protected]/3.9.7/Frameworks/Python.framework/Versions/3.9/lib/python3.9/json/__init__.py", line 346, in loads
return _default_decoder.decode(s)
File "/usr/local/Cellar/[email protected]/3.9.7/Frameworks/Python.framework/Versions/3.9/lib/python3.9/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/local/Cellar/[email protected]/3.9.7/Frameworks/Python.framework/Versions/3.9/lib/python3.9/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 15 column 5 (char 360)```
The only thing I did was delete one or two URLs from the `config.json` file. Here is the relevant content:
```{
"max_depth": 25,
"min_sleep": 3,
"max_sleep": 6,
"timeout": false,
"root_urls": [
"https://www.reddit.com",
"https://www.yahoo.com",
"http://www.cnn.com",
"http://www.ebay.com",
"https://wikipedia.org",
"https://youtube.com",
"https://github.com",
"https://medium.com",
],```
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.