Coder Social home page Coder Social logo

1tayh / noisy Goto Github PK

View Code? Open in Web Editor NEW
1.6K 1.6K 235.0 76 KB

Simple random DNS, HTTP/S internet traffic noise generator

License: GNU General Public License v3.0

Python 98.34% Dockerfile 1.66%
bot dns http privacy privacy-online raspberrypi traffic-generator traffic-inspection

noisy's People

Contributors

1tayh avatar ail1020 avatar arduous avatar rev138 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

noisy's Issues

reload() was moved into importlib in Python 3

reload() was moved into importlib in Python 3 because it was being overuse and led issues that were to difficult-to-debug.

flake8 testing of https://github.com/1tayH/noisy on Python 3.6.3

$ flake8 . --count --select=E901,E999,F821,F822,F823 --show-source --statistics

./noisy.py:13:1: F821 undefined name 'reload'
reload(sys)
^
1     F821 undefined name 'reload'
1

Crash with URLs containing non ascii characters

My interpreter, as seen below is Python 2.7.
Here is the last URLs visited, and the traceback of the exception

INFO:root:Visiting http://www.kopimi.com/index.html
INFO:root:Visiting https://en.wikipedia.org/wiki/Piratbyr�n
Traceback (most recent call last):
 File "noisy.py", line 258, in <module>
   main()
 File "noisy.py", line 254, in main
   crawler.crawl()
 File "noisy.py", line 225, in crawl
   self._browse_from_links()
 File "noisy.py", line 166, in _browse_from_links
   self._browse_from_links(depth + 1)
 File "noisy.py", line 166, in _browse_from_links
   self._browse_from_links(depth + 1)
 File "noisy.py", line 147, in _browse_from_links
   sub_page = self._request(random_link).content
 File "noisy.py", line 41, in _request
   response = requests.get(url, headers=headers, timeout=5)
 File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 72, in get
   return request('get', url, params=params, **kwargs)
 File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 58, in request
   return session.request(method=method, url=url, **kwargs)
 File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 498, in request
   prep = self.prepare_request(req)
 File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 441, in prepare_request
   hooks=merge_hooks(request.hooks, self.hooks),
 File "/usr/local/lib/python2.7/dist-packages/requests/models.py", line 309, in prepare
   self.prepare_url(url, params)
 File "/usr/local/lib/python2.7/dist-packages/requests/models.py", line 359, in prepare_url
   url = url.decode('utf8')
 File "/usr/lib/python2.7/encodings/utf_8.py", line 16, in decode
   return codecs.utf_8_decode(input, errors, True)
UnicodeDecodeError: 'utf8' codec can't decode byte 0xe5 in position 38: unexpected end of data

The correct link is https://en.wikipedia.org/wiki/Piratbyr%C3%A5n

I was not able to confirm, but I think that the problem is coming from:

noisy/noisy.py

Line 19 in ae70264

sys.setdefaultencoding('latin-1')

where latin-1 is mandated. Wouldn't a standard utf-8 approach work better?

Adding

# -*- coding: utf-8 -*-

at the top of noisy.py should do the trick. It would tell Python 2 to work with UTF-8, and be transparent to Python 3

CPU TEMP

Thank you for a great script.

Only issue is that when the script is running (cronjob) and internet connection is down (VPN + killswitch) then the cPU temp reaches critical levels.

command not found

root@kali:~/noisy# ./noisy.py --config config.json
./noisy.py: line 1: import: command not found
./noisy.py: line 2: import: command not found
./noisy.py: line 3: import: command not found
./noisy.py: line 4: import: command not found
./noisy.py: line 5: import: command not found
./noisy.py: line 6: import: command not found
./noisy.py: line 7: import: command not found
./noisy.py: line 8: import: command not found
./noisy.py: line 10: import: command not found
from: too many arguments
./noisy.py: line 13: try:: command not found
from: too many arguments
./noisy.py: line 15: except: command not found
from: too many arguments
./noisy.py: line 18: try:: command not found
./noisy.py: line 19: syntax error near unexpected token sys' ./noisy.py: line 19: reload(sys)'

IPv6 URLs make the script stop

I left noisy running all night long and this morning I got this log:

INFO:root:Visiting https://medium.com/p/a1e0216c8a23
Traceback (most recent call last):
  File "noisy.py", line 265, in <module>
    main()
  File "noisy.py", line 261, in main
    crawler.crawl()
  File "noisy.py", line 232, in crawl
    self._browse_from_links()
  File "noisy.py", line 173, in _browse_from_links
    self._browse_from_links(depth + 1)
  File "noisy.py", line 173, in _browse_from_links
    self._browse_from_links(depth + 1)
  File "noisy.py", line 173, in _browse_from_links
    self._browse_from_links(depth + 1)
  File "noisy.py", line 173, in _browse_from_links
    self._browse_from_links(depth + 1)
  File "noisy.py", line 173, in _browse_from_links
    self._browse_from_links(depth + 1)
  File "noisy.py", line 173, in _browse_from_links
    self._browse_from_links(depth + 1)
  File "noisy.py", line 155, in _browse_from_links
    sub_links = self._extract_urls(sub_page, random_link)
  File "noisy.py", line 120, in _extract_urls
    normalize_urls = [self._normalize_link(url, root_url) for url in urls]
  File "noisy.py", line 62, in _normalize_link
    parsed_url = urlparse(link)
  File "/usr/lib/python2.7/urlparse.py", line 143, in urlparse
    tuple = urlsplit(url, scheme, allow_fragments)
  File "/usr/lib/python2.7/urlparse.py", line 191, in urlsplit
    raise ValueError("Invalid IPv6 URL")
ValueError: Invalid IPv6 URL

Only a hanlder is needed for discard the IPv6 URL and it should stay working without problems.

Feature request: split config file

could this have the config file split into separate parts, eg

  1. config.json for the timing parameters
  2. rooturl.json for the root urls
  3. blacklisted.json for blacklisted urls
  4. useragent.json for the user agents
    a little easier to manage special cases eg testing scenarios or personal preferences

I love this script it's a great Privacy Script

I've noticed using nethogs that FF and Chrome sends traffic sky high when loading a new site (500 KB/sec then 359 KB/sec and so on and then slowly downwards), causing traffic spikes giving away the exact moment you're using the broadband connection. My question would be. Whether Noisy could create the same spikes when loading new pages etc (Noisy maxes out at 89 KB/sec where FF and Chrome maxes out at 800 KB/sec).

Thanks, again.

Crawling into spesific URL?

How to make this script to crawl to spesific URL? For example https://github.com/1tayH/noisy like this specific URL, instead of only crawling it to the https://github.com thanks!

Problem in recognising `requests` on Mac Big Sur

Hi, I run noisy fine on my Linux machine but when I tried to run it on Mac, I came across a problem "No module named requests".

Note: this is what I did to run noisy on my Mac Big Sur 11.4:

  • pip install requests - feedback from terminal: requirements satisfied
  • cloned noisy repo
  • cd to noisy
  • run: python noisy.py --config config.json then I got the following:
python noisy.py --config config.json
Traceback (most recent call last):
  File "noisy.py", line 10, in <module>
    import requests
ImportError: No module named requests

Does anyone know how to fix this problem? Thanks.

P.S. when I run: python --version, I get: Python 2.7; when I run python3 --version, I get Python 3.9.5

passing parameters to noisy ...

python noisy.py -c config.json -t 10
usage: noisy.py [-h] [--log -l] --config -c [--timeout -t]
noisy.py: error: argument --config is required

python noisy.py --config config.json --timeout 10 works.

So -c or -t are not accepted? thx

Running noisy.py doesn't produce any logging output

$ python3 --version; pip3 --version  
Python 3.7.3
pip 20.2.1 from /home/elonsatoshi/.local/lib/python3.7/site-packages/pip (python 3.7)

$ python3 ./noisy.py --config config.json --log debug

It outputs nothing, until I hit Ctrl-C to close it, then it outputs a huge Python error message of the kind that usually occurs when you Ctrl-C a Python script.

hardening systemd noisy.service

an systemd-analyze security noisy.service analyse show me an scary exposure level of 9.2 !!
please consider hardening the systemd config file

`

sudo systemd-analyze security noisy.service
NAME DESCRIPTION EXPOSURE
✗ RemoveIPC= Service user may leave SysV IPC objects around 0.1
✗ RootDirectory=/RootImage= Service runs within the host's root directory 0.1
✓ User=/DynamicUser= Service runs under a static non-root user identity
✗ CapabilityBoundingSet=~CAP_SYS_TIME Service processes may change the system clock 0.2
✗ NoNewPrivileges= Service processes may acquire new privileges 0.2
✓ AmbientCapabilities= Service process does not receive ambient capabilities
✗ PrivateDevices= Service potentially has access to hardware devices 0.2
✗ ProtectClock= Service may write to the hardware clock or system clock 0.2
✗ CapabilityBoundingSet=~CAP_SYS_PACCT Service may use acct() 0.1
✗ CapabilityBoundingSet=~CAP_KILL Service may send UNIX signals to arbitrary processes 0.1
✗ ProtectKernelLogs= Service may read from or write to the kernel log ring buffer 0.2
✗ CapabilityBoundingSet=~CAP_WAKE_ALARM Service may program timers that wake up the system 0.1
✗ CapabilityBoundingSet=~CAP_(DAC_|FOWNER|IPC_OWNER) Service may override UNIX file/IPC permission checks 0.2
✗ ProtectControlGroups= Service may modify the control group file system 0.2
✗ CapabilityBoundingSet=~CAP_LINUX_IMMUTABLE Service may mark files immutable 0.1
✗ CapabilityBoundingSet=~CAP_IPC_LOCK Service may lock memory into RAM 0.1
✗ ProtectKernelModules= Service may load or read kernel modules 0.2
✗ CapabilityBoundingSet=~CAP_SYS_MODULE Service may load kernel modules 0.2
✗ CapabilityBoundingSet=~CAP_SYS_TTY_CONFIG Service may issue vhangup() 0.1
✗ CapabilityBoundingSet=~CAP_SYS_BOOT Service may issue reboot() 0.1
✗ CapabilityBoundingSet=~CAP_SYS_CHROOT Service may issue chroot() 0.1
✗ PrivateMounts= Service may install system mounts 0.2
✗ SystemCallArchitectures= Service may execute system calls with all ABIs 0.2
✗ CapabilityBoundingSet=~CAP_BLOCK_SUSPEND Service may establish wake locks 0.1
✗ MemoryDenyWriteExecute= Service may create writable executable memory mappings 0.1
✗ RestrictNamespaces=~user Service may create user namespaces 0.3
✗ RestrictNamespaces=~pid Service may create process namespaces 0.1
✗ RestrictNamespaces=~net Service may create network namespaces 0.1
✗ RestrictNamespaces=~uts Service may create hostname namespaces 0.1
✗ RestrictNamespaces=~mnt Service may create file system namespaces 0.1
✗ CapabilityBoundingSet=~CAP_LEASE Service may create file leases 0.1
✗ CapabilityBoundingSet=~CAP_MKNOD Service may create device nodes 0.1
✗ RestrictNamespaces=~cgroup Service may create cgroup namespaces 0.1
✗ RestrictSUIDSGID= Service may create SUID/SGID files 0.2
✗ RestrictNamespaces=~ipc Service may create IPC namespaces 0.1
✗ ProtectHostname= Service may change system host/domainname 0.1
✗ CapabilityBoundingSet=~CAP_(CHOWN|FSETID|SETFCAP) Service may change file ownership/access mode/capabilities unrestricted 0.2
✗ CapabilityBoundingSet=~CAP_SET(UID|GID|PCAP) Service may change UID/GID identities/capabilities 0.3
✗ LockPersonality= Service may change ABI personality 0.1
✗ ProtectKernelTunables= Service may alter kernel tunables 0.2
✗ RestrictAddressFamilies=~AF_PACKET Service may allocate packet sockets 0.2
✗ RestrictAddressFamilies=~AF_NETLINK Service may allocate netlink sockets 0.1
✗ RestrictAddressFamilies=~AF_UNIX Service may allocate local sockets 0.1
✗ RestrictAddressFamilies=~… Service may allocate exotic sockets 0.3
✗ RestrictAddressFamilies=~AF_(INET|INET6) Service may allocate Internet sockets 0.3
✗ CapabilityBoundingSet=~CAP_MAC_
Service may adjust SMACK MAC 0.1
✗ RestrictRealtime= Service may acquire realtime scheduling 0.1
✗ CapabilityBoundingSet=~CAP_SYS_RAWIO Service has raw I/O access 0.2
✗ CapabilityBoundingSet=~CAP_SYS_PTRACE Service has ptrace() debugging abilities 0.3
✗ CapabilityBoundingSet=~CAP_SYS_(NICE|RESOURCE) Service has privileges to change resource use parameters 0.1
✓ SupplementaryGroups= Service has no supplementary groups
✗ DeviceAllow= Service has no device ACL 0.2
✗ CapabilityBoundingSet=~CAP_NET_ADMIN Service has network configuration privileges 0.2
✗ ProtectSystem= Service has full access to the OS file hierarchy 0.2
✗ ProtectProc= Service has full access to process tree (/proc hidepid=) 0.2
✗ ProcSubset= Service has full access to non-process /proc files (/proc subset=) 0.1
✗ ProtectHome= Service has full access to home directories 0.2
✗ CapabilityBoundingSet=~CAP_NET_(BIND_SERVICE|BROADCAST|RAW) Service has elevated networking privileges 0.1
✗ CapabilityBoundingSet=~CAP_AUDIT_* Service has audit subsystem access 0.1
✗ CapabilityBoundingSet=~CAP_SYS_ADMIN Service has administrator privileges 0.3
✗ PrivateNetwork= Service has access to the host's network 0.5
✗ PrivateUsers= Service has access to other users 0.2
✗ PrivateTmp= Service has access to other software's temporary files 0.2
✗ CapabilityBoundingSet=~CAP_SYSLOG Service has access to kernel logging 0.1
✓ KeyringMode= Service doesn't share key material with other services
✓ Delegate= Service does not maintain its own delegated control group subtree
✗ SystemCallFilter=~@clock Service does not filter system calls 0.2
✗ SystemCallFilter=~@cpu-emulation Service does not filter system calls 0.1
✗ SystemCallFilter=~@debug Service does not filter system calls 0.2
✗ SystemCallFilter=~@module Service does not filter system calls 0.2
✗ SystemCallFilter=~@mount Service does not filter system calls 0.2
✗ SystemCallFilter=~@obsolete Service does not filter system calls 0.1
✗ SystemCallFilter=~@PRIVILEGED Service does not filter system calls 0.2
✗ SystemCallFilter=~@raw-io Service does not filter system calls 0.2
✗ SystemCallFilter=~@reboot Service does not filter system calls 0.2
✗ SystemCallFilter=~@resources Service does not filter system calls 0.2
✗ SystemCallFilter=~@swap Service does not filter system calls 0.2
✗ IPAddressDeny= Service does not define an IP address allow list 0.2
✓ NotifyAccess= Service child processes cannot alter service state
✗ UMask= Files created by service are world-readable by default 0.1

→ Overall exposure level for noisy.service: 9.2 UNSAFE 😨
`

Noisy is failing on websites with redirection errors

https://www.fossil.com/us/en/account-dashboard/registered-products.html is currently not working, with browsers (Chromium, Firefox) returning "ERR_INVALID_REDIRECT".

Noisy crashes when visiting it

    pi@RaspSalon:~/dev/noisy $ python3 noisy.py --config config.json --log debug
    INFO:urllib3.connectionpool:Starting new HTTPS connection (1): www.fossil.com
    DEBUG:urllib3.connectionpool:"GET /us/en/account-dashboard/registered-products.html HTTP/1.1" 302 20
    Traceback (most recent call last):
      File "noisy.py", line 274, in <module>
        main()
      File "noisy.py", line 270, in main
        crawler.crawl()
      File "noisy.py", line 238, in crawl
        body = self._request(url).content
      File "noisy.py", line 49, in _request
        response = requests.get(url, headers=headers, timeout=5)
      File "/usr/lib/python3/dist-packages/requests/api.py", line 60, in get
        return request('get', url, **kwargs)
      File "/usr/lib/python3/dist-packages/requests/api.py", line 49, in request
        return session.request(method=method, url=url, **kwargs)
      File "/usr/lib/python3/dist-packages/requests/sessions.py", line 457, in request
        resp = self.send(prep, **send_kwargs)
      File "/usr/lib/python3/dist-packages/requests/sessions.py", line 595, in send
        history = [resp for resp in gen] if allow_redirects else []
      File "/usr/lib/python3/dist-packages/requests/sessions.py", line 595, in <listcomp>
        history = [resp for resp in gen] if allow_redirects else []
      File "/usr/lib/python3/dist-packages/requests/sessions.py", line 189, in resolve_redirects
        allow_redirects=False,
      File "/usr/lib/python3/dist-packages/requests/sessions.py", line 569, in send
        r = adapter.send(request, **kwargs)
      File "/usr/lib/python3/dist-packages/requests/adapters.py", line 329, in send
        conn = self.get_connection(request.url, proxies)
      File "/usr/lib/python3/dist-packages/requests/adapters.py", line 243, in get_connection
        conn = self.poolmanager.connection_from_url(url)
      File "/usr/lib/python3/dist-packages/urllib3/poolmanager.py", line 130, in connection_from_url
        u = parse_url(url)
      File "/usr/lib/python3/dist-packages/urllib3/util/url.py", line 143, in parse_url
        raise LocationParseError(url)
    urllib3.exceptions.LocationParseError: Failed to parse: www.fossil.com:-1

My view is that requests should not let through an uncaught error from urllib3. I will try to raise the issue to them, but will provide a temporary workaround for noisy.

Crash File "/usr/lib/python2.7/dist-packages/OpenSSL/SSL.py", line 851, in _raise_ssl_error raise ZeroReturnError()

python -V
Python 2.7.9

INFO:urllib3.connectionpool:Starting new HTTPS connection (1): pages.ebay.at
Traceback (most recent call last):
File "noisy.py", line 274, in
main()
File "noisy.py", line 270, in main
crawler.crawl()
File "noisy.py", line 238, in crawl
self._browse_from_links()
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 179, in _browse_from_links
self._browse_from_links(depth + 1)
File "noisy.py", line 160, in _browse_from_links
sub_page = self._request(random_link).content
File "noisy.py", line 49, in _request
response = requests.get(url, headers=headers, timeout=5)
File "/usr/lib/python2.7/dist-packages/requests/api.py", line 60, in get
return request('get', url, **kwargs)
File "/usr/lib/python2.7/dist-packages/requests/api.py", line 49, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 457, in request
resp = self.send(prep, **send_kwargs)
File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 606, in send
r.content
File "/usr/lib/python2.7/dist-packages/requests/models.py", line 724, in content
self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes()
File "/usr/lib/python2.7/dist-packages/requests/models.py", line 653, in generate
for chunk in self.raw.stream(chunk_size, decode_content=True):
File "/usr/lib/python2.7/dist-packages/urllib3/response.py", line 256, in stream
data = self.read(amt=amt, decode_content=decode_content)
File "/usr/lib/python2.7/dist-packages/urllib3/response.py", line 186, in read
data = self._fp.read(amt)
File "/usr/lib/python2.7/httplib.py", line 602, in read
s = self.fp.read(amt)
File "/usr/lib/python2.7/socket.py", line 380, in read
data = self._sock.recv(left)
File "/usr/lib/python2.7/dist-packages/urllib3/contrib/pyopenssl.py", line 188, in recv
data = self.connection.recv(*args, **kwargs)
File "/usr/lib/python2.7/dist-packages/OpenSSL/SSL.py", line 995, in recv
self._raise_ssl_error(self._ssl, result)
File "/usr/lib/python2.7/dist-packages/OpenSSL/SSL.py", line 851, in _raise_ssl_error
raise ZeroReturnError()
OpenSSL.SSL.ZeroReturnError

JSONDecodeError

  File "/opt/noisy/noisy.py", line 262, in <module>
    main()
  File "/opt/noisy/noisy.py", line 253, in main
    crawler.load_config_file(args.config)
  File "/opt/noisy/noisy.py", line 177, in load_config_file
    config = json.load(config_file)
  File "/usr/local/Cellar/[email protected]/3.9.7/Frameworks/Python.framework/Versions/3.9/lib/python3.9/json/__init__.py", line 293, in load
    return loads(fp.read(),
  File "/usr/local/Cellar/[email protected]/3.9.7/Frameworks/Python.framework/Versions/3.9/lib/python3.9/json/__init__.py", line 346, in loads
    return _default_decoder.decode(s)
  File "/usr/local/Cellar/[email protected]/3.9.7/Frameworks/Python.framework/Versions/3.9/lib/python3.9/json/decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/local/Cellar/[email protected]/3.9.7/Frameworks/Python.framework/Versions/3.9/lib/python3.9/json/decoder.py", line 355, in raw_decode
    raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 15 column 5 (char 360)```

The only thing I did was delete one or two URLs from the `config.json` file. Here is the relevant content:

```{
    "max_depth": 25,
    "min_sleep": 3,
    "max_sleep": 6,
    "timeout": false,
    "root_urls": [
        "https://www.reddit.com",
        "https://www.yahoo.com",
        "http://www.cnn.com",
        "http://www.ebay.com",
        "https://wikipedia.org",
        "https://youtube.com",
        "https://github.com",
        "https://medium.com",
    ],```

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.