Coder Social home page Coder Social logo

teal33t / poopak Goto Github PK

View Code? Open in Web Editor NEW
128.0 7.0 32.0 17.56 MB

POOPAK - TOR Hidden Service Crawler

Python 50.11% CSS 2.35% JavaScript 12.92% HTML 34.39% Dockerfile 0.23%
osint deepweb crawler redis mongo flask docker tor hidden-services darknet

poopak's Introduction

  • I'm looking for open-source developers to work together on PoopakV2. If you are intrested let's talk! -> [email protected]
  • This repo Poopak is no longer usable.

POOPAK | TOR Hidden Service Crawler

License: GPL v3 Open Source Love made-with-python Generic badge

Screenshot

This is an experimental application for crawling, scanning and data gathering from TOR hidden services.

Features

  • Multi-level in-depth crawling using CURL
  • Link extractor
  • Extract Emails/BTC/ETH/XMR addresses
  • Extract EXIF meta data
  • Screenshot (using Splash)
  • Subject detector (using Spacy)
  • Port Scanner
  • Extract report from a hidden service (CSV/PDF)
  • Fulltext search through the directory
  • Language detection
  • Web application security scanning (using Arachni) - [Under Developing]
  • Docker based and Web UI

Licence

This software is made available under the GPL v.3 license. It means if you run a modified program on a server and let other users communicate with it there, your server must also allow them to download the source code corresponding to the modified version running there.

Dependencies

  • Docker (tested on Docker version 18.03.1)
  • Docker Compose (tested on version 1.21.1)
  • Python 3
  • pipenv

Install

Just run application with docker-compose:

docker-compose up -d

and next point your browser to localhost.

Discontinued

poopak's People

Contributors

saman-dev-hyp avatar teal33t avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

poopak's Issues

Total found 0

It is installed successfully, but on searching, I'm getting "total found 0" only.
Do I need to do anything else to make it run? plz help

TypeError: a bytes-like object is required, not 'str'

Hello,

could anybody help with this issue?

Thx a lot
Marcus

docker-compose logs -f:

web-app | [2021-01-27 11:25:13 +0000] [10] [DEBUG] POST /new/
crawler-worker | 11:25:13 high: crawler.run('http://p6o7m73ujalhgkiv.onion') (276dba5f-d77f-4352-91de-6fb1b728c2d2)
web-app | [2021-01-27 11:25:13 +0000] [10] [DEBUG] GET /favicon.ico
crawler-worker | 11:25:21 Traceback (most recent call last):
crawler-worker | File "/usr/local/lib/python3.6/site-packages/rq/worker.py", line 975, in perform_job
crawler-worker | rv = job.perform()
crawler-worker | File "/usr/local/lib/python3.6/site-packages/rq/job.py", line 696, in perform
crawler-worker | self._result = self._execute()
crawler-worker | File "/usr/local/lib/python3.6/site-packages/rq/job.py", line 719, in _execute
crawler-worker | return self.func(*self.args, **self.kwargs)
crawler-worker | File "/application/crawler/init.py", line 10, in run
crawler-worker | spider.proccess()
crawler-worker | File "/application/crawler/spider.py", line 73, in proccess
crawler-worker | 'links': data.get_links(),
crawler-worker | File "/application/crawler/html_extractors.py", line 20, in get_links
crawler-worker | is_onion = True if '.onion' in parsed_href.netloc else False
crawler-worker | TypeError: a bytes-like object is required, not 'str'
crawler-worker | Traceback (most recent call last):
crawler-worker | File "/usr/local/lib/python3.6/site-packages/rq/worker.py", line 975, in perform_job
crawler-worker | rv = job.perform()
crawler-worker | File "/usr/local/lib/python3.6/site-packages/rq/job.py", line 696, in perform
crawler-worker | self._result = self._execute()
crawler-worker | File "/usr/local/lib/python3.6/site-packages/rq/job.py", line 719, in _execute
crawler-worker | return self.func(*self.args, **self.kwargs)
crawler-worker | File "/application/crawler/init.py", line 10, in run
crawler-worker | spider.proccess()
crawler-worker | File "/application/crawler/spider.py", line 73, in proccess
crawler-worker | 'links': data.get_links(),
crawler-worker | File "/application/crawler/html_extractors.py", line 20, in get_links
crawler-worker | is_onion = True if '.onion' in parsed_href.netloc else False
crawler-worker | TypeError: a bytes-like object is required, not 'str'

Adding onion removes : from http://somesite.onion and results in 503

Got it running, but when I attempt to add a site for poopak to crawl, it removes the colon : from the site address. Which then causes poopak to fail crawling the site, always returning a 503 error code. So instead of retrieving and crawling http://somesite.onion poopak attempts to retrieve and crawl http//somesite.onion which is not an address at all. Small bug, but creates a huge problem. I have tried using 'http:/somesite.onion' which did not work.

Adding onion removes `:` from `http://somesite.onion` and results in 503

Got it running, but when I attempt to add a site for poopak to crawl, it removes the colon : from the site address. Which then causes poopak to fail crawling the site, always returning a 503 error code. So instead of retrieving and crawling http://somesite.onion poopak attempts to retrieve and crawl http//somesite.onion which is not an address at all. Small bug, but creates a huge problem. I have tried using 'http:/somesite.onion' which did not work.

"ERROR Requested page not found" what ever I'm searching for

Hi there,
knowing that this is a discontinued project, but I'd love to see it working. ;-)
Just to ask if anybody has ever solved the issue with "ERROR Requested page not found" what ever I'm searching for in the web gui.

Thx so much for any help.

Cheers
Marcus

docker-compose logs -f:
web-app | [2021-01-27 11:04:30 +0000] [10] [DEBUG] GET /
web-app | [2021-01-27 11:04:35 +0000] [10] [DEBUG] POST /
web-app | [2021-01-27 11:04:35 +0000] [10] [DEBUG] GET /search/http://ekbgzchl6x2ias37.onion//
web-app | [2021-01-27 11:04:35 +0000] [10] [DEBUG] GET /favicon.ico

Localhost shows 502 Bad Gateway nginx/1.15.2

All services are running ok but no response in localhost.
Please advise.
hk@hk15ISK:~/poopak$ sudo docker-compose restart
Restarting poopak_server_1 ... done
Restarting web-app ... done
Restarting app-worker ... done
Restarting panel-worker ... done
Restarting detector-worker ... done
Restarting crawler-worker ... done
Restarting poopak_mongoclient_1 ... done
Restarting redis ... done
Restarting poopak_spacy_1 ... done
Restarting poopak_torpool_1 ... done
Restarting poopak_splash_1 ... done
Restarting poopak_mongodb_1 ... done

Fix Mongodb start error: FileNotOpen: Failed to open "/opt/bitnami/mongodb/logs/mongodb.log"

Error

Most recent version of mongodb fails to start on docker-compose up -d with error:
FileNotOpen: Failed to open "/opt/bitnami/mongodb/logs/mongodb.log"

Fix

After some tinkering a simple resolution was discovered to resolve the problem. Create a local volume inside the CWD to hold the log file and create the log file.

  1. In your shell:
    mkdir -p logs && touch logs/mongodb.log

  2. Add the volume to docker-compose.yml below the bitnami volume:

volumes:
  - .:/bitnami
  - ./logs:/opt/bitnami/mongodb/logs  
  1. You may now create the containers and start the application.
    docker-compose up -d

Problems with crawler

Hi,

When I add a new domain using the form I see the follow error in log:

crawler-worker | 23:21:23 high: crawler.run('http://xmh57jrzrnw6insl.onion') (b844c2ef-eeb4-4b8c-9be4-b1550395d45e)
crawler-worker | pbkdf2:sha256:50000$vUgmdFna$5794e060253f569ef325eef9b64ea60aa0413728c8b52eea958751d8b7d61e6f
crawler-worker | User already present in DB.
crawler-worker | Traceback (most recent call last):
crawler-worker | File "manage.py", line 55, in
crawler-worker | manager.run()
crawler-worker | File "/usr/local/lib/python3.6/site-packages/flask_script/init.py", line 417, in run
crawler-worker | result = self.handle(argv[0], argv[1:])
crawler-worker | File "/usr/local/lib/python3.6/site-packages/flask_script/init.py", line 386, in handle
crawler-worker | res = handle(*args, **config)
crawler-worker | File "/usr/local/lib/python3.6/site-packages/flask_script/commands.py", line 216, in call
crawler-worker | return self.run(*args, **kwargs)
crawler-worker | File "manage.py", line 51, in run_crawler_worker
crawler-worker | worker.work()
crawler-worker | File "/usr/local/lib/python3.6/site-packages/rq/worker.py", line 493, in work
crawler-worker | self.execute_job(job, queue)
crawler-worker | File "/usr/local/lib/python3.6/site-packages/rq/worker.py", line 662, in execute_job
crawler-worker | self.fork_work_horse(job, queue)
crawler-worker | File "/usr/local/lib/python3.6/site-packages/rq/worker.py", line 599, in fork_work_horse
crawler-worker | self.main_work_horse(job, queue)
crawler-worker | File "/usr/local/lib/python3.6/site-packages/rq/worker.py", line 677, in main_work_horse
crawler-worker | success = self.perform_job(job, queue)
crawler-worker | File "/usr/local/lib/python3.6/site-packages/rq/worker.py", line 781, in perform_job
crawler-worker | self.prepare_job_execution(job)
crawler-worker | File "/usr/local/lib/python3.6/site-packages/rq/worker.py", line 706, in prepare_job_execution
crawler-worker | registry.add(job, timeout, pipeline=pipeline)
crawler-worker | File "/usr/local/lib/python3.6/site-packages/rq/registry.py", line 47, in add
crawler-worker | return pipeline.zadd(self.key, score, job.id)
crawler-worker | File "/usr/local/lib/python3.6/site-packages/redis/client.py", line 2263, in zadd
crawler-worker | for pair in iteritems(mapping):
crawler-worker | File "/usr/local/lib/python3.6/site-packages/redis/_compat.py", line 123, in iteritems
crawler-worker | return iter(x.items())
crawler-worker | AttributeError: 'int' object has no attribute 'items'

After this I don't see the domain added and nothing more happen. Can you help me?

Regards

issue with mongodb

Hi,

Hope you are all well !

When I start docker-compose up, I have the following errors and 502 error on localhost.

panel-worker       | Traceback (most recent call last):
panel-worker       |   File "manage.py", line 4, in <module>
panel-worker       |     from web import app
panel-worker       |   File "/application/web/__init__.py", line 52, in <module>
panel-worker       |     client.crawler.users.insert({"_id": "admin", "password": pass_hash})
panel-worker       |   File "/usr/local/lib/python3.6/site-packages/pymongo/collection.py", line 3182, in insert
panel-worker       |     check_keys, manipulate, write_concern)
panel-worker       |   File "/usr/local/lib/python3.6/site-packages/pymongo/collection.py", line 612, in _insert
panel-worker       |     bypass_doc_val, session)
panel-worker       |   File "/usr/local/lib/python3.6/site-packages/pymongo/collection.py", line 600, in _insert_one
panel-worker       |     acknowledged, _insert_command, session)
panel-worker       |   File "/usr/local/lib/python3.6/site-packages/pymongo/mongo_client.py", line 1492, in _retryable_write
panel-worker       |     return self._retry_with_session(retryable, func, s, None)
panel-worker       |   File "/usr/local/lib/python3.6/site-packages/pymongo/mongo_client.py", line 1378, in _retry_with_session
panel-worker       |     with self._get_socket(server, session) as sock_info:
panel-worker       |   File "/usr/local/lib/python3.6/contextlib.py", line 82, in __enter__
panel-worker       |     return next(self.gen)
panel-worker       |   File "/usr/local/lib/python3.6/site-packages/pymongo/mongo_client.py", line 1223, in _get_socket
panel-worker       |     self.__all_credentials, checkout=exhaust) as sock_info:
panel-worker       |   File "/usr/local/lib/python3.6/contextlib.py", line 82, in __enter__
panel-worker       |     return next(self.gen)
panel-worker       |   File "/usr/local/lib/python3.6/site-packages/pymongo/pool.py", line 1128, in get_socket
panel-worker       |     sock_info.check_auth(all_credentials)
panel-worker       |   File "/usr/local/lib/python3.6/site-packages/pymongo/pool.py", line 712, in check_auth
panel-worker       |     auth.authenticate(credentials, self)
panel-worker       |   File "/usr/local/lib/python3.6/site-packages/pymongo/auth.py", line 564, in authenticate
panel-worker       |     auth_func(credentials, sock_info)
panel-worker       |   File "/usr/local/lib/python3.6/site-packages/pymongo/auth.py", line 539, in _authenticate_default
panel-worker       |     return _authenticate_scram(credentials, sock_info, 'SCRAM-SHA-1')
panel-worker       |   File "/usr/local/lib/python3.6/site-packages/pymongo/auth.py", line 263, in _authenticate_scram
panel-worker       |     res = sock_info.command(source, cmd)
panel-worker       |   File "/usr/local/lib/python3.6/site-packages/pymongo/pool.py", line 613, in command
panel-worker       |     user_fields=user_fields)
panel-worker       |   File "/usr/local/lib/python3.6/site-packages/pymongo/network.py", line 167, in command
panel-worker       |     parse_write_concern_error=parse_write_concern_error)
panel-worker       |   File "/usr/local/lib/python3.6/site-packages/pymongo/helpers.py", line 159, in _check_command_response
panel-worker       |     raise OperationFailure(msg % errmsg, code, response)
panel-worker       | pymongo.errors.OperationFailure: Authentication failed.
crawler-worker     | Traceback (most recent call last):
crawler-worker     |   File "manage.py", line 4, in <module>
crawler-worker     |     from web import app
crawler-worker     |   File "/application/web/__init__.py", line 52, in <module>
crawler-worker     |     client.crawler.users.insert({"_id": "admin", "password": pass_hash})
crawler-worker     |   File "/usr/local/lib/python3.6/site-packages/pymongo/collection.py", line 3182, in insert
crawler-worker     |     check_keys, manipulate, write_concern)
crawler-worker     |   File "/usr/local/lib/python3.6/site-packages/pymongo/collection.py", line 612, in _insert
crawler-worker     |     bypass_doc_val, session)
crawler-worker     |   File "/usr/local/lib/python3.6/site-packages/pymongo/collection.py", line 600, in _insert_one
crawler-worker     |     acknowledged, _insert_command, session)
crawler-worker     |   File "/usr/local/lib/python3.6/site-packages/pymongo/mongo_client.py", line 1492, in _retryable_write
crawler-worker     |     return self._retry_with_session(retryable, func, s, None)
crawler-worker     |   File "/usr/local/lib/python3.6/site-packages/pymongo/mongo_client.py", line 1378, in _retry_with_session
crawler-worker     |     with self._get_socket(server, session) as sock_info:
crawler-worker     |   File "/usr/local/lib/python3.6/contextlib.py", line 82, in __enter__
crawler-worker     |     return next(self.gen)
crawler-worker     |   File "/usr/local/lib/python3.6/site-packages/pymongo/mongo_client.py", line 1223, in _get_socket
crawler-worker     |     self.__all_credentials, checkout=exhaust) as sock_info:
crawler-worker     |   File "/usr/local/lib/python3.6/contextlib.py", line 82, in __enter__
crawler-worker     |     return next(self.gen)
crawler-worker     |   File "/usr/local/lib/python3.6/site-packages/pymongo/pool.py", line 1128, in get_socket
crawler-worker     |     sock_info.check_auth(all_credentials)
crawler-worker     |   File "/usr/local/lib/python3.6/site-packages/pymongo/pool.py", line 712, in check_auth
crawler-worker     |     auth.authenticate(credentials, self)
crawler-worker     |   File "/usr/local/lib/python3.6/site-packages/pymongo/auth.py", line 564, in authenticate
crawler-worker     |     auth_func(credentials, sock_info)
crawler-worker     |   File "/usr/local/lib/python3.6/site-packages/pymongo/auth.py", line 539, in _authenticate_default
crawler-worker     |     return _authenticate_scram(credentials, sock_info, 'SCRAM-SHA-1')
crawler-worker     |   File "/usr/local/lib/python3.6/site-packages/pymongo/auth.py", line 263, in _authenticate_scram
crawler-worker     |     res = sock_info.command(source, cmd)
crawler-worker     |   File "/usr/local/lib/python3.6/site-packages/pymongo/pool.py", line 613, in command
crawler-worker     |     user_fields=user_fields)
crawler-worker     |   File "/usr/local/lib/python3.6/site-packages/pymongo/network.py", line 167, in command
crawler-worker     |     parse_write_concern_error=parse_write_concern_error)
crawler-worker     |   File "/usr/local/lib/python3.6/site-packages/pymongo/helpers.py", line 159, in _check_command_response
crawler-worker     |     raise OperationFailure(msg % errmsg, code, response)
crawler-worker     | pymongo.errors.OperationFailure: Authentication failed.

crawler-worker is not building

it showing this i dont understand how to get rid of it

root@Prathik:~/poopak# docker-compose up -d
Building crawler-worker
Step 1/8 : FROM python:3.6.1
---> 955d0c3b1bb2
Step 2/8 : COPY . /application
---> Using cache
---> 1faf9b0a0b67
Step 3/8 : ENV HOME=/application
---> Using cache
---> 758175807750
Step 4/8 : WORKDIR /application
---> Using cache
---> eb6f945df5d6
Step 5/8 : RUN pip install -r requirements.txt
---> Running in 0a447b005ded
standard_init_linux.go:207: exec user process caused "exec format error"
ERROR: Service 'crawler-worker' failed to build: The command '/bin/sh -c pip install -r requirements.txt' returned a non-zero code: 1

Add a new onion service

Hi,

I want to use your tool, but don't really understand what do you mean by "Add a new onion service".
Can you give me an example?

Thanks a lot.

503 on hidden services

Hey, we have got an issue with some hidden services.

Tor service is running.
Installation of poopak went smoothly without errors, however when we try to perform a scan on a certain onion link.
The response is always 503 even if the service is active.

Error: "Host mongodb not yet available"

Hello teal33t, first of all thank you very much for the great work you're doing with this tools.

Could you please help me with some starting issues I have?
The logs are being filled up with "web-app "Host mongodb not yet available.

Please find the logfiles related to mongodb below.

Thanks a lot for your help
Marcus

root@ache:/opt/poopak# docker-compose logs -f | grep mongo
Attaching to poopak_server_1_a46b562987fb, web-app, detector-worker, crawler-worker, poopak_mongoclient_1_74b54af33377, panel-worker, app-worker, poopak_mongodb_1_819e1089f4f6, poopak_spacy_1_ee356f861f15, poopak_torpool_1_d5cfd74dcf12, redis, poopak_splash_1_5ccea6e80a73
web-app | Checking availability of mongodb
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Checking availability of mongodb
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Checking availability of mongodb
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
mongoclient_1_74b54af33377 |
mongoclient_1_74b54af33377 | [-] External MONGO_URL not found. Starting local MongoDB...
mongoclient_1_74b54af33377 |
mongoclient_1_74b54af33377 | => Starting app on port 3000...
mongoclient_1_74b54af33377 | 2019-01-13T14:29:22.527+0000 I CONTROL [initandlisten] MongoDB starting : pid=7 port=27017 dbpath=/data/db 64-bit host=71aac4453524
mongoclient_1_74b54af33377 | 2019-01-13T14:29:22.527+0000 I CONTROL [initandlisten] db version v3.4.2
mongoclient_1_74b54af33377 | 2019-01-13T14:29:22.527+0000 I CONTROL [initandlisten] git version: 3f76e40c105fc223b3e5aac3e20dcd026b83b38b
mongoclient_1_74b54af33377 | 2019-01-13T14:29:22.527+0000 I CONTROL [initandlisten] OpenSSL version: OpenSSL 1.0.1t 3 May 2016
mongoclient_1_74b54af33377 | 2019-01-13T14:29:22.527+0000 I CONTROL [initandlisten] allocator: tcmalloc
mongoclient_1_74b54af33377 | 2019-01-13T14:29:22.527+0000 I CONTROL [initandlisten] modules: none
mongoclient_1_74b54af33377 | 2019-01-13T14:29:22.527+0000 I CONTROL [initandlisten] build environment:
mongoclient_1_74b54af33377 | 2019-01-13T14:29:22.527+0000 I CONTROL [initandlisten] distmod: debian81
mongoclient_1_74b54af33377 | 2019-01-13T14:29:22.527+0000 I CONTROL [initandlisten] distarch: x86_64
mongoclient_1_74b54af33377 | 2019-01-13T14:29:22.527+0000 I CONTROL [initandlisten] target_arch: x86_64
mongoclient_1_74b54af33377 | 2019-01-13T14:29:22.527+0000 I CONTROL [initandlisten] options: { storage: { engine: "wiredTiger" } }
mongoclient_1_74b54af33377 | 2019-01-13T14:29:22.533+0000 I STORAGE [initandlisten] wiredtiger_open config: create,cache_size=1448M,session_max=20000,eviction=(threads_max=4),config_base=false,statistics=(fast),log=(enabled=true,archive=true,path=journal,compressor=snappy),file_manager=(close_idle_time=100000),checkpoint=(wait=60,log_size=2GB),statistics_log=(wait=0),
mongoclient_1_74b54af33377 | 2019-01-13T14:29:22.712+0000 I CONTROL [initandlisten]
mongoclient_1_74b54af33377 | 2019-01-13T14:29:22.712+0000 I CONTROL [initandlisten] ** WARNING: Access control is not enabled for the database.
mongoclient_1_74b54af33377 | 2019-01-13T14:29:22.712+0000 I CONTROL [initandlisten] ** Read and write access to data and configuration is unrestricted.
mongoclient_1_74b54af33377 | 2019-01-13T14:29:22.712+0000 I CONTROL [initandlisten]
mongoclient_1_74b54af33377 | 2019-01-13T14:29:22.753+0000 I FTDC [initandlisten] Initializing full-time diagnostic data capture with directory '/data/db/diagnostic.data'
mongoclient_1_74b54af33377 | 2019-01-13T14:29:22.813+0000 I INDEX [initandlisten] build index on: admin.system.version properties: { v: 2, key: { version: 1 }, name: "incompatible_with_version_32", ns: "admin.system.version" }
mongoclient_1_74b54af33377 | 2019-01-13T14:29:22.813+0000 I INDEX [initandlisten] building index using bulk method; build may temporarily use up to 500 megabytes of RAM
mongoclient_1_74b54af33377 | 2019-01-13T14:29:22.813+0000 I INDEX [initandlisten] build index done. scanned 0 total records. 0 secs
mongoclient_1_74b54af33377 | 2019-01-13T14:29:22.814+0000 I COMMAND [initandlisten] setting featureCompatibilityVersion to 3.4
mongoclient_1_74b54af33377 | 2019-01-13T14:29:22.814+0000 I NETWORK [thread1] waiting for connections on port 27017
mongoclient_1_74b54af33377 | 2019-01-13T14:29:23.450+0000 I NETWORK [thread1] connection accepted from 127.0.0.1:44402 #1 (1 connection now open)
mongoclient_1_74b54af33377 | 2019-01-13T14:29:23.455+0000 I NETWORK [conn1] received client metadata from 127.0.0.1:44402 conn1: { driver: { name: "nodejs", version: "3.0.11" }, os: { type: "Linux", name: "linux", architecture: "x64", version: "4.15.0-39-generic" }, platform: "Node.js v8.4.0, LE, mongodb-core: 3.0.11" }
mongoclient_1_74b54af33377 | {"level":"info","message":"[insert-default-settings]"}
mongoclient_1_74b54af33377 | {"level":"info","message":"[migrate-connections]"}
mongodb_1_819e1089f4f6 |
mongodb_1_819e1089f4f6 | Welcome to the Bitnami mongodb container
mongodb_1_819e1089f4f6 | Subscribe to project updates by watching https://github.com/bitnami/bitnami-docker-mongodb
mongodb_1_819e1089f4f6 | Submit issues and feature requests at https://github.com/bitnami/bitnami-docker-mongodb/issues
mongodb_1_819e1089f4f6 |
mongodb_1_819e1089f4f6 | nami INFO Initializing mongodb
mongodb_1_819e1089f4f6 | mongodb INFO ==> Deploying MongoDB with persisted data...
mongodb_1_819e1089f4f6 | mongodb INFO ==> No injected configuration files found. Creating default config files...
mongodb_1_819e1089f4f6 | mongodb INFO
mongodb_1_819e1089f4f6 | mongodb INFO ########################################################################
mongodb_1_819e1089f4f6 | mongodb INFO Installation parameters for mongodb:
mongodb_1_819e1089f4f6 | mongodb INFO Persisted data and properties have been restored.
mongodb_1_819e1089f4f6 | mongodb INFO Any input specified will not take effect.
mongodb_1_819e1089f4f6 | mongodb INFO This installation requires no credentials.
mongodb_1_819e1089f4f6 | mongodb INFO ########################################################################
mongodb_1_819e1089f4f6 | mongodb INFO
mongodb_1_819e1089f4f6 | nami INFO mongodb successfully initialized
mongodb_1_819e1089f4f6 | INFO ==> Starting mongodb...
mongodb_1_819e1089f4f6 | INFO ==> Starting mongod...
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:21.083+0000 I CONTROL [main] Automatically disabling TLS 1.0, to force-enable TLS 1.0 specify --sslDisabledProtocols 'none'
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:21.097+0000 I CONTROL [initandlisten] MongoDB starting : pid=43 port=27017 dbpath=/opt/bitnami/mongodb/data/db 64-bit host=mongodb
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:21.097+0000 I CONTROL [initandlisten] db version v4.0.2
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:21.097+0000 I CONTROL [initandlisten] git version: fc1573ba18aee42f97a3bb13b67af7d837826b47
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:21.097+0000 I CONTROL [initandlisten] OpenSSL version: OpenSSL 1.1.0f 25 May 2017
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:21.097+0000 I CONTROL [initandlisten] allocator: tcmalloc
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:21.097+0000 I CONTROL [initandlisten] modules: none
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:21.097+0000 I CONTROL [initandlisten] build environment:
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:21.097+0000 I CONTROL [initandlisten] distmod: debian92
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:21.097+0000 I CONTROL [initandlisten] distarch: x86_64
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:21.097+0000 I CONTROL [initandlisten] target_arch: x86_64
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:21.097+0000 I CONTROL [initandlisten] options: { config: "/opt/bitnami/mongodb/conf/mongodb.conf", net: { bindIpAll: true, ipv6: true, port: 27017, unixDomainSocket: { enabled: true, pathPrefix: "/opt/bitnami/mongodb/tmp" } }, processManagement: { fork: false, pidFilePath: "/opt/bitnami/mongodb/tmp/mongodb.pid" }, security: { authorization: "disabled" }, setParameter: { enableLocalhostAuthBypass: "true" }, storage: { dbPath: "/opt/bitnami/mongodb/data/db", journal: { enabled: true } }, systemLog: { destination: "file", logAppend: true, logRotate: "reopen", path: true } }
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:21.097+0000 I STORAGE [initandlisten] Detected data files in /opt/bitnami/mongodb/data/db created by the 'wiredTiger' storage engine, so setting the active storage engine to 'wiredTiger'.
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:21.097+0000 I STORAGE [initandlisten]
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:21.097+0000 I STORAGE [initandlisten] ** WARNING: Using the XFS filesystem is strongly recommended with the WiredTiger storage engine
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:21.097+0000 I STORAGE [initandlisten] ** See http://dochub.mongodb.org/core/prodnotes-filesystem
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:21.097+0000 I STORAGE [initandlisten] wiredtiger_open config: create,cache_size=1448M,session_max=20000,eviction=(threads_min=4,threads_max=4),config_base=false,statistics=(fast),log=(enabled=true,archive=true,path=journal,compressor=snappy),file_manager=(close_idle_time=100000),statistics_log=(wait=0),verbose=(recovery_progress),
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:21.755+0000 I STORAGE [initandlisten] WiredTiger message [1547389761:755659][43:0x7fba684ba9c0], txn-recover: Main recovery loop: starting at 6/11008
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:21.887+0000 I STORAGE [initandlisten] WiredTiger message [1547389761:887494][43:0x7fba684ba9c0], txn-recover: Recovering log 6 through 7
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:21.956+0000 I STORAGE [initandlisten] WiredTiger message [1547389761:955986][43:0x7fba684ba9c0], txn-recover: Recovering log 7 through 7
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:22.024+0000 I STORAGE [initandlisten] WiredTiger message [1547389762:24283][43:0x7fba684ba9c0], txn-recover: Set global recovery timestamp: 0
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:22.079+0000 I RECOVERY [initandlisten] WiredTiger recoveryTimestamp. Ts: Timestamp(0, 0)
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:22.099+0000 I CONTROL [initandlisten] ** WARNING: You are running this process as the root user, which is not recommended.
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:22.099+0000 I CONTROL [initandlisten]
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:22.114+0000 I FTDC [initandlisten] Initializing full-time diagnostic data capture with directory '/opt/bitnami/mongodb/data/db/diagnostic.data'
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:22.118+0000 I NETWORK [initandlisten] waiting for connections on port 27017
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:24.901+0000 I NETWORK [listener] connection accepted from 172.21.0.7:56378 #1 (1 connection now open)
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:24.901+0000 I NETWORK [conn1] received client metadata from 172.21.0.7:56378 conn1: { driver: { name: "PyMongo", version: "3.7.2" }, os: { type: "Linux", name: "Linux", architecture: "x86_64", version: "4.15.0-39-generic" }, platform: "CPython 3.6.1.final.0" }
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:24.966+0000 I NETWORK [listener] connection accepted from 172.21.0.7:56380 #2 (2 connections now open)
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:24.967+0000 I NETWORK [conn2] received client metadata from 172.21.0.7:56380 conn2: { driver: { name: "PyMongo", version: "3.7.2" }, os: { type: "Linux", name: "Linux", architecture: "x86_64", version: "4.15.0-39-generic" }, platform: "CPython 3.6.1.final.0" }
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:24.990+0000 I ACCESS [conn2] Successfully authenticated as principal admin on crawler
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:25.418+0000 I NETWORK [listener] connection accepted from 172.21.0.10:44214 #3 (3 connections now open)
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:25.418+0000 I NETWORK [conn3] received client metadata from 172.21.0.10:44214 conn3: { driver: { name: "PyMongo", version: "3.7.2" }, os: { type: "Linux", name: "Linux", architecture: "x86_64", version: "4.15.0-39-generic" }, platform: "CPython 3.6.1.final.0" }
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:25.456+0000 I NETWORK [listener] connection accepted from 172.21.0.10:44216 #4 (4 connections now open)
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:25.456+0000 I NETWORK [conn4] received client metadata from 172.21.0.10:44216 conn4: { driver: { name: "PyMongo", version: "3.7.2" }, os: { type: "Linux", name: "Linux", architecture: "x86_64", version: "4.15.0-39-generic" }, platform: "CPython 3.6.1.final.0" }
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:25.473+0000 I ACCESS [conn4] Successfully authenticated as principal admin on crawler
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:25.940+0000 I NETWORK [listener] connection accepted from 172.21.0.8:40190 #5 (5 connections now open)
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:25.941+0000 I NETWORK [conn5] received client metadata from 172.21.0.8:40190 conn5: { driver: { name: "PyMongo", version: "3.7.2" }, os: { type: "Linux", name: "Linux", architecture: "x86_64", version: "4.15.0-39-generic" }, platform: "CPython 3.6.1.final.0" }
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:25.978+0000 I NETWORK [listener] connection accepted from 172.21.0.8:40192 #6 (6 connections now open)
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:25.978+0000 I NETWORK [conn6] received client metadata from 172.21.0.8:40192 conn6: { driver: { name: "PyMongo", version: "3.7.2" }, os: { type: "Linux", name: "Linux", architecture: "x86_64", version: "4.15.0-39-generic" }, platform: "CPython 3.6.1.final.0" }
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:25.996+0000 I ACCESS [conn6] Successfully authenticated as principal admin on crawler
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:26.525+0000 I NETWORK [listener] connection accepted from 172.21.0.11:60442 #7 (7 connections now open)
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:26.525+0000 I NETWORK [conn7] received client metadata from 172.21.0.11:60442 conn7: { driver: { name: "PyMongo", version: "3.7.2" }, os: { type: "Linux", name: "Linux", architecture: "x86_64", version: "4.15.0-39-generic" }, platform: "CPython 3.6.1.final.0" }
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:26.561+0000 I NETWORK [listener] connection accepted from 172.21.0.11:60444 #8 (8 connections now open)
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:26.562+0000 I NETWORK [conn8] received client metadata from 172.21.0.11:60444 conn8: { driver: { name: "PyMongo", version: "3.7.2" }, os: { type: "Linux", name: "Linux", architecture: "x86_64", version: "4.15.0-39-generic" }, platform: "CPython 3.6.1.final.0" }
mongodb_1_819e1089f4f6 | 2019-01-13T14:29:26.585+0000 I ACCESS [conn8] Successfully authenticated as principal admin on crawler
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
web-app | Host mongodb not yet availabile
[... this goes on for ever]

Unable to start the web-app docker image

Hello,

Any idea why my web-app is failing

web-app exited with code 2
web-app | Usage: flask run [OPTIONS]
web-app | Try 'flask run --help' for help.
web-app |
web-app | Error: While importing 'application.web', an ImportError was raised:
web-app |
web-app | Traceback (most recent call last):
web-app | File "/usr/local/lib/python3.8/site-packages/flask/cli.py", line 219, in locate_app
web-app | import(module_name)
web-app | File "/application/web/init.py", line 2, in
web-app | import flask_login
web-app | File "/usr/local/lib/python3.8/site-packages/flask_login/init.py", line 12, in
web-app | from .login_manager import LoginManager
web-app | File "/usr/local/lib/python3.8/site-packages/flask_login/login_manager.py", line 33, in
web-app | from .utils import _create_identifier
web-app | File "/usr/local/lib/python3.8/site-packages/flask_login/utils.py", line 14, in
web-app | from werkzeug.urls import url_decode
web-app | ImportError: cannot import name 'url_decode' from 'werkzeug.urls' (/usr/local/lib/python3.8/site-packages/werkzeug/urls.py)
web-app |
web-app exited with code 2

Getting 502 Bad Gateway

I would like some help with setting up poopak V1 on Docker for Windows. I installed with powerschell terminal and all went good but uppon opening localhost i get "502 Bad Gateway". ANy steps to follow if this happens? im kinda new with docker and cant find any solutions so far.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.