Coder Social home page Coder Social logo

Comments (11)

kuatroka avatar kuatroka commented on August 26, 2024 1

I have just finished installing the app locally. This time it fared better. I can now see the frontend, but it still seems to have some ironing out to do and probably it's on my part, but I'm struggling with these errors.

  1. The containers don't run and when I try to restart them, it doesn't help
    image

  2. Here is the full log since running docker-compose up. (Since it's a lot of lines, please let me know if you'd rather me not posting it here in the future)

My logs docker-compose up Pulling pgadmin (dpage/pgadmin4:)... latest: Pulling from dpage/pgadmin4 a0d0a0d46f8b: Already exists 2260de48639f: Downloading [============================================> ] 29.7MB/33.14MB2260de48639f: Downloading [==============================================> ] 30.72MB/33.14MB2260de48639f: Downloading [===============================================> ] 31.75MB/33.14MB2260de48639f: Downloading [================================================> ] 32.43MB/33.14MB2260de48639f: Downloading [=================================================> ] 33.1MB/33.14MB2260de48639f: Pull complete 8a7bd2cec0d6: Pull complete 3df70e78fff9: Pull complete 72ce4104debb: Pull complete 2f4a5f004843: Pull complete 850fb29807a6: Pull complete acbad1ac363a: Downloading [> ] 326.9kB/32.04MB acbad1ac363a: Pull complete Digest: sha256:e8d18f941264a82c6fbe81ce60503f2b00823a36e571cd383ca1f462b578f691 9a8d0188e481: Extracting [==================================================>] 1.231kB/1.231kB 9a8d0188e481: Pull complete 8a3f5c4e0176: Downloading [==========================> ] 204.2kB/381.5kB8a3f5c4e0176: Pull complete 51a20dbe2f6a: Downloading [==============================================> ] 6.974MB/7.436MB51a20dbe2f6a: Pull complete 0aacff13b8d7: Pull completeadc9264cf133: Pull complete Digest: sha256:68d4030e07912c418332ba6fdab4ac69f0293d9b1daaed4f1f77bdeb0a5eb048 Status: Downloaded newer image for redis:6.0.9-alpine Building frontend [+] Building 2.4s (9/9) FINISHED => [internal] load build definition from Dockerfile 0.0s => => transferring dockerfile: 108B 0.0s => [internal] load .dockerignore 0.0s => => transferring context: 52B 0.0s => [internal] load metadata for docker.io/library/node:10-alpine 1.9s => [auth] library/node:pull token for registry-1.docker.io 0.0s => [internal] load build context 0.1s => => transferring context: 804.21kB 0.1s => [1/3] FROM docker.io/library/node:10-alpine@sha256:dc98dac24efd4254f75976c40bce469446 0.0s => CACHED [2/3] WORKDIR /app/ 0.0s => [3/3] COPY . . 0.2s => exporting to image 0.0s => => exporting layers 0.0s => => writing image sha256:35c0ee9c99a879001f4aca9ef747a57b7dc37f913aa0872506b9f32d3ef2c 0.0s => => naming to docker.io/library/sec-filings-app-develop_frontend 0.0s

Use 'docker scan' to run Snyk tests against images to find vulnerabilities and learn how to fix
them
WARNING: Image for service frontend was built because it did not already exist. To rebuild this
image you must use docker-compose build or docker-compose up --build.
Building backend
[+] Building 123.3s (15/15) FINISHED
=> [internal] load build definition from Dockerfile.dev 0.0s
=> => transferring dockerfile: 437B 0.0s
=> [internal] load .dockerignore 0.0s
=> => transferring context: 123B 0.0s
=> [internal] load metadata for docker.io/library/python:3.9 2.6s
=> [auth] library/python:pull token for registry-1.docker.io 0.0s
=> [1/9] FROM docker.io/library/python:3.9@sha256:5caa9a0f034e56693e096ac4562097d9167a2 49.9s
=> => resolve docker.io/library/python:3.9@sha256:5caa9a0f034e56693e096ac4562097d9167a25 0.0s
=> => sha256:5caa9a0f034e56693e096ac4562097d9167a2509d22fb970cb4a2b5465a 2.60kB / 2.60kB 0.0s
=> => sha256:d32e17419b7ee61bbd89c2f0d2833a99cf45e594257d15cb567e4cf77 10.87MB / 10.87MB 5.1s
=> => sha256:672ecfe307c07d950d0aaad3b6e5ef3c0a179bd40e501cdd027e6f27011 2.22kB / 2.22kB 0.0s
=> => sha256:e2d7fd224b9cd792b4dae4af48bcb91611a24cd580ae1db91fd0a56db1d 8.60kB / 8.60kB 0.0s
=> => sha256:bb7d5a84853b217ac05783963f12b034243070c1c9c8d2e60ada4744 54.92MB / 54.92MB 23.3s
=> => sha256:f02b617c6a8c415a175f44d7e2c5d3b521059f2a6112c5f022e005a44a7 5.15MB / 5.15MB 3.1s
=> => sha256:c9d2d81226a43a97871acd5afb7e8aabfad4d6b62ae1709c870df3ee 54.57MB / 54.57MB 19.4s
=> => sha256:3c24ae8b66041c09dabc913b6f16fb914d5640b53b10747a343ddc 196.50MB / 196.50MB 41.5s
=> => sha256:8a4322d1621dec2def676639847e546c07a1da4ba0c035edb1036bf5fb 6.29MB / 6.29MB 22.2s
=> => sha256:0bde298e076a8f3680a810ea79dc73250029fba8340b2380c138bfac 17.70MB / 17.70MB 29.1s
=> => sha256:e169b6c7c6289494dc183375db05b4297cc0b18565b9929a3c0f8f70c81883 233B / 233B 25.1s
=> => extracting sha256:bb7d5a84853b217ac05783963f12b034243070c1c9c8d2e60ada47444f3cce04 2.4s
=> => sha256:2c7c1ad9ef8452470252c8279fe57473b8bbef909c8e68f91a5b1d6352 2.35MB / 2.35MB 28.3s
=> => extracting sha256:f02b617c6a8c415a175f44d7e2c5d3b521059f2a6112c5f022e005a44a759f2d 0.2s
=> => extracting sha256:d32e17419b7ee61bbd89c2f0d2833a99cf45e594257d15cb567e4cf7771ce34a 0.3s
=> => extracting sha256:c9d2d81226a43a97871acd5afb7e8aabfad4d6b62ae1709c870df3ee230bc3f5 2.7s
=> => extracting sha256:3c24ae8b66041c09dabc913b6f16fb914d5640b53b10747a343ddc5bb5bd6769 6.4s
=> => extracting sha256:8a4322d1621dec2def676639847e546c07a1da4ba0c035edb1036bf5fb45063b 0.3s
=> => extracting sha256:0bde298e076a8f3680a810ea79dc73250029fba8340b2380c138bfacb0101610 0.6s
=> => extracting sha256:e169b6c7c6289494dc183375db05b4297cc0b18565b9929a3c0f8f70c8188379 0.0s
=> => extracting sha256:2c7c1ad9ef8452470252c8279fe57473b8bbef909c8e68f91a5b1d6352e976ec 0.2s
=> [internal] load build context 0.2s
=> => transferring context: 355.61kB 0.1s
=> [2/9] RUN mkdir /code 0.4s
=> [3/9] WORKDIR /code 0.0s
=> [4/9] ADD requirements/base.txt requirements/dev.txt requirements/test.txt 0.0s
=> [5/9] RUN pip3 install -r requirements/base.txt 19.3s
=> [6/9] RUN pip3 install -r requirements/dev.txt 39.0s
=> [7/9] RUN pip3 install -r requirements/test.txt 10.2s
=> [8/9] ADD . /code/ 0.0s
=> [9/9] RUN useradd -m app 0.4s
=> exporting to image 1.1s
=> => exporting layers 1.1s
=> => writing image sha256:e23638ea72436dcff50476271c93f8eea0f0b832570103e3b087d34893045 0.0s
=> => naming to docker.io/library/sec-filings-app-develop_backend 0.0s

Use 'docker scan' to run Snyk tests against images to find vulnerabilities and learn how to fix
them
WARNING: Image for service backend was built because it did not already exist. To rebuild this image you must use docker-compose build or docker-compose up --build.
Building nginx
[+] Building 12.0s (8/8) FINISHED
=> [internal] load build definition from Dockerfile 0.0s
=> => transferring dockerfile: 147B 0.0s
=> [internal] load .dockerignore 0.0s
=> => transferring context: 2B 0.0s
=> [internal] load metadata for docker.io/library/nginx:1.13.12-alpine 11.8s
=> [auth] library/nginx:pull token for registry-1.docker.io 0.0s
=> [internal] load build context 0.0s
=> => transferring context: 1.89kB 0.0s
=> CACHED [1/2] FROM docker.io/library/nginx:1.13.12-alpine@sha256:9d46fd628d54ebe1633ee 0.0s
=> [2/2] COPY dev/dev.conf /etc/nginx/nginx.conf 0.0s
=> exporting to image 0.0s
=> => exporting layers 0.0s
=> => writing image sha256:dd6dd69698a821ae1f7a00a57918ad314d987d8e4cf2d315cb3d56bb7bd6b 0.0s
=> => naming to docker.io/library/sec-filings-app-develop_nginx 0.0s

Use 'docker scan' to run Snyk tests against images to find vulnerabilities and learn how to fix
them
WARNING: Image for service nginx was built because it did not already exist. To rebuild this image you must use docker-compose build or docker-compose up --build.
Building celery
[+] Building 1.0s (14/14) FINISHED
=> [internal] load build definition from Dockerfile.dev 0.0s
=> => transferring dockerfile: 36B 0.0s
=> [internal] load .dockerignore 0.0s
=> => transferring context: 34B 0.0s
=> [internal] load metadata for docker.io/library/python:3.9 0.8s
=> [1/9] FROM docker.io/library/python:3.9@sha256:5caa9a0f034e56693e096ac4562097d9167a25 0.0s
=> [internal] load build context 0.0s
=> => transferring context: 4.79kB 0.0s
=> CACHED [2/9] RUN mkdir /code 0.0s
=> CACHED [3/9] WORKDIR /code 0.0s
=> CACHED [4/9] ADD requirements/base.txt requirements/dev.txt requirements/test 0.0s
=> CACHED [5/9] RUN pip3 install -r requirements/base.txt 0.0s
=> CACHED [6/9] RUN pip3 install -r requirements/dev.txt 0.0s
=> CACHED [7/9] RUN pip3 install -r requirements/test.txt 0.0s
=> CACHED [8/9] ADD . /code/ 0.0s
=> CACHED [9/9] RUN useradd -m app 0.0s
=> exporting to image 0.0s
=> => exporting layers 0.0s
=> => writing image sha256:e23638ea72436dcff50476271c93f8eea0f0b832570103e3b087d34893045 0.0s
=> => naming to docker.io/library/sec-filings-app-develop_celery 0.0s

Use 'docker scan' to run Snyk tests against images to find vulnerabilities and learn how to fix
them
WARNING: Image for service celery was built because it did not already exist. To rebuild this image you must use docker-compose build or docker-compose up --build.
Pulling flower (mher/flower:latest)...
latest: Pulling from mher/flower
5843afab3874: Pull complete
2dfaacf7024e: Pull complete
2dbba127c6aa: Pull complete
1a467efa2204: Pull complete
865369a531d5: Pull complete
549dc674308e: Pull complete
ff90bbedf8b4: Pull complete
9324c24940d2: Pull complete
bd759767eda4: Pull complete
Digest: sha256:bd388a00438d3dbe836196e7d0491527db7ccb5f07fc1b3afc33016311fc283a
Status: Downloaded newer image for mher/flower:latest
Creating mailhog ... done
Creating redis-commander ... done
Creating frontend ... done
Creating redis ... done
Creating postgres ... done
Creating flower ... done
Creating celery ... done
Creating backend ... done
Creating pgadmin ... done
Creating nginx ... done
Attaching to redis, mailhog, postgres, redis-commander, frontend, flower, pgadmin, celery, backend, nginx
backend | Traceback (most recent call last):
backend | File "/code/manage.py", line 24, in
backend | main()
backend | File "/code/manage.py", line 20, in main
backend | execute_from_command_line(sys.argv)
backend | File "/usr/local/lib/python3.9/site-packages/django/core/management/init.py", line 401, in execute_from_command_line
backend | utility.execute()
backend | File "/usr/local/lib/python3.9/site-packages/django/core/management/init.py", line 345, in execute
backend | settings.INSTALLED_APPS
backend | File "/usr/local/lib/python3.9/site-packages/django/conf/init.py", line 83, in getattr
backend | self._setup(name)
backend | File "/usr/local/lib/python3.9/site-packages/django/conf/init.py", line 70, in _setup
flower | [I 211015 21:52:49 command:152] Visit me at http://localhost:5555
flower | [I 211015 21:52:49 command:159] Broker: redis://redis:6379/1
backend | self._wrapped = Settings(settings_module)
backend | File "/usr/local/lib/python3.9/site-packages/django/conf/init.py", line 177, in init
backend | mod = importlib.import_module(self.SETTINGS_MODULE)
backend | File "/usr/local/lib/python3.9/importlib/init.py", line 127, in import_module
celery | Traceback (most recent call last):
celery | File "/code/manage.py", line 24, in
backend | return _bootstrap._gcd_import(name[level:], package, level)
backend | File "", line 1030, in _gcd_import
backend | File "", line 1007, in _find_and_load
backend | File "", line 986, in _find_and_load_unlocked
backend | File "", line 680, in _load_unlocked
backend | File "", line 850, in exec_module
celery | main()
flower | [I 211015 21:52:49 command:160] Registered tasks:
flower | ['celery.accumulate',
flower | 'celery.backend_cleanup',
flower | 'celery.chain',
flower | 'celery.chord',
flower | 'celery.chord_unlock',
flower | 'celery.chunks',
flower | 'celery.group',
flower | 'celery.map',
flower | 'celery.starmap']
flower | [I 211015 21:52:49 mixins:226] Connected to redis://redis:6379/1
celery | File "/code/manage.py", line 20, in main
celery | execute_from_command_line(sys.argv)
celery | File "/usr/local/lib/python3.9/site-packages/django/core/management/init.py", line 401, in execute_from_command_line
celery | utility.execute()
celery | File "/usr/local/lib/python3.9/site-packages/django/core/management/init.py", line 345, in execute
celery | settings.INSTALLED_APPS
celery | File "/usr/local/lib/python3.9/site-packages/django/conf/init.py", line 83, in getattr
celery | self._setup(name)
celery | File "/usr/local/lib/python3.9/site-packages/django/conf/init.py", line 70, in _setup
celery | self._wrapped = Settings(settings_module)
celery | File "/usr/local/lib/python3.9/site-packages/django/conf/init.py", line 177, in init
celery | mod = importlib.import_module(self.SETTINGS_MODULE)
celery | File "/usr/local/lib/python3.9/importlib/init.py", line 127, in import_module
celery | return _bootstrap._gcd_import(name[level:], package, level)
celery | File "", line 1030, in _gcd_import
celery | File "", line 1007, in _find_and_load
celery | File "", line 986, in _find_and_load_unlocked
celery | File "", line 680, in _load_unlocked
mailhog | 2021/10/15 21:52:47 Using in-memory storage
mailhog | 2021/10/15 21:52:47 [SMTP] Binding to address: 0.0.0.0:1025
backend | File "", line 228, in _call_with_frames_removed
backend | File "/code/backend/settings/development.py", line 1, in
celery | File "", line 850, in exec_module
backend | from .base import * # noqa
backend | File "/code/backend/settings/base.py", line 26, in
backend | SECRET_KEY = os.environ["SECRET_KEY"]
backend | File "/usr/local/lib/python3.9/os.py", line 679, in getitem
mailhog | [HTTP] Binding to address: 0.0.0.0:8025
mailhog | 2021/10/15 21:52:47 Serving under http://0.0.0.0:8025/
mailhog | Creating API v1 with WebPath:
mailhog | Creating API v2 with WebPath:
celery | File "", line 228, in _call_with_frames_removed
backend | raise KeyError(key) from None
backend | KeyError: 'SECRET_KEY'
celery | File "/code/backend/settings/development.py", line 1, in
celery | from .base import * # noqa
celery | File "/code/backend/settings/base.py", line 26, in
celery | SECRET_KEY = os.environ["SECRET_KEY"]
celery | File "/usr/local/lib/python3.9/os.py", line 679, in getitem
celery | raise KeyError(key) from None
celery | KeyError: 'SECRET_KEY'
postgres | The files belonging to this database system will be owned by user "postgres".
postgres | This user must also own the server process.
postgres |
postgres | The database cluster will be initialized with locale "en_US.utf8".
postgres | The default database encoding has accordingly been set to "UTF8".
postgres | The default text search configuration will be set to "english".
postgres |
postgres | Data page checksums are disabled.
postgres |
postgres | fixing permissions on existing directory /var/lib/postgresql/data ... ok
postgres | creating subdirectories ... ok
postgres | selecting default max_connections ... 100
postgres | selecting default shared_buffers ... 128MB
postgres | selecting default timezone ... Etc/UTC
postgres | selecting dynamic shared memory implementation ... posix
postgres | creating configuration files ... ok
postgres | running bootstrap script ... ok
redis | 1:C 15 Oct 2021 21:52:47.224 # oO0OoO0OoO0Oo Redis is starting oO0OoO0OoO0Oo
redis | 1:C 15 Oct 2021 21:52:47.224 # Redis version=6.0.9, bits=64, commit=00000000, modified=0, pid=1, just started
redis | 1:C 15 Oct 2021 21:52:47.224 # Warning: no config file specified, using the default config. In order to specify a config file use redis-server /path/to/redis.conf
redis | 1:M 15 Oct 2021 21:52:47.225 * Running mode=standalone, port=6379.
redis | 1:M 15 Oct 2021 21:52:47.225 # WARNING: The TCP backlog setting of 511 cannot be enforced because /proc/sys/net/core/somaxconn is set to the lower value of 128.
redis | 1:M 15 Oct 2021 21:52:47.225 # Server initialized
redis | 1:M 15 Oct 2021 21:52:47.225 # WARNING overcommit_memory is set to 0! Background save may fail under low memory condition. To fix this issue add 'vm.overcommit_memory = 1' to /etc/sysctl.conf and then reboot or run the command 'sysctl vm.overcommit_memory=1' for this to take effect.
redis | 1:M 15 Oct 2021 21:52:47.225 # WARNING you have Transparent Huge Pages (THP) support enabled in your kernel. This will create latency and memory usage issues with Redis. To fix this issue run the command 'echo madvise > /sys/kernel/mm/transparent_hugepage/enabled' as root, and add it to your /etc/rc.local in order to retain the setting after a reboot. Redis must be restarted after THP is disabled (set to 'madvise' or 'never').
redis | 1:M 15 Oct 2021 21:52:47.225 * Ready to accept connections
postgres | performing post-bootstrap initialization ... ok
postgres | syncing data to disk ...
postgres | WARNING: enabling "trust" authentication for local connections
postgres | You can change this by editing pg_hba.conf or using the option -A, or
postgres | --auth-local and --auth-host, the next time you run initdb.
postgres | ok
postgres |
postgres | Success. You can now start the database server using:
postgres |
postgres | pg_ctl -D /var/lib/postgresql/data -l logfile start
postgres |
postgres | waiting for server to start....2021-10-15 21:52:48.737 UTC [45] LOG: listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"
postgres | 2021-10-15 21:52:48.764 UTC [46] LOG: database system was shut down at 2021-10-15 21:52:48 UTC
postgres | 2021-10-15 21:52:48.768 UTC [45] LOG: database system is ready to accept connections
postgres | done
postgres | server started
postgres |
postgres | /usr/local/bin/docker-entrypoint.sh: ignoring /docker-entrypoint-initdb.d/*postgres |
postgres | waiting for server to shut down...2021-10-15 21:52:48.819 UTC [45] LOG: received fast shutdown request
postgres | .2021-10-15 21:52:48.822 UTC [45] LOG: aborting any active transactions
postgres | 2021-10-15 21:52:48.827 UTC [45] LOG: background worker "logical replication launcher" (PID 52) exited with exit code 1
redis-commander | Creating custom redis-commander config '/redis-commander/config/local-production.json'.
redis-commander | node ./bin/redis-commander --redis-host redis
postgres | 2021-10-15 21:52:48.827 UTC [47] LOG: shutting down
postgres | 2021-10-15 21:52:48.849 UTC [45] LOG: database system is shut down
redis-commander | Using scan instead of keys
redis-commander | No Save: false
redis-commander | listening on 0.0.0.0:8081
redis-commander | access with browser at http://127.0.0.1:8081
redis-commander | Redis Connection redis:6379 using Redis DB #0
postgres | done
postgres | server stopped
postgres |
postgres | PostgreSQL init process complete; ready for start up.
postgres |
postgres | 2021-10-15 21:52:48.934 UTC [1] LOG: listening on IPv4 address "0.0.0.0",
port 5432
postgres | 2021-10-15 21:52:48.934 UTC [1] LOG: listening on IPv6 address "::", port
5432
postgres | 2021-10-15 21:52:48.938 UTC [1] LOG: listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"
postgres | 2021-10-15 21:52:48.958 UTC [54] LOG: database system was shut down at 2021-10-15 21:52:48 UTC
postgres | 2021-10-15 21:52:48.962 UTC [1] LOG: database system is ready to accept connections
celery exited with code 1
backend exited with code 1
flower | [W 211015 21:52:50 inspector:42] Inspect method active_queues failed
flower | [W 211015 21:52:50 inspector:42] Inspect method registered failed
flower | [W 211015 21:52:50 inspector:42] Inspect method reserved failed
flower | [W 211015 21:52:50 inspector:42] Inspect method scheduled failed
flower | [W 211015 21:52:50 inspector:42] Inspect method active failed
flower | [W 211015 21:52:50 inspector:42] Inspect method revoked failed
flower | [W 211015 21:52:50 inspector:42] Inspect method conf failed
flower | [W 211015 21:52:50 inspector:42] Inspect method stats failed
nginx | 2021/10/15 21:52:50 [emerg] 1#1: host not found in upstream "backend:8000"
in /etc/nginx/nginx.conf:13
nginx | nginx: [emerg] host not found in upstream "backend:8000" in /etc/nginx/nginx.conf:13
nginx exited with code 1
pgadmin | NOTE: Configuring authentication for SERVER mode.
pgadmin |
pgadmin | [2021-10-15 21:53:04 +0000] [1] [INFO] Starting gunicorn 20.1.0
pgadmin | [2021-10-15 21:53:04 +0000] [1] [INFO] Listening at: http://[::]:80 (1)
pgadmin | [2021-10-15 21:53:04 +0000] [1] [INFO] Using worker: gthread
pgadmin | [2021-10-15 21:53:04 +0000] [88] [INFO] Booting worker with pid: 88
frontend | npm WARN deprecated [email protected]: CoffeeScript on NPM has moved to
"coffeescript" (no hyphen)
frontend | /usr/local/bin/quasar -> /usr/local/lib/node_modules/@quasar/cli/bin/quasarfrontend | npm WARN notsup Unsupported engine for @quasar/[email protected]: wanted: {"node":">= 12.0.0","npm":">= 5.6.0","yarn":">= 1.6.0"} (current: {"node":"10.24.1","npm":"6.14.12"})
frontend | npm WARN notsup Not compatible with your version of node/npm: @quasar/[email protected]
frontend | npm WARN notsup Unsupported engine for [email protected]: wanted: {"node":">=12.0.0"} (current: {"node":"10.24.1","npm":"6.14.12"})
frontend | npm WARN notsup Not compatible with your version of node/npm: [email protected]
frontend |
frontend | + @quasar/[email protected]
frontend | added 371 packages from 293 contributors in 46.439s
mailhog | [APIv1] KEEPALIVE /api/v1/events
mailhog | [APIv1] KEEPALIVE /api/v1/events
frontend |
frontend | > [email protected] postinstall /app/node_modules/babel-runtime/node_modules/core-js
frontend | > node -e "try{require('./postinstall')}catch(e){}"
frontend |
frontend | Thank you for using core-js ( https://github.com/zloirock/core-js ) for polyfilling JavaScript standard library!
frontend |
frontend | The project needs your help! Please consider supporting of core-js on Open
Collective or Patreon:
frontend | > https://opencollective.com/core-js
frontend | > https://www.patreon.com/zloirock
frontend |
frontend | Also, the author of core-js ( https://github.com/zloirock ) is looking for
a good job -)
frontend |
frontend |
frontend | > [email protected] postinstall /app/node_modules/core-js
frontend | > node -e "try{require('./postinstall')}catch(e){}"
frontend |
frontend | Thank you for using core-js ( https://github.com/zloirock/core-js ) for polyfilling JavaScript standard library!
frontend |
frontend | The project needs your help! Please consider supporting of core-js on Open
Collective or Patreon:
frontend | > https://opencollective.com/core-js
frontend | > https://www.patreon.com/zloirock
frontend |
frontend | Also, the author of core-js ( https://github.com/zloirock ) is looking for
a good job -)
frontend |
frontend |
frontend | > [email protected] postinstall /app/node_modules/ejs
frontend | > node ./postinstall.js
frontend |
frontend | Thank you for installing EJS: built with the Jake JavaScript build tool (https://jakejs.com/)
frontend |
frontend | npm WARN optional SKIPPING OPTIONAL DEPENDENCY: [email protected] (node_modules/webpack-dev-server/node_modules/fsevents):
frontend | npm WARN notsup SKIPPING OPTIONAL DEPENDENCY: Unsupported platform for [email protected]: wanted {"os":"darwin","arch":"any"} (current: {"os":"linux","arch":"x64"})
frontend | npm WARN optional SKIPPING OPTIONAL DEPENDENCY: [email protected] (node_modules/watchpack-chokidar2/node_modules/fsevents):
frontend | npm WARN notsup SKIPPING OPTIONAL DEPENDENCY: Unsupported platform for [email protected]: wanted {"os":"darwin","arch":"any"} (current: {"os":"linux","arch":"x64"})
frontend | npm WARN optional SKIPPING OPTIONAL DEPENDENCY: [email protected] (node_modules/fsevents):
frontend | npm WARN notsup SKIPPING OPTIONAL DEPENDENCY: Unsupported platform for [email protected]: wanted {"os":"darwin","arch":"any"} (current: {"os":"linux","arch":"x64"})
frontend |
frontend | added 1558 packages from 677 contributors and audited 1561 packages in 98.949s
frontend |
frontend | 89 packages are looking for funding
frontend | run npm fund for details
frontend |
frontend | found 84 vulnerabilities (1 low, 38 moderate, 45 high)
frontend | run npm audit fix to fix them, or npm audit for details
frontend |
frontend | Dev mode.......... pwa
frontend | Pkg quasar........ v1.14.5
frontend | Pkg @quasar/app... v2.1.8
frontend | Debugging......... enabled
frontend |
frontend | Browserslist: caniuse-lite is outdated. Please run:
frontend | npx browserslist@latest --update-db
frontend |
frontend | Why you should do it regularly:
frontend | https://github.com/browserslist/browserslist#browsers-data-updating
frontend | Configured browser support (at least 88.57% of global marketshare):
frontend | · Chrome for Android >= 86
frontend | · Firefox for Android >= 82
frontend | · Android >= 81
frontend | · Chrome >= 76
frontend | · Edge >= 83
frontend | · Firefox >= 73
frontend | · iOS >= 10.3
frontend | · Opera >= 68
frontend | · Safari >= 11
frontend |
frontend | App · Reading quasar.conf.js
frontend | App · Checking listening address availability (0.0.0.0:8080)...
frontend | App · Transpiling JS (Babel active)
mailhog | [APIv1] KEEPALIVE /api/v1/events
frontend | App · [GenerateSW] Will generate a service-worker file. Ignoring your custom written one.
frontend | App · Extending PWA Webpack config
frontend | App · Generating Webpack entry point
frontend | App · Booting up...
frontend | App · Compiling PWA...
frontend | Browserslist: caniuse-lite is outdated. Please run:
frontend | npx browserslist@latest --update-db
mailhog | [APIv1] KEEPALIVE /api/v1/events
frontend | App · Compiled PWA done in 36123 ms
frontend | DONE Compiled successfully in 36128ms9:56:48 PM
frontend |
frontend |
frontend | N App dir........... /app
frontend | App URL........... http://localhost:8080/
frontend | Dev mode.......... pwa
frontend | Pkg quasar........ v1.14.5
frontend | Pkg @quasar/app... v2.1.8
frontend | Transpiled JS..... yes (Babel)
frontend |
frontend | ℹ 「wds」: Project is running at http://0.0.0.0:8080/
frontend | ℹ 「wds」: webpack output is served from /
frontend | ℹ 「wds」: 404s will fallback to /index.html
mailhog | [APIv1] KEEPALIVE /api/v1/events
mailhog | [APIv1] KEEPALIVE /api/v1/events
mailhog | [APIv1] KEEPALIVE /api/v1/events
mailhog | [APIv1] KEEPALIVE /api/v1/events
mailhog | [APIv1] KEEPALIVE /api/v1/events
mailhog | [APIv1] KEEPALIVE /api/v1/events
mailhog | [APIv1] KEEPALIVE /api/v1/events
mailhog | [APIv1] KEEPALIVE /api/v1/events
mailhog | [APIv1] KEEPALIVE /api/v1/events
mailhog | [APIv1] KEEPALIVE /api/v1/events
mailhog | [APIv1] KEEPALIVE /api/v1/events
mailhog | [APIv1] KEEPALIVE /api/v1/events
mailhog | [APIv1] KEEPALIVE /api/v1/events
mailhog | [APIv1] KEEPALIVE /api/v1/events
mailhog | [APIv1] KEEPALIVE /api/v1/events
mailhog | [APIv1] KEEPALIVE /api/v1/events
mailhog | [APIv1] KEEPALIVE /api/v1/events
mailhog | [APIv1] KEEPALIVE /api/v1/events
mailhog | [APIv1] KEEPALIVE /api/v1/events
celery | Traceback (most recent call last):
celery | File "/code/manage.py", line 24, in
celery | main()
celery | File "/code/manage.py", line 20, in main
celery | execute_from_command_line(sys.argv)
celery | File "/usr/local/lib/python3.9/site-packages/django/core/management/init.py", line 401, in execute_from_command_line
celery | utility.execute()
celery | File "/usr/local/lib/python3.9/site-packages/django/core/management/init.py", line 345, in execute
celery | settings.INSTALLED_APPS
celery | File "/usr/local/lib/python3.9/site-packages/django/conf/init.py", line 83, in getattr
celery | self._setup(name)
celery | File "/usr/local/lib/python3.9/site-packages/django/conf/init.py", line 70, in _setup
celery | self._wrapped = Settings(settings_module)
celery | File "/usr/local/lib/python3.9/site-packages/django/conf/init.py", line 177, in init
celery | mod = importlib.import_module(self.SETTINGS_MODULE)
celery | File "/usr/local/lib/python3.9/importlib/init.py", line 127, in import_module
celery | return _bootstrap._gcd_import(name[level:], package, level)
celery | File "", line 1030, in _gcd_import
celery | File "", line 1007, in _find_and_load
celery | File "", line 986, in _find_and_load_unlocked
celery | File "", line 680, in _load_unlocked
celery | File "", line 850, in exec_module
celery | File "", line 228, in _call_with_frames_removed
celery | File "/code/backend/settings/development.py", line 1, in
celery | from .base import * # noqa
celery | File "/code/backend/settings/base.py", line 26, in
celery | SECRET_KEY = os.environ["SECRET_KEY"]
celery | File "/usr/local/lib/python3.9/os.py", line 679, in getitem
celery | raise KeyError(key) from None
celery | KeyError: 'SECRET_KEY'
celery | Traceback (most recent call last):
celery | File "/code/manage.py", line 24, in
celery | main()
celery | File "/code/manage.py", line 20, in main
celery | execute_from_command_line(sys.argv)
celery | File "/usr/local/lib/python3.9/site-packages/django/core/management/init.py", line 401, in execute_from_command_line
celery | utility.execute()
celery | File "/usr/local/lib/python3.9/site-packages/django/core/management/init.py", line 345, in execute
celery | settings.INSTALLED_APPS
celery | File "/usr/local/lib/python3.9/site-packages/django/conf/init.py", line 83, in getattr
celery | self._setup(name)
celery | File "/usr/local/lib/python3.9/site-packages/django/conf/init.py", line 70, in _setup
celery | self._wrapped = Settings(settings_module)
celery | File "/usr/local/lib/python3.9/site-packages/django/conf/init.py", line 177, in init
celery | mod = importlib.import_module(self.SETTINGS_MODULE)
celery | File "/usr/local/lib/python3.9/importlib/init.py", line 127, in import_module
celery | return _bootstrap._gcd_import(name[level:], package, level)
celery | File "", line 1030, in _gcd_import
celery | File "", line 1007, in _find_and_load
celery | File "", line 986, in _find_and_load_unlocked
celery | File "", line 680, in _load_unlocked
celery | File "", line 850, in exec_module
celery | File "", line 228, in _call_with_frames_removed
celery | File "/code/backend/settings/development.py", line 1, in
celery | from .base import * # noqa
celery | File "/code/backend/settings/base.py", line 26, in
celery | SECRET_KEY = os.environ["SECRET_KEY"]
celery | File "/usr/local/lib/python3.9/os.py", line 679, in getitem
celery | raise KeyError(key) from None
celery | KeyError: 'SECRET_KEY'
celery exited with code 1
backend | Traceback (most recent call last):
backend | File "/code/manage.py", line 24, in
backend | main()
backend | File "/code/manage.py", line 20, in main
backend | execute_from_command_line(sys.argv)
backend | File "/usr/local/lib/python3.9/site-packages/django/core/management/init.py", line 401, in execute_from_command_line
backend | utility.execute()
backend | File "/usr/local/lib/python3.9/site-packages/django/core/management/init.py", line 345, in execute
backend | settings.INSTALLED_APPS
backend | File "/usr/local/lib/python3.9/site-packages/django/conf/init.py", line 83, in getattr
backend | self._setup(name)
backend | File "/usr/local/lib/python3.9/site-packages/django/conf/init.py", line 70, in _setup
backend | self._wrapped = Settings(settings_module)
backend | File "/usr/local/lib/python3.9/site-packages/django/conf/init.py", line 177, in init
backend | mod = importlib.import_module(self.SETTINGS_MODULE)
backend | File "/usr/local/lib/python3.9/importlib/init.py", line 127, in import_module
backend | return _bootstrap._gcd_import(name[level:], package, level)
backend | File "", line 1030, in _gcd_import
backend | File "", line 1007, in _find_and_load
backend | File "", line 986, in _find_and_load_unlocked
backend | File "", line 680, in _load_unlocked
backend | File "", line 850, in exec_module
backend | File "", line 228, in _call_with_frames_removed
backend | File "/code/backend/settings/development.py", line 1, in
backend | from .base import * # noqa
backend | File "/code/backend/settings/base.py", line 26, in
backend | SECRET_KEY = os.environ["SECRET_KEY"]
backend | File "/usr/local/lib/python3.9/os.py", line 679, in getitem
backend | raise KeyError(key) from None
backend | KeyError: 'SECRET_KEY'
backend | Traceback (most recent call last):
backend | File "/code/manage.py", line 24, in
backend | main()
backend | File "/code/manage.py", line 20, in main
backend | execute_from_command_line(sys.argv)
backend | File "/usr/local/lib/python3.9/site-packages/django/core/management/init.py", line 401, in execute_from_command_line
backend | utility.execute()
backend | File "/usr/local/lib/python3.9/site-packages/django/core/management/init.py", line 345, in execute
backend | settings.INSTALLED_APPS
backend | File "/usr/local/lib/python3.9/site-packages/django/conf/init.py", line 83, in getattr
backend | self._setup(name)
backend | File "/usr/local/lib/python3.9/site-packages/django/conf/init.py", line 70, in _setup
backend | self._wrapped = Settings(settings_module)
backend | File "/usr/local/lib/python3.9/site-packages/django/conf/init.py", line 177, in init
backend | mod = importlib.import_module(self.SETTINGS_MODULE)
backend | File "/usr/local/lib/python3.9/importlib/init.py", line 127, in import_module
backend | return _bootstrap._gcd_import(name[level:], package, level)
backend | File "", line 1030, in _gcd_import
backend | File "", line 1007, in _find_and_load
backend | File "", line 986, in _find_and_load_unlocked
backend | File "", line 680, in _load_unlocked
backend | File "", line 850, in exec_module
backend | File "", line 228, in _call_with_frames_removed
backend | File "/code/backend/settings/development.py", line 1, in
backend | from .base import * # noqa
backend | File "/code/backend/settings/base.py", line 26, in
backend | SECRET_KEY = os.environ["SECRET_KEY"]
backend | File "/usr/local/lib/python3.9/os.py", line 679, in getitem
backend | raise KeyError(key) from None
backend | KeyError: 'SECRET_KEY'
backend exited with code 1
mailhog | [APIv1] KEEPALIVE /api/v1/events
nginx | 2021/10/15 22:16:43 [emerg] 1#1: host not found in upstream "backend:8000"
in /etc/nginx/nginx.conf:13
nginx | nginx: [emerg] host not found in upstream "backend:8000" in /etc/nginx/nginx.conf:13
nginx exited with code 1
mailhog | [APIv1] KEEPALIVE /api/v1/events

from sec-filings-app.

briancaffey avatar briancaffey commented on August 26, 2024 1

I updated the code and README with instructions on how to generate an admin user and how to create data @kuatroka

from sec-filings-app.

briancaffey avatar briancaffey commented on August 26, 2024

Hey @kuatroka thanks a lot for opening this issue. I recently revisited this project and fixed a bunch of errors, but the changes are only on the develop branch of the GitLab repo: https://gitlab.com/briancaffey/sec-filings-app. I'll be testing those changes and then pushing to GitHub. Sorry for the confusion, and please get in touch if you have any other questions about the project. I'll probably be doing some more work on it this weekend.

For your specific questions:

  1. I'm now using the latest flower image, which supports celery v5
  2. If you are using Windows, then I recommend that you use WSL 2 with Docker Desktop when working with this project (this is what I do)
  3. https://stackoverflow.com/a/33018394/6084948

from sec-filings-app.

kuatroka avatar kuatroka commented on August 26, 2024

Thanks Brian. I'll try it with the gitlab code. I do have WSL 2 with docker, so I'm preemptively worried about the outcome. Thanks for the note on the docker logs.

  1. "...and please get in touch if you have any other questions about the project..." I do have some questions and also can offer my help if needed. I'm not a great coder (only basic Python), but I do can test as I'm an insufferable inconsistencies finder.

  2. If you need ideas for new or improved functionalities, please let me know as the main reason how I found this project is because I was looking for something similar to start my own investment insights automated helper. I got tons of ideas and some of them are very much related with what can be done analysing 13F filings. So if you wish, I'll be happy to share some. Below also see a couple of questions, if I may. I have see your application on heroku, so I'm not sure about all the functionalities and please bear with me if my questions are self-evident

  3. Does the current version of the app have the option to load all or certain filings for all years and companies?

  4. If yes to the above, do you somehow identify if the .txt filing is of an XML format or not, so you parse it or park it in case of an older, non-xml filing?

  5. Is there any type of a functionality that turns on/off the ingestion of new filings or maybe there is a feature that downloads new filing in real-time as soon as they appear on the SEC site?

  6. Did you figure out how the amendments work, so the data is corrected for the past filings?

  7. Did you figure out all the different dates in the filings(CONFORMED PERIOD OF REPORT, EFFECTIVENESS DATE, DATE AS OF CHANGE, etc..)

I'm going to stop here as I might have tons more questions :)

from sec-filings-app.

briancaffey avatar briancaffey commented on August 26, 2024

@kuatroka Nice! Thanks for sharing the logs, looks like you are close. Did you copy the contents of the .env.template file into .env? It looks like the backend and celery containers are failing because there is no SECRET_KEY environment variable. That should fix the issue you are experience, hopefully. Don't worry about posting a lot of logs! If you want to, you can wrap long logs in a <summary> tag like this:

<details>
<summary>My logs</summary>
Logs...

More logs..
</details>
My logs Logs...

More logs..

If you need ideas for new or improved functionalities, please let me know as the main reason how I found this project is because I was looking for something similar to start my own investment insights automated helper.

Awesome! Yes, please do share. I'm going to open up GitHub Discussion for this repo. That would be a better place for discussing what 13F analysis features would be worth considering for this project. I'm all ears! I have less experience with SEC data and more with programming.

Does the current version of the app have the option to load all or certain filings for all years and companies?

Currently the way I load in data is one master.idx at a time using the Django admin. This weekend I'll work on documenting this a little bit better, it isn't very clear at this point. Basically what I do is add a filing list in the Django admin with a Quarter (1, 2, 3 or 4) a date (like 2020-01-01) and a year. Then I have an action for processing the filing list, which will go through each 13F filing in the list and download that filing and then process it (using celery). This is what parses the XML and then creates holdings objects that are mostly what drive the API data.

If yes to the above, do you somehow identify if the .txt filing is of an XML format or not, so you parse it or park it in case of an older, non-xml filing?

The files that I parse are all of the same format, I think. It seems to be XML wrapped in pseudo XML or something, so I parse out the XML the same way for each 13F filing.

Is there any type of a functionality that turns on/off the ingestion of new filings or maybe there is a feature that downloads new filing in real-time as soon as they appear on the SEC site?

This could be implemented, but I don't have this now. There are also other APIs that can help with this, or celery tasks can be scheduled to regularly check for new filings / updates.

Did you figure out how the amendments work, so the data is corrected for the past filings?

Not really, I have thought about this a little bit, so it would be an important improvement going forward. Now I just process all of the files, and I was thinking that I could update the queries to mark which filings are from amendment files, or something like that.

Did you figure out all the different dates in the filings(CONFORMED PERIOD OF REPORT, EFFECTIVENESS DATE, DATE AS OF CHANGE, etc..)

I haven't looked into these different dates yet. I'm using dates that are attached to the Quarter/Year of the filing list files. If these dates are important then it would be good to update the models to use those as well.

from sec-filings-app.

kuatroka avatar kuatroka commented on August 26, 2024

Thanks for the log tip. I amended the previous post

I missed the .env.template part. I copied it and was able to install it. So I'm getting closer! Still have issues though.
I can now see the site and all the containers are running, but there is no data, nor I seem to be able to log in to the admin page where I guess I need to initiate the data load from. Neither I can access the "API Documentation" part.

See a couple of gifs below
sec_app_test

sec_app_test2

from sec-filings-app.

kuatroka avatar kuatroka commented on August 26, 2024

Awesome! Yes, please do share. I'm going to open up GitHub Discussion for this repo. That would be a better place for discussing what 13F analysis features would be worth considering for this project. I'm all ears! I have less experience with SEC data and more with programming.

Will do. I'm not that familiar with SEC, I just do have some ideas about some actual financial features that might help in automation of an investment decision. Basically I want to automate what I currently do manually and SEC has a lot of possibly interesting data that need to be brought to right format > analysed > connected with other data sources > some AI/NLP pass on it > generate some investment leads and insight > present it in an awesome UI..... > many more features...

Anyway, too many ideas to explain. I'll go one by one because a lot of data engineering is needed. So if you like my ideas we can tackle them one by one.

from sec-filings-app.

kuatroka avatar kuatroka commented on August 26, 2024

A log from the last docker-compose up when all the containers got to run. Maybe it'll help identifying why I'm getting the 404 on Admin page or can't really log in?

Log $ docker-compose up Starting mailhog ... done Starting redis-commander ... done Recreating frontend ... done Starting redis ... done Starting postgres ... done Starting flower ... done Recreating celery ... done Starting pgadmin ... done Recreating backend ... done Recreating nginx ... done Attaching to redis, redis-commander, mailhog, postgres, flower, pgadmin, frontend, celery, backend, nginx flower | [I 211015 23:21:19 command:152] Visit me at http://localhost:5555 flower | [I 211015 23:21:19 command:159] Broker: redis://redis:6379/1 flower | [I 211015 23:21:19 command:160] Registered tasks: flower | ['celery.accumulate', flower | 'celery.backend_cleanup', flower | 'celery.chain', flower | 'celery.chord', flower | 'celery.chord_unlock', flower | 'celery.chunks', flower | 'celery.group', flower | 'celery.map', flower | 'celery.starmap'] flower | [I 211015 23:21:19 mixins:226] Connected to redis://redis:6379/1 flower | [W 211015 23:21:20 inspector:42] Inspect method stats failed flower | [W 211015 23:21:20 inspector:42] Inspect method registered failed flower | [W 211015 23:21:20 inspector:42] Inspect method scheduled failed flower | [W 211015 23:21:20 inspector:42] Inspect method active_queues failed flower | [W 211015 23:21:20 inspector:42] Inspect method conf failed flower | [W 211015 23:21:20 inspector:42] Inspect method active failed flower | [W 211015 23:21:20 inspector:42] Inspect method reserved failed flower | [W 211015 23:21:20 inspector:42] Inspect method revoked failed mailhog | 2021/10/15 23:21:18 Using in-memory storage mailhog | 2021/10/15 23:21:18 [SMTP] Binding to address: 0.0.0.0:1025 mailhog | 2021/10/15 23:21:18 Serving under http://0.0.0.0:8025/ mailhog | [HTTP] Binding to address: 0.0.0.0:8025 mailhog | Creating API v1 with WebPath: mailhog | Creating API v2 with WebPath: postgres | 2021-10-15 23:21:18.333 UTC [1] LOG: listening on IPv4 address "0.0.0.0", port 5432 postgres | 2021-10-15 23:21:18.333 UTC [1] LOG: listening on IPv6 address "::", port 5432 postgres | 2021-10-15 23:21:18.338 UTC [1] LOG: listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432" postgres | 2021-10-15 23:21:18.358 UTC [24] LOG: database system was interrupted; last known up at 2021-10-15 21:57:49 UTC postgres | 2021-10-15 23:21:18.594 UTC [24] LOG: database system was not properly shut down; automatic recovery in progress postgres | 2021-10-15 23:21:18.596 UTC [24] LOG: redo starts at 0/1652CF8 postgres | 2021-10-15 23:21:18.596 UTC [24] LOG: invalid record length at 0/1652DD8: wanted 24, got 0 postgres | 2021-10-15 23:21:18.596 UTC [24] LOG: redo done at 0/1652DA0 postgres | 2021-10-15 23:21:18.626 UTC [1] LOG: database system is ready to accept connections redis | 1:C 15 Oct 2021 23:21:17.845 # oO0OoO0OoO0Oo Redis is starting oO0OoO0OoO0Oo redis | 1:C 15 Oct 2021 23:21:17.845 # Redis version=6.0.9, bits=64, commit=00000000, modified=0, pid=1, just started redis | 1:C 15 Oct 2021 23:21:17.845 # Warning: no config file specified, using the default config. In order to specify a config file use redis-server /path/to/redis.conf redis | 1:M 15 Oct 2021 23:21:17.847 * Running mode=standalone, port=6379. redis | 1:M 15 Oct 2021 23:21:17.847 # WARNING: The TCP backlog setting of 511 cannot be enforced because /proc/sys/net/core/somaxconn is set to the lower value of 128. redis | 1:M 15 Oct 2021 23:21:17.847 # Server initialized redis | 1:M 15 Oct 2021 23:21:17.847 # WARNING overcommit_memory is set to 0! Background save may fail under low memory condition. To fix this issue add 'vm.overcommit_memory = 1' to /etc/sysctl.conf and then reboot or run the command 'sysctl vm.overcommit_memory=1' for this to take effect. redis | 1:M 15 Oct 2021 23:21:17.847 # WARNING you have Transparent Huge Pages (THP) support enabled in your kernel. This will create latency and memory usage issues with Redis. To fix this issue run the command 'echo madvise > /sys/kernel/mm/transparent_hugepage/enabled' as root, and add it to your /etc/rc.local in order to retain the setting after a reboot. Redis must be restarted after THP is disabled (set to 'madvise' or 'never'). redis | 1:M 15 Oct 2021 23:21:17.847 * Loading RDB produced by version 6.0.9 redis | 1:M 15 Oct 2021 23:21:17.847 * RDB age 1709 seconds redis | 1:M 15 Oct 2021 23:21:17.847 * RDB memory usage when created 1.25 Mb redis | 1:M 15 Oct 2021 23:21:17.847 * DB loaded from disk: 0.000 seconds redis | 1:M 15 Oct 2021 23:21:17.847 * Ready to accept connections redis-commander | node ./bin/redis-commander --redis-host redis redis-commander | Using scan instead of keys redis-commander | No Save: false redis-commander | listening on 0.0.0.0:8081 redis-commander | access with browser at http://127.0.0.1:8081 redis-commander | Redis Connection redis:6379 using Redis DB #0 celery | System check identified some issues: celery | celery | WARNINGS: celery | social_django.Partial.data: (fields.W904) django.contrib.postgres.fields.JSONField is deprecated. Support for it (except in historical migrations) will be removed in Django 4.0. celery | HINT: Use django.db.models.JSONField instead. celery | social_django.UserSocialAuth.extra_data: (fields.W904) django.contrib.postgres.fields.JSONField is deprecated. Support for it (except in historical migrations) will be removed in Django 4.0. celery | HINT: Use django.db.models.JSONField instead. backend | Watching for file changes with StatReloader backend | Performing system checks... backend | backend | System check identified some issues: backend | backend | WARNINGS: backend | social_django.Partial.data: (fields.W904) django.contrib.postgres.fields.JSONField is deprecated. Support for it (except in historical migrations) will be removed in Django 4.0. backend | HINT: Use django.db.models.JSONField instead. backend | social_django.UserSocialAuth.extra_data: (fields.W904) django.contrib.postgres.fields.JSONField is deprecated. Support for it (except in historical migrations) will be removed in Django 4.0. backend | HINT: Use django.db.models.JSONField instead. backend | backend | System check identified 2 issues (0 silenced). celery | System check identified some issues: celery | celery | WARNINGS: celery | social_django.Partial.data: (fields.W904) django.contrib.postgres.fields.JSONField is deprecated. Support for it (except in historical migrations) will be removed in Django 4.0. celery | HINT: Use django.db.models.JSONField instead. celery | social_django.UserSocialAuth.extra_data: (fields.W904) django.contrib.postgres.fields.JSONField is deprecated. Support for it (except in historical migrations) will be removed in Django 4.0. celery | HINT: Use django.db.models.JSONField instead. celery | Watching for file changes with StatReloader backend | backend | You have 38 unapplied migration(s). Your project may not work properly until you apply the migrations for app(s): accounts, admin, auth, authtoken, contenttypes, core, filing, sessions, social_django. backend | Run 'python manage.py migrate' to apply them. backend | October 15, 2021 - 23:21:24 backend | Django version 3.1, using settings 'backend.settings.development' backend | Starting development server at http://0.0.0.0:8000/ backend | Quit the server with CONTROL-C. pgadmin | [2021-10-15 23:21:24 +0000] [1] [INFO] Starting gunicorn 20.1.0 pgadmin | [2021-10-15 23:21:24 +0000] [1] [INFO] Listening at: http://[::]:80 (1) pgadmin | [2021-10-15 23:21:24 +0000] [1] [INFO] Using worker: gthread pgadmin | [2021-10-15 23:21:24 +0000] [86] [INFO] Booting worker with pid: 86 celery | celery | -------------- celery@6a8682e96f2e v5.0.2 (singularity) celery | --- ***** ----- celery | -- ******* ---- Linux-4.19.128-microsoft-standard-x86_64-with-glibc2.31 2021-10-15 23:21:25 celery | - *** --- * --- celery | - ** ---------- [config] celery | - ** ---------- .> app: backend:0x7fd4072fda30 celery | - ** ---------- .> transport: redis://redis:6379/1 celery | - ** ---------- .> results: redis://redis:6379/2 celery | - *** --- * --- .> concurrency: 1 (prefork) celery | -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker) celery | --- ***** ----- celery | -------------- [queues] celery | .> default exchange=default(direct) key=default celery | celery | celery | [tasks] celery | . filing.tasks.process_fails_to_deliver_data_file celery | . filing.tasks.process_filing celery | . filing.tasks.process_filing_list celery | celery | [2021-10-15 23:21:25,844: INFO/MainProcess] Connected to redis://redis:6379/1 celery | [2021-10-15 23:21:25,852: INFO/MainProcess] mingle: searching for neighbors celery | [2021-10-15 23:21:26,869: INFO/MainProcess] mingle: all alone celery | [2021-10-15 23:21:26,890: WARNING/MainProcess] /usr/local/lib/python3.9/site-packages/celery/fixups/django.py:203: UserWarning: Using settings.DEBUG leads to a memory celery | leak, never use this setting in production environments! celery | warnings.warn('''Using settings.DEBUG leads to a memory celery | celery | [2021-10-15 23:21:26,891: INFO/MainProcess] celery@6a8682e96f2e ready. celery | [2021-10-15 23:21:29,651: INFO/MainProcess] Events of group {task} enabled by remote. frontend | npm WARN deprecated [email protected]: CoffeeScript on NPM has moved to "coffeescript" (no hyphen) frontend | /usr/local/bin/quasar -> /usr/local/lib/node_modules/@quasar/cli/bin/quasar frontend | npm WARN notsup Unsupported engine for @quasar/[email protected]: wanted: {"node":">= 12.0.0","npm":">= 5.6.0","yarn":">= 1.6.0"} (current: {"node":"10.24.1","npm":"6.14.12"}) frontend | npm WARN notsup Not compatible with your version of node/npm: @quasar/[email protected] frontend | npm WARN notsup Unsupported engine for [email protected]: wanted: {"node":">=12.0.0"} (current: {"node":"10.24.1","npm":"6.14.12"}) frontend | npm WARN notsup Not compatible with your version of node/npm: [email protected] frontend | frontend | + @quasar/[email protected] frontend | added 371 packages from 293 contributors in 38.467s frontend | npm WARN optional SKIPPING OPTIONAL DEPENDENCY: [email protected] (node_modules/fsevents): frontend | npm WARN notsup SKIPPING OPTIONAL DEPENDENCY: Unsupported platform for [email protected]: wanted {"os":"darwin","arch":"any"} (current: {"os":"linux","arch":"x64"}) frontend | npm WARN optional SKIPPING OPTIONAL DEPENDENCY: [email protected] (node_modules/webpack-dev-server/node_modules/fsevents): frontend | npm WARN notsup SKIPPING OPTIONAL DEPENDENCY: Unsupported platform for [email protected]: wanted {"os":"darwin","arch":"any"} (current: {"os":"linux","arch":"x64"}) frontend | npm WARN optional SKIPPING OPTIONAL DEPENDENCY: [email protected] (node_modules/watchpack-chokidar2/node_modules/fsevents): frontend | npm WARN notsup SKIPPING OPTIONAL DEPENDENCY: Unsupported platform for [email protected]: wanted {"os":"darwin","arch":"any"} (current: {"os":"linux","arch":"x64"}) frontend | frontend | audited 1561 packages in 12.303s frontend | frontend | 89 packages are looking for funding frontend | run `npm fund` for details frontend | frontend | found 84 vulnerabilities (1 low, 38 moderate, 45 high) frontend | run `npm audit fix` to fix them, or `npm audit` for details frontend | frontend | Dev mode.......... pwa frontend | Pkg quasar........ v1.14.5 frontend | Pkg @quasar/app... v2.1.8 frontend | Debugging......... enabled frontend | frontend | Browserslist: caniuse-lite is outdated. Please run: frontend | npx browserslist@latest --update-db frontend | frontend | Why you should do it regularly: frontend | https://github.com/browserslist/browserslist#browsers-data-updating frontend | Configured browser support (at least 88.57% of global marketshare): frontend | · Chrome for Android >= 86 frontend | · Firefox for Android >= 82 frontend | · Android >= 81 frontend | · Chrome >= 76 frontend | · Edge >= 83 frontend | · Firefox >= 73 frontend | · iOS >= 10.3 frontend | · Opera >= 68 frontend | · Safari >= 11 frontend | mailhog | [APIv1] KEEPALIVE /api/v1/events frontend | App · Reading quasar.conf.js frontend | App · Checking listening address availability (0.0.0.0:8080)... frontend | App · Transpiling JS (Babel active) frontend | App · [GenerateSW] Will generate a service-worker file. Ignoring your custom written one. frontend | App · Extending PWA Webpack config frontend | App · Generating Webpack entry point frontend | App · Booting up... frontend | App · Compiling PWA... frontend | Browserslist: caniuse-lite is outdated. Please run: frontend | npx browserslist@latest --update-db mailhog | [APIv1] KEEPALIVE /api/v1/events frontend | App · Compiled PWA done in 34557 ms frontend | DONE Compiled successfully in 34562ms11:23:36 PM frontend | frontend | frontend | N App dir........... /app frontend | App URL........... http://localhost:8080/ frontend | Dev mode.......... pwa frontend | Pkg quasar........ v1.14.5 frontend | Pkg @quasar/app... v2.1.8 frontend | Transpiled JS..... yes (Babel) frontend | frontend | ℹ 「wds」: Project is running at http://0.0.0.0:8080/ frontend | ℹ 「wds」: webpack output is served from / frontend | ℹ 「wds」: 404s will fallback to /index.html

It's a shame the log doesn't want to get copied in the same easy format as I see it in my IDE

from sec-filings-app.

kuatroka avatar kuatroka commented on August 26, 2024

That's great. I would also suggest to add to this guide an explicit step with "copy to the newly created .env file the content from .env.template file before running the docker-compose up command" .

I've completed the step where I create the backend superuser and it goes well, but in the UI page, when I go to the "Admin" tab, I'm still getting the "404" error, so I can't use the new credentials to load the data.

image

from sec-filings-app.

briancaffey avatar briancaffey commented on August 26, 2024

@kuatroka OK, you are very close now. You are trying to access the /admin route, but you are making the request from the frontend app. The frontend app can be accessed on port 8080, but what you want to do is access the /admin route from http://localhost/admin. Accessing localhost (port 80) will make requests to the nginx container. nginx will route requests to either the backend container (if the request path starts with /api or /admin or the frontend dev server (all other requests are sent to the frontend). So you can access the Django admin by either going to http://localhost/admin/ or http://localhost:8000/admin/. http://localhost:8080/admin/ will not work.

from sec-filings-app.

kuatroka avatar kuatroka commented on August 26, 2024

Thanks Brian. The problem got sorted when, as per your suggestion in Discussions, I remapped nginx container's port from 80:80 to 8089:80

from sec-filings-app.

Related Issues (1)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.