Coder Social home page Coder Social logo

tiangolo / full-stack-fastapi-postgresql Goto Github PK

View Code? Open in Web Editor NEW
22.9K 263.0 3.7K 2.11 MB

Full stack, modern web application template. Using FastAPI, React, SQLModel, PostgreSQL, Docker, GitHub Actions, automatic HTTPS and more.

License: MIT License

Shell 1.03% Python 37.47% Mako 0.27% HTML 8.40% Dockerfile 0.64% JavaScript 0.55% TypeScript 51.62% Jinja 0.02%
python json json-schema docker postgresql frontend backend fastapi traefik letsencrypt swagger jwt openapi chakra-ui react tanstack-query tanstack-router typescript sqlmodel

full-stack-fastapi-postgresql's Introduction

Full Stack FastAPI Template

Test Coverage

Technology Stack and Features

  • โšก FastAPI for the Python backend API.
    • ๐Ÿงฐ SQLModel for the Python SQL database interactions (ORM).
    • ๐Ÿ” Pydantic, used by FastAPI, for the data validation and settings management.
    • ๐Ÿ’พ PostgreSQL as the SQL database.
  • ๐Ÿš€ React for the frontend.
    • ๐Ÿ’ƒ Using TypeScript, hooks, Vite, and other parts of a modern frontend stack.
    • ๐ŸŽจ Chakra UI for the frontend components.
    • ๐Ÿค– An automatically generated frontend client.
    • ๐Ÿฆ‡ Dark mode support.
  • ๐Ÿ‹ Docker Compose for development and production.
  • ๐Ÿ”’ Secure password hashing by default.
  • ๐Ÿ”‘ JWT token authentication.
  • ๐Ÿ“ซ Email based password recovery.
  • โœ… Tests with Pytest.
  • ๐Ÿ“ž Traefik as a reverse proxy / load balancer.
  • ๐Ÿšข Deployment instructions using Docker Compose, including how to set up a frontend Traefik proxy to handle automatic HTTPS certificates.
  • ๐Ÿญ CI (continuous integration) and CD (continuous deployment) based on GitHub Actions.

Dashboard Login

API docs

Dashboard - Admin

API docs

Dashboard - Create User

API docs

Dashboard - Items

API docs

Dashboard - User Settings

API docs

Dashboard - Dark Mode

API docs

Interactive API Documentation

API docs

How To Use It

You can just fork or clone this repository and use it as is.

โœจ It just works. โœจ

How to Use a Private Repository

If you want to have a private repository, GitHub won't allow you to simply fork it as it doesn't allow changing the visibility of forks.

But you can do the following:

  • Create a new GitHub repo, for example my-full-stack.
  • Clone this repository manually, set the name with the name of the project you want to use, for example my-full-stack:
git clone [email protected]:tiangolo/full-stack-fastapi-template.git my-full-stack
  • Enter into the new directory:
cd my-full-stack
  • Set the new origin to your new repository, copy it from the GitHub interface, for example:
git remote set-url origin [email protected]:octocat/my-full-stack.git
  • Add this repo as another "remote" to allow you to get updates later:
git remote add upstream [email protected]:tiangolo/full-stack-fastapi-template.git
  • Push the code to your new repository:
git push -u origin master

Update From the Original Template

After cloning the repository, and after doing changes, you might want to get the latest changes from this original template.

  • Make sure you added the original repository as a remote, you can check it with:
git remote -v

origin    [email protected]:octocat/my-full-stack.git (fetch)
origin    [email protected]:octocat/my-full-stack.git (push)
upstream    [email protected]:tiangolo/full-stack-fastapi-template.git (fetch)
upstream    [email protected]:tiangolo/full-stack-fastapi-template.git (push)
  • Pull the latest changes without merging:
git pull --no-commit upstream master

This will download the latest changes from this template without committing them, that way you can check everything is right before committing.

  • If there are conflicts, solve them in your editor.

  • Once you are done, commit the changes:

git merge --continue

Configure

You can then update configs in the .env files to customize your configurations.

Before deploying it, make sure you change at least the values for:

  • SECRET_KEY
  • FIRST_SUPERUSER_PASSWORD
  • POSTGRES_PASSWORD

You can (and should) pass these as environment variables from secrets.

Read the deployment.md docs for more details.

Generate Secret Keys

Some environment variables in the .env file have a default value of changethis.

You have to change them with a secret key, to generate secret keys you can run the following command:

python -c "import secrets; print(secrets.token_urlsafe(32))"

Copy the content and use that as password / secret key. And run that again to generate another secure key.

How To Use It - Alternative With Copier

This repository also supports generating a new project using Copier.

It will copy all the files, ask you configuration questions, and update the .env files with your answers.

Install Copier

You can install Copier with:

pip install copier

Or better, if you have pipx, you can run it with:

pipx install copier

Note: If you have pipx, installing copier is optional, you could run it directly.

Generate a Project With Copier

Decide a name for your new project's directory, you will use it below. For example, my-awesome-project.

Go to the directory that will be the parent of your project, and run the command with your project's name:

copier copy https://github.com/tiangolo/full-stack-fastapi-template my-awesome-project --trust

If you have pipx and you didn't install copier, you can run it directly:

pipx run copier copy https://github.com/tiangolo/full-stack-fastapi-template my-awesome-project --trust

Note the --trust option is necessary to be able to execute a post-creation script that updates your .env files.

Input Variables

Copier will ask you for some data, you might want to have at hand before generating the project.

But don't worry, you can just update any of that in the .env files afterwards.

The input variables, with their default values (some auto generated) are:

  • project_name: (default: "FastAPI Project") The name of the project, shown to API users (in .env).
  • stack_name: (default: "fastapi-project") The name of the stack used for Docker Compose labels and project name (no spaces, no periods) (in .env).
  • secret_key: (default: "changethis") The secret key for the project, used for security, stored in .env, you can generate one with the method above.
  • first_superuser: (default: "[email protected]") The email of the first superuser (in .env).
  • first_superuser_password: (default: "changethis") The password of the first superuser (in .env).
  • smtp_host: (default: "") The SMTP server host to send emails, you can set it later in .env.
  • smtp_user: (default: "") The SMTP server user to send emails, you can set it later in .env.
  • smtp_password: (default: "") The SMTP server password to send emails, you can set it later in .env.
  • emails_from_email: (default: "[email protected]") The email account to send emails from, you can set it later in .env.
  • postgres_password: (default: "changethis") The password for the PostgreSQL database, stored in .env, you can generate one with the method above.
  • sentry_dsn: (default: "") The DSN for Sentry, if you are using it, you can set it later in .env.

Backend Development

Backend docs: backend/README.md.

Frontend Development

Frontend docs: frontend/README.md.

Deployment

Deployment docs: deployment.md.

Development

General development docs: development.md.

This includes using Docker Compose, custom local domains, .env configurations, etc.

Release Notes

Check the file release-notes.md.

License

The Full Stack FastAPI Template is licensed under the terms of the MIT license.

full-stack-fastapi-postgresql's People

Contributors

alejsdev avatar alonme avatar bechtold avatar br3ndonland avatar codesmith-emmy avatar dependabot[bot] avatar disrupted avatar dmontagu avatar dr-neptune avatar dudil avatar ebreton avatar efonte avatar estebanx64 avatar graue70 avatar gucharbon avatar leonlowitzki avatar little7li avatar maurob avatar mocsar avatar mpclarkson avatar nonameentered avatar patrick91 avatar qu3vipon avatar rcheese avatar rlonka avatar sanggusti avatar santigandolfo avatar stephenbrown2 avatar tiangolo avatar uepoch avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

full-stack-fastapi-postgresql's Issues

navigating to /redocs - Bad Gateway

Using this locally. I'm typing docker-compose up the container come up correctly. pgadmin, flower and the front end all come up. When trying to log into the front end it isn't working so following the steps on #16 I navigated to http://localhost/redocs. When going there I get the following error

502 Bad Gateway

in the console I get the following error message

time="2019-09-29T16:38:34Z" level=debug msg="vulcand/oxy/forward/http: Round trip: http://172.23.0.6:80, code: 304, Length: 0, duration: 4.3777ms"
proxy_1          | time="2019-09-29T16:38:34Z" level=debug msg="vulcand/oxy/forward/http: completed ServeHttp on request" Request="{\"Method\":\"GET\",\"URL\":{\"Scheme\":\"http\",\"Opaque\":\"\",\"User\":null,\"Host\":\"172.23.0.6:80\",\"Path\":\"\",\"RawPath\":\"\",\"ForceQuery\":false,\"RawQuery\":\"\",\"Fragment\":\"\"},\"Proto\":\"HTTP/1.1\",\"ProtoMajor\":1,\"ProtoMinor\":1,\"Header\":{\"Accept\":[\"*/*\"],\"Accept-Encoding\":[\"gzip, deflate, br\"],\"Accept-Language\":[\"en-US,en;q=0.9\"],\"Cache-Control\":[\"max-age=0\"],\"Connection\":[\"keep-alive\"],\"Cookie\":[\"Pycharm-e4c9983b=c8545d5e-7dc8-4c3b-beeb-18df8be60c30; Pycharm-950af9c=c8545d5e-7dc8-4c3b-beeb-18df8be60c30; _ga=GA1.1.1130344896.1560453393; Pycharm-7f37a41b=c8545d5e-7dc8-4c3b-beeb-18df8be60c30; csrftoken=8cAzuOnjEBIpetH1yAFm1Toeahy3DSS9txNmvXuY0f3YyRU1OfXXd5NRAy0XbaU4; sessionid=dgxl7lny768p48t2e2y7ys3bifkogfh1; PGADMIN_LANGUAGE=en; pga4_session=f57ff23f-8b7c-4320-83f0-f6cccf7c9d7e!ZawYY9R0RWij6jmfZ/xhxcE4Upw=\"],\"Dnt\":[\"1\"],\"If-Modified-Since\":[\"Fri, 27 Sep 2019 02:20:05 GMT\"],\"If-None-Match\":[\"\\\"5d8d71d5-3c5\\\"\"],\"Referer\":[\"http://localhost/service-worker.js\"],\"Sec-Fetch-Mode\":[\"same-origin\"],\"Sec-Fetch-Site\":[\"same-origin\"],\"Service-Worker\":[\"script\"],\"User-Agent\":[\"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/77.0.3865.90 Safari/537.36\"]},\"ContentLength\":0,\"TransferEncoding\":null,\"Host\":\"localhost\",\"Form\":null,\"PostForm\":null,\"MultipartForm\":null,\"Trailer\":null,\"RemoteAddr\":\"172.23.0.1:43180\",\"RequestURI\":\"/service-worker.js\",\"TLS\":null}"
proxy_1          | time="2019-09-29T16:38:34Z" level=debug msg="vulcand/oxy/forward: completed ServeHttp on request" Request="{\"Method\":\"GET\",\"URL\":{\"Scheme\":\"http\",\"Opaque\":\"\",\"User\":null,\"Host\":\"172.23.0.6:80\",\"Path\":\"\",\"RawPath\":\"\",\"ForceQuery\":false,\"RawQuery\":\"\",\"Fragment\":\"\"},\"Proto\":\"HTTP/1.1\",\"ProtoMajor\":1,\"ProtoMinor\":1,\"Header\":{\"Accept\":[\"*/*\"],\"Accept-Encoding\":[\"gzip, deflate, br\"],\"Accept-Language\":[\"en-US,en;q=0.9\"],\"Cache-Control\":[\"max-age=0\"],\"Connection\":[\"keep-alive\"],\"Cookie\":[\"Pycharm-e4c9983b=c8545d5e-7dc8-4c3b-beeb-18df8be60c30; Pycharm-950af9c=c8545d5e-7dc8-4c3b-beeb-18df8be60c30; _ga=GA1.1.1130344896.1560453393; Pycharm-7f37a41b=c8545d5e-7dc8-4c3b-beeb-18df8be60c30; csrftoken=8cAzuOnjEBIpetH1yAFm1Toeahy3DSS9txNmvXuY0f3YyRU1OfXXd5NRAy0XbaU4; sessionid=dgxl7lny768p48t2e2y7ys3bifkogfh1; PGADMIN_LANGUAGE=en; pga4_session=f57ff23f-8b7c-4320-83f0-f6cccf7c9d7e!ZawYY9R0RWij6jmfZ/xhxcE4Upw=\"],\"Dnt\":[\"1\"],\"If-Modified-Since\":[\"Fri, 27 Sep 2019 02:20:05 GMT\"],\"If-None-Match\":[\"\\\"5d8d71d5-3c5\\\"\"],\"Referer\":[\"http://localhost/service-worker.js\"],\"Sec-Fetch-Mode\":[\"same-origin\"],\"Sec-Fetch-Site\":[\"same-origin\"],\"Service-Worker\":[\"script\"],\"User-Agent\":[\"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/77.0.3865.90 Safari/537.36\"]},\"ContentLength\":0,\"TransferEncoding\":null,\"Host\":\"localhost\",\"Form\":null,\"PostForm\":null,\"MultipartForm\":null,\"Trailer\":null,\"RemoteAddr\":\"172.23.0.1:43180\",\"RequestURI\":\"/service-worker.js\",\"TLS\":null}"
proxy_1          | time="2019-09-29T16:38:34Z" level=debug msg="vulcand/oxy/roundrobin/rr: completed ServeHttp on request" Request="{\"Method\":\"GET\",\"URL\":{\"Scheme\":\"\",\"Opaque\":\"\",\"User\":null,\"Host\":\"\",\"Path\":\"/service-worker.js\",\"RawPath\":\"\",\"ForceQuery\":false,\"RawQuery\":\"\",\"Fragment\":\"\"},\"Proto\":\"HTTP/1.1\",\"ProtoMajor\":1,\"ProtoMinor\":1,\"Header\":{\"Accept\":[\"*/*\"],\"Accept-Encoding\":[\"gzip, deflate, br\"],\"Accept-Language\":[\"en-US,en;q=0.9\"],\"Cache-Control\":[\"max-age=0\"],\"Connection\":[\"keep-alive\"],\"Cookie\":[\"Pycharm-e4c9983b=c8545d5e-7dc8-4c3b-beeb-18df8be60c30; Pycharm-950af9c=c8545d5e-7dc8-4c3b-beeb-18df8be60c30; _ga=GA1.1.1130344896.1560453393; Pycharm-7f37a41b=c8545d5e-7dc8-4c3b-beeb-18df8be60c30; csrftoken=8cAzuOnjEBIpetH1yAFm1Toeahy3DSS9txNmvXuY0f3YyRU1OfXXd5NRAy0XbaU4; sessionid=dgxl7lny768p48t2e2y7ys3bifkogfh1; PGADMIN_LANGUAGE=en; pga4_session=f57ff23f-8b7c-4320-83f0-f6cccf7c9d7e!ZawYY9R0RWij6jmfZ/xhxcE4Upw=\"],\"Dnt\":[\"1\"],\"If-Modified-Since\":[\"Fri, 27 Sep 2019 02:20:05 GMT\"],\"If-None-Match\":[\"\\\"5d8d71d5-3c5\\\"\"],\"Referer\":[\"http://localhost/service-worker.js\"],\"Sec-Fetch-Mode\":[\"same-origin\"],\"Sec-Fetch-Site\":[\"same-origin\"],\"Service-Worker\":[\"script\"],\"User-Agent\":[\"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/77.0.3865.90 Safari/537.36\"]},\"ContentLength\":0,\"TransferEncoding\":null,\"Host\":\"localhost\",\"Form\":null,\"PostForm\":null,\"MultipartForm\":null,\"Trailer\":null,\"RemoteAddr\":\"172.23.0.1:43180\",\"RequestURI\":\"/service-worker.js\",\"TLS\":null}"
proxy_1          | 172.23.0.1 - - [29/Sep/2019:16:38:34 +0000] "GET /service-worker.js HTTP/1.1" 304 0 "http://localhost/service-worker.js" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/77.0.3865.90 Safari/537.36" 6 "PathPrefix-1" "http://172.23.0.6:80" 7ms

I was thinking it could possibly be a CORS issue? so I updated my main.py file as described here #25 after adjusting and restarting docker it the issue persisted.

Back-end debugging in Docker

Thank you for this project, it's very useful.

One of the missing things is a description of how to debug back-end services in Docker. I couldn't successfully manage how to debug in VSCode (via ptvsd) or PyCharm (via remote debug option). Probably some Traefik config should be changed in order to allow remote connections to interpreter.

This cookiecutter but without the frontend

For a few of my projects, I have used this cookiecutter but removed the frontend part.

Would it be useful to create a cookiecutter that followed this project, but removed the frontend? It would allow the project to focus on purely the API, with the frontend in a separate project.

It would also make it easier to swap out the dashboard part and use something like https://github.com/marmelab/react-admin in a separate project.

How to configure celery workers?

I'm currently playing around the celery workers, even though I have followed the Celery Worker official guide, I could increase neither the worker count nor the concurrency count.

Below is my change in worker-start.sh

# celery worker -A app.worker -l info -Q main-queue -c 4
celery worker -A app.worker --loglevel=INFO -O fair --concurrency=2 -n worker1@%h 
celery worker -A app.worker --loglevel=INFO -O fair --concurrency=2 -n worker2@%h

I'm expecting to start the 2 workers, each can fork 2 processes, however, after docker-compose restart, I can only get a single worker, with one single process. I tried to add several tasks, each took 180 seconds, they can only be executed one after another, because I have only one worker to run them in sequential.

Any other places shall I look into to get more workers/processes? Thanks.

Pytest returns JSONDecodeError

Still working through exactly how everything fits together (great project that does everything) - but finally got around to testing the code after noticing that the Swagger UI would post data into the DB (verified in pgAdmin), but would throw a response error.

To start fresh - pulled a new version of everything and ran the test from a clean install - same error...

============================= test session starts ==============================
platform linux -- Python 3.7.4, pytest-5.1.2, py-1.8.0, pluggy-0.12.0
rootdir: /app/app/tests
collected 23 items

app/app/tests/api/api_v1/test_celery.py .                                [  4%]
app/app/tests/api/api_v1/test_items.py FF                                [ 13%]
app/app/tests/api/api_v1/test_login.py .F                                [ 21%]
app/app/tests/api/api_v1/test_users.py FFF..F                            [ 47%]
app/app/tests/crud/test_item.py ....                                     [ 65%]
app/app/tests/crud/test_user.py ........                                 [100%]

=================================== FAILURES ===================================
_______________________________ test_create_item _______________________________

superuser_token_headers = {'Authorization': 'Bearer eyJ0eXAiOiJ...o7a_gBBlGdepjpY'}

    def test_create_item(superuser_token_headers):
        server_api = get_server_api()
        data = {"title": "Foo", "description": "Fighters"}
        response = requests.post(
            f"{server_api}{config.API_V1_STR}/items/",
            headers=superuser_token_headers,
            json=data,
        )
>       content = response.json()

app/app/tests/api/api_v1/test_items.py:16: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
usr/local/lib/python3.7/site-packages/requests/models.py:897: in json
    return complexjson.loads(self.text, **kwargs)
usr/local/lib/python3.7/json/__init__.py:348: in loads
    return _default_decoder.decode(s)
usr/local/lib/python3.7/json/decoder.py:337: in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <json.decoder.JSONDecoder object at 0x7f3f7d41d510>
s = 'Internal Server Error', idx = 0

    def raw_decode(self, s, idx=0):
        """Decode a JSON document from ``s`` (a ``str`` beginning with
        a JSON document) and return a 2-tuple of the Python
        representation and the index in ``s`` where the document ended.
    
        This can be used to decode a JSON document from a string that may
        have extraneous data at the end.
    
        """
        try:
            obj, end = self.scan_once(s, idx)
        except StopIteration as err:
>           raise JSONDecodeError("Expecting value", s, err.value) from None
E           json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

usr/local/lib/python3.7/json/decoder.py:355: JSONDecodeError

You can see that the tests all fail on same JSON decoding. I've tried each of the failed tests and noted the same behavior - the POSTs will update the db - but returning information from the bd has the decoder error...

Here is an example of a successful POST to the database of an item but the response fails.

backend_1        | INFO: ('172.21.0.5', 57326) - "POST /api/v1/items/ HTTP/1.1" 500
backend_1        | ERROR: Exception in ASGI application
backend_1        | Traceback (most recent call last):
backend_1        |   File "/usr/local/lib/python3.7/site-packages/uvicorn/protocols/http/httptools_impl.py", line 368, in run_asgi
backend_1        |     result = await app(self.scope, self.receive, self.send)
backend_1        |   File "/usr/local/lib/python3.7/site-packages/starlette/applications.py", line 133, in __call__
backend_1        |     await self.error_middleware(scope, receive, send)
backend_1        |   File "/usr/local/lib/python3.7/site-packages/starlette/middleware/errors.py", line 177, in __call__
backend_1        |     raise exc from None
backend_1        |   File "/usr/local/lib/python3.7/site-packages/starlette/middleware/errors.py", line 155, in __call__
backend_1        |     await self.app(scope, receive, _send)
backend_1        |   File "/usr/local/lib/python3.7/site-packages/starlette/middleware/base.py", line 25, in __call__
backend_1        |     response = await self.dispatch_func(request, self.call_next)
backend_1        |   File "./app/main.py", line 34, in db_session_middleware
backend_1        |     response = await call_next(request)
backend_1        |   File "/usr/local/lib/python3.7/site-packages/starlette/middleware/base.py", line 45, in call_next
backend_1        |     task.result()
backend_1        |   File "/usr/local/lib/python3.7/site-packages/starlette/middleware/base.py", line 38, in coro
backend_1        |     await self.app(scope, receive, send)
backend_1        |   File "/usr/local/lib/python3.7/site-packages/starlette/middleware/cors.py", line 84, in __call__
backend_1        |     await self.simple_response(scope, receive, send, request_headers=headers)
backend_1        |   File "/usr/local/lib/python3.7/site-packages/starlette/middleware/cors.py", line 140, in simple_response
backend_1        |     await self.app(scope, receive, send)
backend_1        |   File "/usr/local/lib/python3.7/site-packages/starlette/exceptions.py", line 73, in __call__
backend_1        |     raise exc from None
backend_1        |   File "/usr/local/lib/python3.7/site-packages/starlette/exceptions.py", line 62, in __call__
backend_1        |     await self.app(scope, receive, sender)
backend_1        |   File "/usr/local/lib/python3.7/site-packages/starlette/routing.py", line 590, in __call__
backend_1        |     await route(scope, receive, send)
backend_1        |   File "/usr/local/lib/python3.7/site-packages/starlette/routing.py", line 208, in __call__
backend_1        |     await self.app(scope, receive, send)
backend_1        |   File "/usr/local/lib/python3.7/site-packages/starlette/routing.py", line 41, in app
backend_1        |     response = await func(request)
backend_1        |   File "/usr/local/lib/python3.7/site-packages/fastapi/routing.py", line 124, in app
backend_1        |     skip_defaults=response_model_skip_defaults,
backend_1        |   File "/usr/local/lib/python3.7/site-packages/fastapi/routing.py", line 56, in serialize_response
backend_1        |     raise ValidationError(errors, field.type_)
backend_1        | pydantic.error_wrappers.ValidationError: 1 validation error for Item
backend_1        | response
backend_1        |   value is not a valid dict (type=type_error.dict)

There are no changes to the code... As I'm slightly better than noob status, it's possible that I've missed something in the install workflow - but, if so, can't find it. Any help or a point in the right direction would be greatly appreciated.

backend code layout: maybe group by resources

IMHO, the current layout is not very practical for adding new / removing resources, therefore I would like to suggest differently pivoted organization of code.

I think most people will continue from this cookiecutter by adding / removing resources like Items and Users.

Example usecase

Lets say that apart from Users and Items you want to add Ideas to your app.
Currently you need to visit each directory: api, crud, db, db_models, tests/* and add the file named idea(s).py.

Maybe you start to think that the Ideas are better and you want to get rid of Items now...

Suggested layout

  • /users/

    • __init__.py
    • model.py
    • db.py
    • api.py
    • crud.py
    • utils.py
  • /items/

    • __init__.py
    • model.py
    • db.py
    • api.py
    • crud.py
  • /tokens/

    • ...
  • tests should be structured analogically

I am not yet sure how other parts should be layed out, but I would try to squash remaining files in now almost empty subpackages to core.

What do you think?

issue deploying vue frontend changes

I am new to Docker containerizations. Had the application built and running using docker. The backend changes I make are getting deployed correctly - verified through the swagger docs. However, i am not able to deploy any changes made to the frontend application.

I made a change to the admin view and added a new column to the table. Followed that up with "docker-compose up -d". I expected to see the change to the UI when ready, but that did not happen. The UI is still the same without the new column I added.

[BUG] SQLAlchemy InvalidRequestError on relationships if one of the model is not yet imported

Describe the bug

The application crashes at start-up, when initializing data if :

  1. a relationship is defined ...
  2. ... with a model not already imported at the time of execution.
backend_1        | INFO:__main__:Starting call to '__main__.init', this is the 2nd time calling it.
backend_1        | INFO:__main__:Service finished initializing
backend_1        | INFO  [alembic.runtime.migration] Context impl PostgresqlImpl.
backend_1        | INFO  [alembic.runtime.migration] Will assume transactional DDL.
backend_1        | INFO  [alembic.runtime.migration] Running upgrade  -> d4867f3a4c0a, First revision
backend_1        | INFO  [alembic.runtime.migration] Running upgrade d4867f3a4c0a -> ea9cad5d9292, Added SubItem models
backend_1        | INFO:__main__:Creating initial data
backend_1        | Traceback (most recent call last):
backend_1        |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/ext/declarative/clsregistry.py", line 294, in __call__
backend_1        |     x = eval(self.arg, globals(), self._dict)
backend_1        |   File "<string>", line 1, in <module>
backend_1        | NameError: name 'SubItem' is not defined
...
backend_1        | sqlalchemy.exc.InvalidRequestError: When initializing mapper mapped class Item->item, expression 'SubItem' failed to locate a name ("name 'SubItem' is not defined"). If this is a class name, consider adding this relationship() to the <class 'app.db_models.item.Item'> class after both dependent classes have been defined.
base-project_backend_1 exited with code 1

To Reproduce

create a new db_models/subitems.py (could be copied from item.py)

It is important that this new model has a relationship to another one, e.g Item

from sqlalchemy import Column, ForeignKey, Integer, String
from sqlalchemy.orm import relationship

from app.db.base_class import Base


class SubItem(Base):
    id = Column(Integer, primary_key=True, index=True)
    title = Column(String, index=True)
    description = Column(String, index=True)
    item_id = Column(Integer, ForeignKey("item.id"))
    item = relationship("Item", back_populates="subitems")

Adapt db_models/item.py with the new relationship

...

class Item(Base):
    ...
    subitems = relationship("SubItem", back_populates="item")

Declare the new SubItem in db/base.py as per the documentation

# Import all the models, so that Base has them before being
# imported by Alembic
from app.db.base_class import Base  # noqa
from app.db_models.user import User  # noqa
from app.db_models.item import Item  # noqa
from app.db_models.subitem import SubItem  # noqa

Re-build and start the application. The full traceback follows

backend_1        | INFO:__main__:Creating initial data
backend_1        | Traceback (most recent call last):
backend_1        |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/ext/declarative/clsregistry.py", line 294, in __call__
backend_1        |     x = eval(self.arg, globals(), self._dict)
backend_1        |   File "<string>", line 1, in <module>
backend_1        | NameError: name 'SubItem' is not defined
backend_1        |
backend_1        | During handling of the above exception, another exception occurred:
backend_1        |
backend_1        | Traceback (most recent call last):
backend_1        |   File "/app/app/initial_data.py", line 21, in <module>
backend_1        |     main()
backend_1        |   File "/app/app/initial_data.py", line 16, in main
backend_1        |     init()
backend_1        |   File "/app/app/initial_data.py", line 11, in init
backend_1        |     init_db(db_session)
backend_1        |   File "/app/app/db/init_db.py", line 12, in init_db
backend_1        |     user = crud.user.get_by_email(db_session, email=config.FIRST_SUPERUSER)
backend_1        |   File "/app/app/crud/user.py", line 16, in get_by_email
backend_1        |     return db_session.query(User).filter(User.email == email).first()
backend_1        |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/scoping.py", line 162, in do
backend_1        |     return getattr(self.registry(), name)(*args, **kwargs)
backend_1        |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 1543, in query
backend_1        |     return self._query_cls(entities, self, **kwargs)
backend_1        |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 168, in __init__
backend_1        |     self._set_entities(entities)
backend_1        |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 200, in _set_entities
backend_1        |     self._set_entity_selectables(self._entities)
backend_1        |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 231, in _set_entity_selectables
backend_1        |     ent.setup_entity(*d[entity])
backend_1        |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 4077, in setup_entity
backend_1        |     self._with_polymorphic = ext_info.with_polymorphic_mappers
backend_1        |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/util/langhelpers.py", line 855, in __get__
backend_1        |     obj.__dict__[self.__name__] = result = self.fget(obj)
backend_1        |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/mapper.py", line 2135, in _with_polymorphic_mappers
backend_1        |     configure_mappers()
backend_1        |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/mapper.py", line 3229, in configure_mappers
backend_1        |     mapper._post_configure_properties()
backend_1        |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/mapper.py", line 1947, in _post_configure_properties
backend_1        |     prop.init()
backend_1        |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/interfaces.py", line 196, in init
backend_1        |     self.do_init()
backend_1        |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/relationships.py", line 1860, in do_init
backend_1        |     self._process_dependent_arguments()
backend_1        |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/relationships.py", line 1922, in _process_dependent_arguments
backend_1        |     self.target = self.entity.persist_selectable
backend_1        |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/util/langhelpers.py", line 855, in __get__
backend_1        |     obj.__dict__[self.__name__] = result = self.fget(obj)
backend_1        |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/relationships.py", line 1827, in entity
backend_1        |     argument = self.argument()
backend_1        |   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/ext/declarative/clsregistry.py", line 306, in __call__
backend_1        |     % (self.prop.parent, self.arg, n.args[0], self.cls)
backend_1        | sqlalchemy.exc.InvalidRequestError: When initializing mapper mapped class Item->item, expression 'SubItem' failed to locate a name ("name 'SubItem' is not defined"). If this is a class name, consider adding this relationship() to the <class 'app.db_models.item.Item'> class after both dependent classes have been defined.
base-project_backend_1 exited with code 1

Expected behavior
The solution should have started normally

Additionnal context
In (most of) real use-cases, the application would have defined some CRUD operation on the new defined model, and consequently imported it here and there.

Thus making it available at the time when creating initial data.

Nevertheless, the error is so annoying and obscure (when it happens) that it deserves a safeguard (see my PR for a suggestion)

Idea: Run tests locally outside docker

related also to #24.

At first, the tests/api/* are not using fixture server_api

from app.tests.utils.utils import get_server_api

def test_celery_worker_test(superuser_token_headers):
    server_api = get_server_api()

shall be

def test_celery_worker_test(server_api, superuser_token_headers):

this would allow to improve pytest tests this way (as idea, further improvements possible):

from multiprocessing import Process

import pytest
import uvicorn

def run_server():
    uvicorn.run("app.main:app", port=8123)

@pytest.fixture(scope="module")
def server_api(server):
    proc = Process(target=run_server, args=(), daemon=True)
    proc.start()
    # maybe some sleep here to wait for server starts
    yield ('localhost', 8123)
    proc.kill()  # Cleanup after test

credits to: https://stackoverflow.com/questions/57412825/how-to-start-a-uvicorn-fastapi-in-background-when-testing-with-pytest

[Question] WebSocket config example

Could you please provide an example of how to configure access to WebSockets? For instance, to the following endpoint:

@app.websocket("/ws")
async def websocket_endpoint(websocket: WebSocket):
    await websocket.accept()
    await websocket.send_text(f"WebSocket client connected")

I could't connect to the endpoint from the client app, getting Error during WebSocket handshake: Unexpected response code: 200. I suppose, either Traefik or Nginx configs should be changed.

Regarding the usage of the 'sub' field in jwt tokens

Hi, thanks for the awesome repo. I'm building a site based on it, and found something that confused me.

This project uses the jwt field 'sub' as a kind of way of explaining the intention of the jwt (first example, second example), instead of just using it to describe the subject, or user in our case, of the token. I thought that this was the way the 'sub' field should work, at least from what I've read.

Is there a reason for this? Am I understanding the usage of 'sub' wrong?

I am planning of using the 'sub' field in my site to send the username back to the web client, and this way have an easy way to access it to show on the site. Is this wrong or unsafe?

Thanks!

> > > When defining your SQLAlchemy models, you can specify a default parameter:

When defining your SQLAlchemy models, you can specify a default parameter:

class Item(Base):
    __tablename__ = "item"
    id = Column(Integer, primary_key=True)
    created_at = Column(DateTime(), default=func.now())
    status = Column(String, default='open')

say for example column is status and the default values are (open, closed, unresolved)

Exactly!

I am a little confused here

Originally posted by @AnneNamuli in #60 (comment)

class myEnum(enum.Enum):
       open = "open"
       closed="closed"
       unresolved = "unresolved"
class Item(Base):
    __tablename__ = "item"
    id = Column(Integer, primary_key=True)
    created_at = Column(DateTime(), default=func.now())
    status = Column(Enum(myEnum), default='open')```

[BUG] DB OperationalError

Describe the bug

The application returns (randomly) 500 errors.

When looking at the logs, we can see some OperationError occuring: sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) server closed the connection unexpectedly

Following the link at the end of the traceback, we have the following information from SQLAlchemy web site:

Exception raised for errors that are related to the databaseโ€™s operation and not necessarily under the control of the programmer, e.g. an unexpected disconnect occurs, the data source name is not found, a transaction could not be processed, a memory allocation error occurred during processing, etc.

This error is a DBAPI Error and originates from the database driver (DBAPI), not SQLAlchemy itself.

The OperationalError is the most common (but not the only) error class used by drivers in the context of the database connection being dropped, or not being able to connect to the database. For tips on how to deal with this, see the section Dealing with Disconnects.

Here is the full traceback

[2019-05-20 07:46:24 +0000] [12] [INFO] ('10.0.16.163', 56442) - "GET /api/v1/audit_app/audit_groups/me HTTP/1.1" 500
[2019-05-20 07:46:24 +0000] [12] [ERROR] Exception in ASGI application
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/uvicorn/protocols/http/httptools_impl.py", line 372, in run_asgi
result = await asgi(self.receive, self.send)
File "/usr/local/lib/python3.7/site-packages/starlette/middleware/errors.py", line 125, in asgi
raise exc from None
File "/usr/local/lib/python3.7/site-packages/starlette/middleware/errors.py", line 103, in asgi
await asgi(receive, _send)
File "/usr/local/lib/python3.7/site-packages/starlette/middleware/base.py", line 27, in asgi
response = await self.dispatch_func(request, self.call_next)
File "/app/app/main.py", line 37, in db_session_middleware
response = await call_next(request)
File "/usr/local/lib/python3.7/site-packages/starlette/middleware/base.py", line 44, in call_next
task.result()
File "/usr/local/lib/python3.7/site-packages/starlette/middleware/base.py", line 37, in coro
await inner(request.receive, queue.put)
File "/usr/local/lib/python3.7/site-packages/starlette/exceptions.py", line 74, in app
raise exc from None
File "/usr/local/lib/python3.7/site-packages/starlette/exceptions.py", line 63, in app
await instance(receive, sender)
File "/usr/local/lib/python3.7/site-packages/starlette/routing.py", line 41, in awaitable
response = await func(request)
File "/usr/local/lib/python3.7/site-packages/fastapi/routing.py", line 66, in app
request=request, dependant=dependant, body=body
File "/usr/local/lib/python3.7/site-packages/fastapi/dependencies/utils.py", line 270, in solve_dependencies
background_tasks=background_tasks,
File "/usr/local/lib/python3.7/site-packages/fastapi/dependencies/utils.py", line 279, in solve_dependencies
solved = await run_in_threadpool(sub_dependant.call, **sub_values)
File "/usr/local/lib/python3.7/site-packages/starlette/concurrency.py", line 24, in run_in_threadpool
return await loop.run_in_executor(None, func, *args)
File "/usr/local/lib/python3.7/concurrent/futures/thread.py", line 57, in run
result = self.fn(*self.args, **self.kwargs)
File "/app/app/api/utils/security.py", line 28, in get_current_user
user = crud.user.get(db, user_id=token_data.user_id)
File "/app/app/crud/user.py", line 12, in get
return db_session.query(User).filter(User.id == user_id).first()
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3215, in first
ret = list(self[0:1])
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3007, in __getitem__
return list(res)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3317, in __iter__
return self._execute_and_instances(context)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3342, in _execute_and_instances
result = conn.execute(querycontext.statement, self._params)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 988, in execute
return meth(self, multiparams, params)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/sql/elements.py", line 287, in _execute_on_connection
return connection._execute_clauseelement(self, multiparams, params)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1107, in _execute_clauseelement
distilled_params,
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1248, in _execute_context
e, statement, parameters, cursor, context
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1466, in _handle_dbapi_exception
util.raise_from_cause(sqlalchemy_exception, exc_info)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/util/compat.py", line 383, in raise_from_cause
reraise(type(exception), exception, tb=exc_tb, cause=cause)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/util/compat.py", line 128, in reraise
raise value.with_traceback(tb)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1244, in _execute_context
cursor, statement, parameters, context
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/default.py", line 552, in do_execute
cursor.execute(statement, parameters)
sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) server closed the connection unexpectedly
This probably means the server terminated abnormally
before or while processing the request.
y3zc2sy
[SQL: SELECT users.name AS users_name, users.password AS users_password, users.id AS users_id, users.email AS users_email, users.is_active AS users_is_active, users.is_superuser AS users_is_superuser, users.created_at AS users_created_at, users.updated_at AS users_updated_at, users.city_id AS users_city_id
FROM users
WHERE users.id = %(id_1)s
LIMIT %(param_1)s]
[parameters: {'id_1': 2, 'param_1': 1}]
(Background on this error at: http://sqlalche.me/e/e3q8)

Expected behavior
No 500 returned to the end user. Connection refreshed if needed.

Additionnal context
The application runs within a container, following the setup from https://dockerswarm.rocks

[QUESTION] How to start celery beat together with workers?

I'm trying to use the celery beat service to start some scheduled tasks, and publish these tasks into a different queue, so I can have a different type of worker to consume the tasks.

My current approach is to use the similar way of starting normal worker in dockerfile, however, seems I can only start the beat service, none of my workers started.

COPY ./app/worker-start.sh /worker-start.sh

RUN chmod +x /worker-start.sh

CMD ["bash", "/worker-start.sh"]

COPY ./app/checker-start.sh /checker-start.sh

RUN chmod +x /checker-start.sh

CMD ["bash", "/checker-start.sh"]

COPY ./app/beat-start.sh /beat-start.sh

RUN chmod +x /beat-start.sh

CMD ["bash", "/beat-start.sh"]

Is this the correct way of starting a new type of worker along with beat in docker? Any comment is appreciated. Thanks.

Preferred way to move the production database to an external service (such as AWS RDS)?

A short question: we've been using the fullstack template for a while now and getting the hang of it slowly. We want to host our database on AWS RDS rather than internally in a docker container. Is there an easy way to configure this so that only the production URL will point to RDS, while all others still use a local docker container for development, staging and testing?

Currently one of our devs created a second env file called env-postgres-prod.env and changed docker-compose.shared.env.yml to use this instead, but this also affects the tests scripts etc...

[Question] Association tables

@tiangolo, I have having trouble updating data in an association table. the function does not throw any errors but it doesnt update the field either. I copied the same logic from update function here -- backend/app/app/crud/items.py

[Question] No async endpoints in full-stack-fastapi-postgresql app?

I see that the api endpoints do not have async handlers. Doesn't this mean that these handlers will run synchronously and won't be able to take advantage of the event loop/ async capabilities as NodeJS does?

There might be some gap in my understanding here. If yes, kindly point me to relevant source. Thanks.

Backend tests not don't pass

Getting the following error when running backend tests on a cleanly installed project:

pydantic.error_wrappers.ValidationError: 1 validation error
response
  value is not a valid dict (type=type_error.dict)

Fix is to add:

class Config:
    orm_mode = True

to the pydantic models for Item and User:

# Additional properties to return via API
class User(UserBaseInDB):

    class Config:
        orm_mode = True

/users/me returns 500 (GET/POST) - pydantic issue

issue

hi, currently something seems to be wrong with the pydantic validations on the user model.
Any get/post to /users/me returns a 500.
Logs:

backend_1        | pydantic.error_wrappers.ValidationError: 1 validation error for User
backend_1        | response
backend_1        |   value is not a valid dict (type=type_error.dict)

Request payload looks like a dict :)
image

how to reproduce

fresh cookiecutter generated project, log with admin into frontend, change own username/email etc.

[Question] How to deploy from local Ubuntu desktop machine to Digital Ocean machine?

Hello,

First, thanks for providing such an excellent stack.

I'm a little new to Docker and I'm struggling to find the easiest way to deploy from my local Linux machine to my Digital Ocean (DO) droplet.

Here's how I understand the process outlined in the project folder "README.md" and from the https://dockerswarm.rocks/traefik/ docs:

  1. Set up DO Ubuntu machine and follow instructions to set up Traefik Proxy w/ HTTPS (https://dockerswarm.rocks/traefik/). I was successful here.

  2. Use cookiecutter to generate full-stack-fastapi-postgresql project.

  3. Develop/Test locally, also successful

  4. Build/Deploy: here is where I'm a little confused. Should I use docker-machine on my local computer or do I set up a swarm that I manage from my local machine? I'm confused with Docker's docs. I'm trying to build/deploy with your intended instructions.

I tried to set up a swarm "manager" on my local machine and then join the a "worker" which is my VPS on digital ocean but I got an error Error response from daemon: could not find local IP address: dial udp [my IPV6 address of local machine].

I also tried to provision with docker-machine but I'm getting the feeling that I should be using swarm. The Docker docs seem to state that docker-machine has been superseded.

Thanks in advance!

calling a common function that uses a db session from inside fastapi as well as celery

hi guys,
I have a pretty common usecase that im not able to figure out how to do - i have a common function that i need to call using fastapi as well as a celery worker.

However, im not able to figure out how to create the session in a way that it will work for both api and worker.

the code that is in this repo does not have an example which i could find, but it is super common in real world.

Could you guys comment on how we should write this ? If you could just add a function that calls a db query and use that function in both the api and celery worker, that should be enough.

we have a pretty hacky way of doing this in flask - check if a flask request context exists, etc etc . I'm hoping there's a much more cleaner way here.

Exception in ASGI application with newest Fastapi

I've successfully used the Fullstack fastapi template in the past. With the recent version of fastapi installed through the Dockerfile, I get the following error when I log on to docker-compose logs -f backend and open up https://localhost/:

backend_1        | INFO: ('172.22.0.2', 39320) - "GET /api/v1/users/me HTTP/1.1" 500
backend_1        | ERROR: Exception in ASGI application
backend_1        | Traceback (most recent call last):
backend_1        |   File "/usr/local/lib/python3.7/site-packages/uvicorn/protocols/http/httptools_impl.py", line 368, in run_asgi
backend_1        |     result = await app(self.scope, self.receive, self.send)
backend_1        |   File "/usr/local/lib/python3.7/site-packages/starlette/applications.py", line 133, in __call__
backend_1        |     await self.error_middleware(scope, receive, send)
backend_1        |   File "/usr/local/lib/python3.7/site-packages/starlette/middleware/errors.py", line 122, in __call__
backend_1        |     raise exc from None
backend_1        |   File "/usr/local/lib/python3.7/site-packages/starlette/middleware/errors.py", line 100, in __call__
backend_1        |     await self.app(scope, receive, _send)
backend_1        |   File "/usr/local/lib/python3.7/site-packages/starlette/middleware/base.py", line 25, in __call__
backend_1        |     response = await self.dispatch_func(request, self.call_next)
backend_1        |   File "./app/main.py", line 34, in db_session_middleware
backend_1        |     response = await call_next(request)
backend_1        |   File "/usr/local/lib/python3.7/site-packages/starlette/middleware/base.py", line 45, in call_next
backend_1        |     task.result()
backend_1        |   File "/usr/local/lib/python3.7/site-packages/starlette/middleware/base.py", line 38, in coro
backend_1        |     await self.app(scope, receive, send)
backend_1        |   File "/usr/local/lib/python3.7/site-packages/starlette/middleware/cors.py", line 76, in __call__
backend_1        |     await self.app(scope, receive, send)
backend_1        |   File "/usr/local/lib/python3.7/site-packages/starlette/exceptions.py", line 73, in __call__
backend_1        |     raise exc from None
backend_1        |   File "/usr/local/lib/python3.7/site-packages/starlette/exceptions.py", line 62, in __call__
backend_1        |     await self.app(scope, receive, sender)
backend_1        |   File "/usr/local/lib/python3.7/site-packages/starlette/routing.py", line 585, in __call__
backend_1        |     await route(scope, receive, send)
backend_1        |   File "/usr/local/lib/python3.7/site-packages/starlette/routing.py", line 207, in __call__
backend_1        |     await self.app(scope, receive, send)
backend_1        |   File "/usr/local/lib/python3.7/site-packages/starlette/routing.py", line 40, in app
backend_1        |     response = await func(request)
backend_1        |   File "/usr/local/lib/python3.7/site-packages/fastapi/routing.py", line 122, in app
backend_1        |     skip_defaults=response_model_skip_defaults,
backend_1        |   File "/usr/local/lib/python3.7/site-packages/fastapi/routing.py", line 54, in serialize_response
backend_1        |     raise ValidationError(errors)
backend_1        | pydantic.error_wrappers.ValidationError: 1 validation error
backend_1        | response
backend_1        |   value is not a valid dict (type=type_error.dict)

There's a small chance that this error relates to some code edits I have introduced, but I suspect this is related to the latest changes to how pydantic validates models in newer Fastapi versions. Anyone else experiencing this? I haven't tested it with a fresh full-stack-fastapi-postgresql, but will check as soon as possible..

Just a note: login still works, perhaps this is just a backend error with no consequences, it is still annoying seeing this error every time the login screen is requested.

[Question] File downloading

I'd like the user to be able to download a file I have within the /app directory via an API call. Which is the best way to do this, given that this file can be considerably big?

How to connect to database locally?

I'm really interesting in this project, and want to get a start with this template.

However, I have a basic question regarding connecting the database inside container.

In develop environment, how do I connect to the database? I'm trying to setup a new server connection in pgAdmin at http://localhost:5050, but always get rejected:

Unable to connect to server:
could not connect to server: Connection refused
Is the server running on host "127.0.0.1" and accepting
TCP/IP connections on port 5432?

The url was 127.0.0.1:5432, database postgres, user postgres, and using the password I generated earlier. Anything I may used wrongly here?

[Question] Why is there no pipenv / poetry for installing dependencies?

I would assume that one or the other (or even a requirements.txt) would be used for setting up the python dependencies.

I've seen so MANY nice libraries/abstractions already used throughout this cookiecutter, that I'm surpised that there is no strict way of controlling python package dependency versions.

Cookiecutter Windows Bug in worker-start.sh: CRLF Linebreaks

First, many thanks for this project! I am lookin forward to testing it.

I had some problems getting it to run on Windows and found the source (it was me). Even though it was my fault, I thought a note in the readme could be added for other users:

The backend would not start because linebreaks in worker-start.sh (and other bash-files) were CRLF (Windows), and not LF (Linux). In docker-compose logs this would appear as invalid command \r etc.

I used cookiecutter in Windows with cookiecutter https://github.com/tiangolo/full-stack-fastapi-postgresql, which issued CRLF linebreaks. Once using cookiecutter from WSL bash in Ubuntu, everything worked fine!

I think it generally a good recommendation to use docker-compose, and docker from WSL bash (connected through Windows Docker daemon running on localhost); and set VSCode (if used from Windows), to use LF in global settings with "files.eol": "\n".

Backend not starting

Hello
I've tried running project locally on Windows server (using cookiecutter), but backend is not starting.
docker logs shows:
Checking for script in /app/prestart.sh
Running script /app/prestart.sh
: not foundad.sh: 2: /app/prestart.sh:

Default values

How do i create default values in the database(sqlalchemy)?

buried functionality in docker images

Investing a reference implementation like this, I see prestart.sh, gunicorn, and other things, and I have no idea how they are called or even how the uvicorn server starts up. I did a clone on the uvicorn/gunicorn/docker and see that there's quite a bit of buried functionality in there that is hidden from this reference implementation. That makes it less useful and harder to follow.

For me, the purpose of an implementation like this is to take the parts that might fit into my project structure, and modify other parts that don't. For security reasons, I can't use community contributed docker images, and even if I could, this structure does not work in my project's mono-repo and microservice build structure. I have to look in two different repositories now to see how I could modify this structure to use in my project.

One last note - in my opinion there's way too much code in the docker images that then calls this implementations code. I like the structure of the project in general, but that creates a lot of difficulties and removes flexibility for testing and other CI stuff.

[Question] scoped_session versus Session sqlalchemy

According to the sqlalchemy documentation, the Session is not meant to be run concurrently, and libraries like Flask-Sqlalchemy provide lazy support, etc.
So given that sqlalchemy creates a connection pool underneath, and that non-async endpoints in fastapi are run within a threadpool, is the Session implementation in app.db thread-safe?

Sqlalchemy recommends using a scopedsession in the case where you don't have extra library support since it uses threadlocal. That's what flask-sqlalchemy uses
https://docs.sqlalchemy.org/en/13/orm/contextual.html#unitofwork-contextual

I was wondering your opinion on this and whether you think it's safe to use the session this way in a high throughput site?

deployment: curl localhost connection refused

Hi, I don't know a lot about networking. Can someone help? I did the deploy on an Amazon machine, and everything looked like it worked, all the services are running. But when I curl I get connection refused. I even get the connection refused from within the machine itself. Can someone point me in the right direction? Thanks so much.

Here we see that netstat shows nothing is listening on port 80 after deployment
image

Question: How to run tests outside of a docker container?

Hi, thanks for the beautiful framework and cookiecutter project!

I'm setting up a project using this cookiecutter and I haven't managed to run the tests without the use of the script that generates the tests docker container. Is there a way to quickly run the tests without recreating a docker container?
I plan to use TDD in my development and if I have to wait a long time for the tests to run it will discourage me from following that practice...

I tried running the tests by executing docker-compose exec backend bash and then running the tests running bash tests-start.sh and also by simply running pytest, but for the former the tests don't manage to authenticate and for the latte I get a pytest not found error.

Any hints would be very welcome!

os.getenvb() not supported on windows (used in config.py )

config.py contains code:

SECRET_KEY = os.getenvb(b"SECRET_KEY")
if not SECRET_KEY:
    SECRET_KEY = os.urandom(32)

https://docs.python.org/3.6/library/os.html#os.getenvb

getenvb() is only available if supports_bytes_environ is True.

so possible change (windows as development platform) is:

if os.supports_bytes_environ:
    SECRET_KEY = os.getenvb(b"SECRET_KEY", default=os.urandom(32))
else:
    SECRET_KEY = os.urandom(32)

Note that also default can be provided in os.getenv(key, default=None)/os.getenvb(key, default=None) calls, other places can be then improved.

[QUESTION] Unable to build containers due to memory limit?

I'm trying to run docker-compose up on a ubuntu linux server, however, it hangs when building frontend stage.

Creating network "aapi_default" with the default driver
Building frontend
Step 1/13 : FROM tiangolo/node-frontend:10 as build-stage
 ---> 46be30c070b7
Step 2/13 : WORKDIR /app
 ---> Running in ac6245f9ad3a
Removing intermediate container ac6245f9ad3a
 ---> f6ddb3ab64e4
Step 3/13 : COPY package*.json /app/
 ---> f843ecca92ae
Step 4/13 : RUN npm install
 ---> Running in 039a5520302d

> [email protected] install /app/node_modules/yorkie
> node bin/install.js

setting up Git hooks
can't find .git directory, skipping Git hooks installation
npm WARN optional SKIPPING OPTIONAL DEPENDENCY: [email protected] (node_modules/fsevents):
npm WARN notsup SKIPPING OPTIONAL DEPENDENCY: Unsupported platform for [email protected]: wanted {"os":"darwin","arch":"any"} (current: {"os":"linux","arch":"x64"})

added 1418 packages from 842 contributors, removed 14 packages, updated 3 packages and audited 38434 packages in 29.045s
found 894 vulnerabilities (63 low, 10 moderate, 821 high)
  run `npm audit fix` to fix them, or `npm audit` for details
Removing intermediate container 039a5520302d
 ---> 2b01d93f74a7
Step 5/13 : COPY ./ /app/
 ---> 3a95f66c6d2c
Step 6/13 : ARG FRONTEND_ENV=production
 ---> Running in 56e44e902622
Removing intermediate container 56e44e902622
 ---> 7abe680a8443
Step 7/13 : ENV VUE_APP_ENV=${FRONTEND_ENV}
 ---> Running in 66d67a827ab8
Removing intermediate container 66d67a827ab8
 ---> affaeaff5171
Step 8/13 : RUN npm run test:unit
 ---> Running in 2d6296538665

> [email protected] test:unit /app
> vue-cli-service test:unit

PASS tests/unit/upload-button.spec.ts
  UploadButton.vue
    โœ“ renders props.title when passed (34ms)

Test Suites: 1 passed, 1 total
Tests:       1 passed, 1 total
Snapshots:   0 total
Time:        4.659s
Ran all test suites.
Removing intermediate container 2d6296538665
 ---> 0781963d4ded
Step 9/13 : RUN npm run build
 ---> Running in 27f3f67ef14e

> [email protected] build /app
> vue-cli-service build


-  Building for production...
Starting type checking and linting service...
Using 1 worker with 2048MB memory limit
Browserslist: caniuse-lite is outdated. Please run next command `npm update caniuse-lite browserslist`
^C^C^CException compose.cli.signals.ShutdownException: ShutdownException() in <generator object _error_catcher at 0x7fb9c385e960> ignored
Gracefully stopping... (press Ctrl+C again to force)

The issue could be due to Using 1 worker with 2048MB memory limit, and I'm using a 1GB memory server.

When it hangs, I can observe the kswapd0 process gradually taking up to 99.9% CPU, which is used to manage virtual memory.

  PID USER      PR  NI    VIRT    RES    SHR S %CPU %MEM     TIME+ COMMAND
   36 root      20   0       0      0      0 R 99.9  0.0   0:25.77 kswapd0
 1845 ubuntu    20   0   44572    704    292 R  3.2  0.1   0:00.86 top
 2664 root      20   0 1354116 330408      0 D  2.3 32.8   0:16.23 node
  909 root      20   0  768804  20912      0 S  1.2  2.1   0:00.30 containerd

Does it mean in order to use this project, there has to be at least 2GB memory? Is it possible that I can run it on my 1GB memory server?

Weak support for 'ForwardRef' with python 3.6.8

Hi @tiangolo ,

Thanks a lot for your amazing contributions. fastapi ๐ŸŽˆ . dockerswarmrocks ๐ŸŒŸ , and all projects generators ๐Ÿš€

I just hit an issue when creating my first one-to-many relationship, as described in tiangolo/fastapi#153.

One straightforward solution is to move up the python version from 3.6 to 3.7 in the Dockerfile(s).

So you have any particular advices to do this ? or shall I just create a PR with the 3 corrections ?

Emmanuel
-- cortexia.ch

Question: How to backup/restore postgres and other data in docker containers.

Hi, first of all, I really like the FastAPI project and the project generator.

I picked up quite a lot on docker and other tech.

I'm just getting into docker containers.

So perhaps I missed something, but I'm wondering how you would backup/restore the postgres data in a production (and/or staging) docker swarm?
The same goes for any other data files (like for example uploaded files) in some form of persistent filestore. (When not using S3 or similar, but storing data locally in a docker volume.)

I've found links about "docker exec pgdump" into the postgres container(s) and dumping the backup into another mounted backup volume.

Also to get those backups from the docker swarm host onto remote storage, what are options there?
Eg. crontab/rsync on the docker swarm host.

Or would it be better to create a custom container in each swarm project that takes care of backing up database(s), stored files and other persistent data and sending it over to another (remote) location.
And in the latter case, how would it work when I wanted to fire up a deployment, starting from an existing backup of the database and/or a set of stored file data?

Is this something that can be automated?
Or at least prepared in a set of scripts that can be fired up when appropriate?
Some pointers are more than welcome. In fact, if this sounds usefull to other people, it could even be integrated into the project generator.
Or at least some documentation on how to go about it.

[Remark] gunicorn_conf can lead to huge number of workers

I used your config file for gunicorn on a project, so not sure this issue applies here as I'm using asyncpg, but I found it interesting enough to post.

web_concurrency can reach too high levels, in my case using defaults it was trying to setup 96 workers and I couldn't start my app, because of an asyncpg.exceptions.TooManyConnectionsError

Indeed, the default max_size in asyncpg.pool.create_pool is 10 so should you get more than 10 workers, db will complain this way:

[2019-02-26 12:52:14 +0000] [135] [ERROR] Exception in 'lifespan' protocol
Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/uvicorn/lifespan/on.py", line 40, in main
    await app_instance(self.receive, self.send)
  File "/usr/local/lib/python3.7/site-packages/starlette/routing.py", line 478, in asgi
    await self.startup()
  File "/usr/local/lib/python3.7/site-packages/starlette/routing.py", line 461, in startup
    await handler()
  File "/usr/src/app/app/app.py", line 27, in startup
    await database.connect()
  File "/usr/local/lib/python3.7/site-packages/databases/core.py", line 59, in connect
    await self._backend.connect()
  File "/usr/local/lib/python3.7/site-packages/databases/backends/postgres.py", line 55, in connect
    self._pool = await asyncpg.create_pool(str(self._database_url), **kwargs)
  File "/usr/local/lib/python3.7/site-packages/asyncpg/pool.py", line 400, in _async__init__
    await self._initialize()
  File "/usr/local/lib/python3.7/site-packages/asyncpg/pool.py", line 417, in _initialize
    await first_ch.connect()
  File "/usr/local/lib/python3.7/site-packages/asyncpg/pool.py", line 125, in connect
    self._con = await self._pool._get_new_connection()
  File "/usr/local/lib/python3.7/site-packages/asyncpg/pool.py", line 463, in _get_new_connection
    **self._connect_kwargs)
  File "/usr/local/lib/python3.7/site-packages/asyncpg/connection.py", line 1688, in connect
    max_cacheable_statement_size=max_cacheable_statement_size)
  File "/usr/local/lib/python3.7/site-packages/asyncpg/connect_utils.py", line 543, in _connect
    connection_class=connection_class)
  File "/usr/local/lib/python3.7/site-packages/asyncpg/connect_utils.py", line 519, in _connect_addr
    await asyncio.wait_for(connected, loop=loop, timeout=timeout)
  File "/usr/local/lib/python3.7/asyncio/tasks.py", line 416, in wait_for
    return fut.result()
asyncpg.exceptions.TooManyConnectionsError: sorry, too many clients already

I added the following, so that the max number of workers matches the max_size of asyncpg create_pool()

diff --git a/services/backend/gunicorn_conf.py b/services/backend/gunicorn_conf.py
index c031db5..e181880 100644
--- a/services/backend/gunicorn_conf.py
+++ b/services/backend/gunicorn_conf.py
@@ -2,6 +2,7 @@ import json
 import multiprocessing
 import os
 
+max_workers = os.getenv("MAX_WORKERS", 10)
 workers_per_core_str = os.getenv("WORKERS_PER_CORE", "2")
 web_concurrency_str = os.getenv("WEB_CONCURRENCY", None)
 host = os.getenv("HOST", "0.0.0.0")
@@ -20,7 +21,7 @@ if web_concurrency_str:
     web_concurrency = int(web_concurrency_str)
     assert web_concurrency > 0
 else:
-    web_concurrency = int(default_web_concurrency)
+    web_concurrency = min(max_workers, int(default_web_concurrency))
 
 # Gunicorn config variables
 loglevel = use_loglevel

Traefik configuration issue, some URLS being cancelled and then blocked, status code 307.

I have a stack deployed on AWS with Docker Swarm and Traefik. Everything works fine locally.

It appears that any URL with a query parameter is being cancelled, downgraded to http, and then blocked.

Here is a screenshot from chrome dev tools showing "/latest" endpoint returning successfully and the "/jobs?" input with query parameters being cancelled:

image

If I navigate directly to the url in question, I get a successful response, but the same endpoint, called via axios in my vue app, gets cancelled.

The proxy logs are showing status code 307 for these requests. Here is the full log line:

"GET /api/jobs?descending=true&pagination=false HTTP/1.1" 307 0 "https://app.example.com/fb" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/76.0.3809.132 Safari/537.36" 264 "PathPrefix-api-docs-redoc-1" "http://10.0.2.27:80" 1ms

For some reason, these requests are getting redirected to http and then failing.

Could this be a problem with the Traefik proxy configuration? Or axios?

UPDATE:
After digging into the redirect issue more, it seems to be an issue with trailing slashes. My requests are getting redirected to include a trailing slash. So

https://app.example.com/api/companies?type=customer

becomes

http://app.example.com/api/companies/?type=customer

(note the / before the ? and the https --> http). Since its now requesting an unsecure resource it gets blocked by chrome.

Why are all requests, even ones with query parameters (?) being redirected to "/" ?

Thanks in advance!

What about casting CRUD returned value to pydantic models ?

I have implemented

  • a pydantic City model (and some extra models for the updates, creations as suggested in the tutorial).
  • a SQLalchemy DBCity model

I would like the crud.get method to return the City model and not the DBCity.

More important, I would like the casting from DBCity to City to be done automatically, with something like

def get(db_session, city_id: int) -> Optional[City]:
    return db_session.query(DBCity).get(city_id)

The code above does not make any casting, and behaves as follow:

def get(db_session, city_id: int) -> Optional[DBCity]:
    return db_session.query(DBCity).get(city_id)

What do you think of this behavior ?

Wouldn't it be nice to have an auto-casting like the ones of the API ?

Use a different path for Vue frontend

Hi,

I wanted to use another path for the Vue frontend but updating the traefik rule to this:

frontend:
    labels:
      - traefik.frontend.rule=PathPrefix:/admin

gives a blank page and this error on the browser:

Screen Shot 2019-06-12 at 3 00 25 PM

Tests Failing

This might be related to #72 but when I run the tests (by running ./tests.sh) on the project, I'm getting validation errors.

============================= test session starts ==============================
platform linux -- Python 3.7.5, pytest-5.2.1, py-1.8.0, pluggy-0.13.0
rootdir: /app/app/tests
collected 23 items

app/app/tests/api/api_v1/test_celery.py .                                [  4%]
app/app/tests/api/api_v1/test_items.py FF                                [ 13%]
app/app/tests/api/api_v1/test_login.py .F                                [ 21%]
app/app/tests/api/api_v1/test_users.py FFF..F                            [ 47%]
app/app/tests/crud/test_item.py ....                                     [ 65%]
app/app/tests/crud/test_user.py ........                                 [100%]

=================================== FAILURES ===================================
_______________________________ test_create_item _______________________________

superuser_token_headers = {'Authorization': 'Bearer eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJ1c2VyX2lkIjoxLCJleHAiOjE1NzIyMjA1ODQsInN1YiI6ImFjY2VzcyJ9.3L8ww9V0Muleb9PBPdBHCOXZ62HbFauqD9grYeIlAco'}

    def test_create_item(superuser_token_headers):
        server_api = get_server_api()
        data = {"title": "Foo", "description": "Fighters"}
        response = requests.post(
            f"{server_api}{config.API_V1_STR}/items/",
            headers=superuser_token_headers,
            json=data,
        )
>       content = response.json()

app/app/tests/api/api_v1/test_items.py:16: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
usr/local/lib/python3.7/site-packages/requests/models.py:897: in json
    return complexjson.loads(self.text, **kwargs)
usr/local/lib/python3.7/json/__init__.py:348: in loads
    return _default_decoder.decode(s)
usr/local/lib/python3.7/json/decoder.py:337: in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <json.decoder.JSONDecoder object at 0x7f075e2bc910>
s = 'Internal Server Error', idx = 0

    def raw_decode(self, s, idx=0):
        """Decode a JSON document from ``s`` (a ``str`` beginning with
        a JSON document) and return a 2-tuple of the Python
        representation and the index in ``s`` where the document ended.
    
        This can be used to decode a JSON document from a string that may
        have extraneous data at the end.
    
        """
        try:
            obj, end = self.scan_once(s, idx)
        except StopIteration as err:
>           raise JSONDecodeError("Expecting value", s, err.value) from None
E           json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

usr/local/lib/python3.7/json/decoder.py:355: JSONDecodeError
________________________________ test_read_item ________________________________

superuser_token_headers = {'Authorization': 'Bearer eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJ1c2VyX2lkIjoxLCJleHAiOjE1NzIyMjA1ODQsInN1YiI6ImFjY2VzcyJ9.3L8ww9V0Muleb9PBPdBHCOXZ62HbFauqD9grYeIlAco'}

    def test_read_item(superuser_token_headers):
        item = create_random_item()
        server_api = get_server_api()
        response = requests.get(
            f"{server_api}{config.API_V1_STR}/items/{item.id}",
            headers=superuser_token_headers,
        )
>       content = response.json()

app/app/tests/api/api_v1/test_items.py:30: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
usr/local/lib/python3.7/site-packages/requests/models.py:897: in json
    return complexjson.loads(self.text, **kwargs)
usr/local/lib/python3.7/json/__init__.py:348: in loads
    return _default_decoder.decode(s)
usr/local/lib/python3.7/json/decoder.py:337: in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <json.decoder.JSONDecoder object at 0x7f075e2bc910>
s = 'Internal Server Error', idx = 0

    def raw_decode(self, s, idx=0):
        """Decode a JSON document from ``s`` (a ``str`` beginning with
        a JSON document) and return a 2-tuple of the Python
        representation and the index in ``s`` where the document ended.
    
        This can be used to decode a JSON document from a string that may
        have extraneous data at the end.
    
        """
        try:
            obj, end = self.scan_once(s, idx)
        except StopIteration as err:
>           raise JSONDecodeError("Expecting value", s, err.value) from None
E           json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

usr/local/lib/python3.7/json/decoder.py:355: JSONDecodeError
____________________________ test_use_access_token _____________________________

superuser_token_headers = {'Authorization': 'Bearer eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJ1c2VyX2lkIjoxLCJleHAiOjE1NzIyMjA1ODQsInN1YiI6ImFjY2VzcyJ9.3L8ww9V0Muleb9PBPdBHCOXZ62HbFauqD9grYeIlAco'}

    def test_use_access_token(superuser_token_headers):
        server_api = get_server_api()
        r = requests.post(
            f"{server_api}{config.API_V1_STR}/login/test-token",
            headers=superuser_token_headers,
        )
>       result = r.json()

app/app/tests/api/api_v1/test_login.py:28: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
usr/local/lib/python3.7/site-packages/requests/models.py:897: in json
    return complexjson.loads(self.text, **kwargs)
usr/local/lib/python3.7/json/__init__.py:348: in loads
    return _default_decoder.decode(s)
usr/local/lib/python3.7/json/decoder.py:337: in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <json.decoder.JSONDecoder object at 0x7f075e2bc910>
s = 'Internal Server Error', idx = 0

    def raw_decode(self, s, idx=0):
        """Decode a JSON document from ``s`` (a ``str`` beginning with
        a JSON document) and return a 2-tuple of the Python
        representation and the index in ``s`` where the document ended.
    
        This can be used to decode a JSON document from a string that may
        have extraneous data at the end.
    
        """
        try:
            obj, end = self.scan_once(s, idx)
        except StopIteration as err:
>           raise JSONDecodeError("Expecting value", s, err.value) from None
E           json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

usr/local/lib/python3.7/json/decoder.py:355: JSONDecodeError
_________________________ test_get_users_superuser_me __________________________

superuser_token_headers = {'Authorization': 'Bearer eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJ1c2VyX2lkIjoxLCJleHAiOjE1NzIyMjA1ODUsInN1YiI6ImFjY2VzcyJ9.qC_VMci_zK8tnvn_F9XKn9krkuQNijV4mnZuOHJDJ9A'}

    def test_get_users_superuser_me(superuser_token_headers):
        server_api = get_server_api()
        r = requests.get(
            f"{server_api}{config.API_V1_STR}/users/me", headers=superuser_token_headers
        )
>       current_user = r.json()

app/app/tests/api/api_v1/test_users.py:16: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
usr/local/lib/python3.7/site-packages/requests/models.py:897: in json
    return complexjson.loads(self.text, **kwargs)
usr/local/lib/python3.7/json/__init__.py:348: in loads
    return _default_decoder.decode(s)
usr/local/lib/python3.7/json/decoder.py:337: in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <json.decoder.JSONDecoder object at 0x7f075e2bc910>
s = 'Internal Server Error', idx = 0

    def raw_decode(self, s, idx=0):
        """Decode a JSON document from ``s`` (a ``str`` beginning with
        a JSON document) and return a 2-tuple of the Python
        representation and the index in ``s`` where the document ended.
    
        This can be used to decode a JSON document from a string that may
        have extraneous data at the end.
    
        """
        try:
            obj, end = self.scan_once(s, idx)
        except StopIteration as err:
>           raise JSONDecodeError("Expecting value", s, err.value) from None
E           json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

usr/local/lib/python3.7/json/decoder.py:355: JSONDecodeError
__________________________ test_create_user_new_email __________________________

superuser_token_headers = {'Authorization': 'Bearer eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJ1c2VyX2lkIjoxLCJleHAiOjE1NzIyMjA1ODUsInN1YiI6ImFjY2VzcyJ9.qC_VMci_zK8tnvn_F9XKn9krkuQNijV4mnZuOHJDJ9A'}

    def test_create_user_new_email(superuser_token_headers):
        server_api = get_server_api()
        username = random_lower_string()
        password = random_lower_string()
        data = {"email": username, "password": password}
        r = requests.post(
            f"{server_api}{config.API_V1_STR}/users/",
            headers=superuser_token_headers,
            json=data,
        )
>       assert 200 <= r.status_code < 300
E       assert 500 < 300
E        +  where 500 = <Response [500]>.status_code

app/app/tests/api/api_v1/test_users.py:33: AssertionError
____________________________ test_get_existing_user ____________________________

superuser_token_headers = {'Authorization': 'Bearer eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJ1c2VyX2lkIjoxLCJleHAiOjE1NzIyMjA1ODUsInN1YiI6ImFjY2VzcyJ9.qC_VMci_zK8tnvn_F9XKn9krkuQNijV4mnZuOHJDJ9A'}

    def test_get_existing_user(superuser_token_headers):
        server_api = get_server_api()
        username = random_lower_string()
        password = random_lower_string()
        user_in = UserCreate(email=username, password=password)
        user = crud.user.create(db_session, user_in=user_in)
        user_id = user.id
        r = requests.get(
            f"{server_api}{config.API_V1_STR}/users/{user_id}",
            headers=superuser_token_headers,
        )
>       assert 200 <= r.status_code < 300
E       assert 500 < 300
E        +  where 500 = <Response [500]>.status_code

app/app/tests/api/api_v1/test_users.py:50: AssertionError
_____________________________ test_retrieve_users ______________________________

superuser_token_headers = {'Authorization': 'Bearer eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJ1c2VyX2lkIjoxLCJleHAiOjE1NzIyMjA1ODUsInN1YiI6ImFjY2VzcyJ9.qC_VMci_zK8tnvn_F9XKn9krkuQNijV4mnZuOHJDJ9A'}

    def test_retrieve_users(superuser_token_headers):
        server_api = get_server_api()
        username = random_lower_string()
        password = random_lower_string()
        user_in = UserCreate(email=username, password=password)
        user = crud.user.create(db_session, user_in=user_in)
    
        username2 = random_lower_string()
        password2 = random_lower_string()
        user_in2 = UserCreate(email=username2, password=password2)
        user2 = crud.user.create(db_session, user_in=user_in2)
    
        r = requests.get(
            f"{server_api}{config.API_V1_STR}/users/", headers=superuser_token_headers
        )
>       all_users = r.json()

app/app/tests/api/api_v1/test_users.py:103: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
usr/local/lib/python3.7/site-packages/requests/models.py:897: in json
    return complexjson.loads(self.text, **kwargs)
usr/local/lib/python3.7/json/__init__.py:348: in loads
    return _default_decoder.decode(s)
usr/local/lib/python3.7/json/decoder.py:337: in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <json.decoder.JSONDecoder object at 0x7f075e2bc910>
s = 'Internal Server Error', idx = 0

    def raw_decode(self, s, idx=0):
        """Decode a JSON document from ``s`` (a ``str`` beginning with
        a JSON document) and return a 2-tuple of the Python
        representation and the index in ``s`` where the document ended.
    
        This can be used to decode a JSON document from a string that may
        have extraneous data at the end.
    
        """
        try:
            obj, end = self.scan_once(s, idx)
        except StopIteration as err:
>           raise JSONDecodeError("Expecting value", s, err.value) from None
E           json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

usr/local/lib/python3.7/json/decoder.py:355: JSONDecodeError

Validation error for User

Hi

I used the cookiecutter to create this project locally on macOS 10.14.6 and ran it without making any changes.

I ran into the following issue with the API (and consequently the frontend):

backend_1        | pydantic.error_wrappers.ValidationError: 1 validation error for User
backend_1        | response -> 0
backend_1        |   value is not a valid dict (type=type_error.dict)

This example was with /api/v1/users/me but it occurred with all APIs returning the User model.

I was able to successfully return the user in raised exceptions all the way through the backend processing, so it seems that the validation error is being raised from db_session_middleware .

Adding the orm_mode property to the UserBase model seemed to fix the issue. ie.

class UserBase(BaseModel):
    email: Optional[str] = None
    is_active: Optional[bool] = True
    is_superuser: Optional[bool] = False
    full_name: Optional[str] = None

    class Config:
        orm_mode = True

I'm not sure whether this is a bug, or something to do with my configuration or OS - but this was a completely vanilla build, so perhaps I'm not the only person to encounter this issue.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.