Awesome
Async configuration for FastAPI and SQLModel
This is a project template which uses FastAPI, Alembic and async SQLModel as ORM which already is compatible with Pydantic V2 and SQLAlchemy V2.0. It shows a complete async CRUD template using authentication. Our implementation utilizes the newest version of FastAPI and incorporates typing hints that are fully compatible with Python >=3.10. If you're looking to build modern and efficient web applications with Python, this template will provide you with the necessary tools to get started quickly. You can read a short article with the motivations of starting this project in Our Journey Using Async FastAPI.
If you are looking to create new a project from zero, I recommend you to use create-fastapi-project.
Do you need assistance, trainning or support for your newxt project using fastapi?. Please don't hesitate to get in touch with our team at info@allient.io or schedule a meeting with us here.
Why Use This Template?
Developing web applications can be a challenging process, especially when dealing with databases, authentication, asynchronous tasks, and other complex components. Our template is designed to simplify this process and offer you a solid starting point. Some of the highlights of this template include:
- FastAPI Integration: FastAPI is a modern and efficient web framework that allows you to quickly and easily create APIs. This template uses the latest features of FastAPI and offers type hints that are compatible with Python 3.10 and later versions.
- Asynchronous Database Management: We use SQLModel, an asynchronous ORM library, to interact with the database efficiently and securely.
- Asynchronous Tasks with Celery: This template includes examples of how to execute asynchronous and scheduled tasks using Celery, which is ideal for operations that require significant time or resources.
- Authentication and Authorization: We implement JWT-based authentication and role-based access control to ensure that your APIs are secure and protected.
- Documentation and Automated Testing: The template is configured to automatically generate interactive documentation for your APIs. It also includes automated tests using pytest to ensure code quality.
- Development Best Practices: We apply code formatting, type checking, and static analysis tools to ensure that the code is readable, robust, and reliable.
Table of Contents
- Prerequisites
- Run the project using Docker containers and forcing build containers
- Run project using Docker containers
- Setup database with initial data
- ERD Database model
- Containers architecture
- Preview
- Static files
- Minio server
- Celery
- Run Alembic migrations (Only if you change the DB model)
- Production Deployment
- Database unique IDs
- Code Style
- SonarQube static analysis
- Testing
- Type checker
- Basic chatbot example with Langchain and OpenAI
- Inspiration and References
- TODO List
- License
Prerequisites
Set environment variables
Create an .env file on root folder and copy the content from .env.example. Feel free to change it according to your own configuration.
Docker engine
This project utilizes Docker and Docker Compose, so please ensure that you have installed the latest version compatible with your operating system. If you haven't already installed Docker, you can find detailed instructions on how to do so here. Docker desktop can be good for a dev computer.
You can check if it is installed with this command
docker --version
Make
"Make" is a build automation tool that is primarily used to manage the compilation and building of software projects. It reads a file called a "Makefile" which specifies a set of rules and dependencies for building a project, and then executes the necessary commands to build the project according to those rules. Depending of your OS you will requiere to install it in different ways.
Mac
xcode-select --install
Ubuntu
sudo apt-get install build-essential
sudo apt-get -y install make
You can check if it is installed with this command
make --version
Python ">3.9,<3.12"
If you haven't already installed Python. You can download and install python from here.
You can check yu python version:
python --version
Poetry
Python Poetry is a tool for dependency management and packaging in Python. It provides a modern and efficient approach to managing Python projects' dependencies, virtual environments, and packaging. You can find detailed instructions on how install it here. Poetry manages packages in pyproject.toml file; In this project you can find it in the folder backend/app.
You can check if it is installed with this command
poetry --version
Dev tip to activate virtual environment
When you are opening python files do this cna help you to vscode detect installed packages.
cd backend/app/
poetry shell
After that you can show the interpreted path. You can copy that path and set as the default for the project in vscode. Press on Enter interpreter path .. and past path.
<p align="center"> <img src="static/python_int.png" align="center"/> </p>Run the project using Docker containers and forcing build containers
Using docker compose command
docker compose -f docker-compose-dev.yml up --build
Using Makefile command
make run-dev-build
Run project using Docker containers
Using docker compose command
docker compose -f docker-compose-dev.yml up
Using Makefile command
make run-dev
Setup database with initial data
This creates sample users on database.
Using docker compose command
docker compose -f docker-compose-dev.yml exec fastapi_server python app/initial_data.py
Using Makefile command
make init-db
Any of the above commands creates three users with the following passwords:
- Admin credentials -> username: admin@admin.com and password: admin
- Manager credentials -> username: manager@example.com and password: admin
- User credentials -> username: user@example.com and password: admin
You can connect to the Database using pgAdmin4 and use the credentials from .env file. Database port on local machine has been configured to 5454 on docker-compose-dev.yml file
(Optional) If you prefer you can run pgAdmin4 on a docker container using the following commands, they should executed on different terminals:
Starts pgadmin
make run-pgadmin
This starts pgamin in http://localhost:15432. When connecting to db server introduce the password by default it is postgres if you didn't change it in .env file.
<p align="center"> <img src="static/tables.png" align="center"/> </p>ERD Database model
<p align="center"> <img src="static/erd.png" align="center"/> </p>Containers architecture
<p align="center"> <img src="static/container_architecture.png" align="center"/> </p>As this project uses Caddy as a reverse proxy, which uses namespaces routing, you can access the documentation with the following path http://fastapi.localhost/docs
Preview
<p align="center"> <img src="static/1.png" align="center"/> </p> <p align="center"> <img src="static/2.png" align="center"/> </p>Static files
All files on static folder will be served by Caddy container as static files. You can check it with this link http://static.localhost
Minio server
This template allows users can upload their photos. The images are stored using the open source Object Storage Service (OSS) minio, which provides storage of images using buckets in a secure way through presigned URLs.
- Minio credentials -> username: minioadmin and password: minioadmin
Celery
Celery is a distributed task queue that allows developers to run asynchronous tasks in their applications. It is particularly useful for tasks that are time-consuming, require heavy computation or access external services, and can be run independently of the main application. It also offers features such as task scheduling, task prioritization, and retries in case of failure.
Celery Beat is an additional component of Celery that allows developers to schedule periodic tasks and intervals for their Celery workers. It provides an easy-to-use interface for defining task schedules and supports several scheduling options such as crontab, interval, and relative.
You can see the architecture used in this project which uses Redis as celery broker and the current postgres database as celery backend. It also uses celery-sqlalchemy-scheduler to store celery beats task into database so they can mutated.
Within the natural_language endpoints, you can access a sample application that demonstrates not only synchronous prediction of machine learning models but also batch prediction. Additionally, there are examples of how to schedule periodic tasks using Celery Beat in the periodic_tasks endpoints.
<p align="center"> <img src="static/celery_diagram.png" align="center"/> </p>Run Alembic migrations (Only if you change the DB model)
Using docker compose command
docker compose -f docker-compose-dev.yml exec fastapi_server alembic revision --autogenerate
docker compose -f docker-compose-dev.yml exec fastapi_server alembic upgrade head
Using Makefile command
make add-dev-migration
Production Deployment
Remember to use a persistant PostgreSQL database, update the new credentials on .env file and use this command to run the project in a production environment. For testing this configuration on localhost you can uncomment the database container and depends_on of fastapi container otherwise it will not work on a local environment.
Using docker compose command
docker compose up --build
Database unique IDs
Generating and using unique IDs is a really important desicion when starting a new project and its most common use is as primary keys for database tables. This project uses a custom UUID7 Draft04 implementation to make it simple to use and take advantage of UUID type of PostgreSQL. UUID7 combines timestamp with random data in order to help to convert data into time-stamped sequencially. If you are looking for another alternatives for tables IDs like Snowflakes, ULID, KSUID, pushID, xid among others you can check these references.
Code Style
To ensure a standardized code style this project uses black and ruff. If you want to change the config rules you can edit both ruff and black rules in the pyproject.toml file.
To reformat files execute the following command
make formatter
To run lint, you can run the following command:
make lint
To run lint in watch mode, you can run the following command:
make lint-watch
To run lint and try to fix the errors, you can run the following command:
make lint-fix
SonarQube static analysis
SonarQube is an automatic code review tool that detects bugs, vulnerabilities, and code smells in a project. You can read this post in order to have a better understanding about what SonarQube can do.
The following steps can help you to run a local static code analysis
- Start SonarQube container
make run-sonarqube
The above code starts SonarQube at localhost:9000. You can login using this credentials -> username: admin and password: admin, after that it should requiere you change your password.
- Add new project
- Copy projectKey and login and replace on backend/sonar-project.properties file.
backend/sonar-project.properties file
# Organization and project keys are displayed in the right sidebar of the project homepage
sonar.organization=my_organization
sonar.projectKey=fastapi-alembic-sqlmodel-async
sonar.host.url=http://host.docker.internal:9000
sonar.login=157cc42f5b2702f470af3466610eebf38551fdd7
# --- optional properties ---
# defaults to project key
sonar.projectName=fastapi-alembic-sqlmodel-async
# defaults to 'not provided'
sonar.projectVersion=1.0
# Path is relative to the sonar-project.properties file. Defaults to .
sonar.sources=app
# Encoding of the source code. Default is default system encoding
sonar.sourceEncoding=UTF-8
- Run the following command to execute a new code scan
make run-sonar-scanner
<p align="center">
<img src="static/sonarqube6.png" align="center"/>
</p>
When the build is successful, you can see the SonarQube screen automatically refreshed with the analysis. If you want to export a report, you can check this this post.
Testing
Testing in FastAPI with pytest involves creating test functions that simulate HTTP requests to the API endpoints and verifying the responses. This approach allows us to conduct both unit tests for individual functions and integration tests for the entire application.
To perform tests in this project, we utilize two essential libraries: pytest and pytest-asyncio.
However, when testing FastAPI endpoints that utilize async connections with the database and a pool strategy, there is a trick to be aware of. The recommended approach is to create an isolated testing environment that connects to the database using the "poolclass": NullPool parameter on the engine. This helps to avoid potential issues related to tasks being attached to different loops. For more details on this, you can refer to the following references: Fastapi testing RuntimeError: Task attached to a different loop and Connection Pooling.
To execute the tests, follow these steps:
- Start the testing environment using the command:
make run-test
- Once the testing environment is up and running, open another terminal and run the tests with the following command:
make pytest
Type checker
Python's type hints, introduced in PEP 484 and fully embraced in later versions of Python, allow you to specify the expected types of variables, function parameters, and return values. It is really good how fastapi documentation promotes type hints so this code base tryies to use this tool the most posible because type hints make the code more self-documenting by providing clear information about what types of values a function or variable can hold and they catch type-related errors at compile time, before the code is executed.
This project uses mypy a popular static type checker for Python. If you want to change the config rules you can edit the rules in the pyproject.toml file.
To execute Type checking, run this command:
make mypy
Basic chatbot example with Langchain and OpenAI
In addition to its core features, this project template demonstrates how to integrate an basic chatbot powered by Langchain and OpenAI through websockets. You can use PieSocket Websocket Tester to test websockets.
To begin experimenting with the basic chatbot, follow these steps:
-
Obtain an OpenAI API Key: You'll need to set the
OPENAI_API_KEY
environment variable, which you can obtain from OpenAI's platform. -
Test Websocket Connection: You can test the websocket connection by using the following URL: ws://fastapi.localhost/chat/<USER_ID>. Replace
<USER_ID>
with a user identifier of your choice. It should be the ID of your user. -
Sending and Receiving Messages: You should be able to send messages to the chatbot using the provided websocket connection. To do this, use the following message structure:
{"message":"Hello world"}
Once you send a message, the chatbot will respond with generated responses based on the content of your input.
Inspiration and References
- full-stack-fastapi-postgresql.
- fastapi-sqlmodel-alembic.
- sqlmodel-tutorial.
- asyncer-tutorial.
- fastapi-pagination.
- fastapi-cache.
- fastapi-keycloak.
- fastapi-async-sqlalchemy.
- fastapi-minio.
- fastapi-best-practices.
- pgadmin Makefile.
- Styling and makefiles.
- awesome-fastapi.
- Serving ML Models in Production with FastAPI and Celery
- Database detup
- Dispatch
TODO List:
- Add Custom Response model
- Create sample one to many relationship
- Create sample many to many relationship
- Add JWT authentication
- Add Pagination
- Add User birthday field with timezone
- Add static server
- Add basic RBAC (Role base access control)
- Add sample heroes, teams and groups on init db
- Add cache configuration using fastapi-cache2 and redis
- Create a global database pool of sessions to avoid to pass the session as dependency injection on each handle
- Refactor tablename to Pascal case
- Add one to one relationship sample
- Add sample to upload images and store them using minio
- Invalidate access and refresh tokens when the password is changed using Redis
- Add shortcuts using a Makefile
- Add sample async, sync and concurrent functions using asyncer
- Add Black formatter and flake8 lint (Rasa as reference)
- Add static code analysis using SonarQube
- Function return type annotations to declare the response_model (fastapi > 0.89.0)
- Add export report api in csv/xlsx files using StreamingResponse
- Add Github actions automation for deploy on Elastic Beanstalk - AWS
- Database query optimization. Many-Many use "selectin" and One-One and One-Many use "joined" issue
- Add Enum sample column
- Add docstrings
- Install pg_trgm by code and add a query for smart search of users by name
- Upgrade typing (Compatible just with python > 3.10)
- Add sample transformers NLP models and use them globally
- Add Celery samples for tasks, and schedule tasks
- Migrate from traefik reverse proxy to Caddy reverse proxy for automatic ssl
- Add fastapi limiter to natural language endpoints
- Add websocket conneting with chatgpt
- Setup testing configuracion
- Add sample composition using pydantic
- Add a nextjs sample frontend
- Add testing
- Add jsonb field on table sample
- Make that celery-sqlalchemy-scheduler works async
- Add AuthZ using oso
- Add SSL to reverse proxy on prod
- Add instructions on doc for production deployment using github actions and dockerhub (CI/CD)
- Add production deployment orchestation using terraform + Elastic Beanstalk - AWS
- Convert repo into template using cookiecutter
Support and Maintenance
fastapi-alembic-sqlmodel-async
is supported by the Allient development team. Our team is composed by a experienced professionals specializing in FastAPI projects and NLP. Please don't hesitate to get in touch with our team at info@allient.io or schedule a meeting with us here.
PR are welcome ❤️
License
- This project is licensed under the terms of the MIT license