Home

Awesome

<p align="center"> <a href="https://flama.dev"><img src="https://raw.githubusercontent.com/vortico/flama/master/.github/logo.png" alt='Flama'></a> </p> <p align="center"> <em>Fire up your models with the flame</em> &#128293; </p> <p align="center"> <a href="https://github.com/vortico/flama/actions"> <img src="https://github.com/vortico/flama/workflows/Test%20And%20Publish/badge.svg" alt="Test And Publish workflow status"> </a> <a href="https://pypi.org/project/flama/"> <img src="https://img.shields.io/pypi/v/flama?logo=PyPI&logoColor=white" alt="Package version"> </a> <a href="https://pypi.org/project/flama/"> <img src="https://img.shields.io/pypi/pyversions/flama?logo=Python&logoColor=white" alt="PyPI - Python Version"> </a> </p>

Flama

Flama is a python library which establishes a standard framework for development and deployment of APIs with special focus on machine learning (ML). The main aim of the framework is to make ridiculously simple the deployment of ML APIs, simplifying (when possible) the entire process to a single line of code.

The library builds on Starlette, and provides an easy-to-learn philosophy to speed up the building of highly performant GraphQL, REST and ML APIs. Besides, it comprises an ideal solution for the development of asynchronous and production-ready services, offering automatic deployment for ML models.

Some remarkable characteristics:

Installation

Flama is fully compatible with all supported versions of Python. We recommend you to use the latest version available.

For a detailed explanation on how to install flama visit: https://flama.dev/docs/getting-started/installation.

Getting Started

Visit https://flama.dev/docs/getting-started/quickstart to get started with Flama.

Documentation

Visit https://flama.dev/docs/ to view the full documentation.

Example

from flama import Flama

app = Flama(
    title="Hello-🔥",
    version="1.0",
    description="My first API",
)


@app.route("/")
def home():
    """
    tags:
        - Salute
    summary:
        Returns a warming message
    description:
        This is a more detailed description of the method itself.
        Here we can give all the details required and they will appear
        automatically in the auto-generated docs.
    responses:
        200:
            description: Warming hello message!
    """
    return {"message": "Hello 🔥"}

This example will build and run a Hello 🔥 API. To run it:

flama run examples.hello_flama:app

Authors

Contributing

This project is absolutely open to contributions so if you have a nice idea, please read our contributing docs before submitting a pull request.

Star History

<a href="https://github.com/vortico/flama"> <picture> <source media="(prefers-color-scheme: dark)" srcset="https://api.star-history.com/svg?repos=vortico/flama&type=Date&theme=dark" /> <source media="(prefers-color-scheme: light)" srcset="https://api.star-history.com/svg?repos=vortico/flama&type=Date" /> <img alt="Star History Chart" src="https://api.star-history.com/svg?repos=vortico/flama&type=Date" /> </picture> </a>