rigami.
All stacks
/// Backend · AI-friendly server

FastAPI services for AI-heavy products.

When the back-end is half AI pipeline, half API, FastAPI is our default. Async Python, pydantic typing, OpenAPI docs out of the box.

/// What we ship with Python / FastAPI

How we use it
in production.

  • FastAPI + pydantic v2 + async/await
  • LLM integrations: OpenAI, Anthropic, Google
  • Vector stores (pgvector, Pinecone, Weaviate)
  • Background workers with Celery / RQ
  • Postgres / Redis / S3
  • OpenTelemetry, OpenAPI, Sentry