Our containerized AI artifacts are OCI compatible allowing you to directly use them with docker, podman and kubernetes wherever you need them: whether the cloud, a datacenter, or your basement. Our artifacts are regularly rebuilt, updated, and scanned for vulnerabilities to provide, the smallest, fastest, and most secure runtime possible.Documentation Index
Fetch the complete documentation index at: https://docs.ramalama.com/llms.txt
Use this file to discover all available pages before exploring further.
Quick start
Install dependencies
Getting started requires either Docker or Podman. We also recommend the RamaLama CLI for a streamlined experience.
- Install Podman or Docker
- (Optional) Install RamaLama CLI
Run the model
Model images bundle both the runtime and model, providing a single runnable container.
Next steps
Deploy to Production
Deploy with Docker Compose or Kubernetes for production workloads
Security & Provenance
Review CVEs, SBOMs, and security best practices
Request Custom Images
Need bespoke images for specific hardware or compliance needs?

