Architecture & high availability
A clear view of the SaaS mechanics and deployment options, aligned with plans.
Service continuity
- - Multi-region active/standby (France Central + North Europe).
- - Multi-zone distribution with autoscaling.
- - Automatic public endpoint failover.
Options by plan
- - Managed SaaS: all plans.
- - Hybrid (self-hosted agent): from Growth plan.
- - On-prem LLM Gateway: Enterprise.
- - Full on-prem deployment: Enterprise on request.
Sovereignty and network constraints can influence the recommended plan.
Simple, transparent pricing
Choose the plan that fits your needs. Scale as you grow. No hidden fees.
Free
Discover Argy for free. Perfect for testing the platform.
Included quotas
- In Argy, a product groups an application/service and its environments. It’s the unit where you apply modules, policies, and automations.
- 1 product
- An Argy module is a packaged golden path (config schema + implementation + guardrails). It turns your toolchain into governed self‑service.
- 3 active modules
- A pipeline is an automated execution (build, tests, deploy, validations). Argy orchestrates these runs on top of your existing CI/CD.
- 10 pipelines/month
- AI tokens measure LLM consumption. Argy governs them through the LLM Gateway (quotas, audit, filtering) to control costs and risks.
- 10,000 AI tokens/month
- RAG (Retrieval-Augmented Generation) augments prompts with passages retrieved from your documents to deliver grounded, context-aware responses.
- 0 RAG documents
- RAG (Retrieval-Augmented Generation) augments prompts with passages retrieved from your documents to deliver grounded, context-aware responses.
- 0 indexed RAG tokens
- RAG (Retrieval-Augmented Generation) augments prompts with passages retrieved from your documents to deliver grounded, context-aware responses.
- 0 MB RAG storage
- RAG (Retrieval-Augmented Generation) augments prompts with passages retrieved from your documents to deliver grounded, context-aware responses.
- 0 RAG queries/month
Features
- Full Argy Console
- Standard module catalog
- Git integrations (GitHub, GitLab)
- Community support
- 99.9% availability SLA (SaaS)
- No Argy Code access
- No LLM Gateway access
- No SSO
Starter
For startups and small projects looking to standardize quickly.
Included quotas
- In Argy, a product groups an application/service and its environments. It’s the unit where you apply modules, policies, and automations.
- 5 products
- An Argy module is a packaged golden path (config schema + implementation + guardrails). It turns your toolchain into governed self‑service.
- 20 active modules
- A pipeline is an automated execution (build, tests, deploy, validations). Argy orchestrates these runs on top of your existing CI/CD.
- 100 pipelines/month
- AI tokens measure LLM consumption. Argy governs them through the LLM Gateway (quotas, audit, filtering) to control costs and risks.
- 100,000 AI tokens/month
- RAG (Retrieval-Augmented Generation) augments prompts with passages retrieved from your documents to deliver grounded, context-aware responses.
- 20 RAG documents
- RAG (Retrieval-Augmented Generation) augments prompts with passages retrieved from your documents to deliver grounded, context-aware responses.
- 1,000,000 indexed RAG tokens
- RAG (Retrieval-Augmented Generation) augments prompts with passages retrieved from your documents to deliver grounded, context-aware responses.
- 200 MB RAG storage
- RAG (Retrieval-Augmented Generation) augments prompts with passages retrieved from your documents to deliver grounded, context-aware responses.
- 200 RAG queries/month
Features
- Everything in Free +
- LLM Gateway SaaS included
- Argy Code (limited usage)
- Module Studio (partial)
- Email support
- Cloud integrations (AWS, Azure, GCP)
- 99.9% availability SLA (SaaS)
Growth
For scale-ups and critical projects ready to scale.
Included quotas
- In Argy, a product groups an application/service and its environments. It’s the unit where you apply modules, policies, and automations.
- 25 products
- An Argy module is a packaged golden path (config schema + implementation + guardrails). It turns your toolchain into governed self‑service.
- 100 active modules
- A pipeline is an automated execution (build, tests, deploy, validations). Argy orchestrates these runs on top of your existing CI/CD.
- 1,000 pipelines/month
- AI tokens measure LLM consumption. Argy governs them through the LLM Gateway (quotas, audit, filtering) to control costs and risks.
- 500,000 AI tokens/month
- RAG (Retrieval-Augmented Generation) augments prompts with passages retrieved from your documents to deliver grounded, context-aware responses.
- 100 RAG documents
- RAG (Retrieval-Augmented Generation) augments prompts with passages retrieved from your documents to deliver grounded, context-aware responses.
- 10,000,000 indexed RAG tokens
- RAG (Retrieval-Augmented Generation) augments prompts with passages retrieved from your documents to deliver grounded, context-aware responses.
- 2,000 MB RAG storage
- RAG (Retrieval-Augmented Generation) augments prompts with passages retrieved from your documents to deliver grounded, context-aware responses.
- 1,000 RAG queries/month
Features
- Everything in Starter +
- Unlimited Argy Code
- Full Module Studio
- RAG (Retrieval-Augmented Generation)
- Advanced RBAC + Audit logs
- Approval workflows
- Self-hosted agents (optional)
- Priority support
- 99.9% availability SLA (SaaS)
Enterprise
For large enterprises with compliance and sovereignty requirements.
Included quotas
- In Argy, a product groups an application/service and its environments. It’s the unit where you apply modules, policies, and automations.
- Unlimited
- An Argy module is a packaged golden path (config schema + implementation + guardrails). It turns your toolchain into governed self‑service.
- Unlimited
- A pipeline is an automated execution (build, tests, deploy, validations). Argy orchestrates these runs on top of your existing CI/CD.
- Unlimited
- AI tokens measure LLM consumption. Argy governs them through the LLM Gateway (quotas, audit, filtering) to control costs and risks.
- Negotiated
- RAG (Retrieval-Augmented Generation) augments prompts with passages retrieved from your documents to deliver grounded, context-aware responses.
- Unlimited
- RAG (Retrieval-Augmented Generation) augments prompts with passages retrieved from your documents to deliver grounded, context-aware responses.
- Unlimited
- RAG (Retrieval-Augmented Generation) augments prompts with passages retrieved from your documents to deliver grounded, context-aware responses.
- Unlimited
- RAG (Retrieval-Augmented Generation) augments prompts with passages retrieved from your documents to deliver grounded, context-aware responses.
- Negotiated
Features
- Everything in Growth +
- On-premises LLM Gateway
- Private self-hosted agents
- RAG on internal data
- SSO (OIDC/SAML) + SCIM
- ITSM integration
- 99.9% SLA guaranteed
- Dedicated 24/7 support
- Dedicated or on-premises deployment
Enterprise supports SaaS, dedicated, and on-premises deployments. Get a custom quote tailored to your organization’s needs.
Compare plans at a glance
All the details to help you choose the right plan for your team.
| Feature | Free | Starter | Growth | Enterprise |
|---|---|---|---|---|
| In Argy, a product groups an application/service and its environments. It’s the unit where you apply modules, policies, and automations. | 1 product | 5 products | 25 products | Unlimited |
| An Argy module is a packaged golden path (config schema + implementation + guardrails). It turns your toolchain into governed self‑service. | 3 active modules | 20 active modules | 100 active modules | Unlimited |
| A pipeline is an automated execution (build, tests, deploy, validations). Argy orchestrates these runs on top of your existing CI/CD. | 10 pipelines/month | 100 pipelines/month | 1,000 pipelines/month | Unlimited |
| AI tokens measure LLM consumption. Argy governs them through the LLM Gateway (quotas, audit, filtering) to control costs and risks. | 10,000 AI tokens/month | 100,000 AI tokens/month | 500,000 AI tokens/month | Negotiated |
| RAG (Retrieval-Augmented Generation) augments prompts with passages retrieved from your documents to deliver grounded, context-aware responses. | 0 RAG documents | 20 RAG documents | 100 RAG documents | Unlimited |
| RAG (Retrieval-Augmented Generation) augments prompts with passages retrieved from your documents to deliver grounded, context-aware responses. | 0 indexed RAG tokens | 1,000,000 indexed RAG tokens | 10,000,000 indexed RAG tokens | Unlimited |
| RAG (Retrieval-Augmented Generation) augments prompts with passages retrieved from your documents to deliver grounded, context-aware responses. | 0 MB RAG storage | 200 MB RAG storage | 2,000 MB RAG storage | Unlimited |
| RAG (Retrieval-Augmented Generation) augments prompts with passages retrieved from your documents to deliver grounded, context-aware responses. | 0 RAG queries/month | 200 RAG queries/month | 1,000 RAG queries/month | Negotiated |
| The LLM Gateway centralizes and secures AI calls (providers, quotas, audit, filters). It avoids client-side API keys and provides enterprise control.LLM Gateway | ||||
| Argy Code is a coding agent (CLI + IDE) connected to Argy context. It speeds up delivery while enforcing standards (policies, quality gates).Argy Code | Limited | |||
| Module Studio is the workspace to design and maintain modules (schema, templates, validations, docs) so your golden paths stay consistent and adoptable.Module Studio | Partial | |||
| RAG | ||||
| Advanced RBAC | ||||
| SSO + SCIM | Add-on | Add-on | ||
| Self-hosted agents | Add-on | |||
| On-premises LLM Gateway | Add-on | |||
| SLA | 99.9% | 99.9% | 99.9% | — |
| Support | Community | Priority | Dedicated 24/7 |
Add-ons & Extra capacity
Extend Argy with optional packs—keep pricing predictable while you scale.
+100,000 AI tokens
Increase your token quota for RAG indexing and the LLM Gateway.
Available for: Starter, Growth, Enterprise
+100 pipelines/month
Run more deployment pipelines each month.
Available for: Starter, Growth, Enterprise
Private Argy Code agent
Deploy Argy Code agents in your infrastructure.
Available for: Growth, Enterprise
Self-hosted agent (hybrid)
Enable a self-hosted agent to run sensitive actions inside your network.
Available for: Growth
On-premises LLM Gateway
Deploy the LLM Gateway on your premises to keep AI data internal.
Available for: Growth
SSO (OIDC/SAML)
Single sign-on via your identity provider.
Available for: Starter, Growth (included in Enterprise)
SCIM / Directory provisioning
Automatic user and group synchronization.
Available for: Growth, Enterprise
Onboarding training
Personalized training session for your teams.
Available for: All plans
Dedicated support
Priority access to a dedicated support engineer.
Available for: Starter, Growth (included in Enterprise)
Frequently asked questions
Is there a limit on the number of users?
No! All Argy plans include unlimited users. You pay for capabilities (products, modules, pipelines, AI tokens), not seats.
How does AI token billing work?
1 Argy credit = 1,000,000 tokens. Tokens are consumed when calling the LLM Gateway (Argy Code, AI assistant, RAG). You can track your consumption in real-time in the console.
Can I change plans at any time?
Yes, you can upgrade at any time. The change is effective immediately and billing is prorated. For downgrades, contact our team.
What is the on-premises LLM Gateway?
The on-premises LLM Gateway allows you to deploy the AI gateway in your infrastructure. Your data and LLM API keys stay within your perimeter, ideal for enterprises with sovereignty requirements.
Are self-hosted agents secure?
Yes. Agents only establish outbound connections (HTTPS) to Argy. No inbound ports are exposed. Credentials stay in your infrastructure.
Do you offer discounts for annual commitments?
Yes, we offer a 15% discount for annual commitments. Contact our sales team to learn more.
Ready to transform your DevSecOps?
Book a personalized demo with our team. We'll show you how Argy can accelerate your platform engineering journey.
No credit card required • 15% discount on annual plans • Cancel anytime
FAQ
Common questions.
Does Argy replace your existing tools?⌃
No. Argy interfaces with your toolchain (Git, CI/CD, cloud, Kubernetes, observability, secrets). Argy's role is to orchestrate and standardize via versioned modules, not to reinvent every brick.
What is an Argy 'module'?⌃
A module encapsulates an operational workflow: configuration schema, templates (IaC/CI), policies/guardrails, documentation, and runbooks. It is reusable, extensible, and applied per environment.
What is the difference with a 'home-grown' IDP?⌃
Argy provides a SaaS operating layer: catalog, governance, versioning, self-service experience, and steering. You keep your technical choices and tools — Argy accelerates standardization and adoption.
How to start in a few weeks?⌃
We start with 1 to 2 priority golden paths (e.g., microservice + ephemeral envs). Then, we expand the catalog and add governance/observability incrementally.
What is the role of AI in Argy?⌃
AI assists platform teams and developers in configuring modules, detecting drifts from standards (Golden Paths), and automated generation of operational runbooks.
Is Argy suitable for large enterprises?⌃
Absolutely. Argy was designed for scale, with fine-grained RBAC, SSO, audit logs, and dedicated support. It is the ultimate solution for organizations wanting to industrialize their DevSecOps.
European SaaS
GDPR compliant & hosted in EU
No Lock-in
Built on open standards
API-First
Everything is automatable
Ready to turn AI into an enterprise operating system?
Share your context (toolchain, constraints, org). We’ll propose a pragmatic rollout that makes AI governed, scalable, and sovereign.