Skip to main content
In just a few steps, you’ll have OpenStack up and running, ready to handle requests from your AI applications.
1

Sign up for OpenStack

If you haven’t already, sign up for an OpenStack account at https://app.openstack.ai.
2

Connect your LLM provider

In the OpenStack dashboard connect your preferred LLM provider (e.g., OpenAI, Anthropic) by adding your API key. Learn more about model providers.
3

Get your OpenStack API key

Once your LLM provider is connected, you will get an OpenStack API key. This key will be used to authenticate requests from your AI applications to OpenStack. Keep it secure!
4

Update your application to use OpenStack

Swap your AI application’s LLM provider endpoint with OpenStack’s endpoint: https://api.openstack.ai/v1. Make sure to include a user id, learn more here.
  • OpenAI TypeScript
  • AI SDK Typescript
  • OpenAI Python
  • OpenStack SDK (soon)
  • cURL
import OpenAI from "openai";

const openai = new OpenAI({
  apiKey: "YOUR_OPENSTACK_API_KEY",
  baseURL: "https://api.openstack.ai/v1",
});

const response = await openai.chat.completions.create({
  model: "gpt-5",
  messages: [{ role: "user", content: "Hello!" }],
  user: "user_123",
});
5
That’s it! Your AI application is now set up to use OpenStack. You can start sending requests and enable advanced features and integrations as you need them.