
Overview
Launch includes a production-ready AI chat feature with streaming responses and chat history persistence. The architecture is modular, making it easy to extend with additional providers and custom system prompts.Prerequisites
- Backend running with AI env vars set (copy from
apps/api/example.envfirst) EXPO_PUBLIC_API_URLset inapps/mobile/.env(fromapps/mobile/example.env)
Key Features
Provider Abstraction
OpenAI is enabled by default. Anthropic support is wired in the backend and
can be enabled in the mobile model list with a single config change.
Streaming Responses
Real-time streaming for a responsive chat experience with stop generation
support.
System Prompts
Predefined personas and custom system prompts to shape AI behavior.
Chat Persistence
Conversation history stored in PostgreSQL via tRPC procedures.
Architecture
The AI feature is built with extensibility in mind, separating concerns into distinct layers.Core Components
AI Provider Abstraction
Location:
lib/ai/providers/ A unified interface for working with
different AI providers. OpenAI is enabled by default; add Anthropic to the
providers list to expose Claude models in the UI.Custom Hooks
Location:
features/chat/hooks/ Business logic extracted into reusable
hooks: useChat for messaging, useChatHistory for history management, and
useChatPersistence for database sync.Chat Context
Location:
features/chat/context/ Global state management for chat
functionality, reducing prop drilling and simplifying component interactions.Steps
- Copy the example env (if not already):
- Add API keys to your backend
.env:
- Ensure the AI feature flag is enabled:
apps/mobile/features/feature-registry.tsx → featureFlags.ai = true
- (Optional) Enable Anthropic models in the mobile picker:
apps/mobile/lib/ai/providers/index.ts → add anthropicProvider to the
providers array.
- Open the chat screen at
/ai-chat.
How It Works
- Mobile streams tokens from the API via
POST /api/ai/stream(SSE) - Chat history is persisted via tRPC procedures under
chat.* - System prompts are defined in
apps/mobile/lib/ai/prompts/index.ts
Key Files
- API streaming:
apps/api/src/routes/ai-stream.ts - AI providers:
apps/api/src/services/openai.tsandapps/api/src/services/anthropic.ts - Chat persistence:
apps/api/src/routers/chat.ts - Mobile streaming client:
apps/mobile/lib/api/streaming.ts - Chat UI and hooks:
apps/mobile/features/chat/
Customizing System Prompts
System prompts define the AI’s persona and behavior:Chat and ChatMessage models
in apps/api/prisma/schema.prisma.
API Endpoints
The chat feature exposes the following tRPC procedures:| Procedure | Type | Description |
|---|---|---|
chat.list | Query | Get all chats for the current user |
chat.get | Query | Get a specific chat with messages |
chat.create | Mutation | Create a new chat |
chat.delete | Mutation | Delete a chat |
chat.addMessage | Mutation | Add a message to a chat |
chat.updateTitle | Mutation | Update chat title |
Test Checklist
- Chat screen loads and streams responses
- New chats are saved and appear in history
- Switching models updates the active provider
Troubleshooting
If streaming fails, verify API keys and server logs. For general issues, start with Troubleshooting.Remove / Disable
To disable AI while you configure providers, set:apps/mobile/features/feature-registry.tsx → featureFlags.ai = false
For production removal guidance, see Removing Features.