Forge Integration
Using the AI SDK inside a Forge application via the AI extension
The AI SDK can be used standalone in any Go project. When building a Forge application, the AI extension (github.com/xraph/forge/extensions/ai) provides a thin DI wrapper that registers ai-sdk services into Forge's Vessel container.
How It Works
The Forge AI extension:
- Reads AI configuration from the Forge app config.
- Creates an
LLMManager,StateStore, andVectorStore. - Registers them in the Vessel container for dependency injection.
- Creates an
AgentFactoryandAgentManagerfor managing agents.
Services Registered
| Service | Type | Container Key |
|---|---|---|
| LLM Manager | sdk.LLMManager | forge.ai.sdk.llmManager |
| State Store | sdk.StateStore | auto-registered by type |
| Vector Store | sdk.VectorStore | auto-registered by type |
| Agent Factory | *ai.AgentFactory | auto-registered by type |
| Agent Manager | *ai.AgentManager | auto-registered by type |
When config.Training.Enabled is true, the extension also registers:
ai.ModelTrainerai.DataManagerai.PipelineManager
Using the Extension
import (
"github.com/xraph/forge"
"github.com/xraph/forge/extensions/ai"
)
aiExt := ai.NewExtension()
app := forge.New(
forge.WithAppName("my-ai-app"),
forge.WithExtensions(aiExt),
)
ctx := context.Background()
if err := app.Start(ctx); err != nil {
log.Fatal(err)
}
defer app.Stop(ctx)Resolving AI Services
Use Vessel to resolve ai-sdk services from the container:
import (
"github.com/xraph/vessel"
sdk "github.com/xraph/ai-sdk"
)
// By type (recommended)
llmManager, err := vessel.InjectType[sdk.LLMManager](app.Container())
// By name
llmManager, err := vessel.Resolve[sdk.LLMManager](app.Container(), "forge.ai.sdk.llmManager")Using in Extensions
func (e *MyExtension) Register(app forge.App) error {
return vessel.ProvideConstructor(app.Container(), func(
llmManager sdk.LLMManager,
logger forge.Logger,
) (*MyAIService, error) {
return &MyAIService{
llm: llmManager,
logger: logger,
}, nil
})
}Important Notes
- The AI extension falls back to in-memory stores when configured backends fail to connect.
- LLM providers are not auto-registered from config -- they must be registered manually.
- No routes are mounted automatically; register
AgentControllerroutes yourself. - Extension
Health()currently returns nil and does not aggregate component checks.
Standalone vs. Forge
| Scenario | Recommendation |
|---|---|
| Standalone Go app | Use github.com/xraph/ai-sdk directly |
| Forge app needing AI | Use the AI extension for DI wiring |
| Shared library | Import github.com/xraph/ai-sdk directly |
| Custom DI setup | Use github.com/xraph/ai-sdk directly |
Further Reading
- AI Extension docs -- full configuration, DI lifecycle, HTTP API, and troubleshooting
- Vessel docs -- dependency injection container
How is this guide?