OpenAI Chat Feature
| Feature Name | OpenAI Chat |
| Feature ID | CrestApps.OrchardCore.OpenAI |
Provides a way to interact with the OpenAI service provider.
Overview
The OpenAI AI Chat feature enhances the AI Services functionality by integrating OpenAI-compatible models. It provides a suite of services to interact with these models, enabling advanced AI capabilities.
Configuration
The OpenAI AI Chat feature allows you to connect to any AI provider that adheres to OpenAI API standards, such as DeepSeek, Google Gemini, Together AI, vLLM, Cloudflare Workers AI, and more.
To configure a connection, add the following settings to the appsettings.json file:
{
"OrchardCore": {
"CrestApps_AI": {
"Providers": {
"OpenAI": {
"DefaultConnectionName": "openai-cloud",
"Connections": {
"openai-cloud": {
"ApiKey": "<!-- Your API Key Goes Here -->",
"Deployments": [
{ "Name": "gpt-4o", "Type": "Chat", "IsDefault": true },
{ "Name": "gpt-4o-mini", "Type": "Utility", "IsDefault": true },
{ "Name": "text-embedding-3-large", "Type": "Embedding", "IsDefault": true },
{ "Name": "dall-e-3", "Type": "Image", "IsDefault": true }
]
}
}
}
}
}
}
}
The following format using ChatDeploymentName, UtilityDeploymentName, etc. is still supported but deprecated. Existing configurations will be auto-migrated at runtime.
{
"Connections": {
"openai-cloud": {
"ApiKey": "...",
"ChatDeploymentName": "gpt-4o",
"UtilityDeploymentName": "gpt-4o-mini",
"EmbeddingDeploymentName": "text-embedding-3-large",
"ImagesDeploymentName": "dall-e-3"
}
}
}
Using AI Deployments
If the AI Deployments feature is enabled, you can create multiple typed deployments under the same connection. Each deployment has a Type (Chat, Utility, Embedding, Image, SpeechToText) and an optional IsDefault flag. This allows different AI profiles to utilize different models while sharing the same connection. UI dropdowns display deployments grouped by connection for easy selection.
Configuring Other AI Providers
The OpenAI AI Chat feature supports multiple AI providers that adhere to OpenAI API standards, such as:
- DeepSeek (Docs)
- Google Gemini (Docs)
- Together AI (Docs)
- vLLM (Docs)
- Cloudflare Workers AI (Docs)
- LM Studio (Docs)
- KoboldCpp (Docs)
- text-gen-webui (Docs)
- FastChat (Docs)
- LocalAI (Docs)
- llama-cpp-python (Docs)
- TensorRT-LLM (Docs)
- BerriAI/litellm (Docs)
Configuring a Provider Example: DeepSeek
You can configure the DeepSeek connection either through the configuration provider or via the UI using the AI Connection Management feature.
Configuration Using appsettings.json
To configure DeepSeek, add the following settings:
{
"OrchardCore": {
"CrestApps_AI": {
"Providers": {
"OpenAI": {
"DefaultConnectionName": "deepseek",
"Connections": {
"deepseek": {
"Endpoint": "https://api.deepseek.com/v1",
"ApiKey": "<!-- Your API Key Goes Here -->",
"Deployments": [
{ "Name": "deepseek-chat", "Type": "Chat", "IsDefault": true }
]
}
}
}
}
}
}
}
The
DefaultConnectionNameunder theOpenAInode is required only if you want to set thedeepseekconnection as the default OpenAI connection when AI profiles use the default setting.
Configuration via AI Connection Management
If you are using the AI Connection Management feature, you can configure DeepSeek through the UI or by executing the following recipe:
{
"steps": [
{
"name": "AIProviderConnections",
"connections": [
{
"Source": "OpenAI",
"Name": "deepseek",
"IsDefault": false,
"DisplayText": "DeepSeek",
"Deployments": [
{ "Name": "deepseek-chat", "Type": "Chat", "IsDefault": true }
],
"Properties": {
"OpenAIConnectionMetadata": {
"Endpoint": "https://api.deepseek.com/v1",
"ApiKey": "<!-- DeepSeek API Key -->"
}
}
}
]
}
]
}
Configuring Other AI Providers
To connect to Google Gemini, Together AI, vLLM, or any other supported provider, modify the Endpoint and ApiKey fields accordingly. For example, configuring Google Gemini would look like this:
{
"OrchardCore": {
"CrestApps_AI": {
"Providers": {
"OpenAI": {
"DefaultConnectionName": "google-gemini",
"Connections": {
"google-gemini": {
"Endpoint": "https://generativelanguage.googleapis.com/v1",
"ApiKey": "<!-- Your Google Gemini API Key -->",
"Deployments": [
{ "Name": "gemini-pro", "Type": "Chat", "IsDefault": true }
]
}
}
}
}
}
}
}
You can replace Endpoint with the appropriate URL for each provider.
Configuring Multiple Models
If you need access to multiple DeepSeek models, you can execute the following recipe to add typed deployments:
{
"steps": [
{
"name": "AIDeployment",
"deployments": [
{
"Name": "deepseek-chat",
"Type": "Chat",
"IsDefault": true,
"ProviderName": "OpenAI",
"ConnectionName": "deepseek"
},
{
"Name": "deepseek-reasoner",
"Type": "Chat",
"ProviderName": "OpenAI",
"ConnectionName": "deepseek"
}
]
}
]
}
This configuration allows you to access multiple models provided by DeepSeek, such as deepseek-chat (set as the default Chat deployment) and deepseek-reasoner.
By following these steps, you can seamlessly integrate DeepSeek into your AI chat feature, either as a default provider or alongside other AI models.