OpenAI Chat Feature
Table of Contents
OpenAI Chat Feature
The OpenAI AI Chat feature enhances the AI Services functionality by integrating OpenAI-compatible models. It provides a suite of services to interact with these models, enabling advanced AI capabilities.
Configuration
The OpenAI AI Chat feature allows you to connect to any AI provider that adheres to OpenAI API standards, such as DeepSeek, Google Gemini, Together AI, vLLM, Cloudflare Workers AI, and more.
To configure a connection, add the following settings to the appsettings.json file:
{
"OrchardCore": {
"CrestApps_AI": {
"Providers": {
"OpenAI": {
"DefaultConnectionName": "openai-cloud",
"DefaultDeploymentName": "gpt-4o-mini",
"Connections": {
"openai-cloud": {
"ApiKey": "<!-- Your API Key Goes Here -->",
"DefaultDeploymentName": "gpt-4o-mini",
"DefaultUtilityDeploymentName": "gpt-4o-mini"
}
}
}
}
}
}
}
Using AI Deployments
If the AI Deployments feature is enabled, you can create multiple deployments under the same connection. This allows different AI profiles to utilize different models while sharing the same connection.
Configuring Other AI Providers
The OpenAI AI Chat feature supports multiple AI providers that adhere to OpenAI API standards, such as:
- DeepSeek (Docs)
- Google Gemini (Docs)
- Together AI (Docs)
- vLLM (Docs)
- Cloudflare Workers AI (Docs)
- LM Studio (Docs)
- KoboldCpp (Docs)
- text-gen-webui (Docs)
- FastChat (Docs)
- LocalAI (Docs)
- llama-cpp-python (Docs)
- TensorRT-LLM (Docs)
- BerriAI/litellm (Docs)
Configuring a Provider Example: DeepSeek
You can configure the DeepSeek connection either through the configuration provider or via the UI using the AI Connection Management feature.
Configuration Using appsettings.json
To configure DeepSeek, add the following settings:
{
"OrchardCore": {
"CrestApps_AI": {
"Providers": {
"OpenAI": {
"DefaultConnectionName": "deepseek",
"DefaultDeploymentName": "deepseek-chat",
"Connections": {
"deepseek": {
"Endpoint": "https://api.deepseek.com/v1",
"ApiKey": "<!-- Your API Key Goes Here -->",
"DefaultDeploymentName": "deepseek-chat"
}
}
}
}
}
}
}
The
DefaultConnectionNameandDefaultDeploymentNameunder theOpenAInode are required only if you want to set thedeepseekconnection as the default OpenAI connection when AI profiles use the default setting.
Configuration via AI Connection Management
If you are using the AI Connection Management feature, you can configure DeepSeek through the UI or by executing the following recipe:
{
"steps": [
{
"name": "AIProviderConnections",
"connections": [
{
"Source": "OpenAI",
"Name": "deepseek",
"IsDefault": false,
"DefaultDeploymentName": "deepseek-chat",
"DisplayText": "DeepSeek",
"Properties": {
"OpenAIConnectionMetadata": {
"Endpoint": "https://api.deepseek.com/v1",
"ApiKey": "<!-- DeepSeek API Key -->"
}
}
}
]
}
]
}
Configuring Other AI Providers
To connect to Google Gemini, Together AI, vLLM, or any other supported provider, modify the Endpoint and ApiKey fields accordingly. For example, configuring Google Gemini would look like this:
{
"OrchardCore": {
"CrestApps_AI": {
"Providers": {
"OpenAI": {
"DefaultConnectionName": "google-gemini",
"DefaultDeploymentName": "gemini-pro",
"Connections": {
"google-gemini": {
"Endpoint": "https://generativelanguage.googleapis.com/v1",
"ApiKey": "<!-- Your Google Gemini API Key -->",
"DefaultDeploymentName": "gemini-pro"
}
}
}
}
}
}
}
You can replace Endpoint with the appropriate URL for each provider.
Configuring Multiple Models
If you need access to multiple DeepSeek models, you can execute the following recipe to add standard deployments:
{
"steps": [
{
"name": "AIDeployment",
"deployments": [
{
"Name": "deepseek-chat",
"ProviderName": "OpenAI",
"ConnectionName": "deepseek"
},
{
"Name": "deepseek-reasoner",
"ProviderName": "OpenAI",
"ConnectionName": "deepseek"
}
]
}
]
}
This configuration allows you to access multiple models provided by DeepSeek, such as deepseek-chat and deepseek-reasoner.
By following these steps, you can seamlessly integrate DeepSeek into your AI chat feature, either as a default provider or alongside other AI models.