-
Notifications
You must be signed in to change notification settings - Fork 615
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: support minimax ai model #1033
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
} | ||
|
||
// 使用OpenAI接口协议,映射模型 | ||
if m.config.protocol == protocolOpenAI { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里的判定是不是有问题?如果用户配置的不适用 OpenAI 的契约,那么上面就不能按照 OpenAI 的契约进行请求数据反序列化了。而且这个 protocol 指的是契约,不是 model。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里的判定是不是有问题?如果用户配置的不适用 OpenAI 的契约,那么上面就不能按照 OpenAI 的契约进行请求数据反序列化了。而且这个 protocol 指的是契约,不是 model。
这里的意思是只有使用openai协议时,才会进行模型映射。如果使用原生的接口协议,就使用接口传入的模型直接调用,不进行模型映射了
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
不过 modelMapping 字段在定义上并没有说明只有在 OpenAI 协议上生效。这个逻辑可能会让用户感到迷惑。如果用户配置 Original 协议,一般也不会配 modelMapping 吧。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
不过 modelMapping 字段在定义上并没有说明只有在 OpenAI 协议上生效。这个逻辑可能会让用户感到迷惑。如果用户配置 Original 协议,一般也不会配 modelMapping 吧。
嗯,好的,这里我调整一下
也可以,如果是abab6.5、abab6.5s、abab5.5s模型会优先使用ChatCompletion Pro,abab5.5优先使用ChatCompletion,其他模型使用ChatCompletion v2,我这边会根据这个逻辑再调整下 |
ChatCompletion Pro也支持abab5.5,目前实现逻辑:如果是abab6.5、abab6.5s、abab5.5s、abab5.5模型会优先使用ChatCompletion Pro,其他模型使用ChatCompletion v2(abab6.5t、abab6.5g) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Ⅰ. Describe what this PR did
1)支持minimax AI模型
2)修复文心一言使用OpenAI协议流式响应格式(data:后少了个空格)
Ⅱ. Does this pull request fix one issue?
fixes #953
Ⅲ. Why don't you add test cases (unit test/integration test)?
Ⅳ. Describe how to verify it
docker-compose.yaml
使用OpenAI协议
envoy.yaml
非流式请求
示例1:调用ChatCompletion V2接口
响应:
示例2:调用ChatCompletion Pro接口
响应:
流式请求
示例1:调用ChatCompletion V2接口
响应:
示例2:调用ChatCompletion Pro接口
响应:
使用MiniMax协议
envoy.yaml
非流式请求
示例1:调用ChatCompletion V2接口
响应:
示例2:调用ChatCompletion Pro接口
响应:
流式请求
示例1:调用ChatCompletion V2接口
响应:
示例2:调用ChatCompletion Pro接口
响应:
修复文心一言使用OpenAI协议流式响应格式
envoy.yaml
流式请求
响应:
Ⅴ. Special notes for reviews