Skip to content

How to pass cache prompt to openai chat completion module or how to achive token level streaming using post api? #8271

Closed Answered by Raul824
Raul824 asked this question in Q&A
Discussion options

You must be logged in to vote

in chat completion
extra_body={"cache_prompt": True}

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@ThachNgocTran
Comment options

Answer selected by Raul824
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants