@@ -37,8 +37,8 @@ Feature: llama.cpp server
37
37
38
38
Examples : Prompts
39
39
| prompt | n_predict | re_content | n_prompt | n_predicted | truncated |
40
- | I believe the meaning of life is | 8 | (read \|going )+ | 18 | 8 | not |
41
- | Write a joke about AI from a very long prompt which will not be truncated | 256 | (princesses \|everyone \|kids \|Anna \|forest )+ | 46 | 64 | not |
40
+ | I believe the meaning of life is | 8 | (read \|going \| pretty )+ | 18 | 8 | not |
41
+ | Write a joke about AI from a very long prompt which will not be truncated | 256 | (princesses \|everyone \|kids \|Anna \|forest )+ | 45 | 64 | not |
42
42
43
43
Scenario : Completion prompt truncated
44
44
Given a prompt:
@@ -67,8 +67,8 @@ Feature: llama.cpp server
67
67
68
68
Examples : Prompts
69
69
| model | system_prompt | user_prompt | max_tokens | re_content | n_prompt | n_predicted | enable_streaming | truncated |
70
- | llama -2 | Book | What is the best book | 8 | (Here \|what )+ | 77 | 8 | disabled | not |
71
- | codellama70b | You are a coding assistant . | Write the fibonacci function in c ++. | 128 | (thanks \|happy \|bird \|Annabyear )+ | -1 | 64 | enabled | |
70
+ | llama -2 | Book | What is the best book | 8 | (Here \|what )+ | 76 | 8 | disabled | not |
71
+ | codellama70b | You are a coding assistant . | Write the fibonacci function in c ++. | 128 | (thanks \|happy \|bird \|fireplace )+ | -1 | 64 | enabled | |
72
72
73
73
74
74
Scenario Outline : OAI Compatibility w/ response format
@@ -84,7 +84,7 @@ Feature: llama.cpp server
84
84
| response_format | n_predicted | re_content |
85
85
| {"type ": "json_object ", "schema ": {"const ": "42 "}} | 5 | "42 " |
86
86
| {"type ": "json_object ", "schema ": {"items ": [{"type ": "integer "}]}} | 10 | \[ -300 \] |
87
- | {"type ": "json_object "} | 10 | \{ " Jacky . |
87
+ | {"type ": "json_object "} | 10 | \{ " Saragine . |
88
88
89
89
90
90
Scenario : Tokenize / Detokenize
0 commit comments