Skip to content

Commit ddd5cb9

Browse files
authored
Reorganize and reword discussion of streaming language models (#319)
Alternative to #286 Signed-off-by: Mattt Zmuda <[email protected]>
1 parent a79b0d2 commit ddd5cb9

File tree

1 file changed

+8
-15
lines changed

1 file changed

+8
-15
lines changed

README.md

Lines changed: 8 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,8 @@ We recommend not adding the token directly to your source code, because you don'
3030

3131
## Run a model
3232

33-
Create a new Python file and add the following code, replacing the model identifier and input with your own:
33+
Create a new Python file and add the following code,
34+
replacing the model identifier and input with your own:
3435

3536
```python
3637
>>> import replicate
@@ -42,18 +43,6 @@ Create a new Python file and add the following code, replacing the model identif
4243
['https://replicate.com/api/models/stability-ai/stable-diffusion/files/50fcac81-865d-499e-81ac-49de0cb79264/out-0.png']
4344
```
4445

45-
Some models, particularly language models, may not require the version string. Refer to the API documentation for the model for more on the specifics:
46-
47-
```python
48-
replicate.run(
49-
"meta/meta-llama-3-70b-instruct",
50-
input={
51-
"prompt": "Can you write a poem about open source machine learning?",
52-
"system_prompt": "You are a helpful, respectful and honest assistant.",
53-
},
54-
)
55-
```
56-
5746
> [!TIP]
5847
> You can also use the Replicate client asynchronously by prepending `async_` to the method name.
5948
>
@@ -110,13 +99,17 @@ for event in replicate.stream(
11099
print(str(event), end="")
111100
```
112101

102+
> [!TIP]
103+
> Some models, like [meta/meta-llama-3-70b-instruct](https://replicate.com/meta/meta-llama-3-70b-instruct),
104+
> don't require a version string.
105+
> You can always refer to the API documentation on the model page for specifics.
106+
113107
You can also stream the output of a prediction you create.
114108
This is helpful when you want the ID of the prediction separate from its output.
115109

116110
```python
117-
version = "02e509c789964a7ea8736978a43525956ef40397be9033abf9fd2badfe68c9e3"
118111
prediction = replicate.predictions.create(
119-
version=version,
112+
model="meta/meta-llama-3-70b-instruct"
120113
input={"prompt": "Please write a haiku about llamas."},
121114
stream=True,
122115
)

0 commit comments

Comments
 (0)