@@ -30,7 +30,8 @@ We recommend not adding the token directly to your source code, because you don'
30
30
31
31
## Run a model
32
32
33
- Create a new Python file and add the following code, replacing the model identifier and input with your own:
33
+ Create a new Python file and add the following code,
34
+ replacing the model identifier and input with your own:
34
35
35
36
``` python
36
37
>> > import replicate
@@ -42,18 +43,6 @@ Create a new Python file and add the following code, replacing the model identif
42
43
[' https://replicate.com/api/models/stability-ai/stable-diffusion/files/50fcac81-865d-499e-81ac-49de0cb79264/out-0.png' ]
43
44
```
44
45
45
- Some models, particularly language models, may not require the version string. Refer to the API documentation for the model for more on the specifics:
46
-
47
- ``` python
48
- replicate.run(
49
- " meta/meta-llama-3-70b-instruct" ,
50
- input = {
51
- " prompt" : " Can you write a poem about open source machine learning?" ,
52
- " system_prompt" : " You are a helpful, respectful and honest assistant." ,
53
- },
54
- )
55
- ```
56
-
57
46
> [ !TIP]
58
47
> You can also use the Replicate client asynchronously by prepending ` async_ ` to the method name.
59
48
>
@@ -110,13 +99,17 @@ for event in replicate.stream(
110
99
print (str (event), end = " " )
111
100
```
112
101
102
+ > [ !TIP]
103
+ > Some models, like [ meta/meta-llama-3-70b-instruct] ( https://replicate.com/meta/meta-llama-3-70b-instruct ) ,
104
+ > don't require a version string.
105
+ > You can always refer to the API documentation on the model page for specifics.
106
+
113
107
You can also stream the output of a prediction you create.
114
108
This is helpful when you want the ID of the prediction separate from its output.
115
109
116
110
``` python
117
- version = " 02e509c789964a7ea8736978a43525956ef40397be9033abf9fd2badfe68c9e3"
118
111
prediction = replicate.predictions.create(
119
- version = version,
112
+ model = " meta/meta-llama-3-70b-instruct "
120
113
input = {" prompt" : " Please write a haiku about llamas." },
121
114
stream = True ,
122
115
)
0 commit comments