You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: md/02.QuickStart/Ollama_QuickStart.md
+36-19Lines changed: 36 additions & 19 deletions
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@
4
4
5
5
## **1. Installation**
6
6
7
-
Ollama supports running on Windows, macOS, and Linux. You can install Ollama through this link ([https://ollama.com/download](https://ollama.com/download)). After successful installation, you can directly use Ollama script to call Phi-3 through a terminal window. You can see all the [available libaries in Ollama.](https://ollama.com/library)
7
+
Ollama supports running on Windows, macOS, and Linux. You can install Ollama through this link ([https://ollama.com/download](https://ollama.com/download)). After successful installation, you can directly use Ollama script to call Phi-3 through a terminal window. You can see all the [available libaries in Ollama](https://ollama.com/library). If you open this repository in a Codespace, it will already have Ollama installed.
8
8
9
9
10
10
```bash
@@ -26,7 +26,7 @@ If you want to call the Phi-3 API generated by ollama, you can use this command
26
26
ollama serve
27
27
28
28
```
29
-
***Note:*** If running MacOS or Linux, please note that you may encounter the following error <b>"Error: listen tcp 127.0.0.1:11434: bind: address already in use"</b> You may get this error when calling running the command. The solution for this problems is:
29
+
***Note:*** If running MacOS or Linux, please note that you may encounter the following error <b>"Error: listen tcp 127.0.0.1:11434: bind: address already in use"</b> You may get this error when calling running the command. You can either ignore that error, since it typically indicates the server is already running, or you can stop the and restart Ollama:
30
30
31
31
**macOS**
32
32
@@ -46,7 +46,7 @@ sudo systemctl stop ollama
46
46
47
47
```
48
48
49
-
Ollama supports two API: generate and chat. You can call the model API provided by Ollama according to your needs. Local service port 11434. such as
49
+
Ollama supports two API: generate and chat. You can call the model API provided by Ollama according to your needs, by sending requests to the local service running on port 11434.

96
96
97
97
# Additional Resources
98
98
99
-
Check the list of available models in Ollama in [this link.](https://ollama.com/library)
99
+
Check the list of available models in Ollama in [their library](https://ollama.com/library).
100
100
101
101
Pull your model from the Ollama server using this command
102
102
@@ -113,17 +113,45 @@ ollama run phi3
113
113
***Note:*** Visit this link [https://github.com/ollama/ollama/blob/main/docs/api.md](https://github.com/ollama/ollama/blob/main/docs/api.md) to learn more
114
114
115
115
116
+
## Calling Ollama from Python
117
+
118
+
You can use `requests` or `urllib3` to make requests to the local server endpoints used above. However, a popular way to use Ollama in Python is via the [openai](https://pypi.org/project/openai/) SDK, since Ollama provides OpenAI-compatible server endpoints as well.
119
+
120
+
Here is an example for phi3-mini:
121
+
122
+
```python
123
+
import openai
124
+
125
+
client = openai.OpenAI(
126
+
base_url="http://localhost:11434/v1",
127
+
api_key="nokeyneeded",
128
+
)
129
+
130
+
response = client.chat.completions.create(
131
+
model="phi3"
132
+
temperature=0.7,
133
+
n=1,
134
+
messages=[
135
+
{"role": "system", "content": "You are a helpful assistant."},
136
+
{"role": "user", "content": "Write a haiku about a hungry cat"},
137
+
],
138
+
)
139
+
140
+
print("Response:")
141
+
print(response.choices[0].message.content)
142
+
```
143
+
116
144
## Calling Ollama from JavaScript
117
145
118
146
```javascript
119
-
#Example of Summarize a file with Phi-3
147
+
//Example of Summarize a file with Phi-3
120
148
script({
121
149
model:"ollama:phi3",
122
150
title:"Summarize with Phi-3",
123
151
system: ["system"],
124
152
})
125
153
126
-
#Example of summarize
154
+
//Example of summarize
127
155
constfile=def("FILE", env.files)
128
156
$`Summarize ${file} in a single paragraph.`
129
157
```
@@ -160,14 +188,3 @@ Run the app with the command:
0 commit comments