You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Add pre-commit with ruff, pyproject.toml, gh lint/test actions (#124)
* Add pre-commit with ruff and related gh action
* Split lint/unit test into separate jobs
* Fix gh action syntax
* Fix action version
* Remove un-needed packages, switch to pyproject.toml
* Tweak pyproject.toml
* Update README, run pre-commit
Copy file name to clipboardExpand all lines: README.md
+56-32
Original file line number
Diff line number
Diff line change
@@ -20,6 +20,11 @@ Each agent is intended to answer questions related to a set of documents known a
20
20
-[With Docker Compose](#with-docker-compose)
21
21
-[Using huggingface text-embeddings-inference server to host embedding model (deprecated)](#using-huggingface-text-embeddings-inference-server-to-host-embedding-model-deprecated)
-[Deploying to OpenShift](#deploying-to-openshift)
@@ -142,52 +147,32 @@ A development/test environment can be set up with or without docker compose. In
142
147
143
148
The docker compose file offers an easy way to spin up all components. [ollama](https://ollama.com) is used to host the LLM and embedding model. For utilization of your GPU, refer to the comments in the compose file to see which configurations to uncomment on the 'ollama' container. Postgres persists the data, and pgadmin allows you to query the database.
144
149
145
-
You will need Docker version 27.5.1 on Fedora 40 and 41 to be able to use docker compose (not docker-compose) and for that You will need to reinstall latest docker version from the [fedora docker repo](https://docs.docker.com/engine/install/fedora/#install-using-the-repository) or follow the instructions here.
150
+
1. First, install Docker: [Follow the official guide for your OS](https://docs.docker.com/engine/install/)
146
151
147
-
Docker 27.5.1 is confirmed working with macOS 15.3.
152
+
- NOTE: Currently, the compose file does not work with `podman`.
148
153
149
-
To get the correct version of docker, add the repo:
154
+
2. On Linux, be sure to run through the [postinstall steps](https://docs.docker.com/engine/install/linux-postinstall/)
@@ -196,7 +181,7 @@ Run through the postinstall steps https://docs.docker.com/engine/install/linux-p
196
181
}
197
182
```
198
183
199
-
1. (optional) Follow these steps to start the [tangerine-frontend](https://github.com/RedHatInsights/tangerine-frontend#with-docker-compose)
184
+
7. (optional) Follow these steps to start the [tangerine-frontend](https://github.com/RedHatInsights/tangerine-frontend#with-docker-compose)
200
185
201
186
Note: You can access pgadmin at localhost:5050.
202
187
@@ -315,14 +300,53 @@ to use this to test different embedding models that are not supported by ollama,
315
300
316
301
1. (optional) Follow these steps to start the [tangerine-frontend](https://github.com/RedHatInsights/tangerine-frontend#without-docker-compose)
317
302
318
-
## Debugging in VSCode
303
+
## Developer Guide
304
+
305
+
### Install development packages
306
+
307
+
If desiring to make contributions, be sure to install the development packages:
308
+
309
+
```sh
310
+
pipenv install --dev
311
+
```
312
+
313
+
### Using pre-commit
314
+
315
+
This project uses pre-commit to handle formatting and linting.
316
+
317
+
- Before pushing a commit, you can run:
318
+
319
+
```sh
320
+
pre-commit run --all
321
+
```
322
+
323
+
and if it fails, check for changes the tool has made to your files.
324
+
325
+
- Alternatively, you can add pre-commit as a git hook with:
326
+
327
+
```sh
328
+
pre-commit install
329
+
```
330
+
331
+
and pre-commit will automatically be invoked every time you create a commit.
332
+
333
+
### Debugging in VSCode
319
334
320
335
Run postgres and ollama either locally or in containers. Don't run the backend container. Click on "Run & Debug" in the left menu and then run the "Debug Tangerine Backend" debug target. You can now set breakpoints and inspect runtime state.
321
336
322
337
There's a second debug target forthe unit tests if you want to run thosein a debugger.
323
338
324
339
## Mac Development Tips
325
-
Ollama running in Docker on Apple Silicon cannot make use of hardware acceleration. That means the LLM will be very slow to respond running in Docker, even on a very capable machine. However, running the model locally does make use of acceleration and is quite fast. If you are working on a Mac the best setup is to run the model through ollama locally and then the other deps like the database in Docker. The way the compose file is set up, the networking is all seemless. If you stop the ollama container and then ollama serve locally it will all just work together. You'll have the best local development setup if you combine the model running locally and tangerine-backend running in a debugger in VSCode with postgres and pgadmin running in Docker!
340
+
341
+
Ollama running in Docker on Apple Silicon cannot make use of hardware acceleration. That means the LLM will be very slow to respond running in Docker, even on a very capable machine.
342
+
343
+
However, running the ollama outside of Docker does make use of acceleration and is quite fast. If you are working on a Mac the best setup is to run the model through ollama locally and continue to run the other components (like the database) in Docker. The way the compose file is set up, the networking should allow this to work without issue.
344
+
345
+
Comment out `ollama` from the compose file, or stop the ollama container. Invoke `ollama serve` on your shell. For an optimal developer experience:
346
+
347
+
- run tangerine-backend in a debugger in VSCode
348
+
- run ollama directly on your host
349
+
- run postgres/pgadmin in Docker.
326
350
327
351
## Synchronizing Documents from S3
328
352
@@ -350,15 +374,15 @@ To do so you'll need to do the following:
350
374
echo 'BUCKET=mybucket' >> .env
351
375
```
352
376
353
-
5. Create an `s3.yaml` file that describes your agents and the documents they should ingest. See [s3-example.yaml](s3-example.yaml) for an example.
377
+
1. Create an `s3.yaml` file that describes your agents and the documents they should ingest. See [s3-example.yaml](s3-example.yaml) for an example.
354
378
355
379
If using docker compose, copy this config into your container:
0 commit comments