From 69f07d268a41f2c4d2761742113a2c4b20d298d7 Mon Sep 17 00:00:00 2001 From: Derrick Mwiti Date: Wed, 26 Apr 2023 11:47:34 +0300 Subject: [PATCH 01/13] Docker user guide --- docs/user-guide/deepsparse-docker.md | 214 +++++++++++++++++++++++++++ 1 file changed, 214 insertions(+) create mode 100644 docs/user-guide/deepsparse-docker.md diff --git a/docs/user-guide/deepsparse-docker.md b/docs/user-guide/deepsparse-docker.md new file mode 100644 index 0000000000..ed961e0c05 --- /dev/null +++ b/docs/user-guide/deepsparse-docker.md @@ -0,0 +1,214 @@ + +# How to Use DeepSparse With Docker +Apart from installing DeepSparse with `pip` you can also set it up using [Docker](https://www.docker.com/), enabling you to start using DeepSparse immediately without the need to manualy install all the required dependencies. + +The first step is to pull the `deepsparse` image from the GitHub Container Registry. + +```bash +docker pull ghcr.io/neuralmagic/deepsparse:1.4.2 +``` + +Tag the image to make it easier to reference later: + +```bash +docker tag ghcr.io/neuralmagic/deepsparse:1.4.2 deepsparse_docker +``` +## DeepSparse Server Example + +Built on the popular FastAPI and Uvicorn stack, DeepSparse Server enables you to set-up a REST endpoint for serving inferences over HTTP. Since DeepSparse Server wraps the Pipeline API, it inherits all of the utilities provided by Pipelines. + +Start the `deepsparse` container in interactive mode and publish the containers port 5543 to the local machine's port 5543 to expose the port outside the container. + +Here's the meaning of the commands after `docker container run`: +- `i` Keeps STDIN open even if not attached +- `t` to allocate a pseudo-TTY +- `p` publishes Docker's internal port 5543 to the local machines port 5543 +```bash +docker container run -it -p 5543:5543 deepsparse_docker +``` +You can also run the container using the [old Docker CLI syntax](https://www.docker.com/blog/whats-new-in-docker-1-13/) but using the new commands is encouraged: +```bash +docker run -it -p 5543:5543 deepsparse_docker +``` +Running the following CLI command inside the container launches a sentiment analysis pipeline with a 90% pruned-quantized BERT model identified by its SparseZoo stub: + +```bash +deepsparse.server --task sentiment_analysis --model_path "zoo:nlp/sentiment_analysis/distilbert-none/pytorch/huggingface/sst2/pruned90-none" +``` +Alternatively, you can run the two commands in a single line: +```bash +docker container run -p 5543:5543 deepsparse_docker deepsparse.server --task sentiment_analysis --model_path "zoo:nlp/sentiment_analysis/distilbert-none/pytorch/huggingface/sst2/pruned90-none" +``` +You should see Uvicorn report that it is running on http://0.0.0.0:5543. Once launched, a `/docs` path is created with full endpoint descriptions and support for making sample requests. + +Here is an example client request, using the Python requests library for formatting the HTTP: +```python +import requests + +url = "http://localhost:5543/predict" + +obj = { + "sequences": "Who is Mark?", +} + +response = requests.post(url, json=obj) +response.content +# b'{"labels":["negative"],"scores":[0.9695534706115723]}' +``` +## DeepSparse Engine example + +Engine is the lowest-level API for interacting with DeepSparse. As much as possible, we recommended you use the Pipeline API but Engine is available if you want to handle pre- or post-processing yourself. + +With Engine, we can compile an ONNX file and run inference on raw tensors. + +Here's an example, using a 90% pruned-quantized BERT trained on SST2 from SparseZoo. + +Save this script in a file named `app.py`: +```python +from deepsparse import Engine +from deepsparse.utils import generate_random_inputs, model_to_path + +def run(): + # download onnx from sparsezoo and compile with batchsize 1 + sparsezoo_stub = "zoo:nlp/sentiment_analysis/obert-base/pytorch/huggingface/sst2/pruned90_quant-none" + batch_size = 1 + bert_engine = Engine( + model=sparsezoo_stub, # sparsezoo stub or path to local ONNX + batch_size=batch_size # defaults to batch size 1 + ) + + # input is raw numpy tensors, output is raw scores for classes + inputs = generate_random_inputs(model_to_path(sparsezoo_stub), batch_size) + output = bert_engine(inputs) + print(output) + + +if __name__ == "__main__": + run() +``` +Next create a Dockerfile. The name of the file should be `Dokcerfile`. This file has instructions for: +- Pulling the DeepSparse Docker Image +- Setting up a new user +- Setting the home directory for the user +- Copying the Python script into the container and setting owner to the created user +- Running the Python script +```Dockerfile +# Pull DeepSparse Image +FROM ghcr.io/neuralmagic/deepsparse:1.4.2 + +# Set up a new user named "user" with user ID 1000 +RUN useradd -m -u 1000 user +# Switch to the "user" user +USER user +# Set home to the user's home directory +ENV HOME=/home/user \ + PATH=/home/user/.local/bin:$PATH + +# Set the working directory to the user's home directory +WORKDIR $HOME/app +# Copy the current directory contents into the container at $HOME/app setting the owner to the user +COPY --chown=user . $HOME/app + +CMD ["python", "app.py"] +``` +Create a DeepSparse Container where the Python script will run in. The `-t` argument tags the container with the given name. + +Run the following command in the directory containing the `Dockerfle` and `app.py`. +```bash +docker build -t engine_deepsparse_docker . +``` +Run your newly created DeepSparse Container: +```bash +docker container run engine_deepsparse_docker +# [array([[-0.34614536, 0.09025408]], dtype=float32)] +``` + +## DeepSparse Pipeline Example +Pipeline is the default interface for interacting with DeepSparse. + +Similar to Hugging Face Pipelines, DeepSparse Pipelines wrap pre- and post-processing around the inference performed by the Engine. This creates a clean API that allows you to pass raw images and text to DeepSparse and receive the post-processed prediction, making it easy to add DeepSparse to your application. + +Use the `Pipeline.create()` constructor to create an instance of a sentiment analysis Pipeline with a 90% pruned-quantized version of BERT trained on SST2. We can then pass the Pipeline raw text and receive the predictions. All of the pre-processing (such as tokenizing the input) is handled by the Pipeline. + +Save this script in a file called `app.py`: +```python +from deepsparse import Pipeline + +def run(): + # download onnx from sparsezoo and compile with batch size 1 + sparsezoo_stub = "zoo:nlp/sentiment_analysis/obert-base/pytorch/huggingface/sst2/pruned90_quant-none" + batch_size = 1 + sa_pipeline = Pipeline.create( + task="sentiment-analysis", + model_path=sparsezoo_stub, # sparsezoo stub or path to local ONNX + batch_size=1 # default batch size is 1 + ) + + # run inference on image file + prediction = sa_pipeline("The sentiment analysis pipeline is fast and easy to use") + print(prediction) + + +if __name__ == "__main__": + run() +``` +Next create a Dockerfile. The file should be named `Dockerfile`: +```Dockerfile +# Pull the DeepSparse Image +FROM ghcr.io/neuralmagic/deepsparse:1.4.2 + +# Set up a new user named "user" with user ID 1000 +RUN useradd -m -u 1000 user +# Switch to the "user" user +USER user +# Set home to the user's home directory +ENV HOME=/home/user \ + PATH=/home/user/.local/bin:$PATH + +# Set the working directory to the user's home directory +WORKDIR $HOME/app +# Copy the current directory contents into the container at $HOME/app setting the owner to the user +COPY --chown=user . $HOME/app + +CMD ["python", "app.py"] +``` + +Create Docker Container using the Dockerfile. The `Dockerfile` and `app.py` should be in the same folder. Run the following command in that folder: +```bash +docker build -t pipeline_deepsparse_docker . +``` +Run the Docker Container: +```bash +docker container run pipeline_deepsparse_docker +# labels=['positive'] scores=[0.9955807328224182] +``` +## DeepSparse Benchmarking + +Use the benchmarking utility to check the DeepSparse's performance: +```bash +docker container run -it deepsparse_docker deepsparse.benchmark "zoo:nlp/sentiment_analysis/distilbert-none/pytorch/huggingface/sst2/pruned90-none" + +> Original Model Path: zoo:nlp/sentiment_analysis/distilbert-none/pytorch/huggingface/sst2/pruned90-none +> Batch Size: 1 +> Scenario: sync +> Throughput (items/sec): 1.4351 +> Latency Mean (ms/batch): 696.7735 +> Latency Median (ms/batch): 687.1720 +> Latency Std (ms/batch): 465.9775 +> Iterations: 15 +``` + From dc9484e17dea262a9b1f7b8f6ea65e7d88b8cc02 Mon Sep 17 00:00:00 2001 From: Derrick Mwiti Date: Wed, 26 Apr 2023 11:52:11 +0300 Subject: [PATCH 02/13] Docker user guide --- docs/user-guide/deepsparse-docker.md | 10 ++++++---- 1 file changed, 6 insertions(+), 4 deletions(-) diff --git a/docs/user-guide/deepsparse-docker.md b/docs/user-guide/deepsparse-docker.md index ed961e0c05..cee939de52 100644 --- a/docs/user-guide/deepsparse-docker.md +++ b/docs/user-guide/deepsparse-docker.md @@ -14,7 +14,7 @@ See the License for the specific language governing permissions and limitations under the License. --> # How to Use DeepSparse With Docker -Apart from installing DeepSparse with `pip` you can also set it up using [Docker](https://www.docker.com/), enabling you to start using DeepSparse immediately without the need to manualy install all the required dependencies. +Apart from installing DeepSparse with `pip` you can also set it up using [Docker](https://www.docker.com/), enabling you to start using DeepSparse immediately without the need to manually install all the required dependencies. The first step is to pull the `deepsparse` image from the GitHub Container Registry. @@ -29,7 +29,7 @@ docker tag ghcr.io/neuralmagic/deepsparse:1.4.2 deepsparse_docker ``` ## DeepSparse Server Example -Built on the popular FastAPI and Uvicorn stack, DeepSparse Server enables you to set-up a REST endpoint for serving inferences over HTTP. Since DeepSparse Server wraps the Pipeline API, it inherits all of the utilities provided by Pipelines. +Built on the popular FastAPI and Uvicorn stack, DeepSparse Server enables you to set up a REST endpoint for serving inferences over HTTP. Since DeepSparse Server wraps the Pipeline API, it inherits all the utilities provided by Pipelines. Start the `deepsparse` container in interactive mode and publish the containers port 5543 to the local machine's port 5543 to expose the port outside the container. @@ -140,9 +140,11 @@ docker container run engine_deepsparse_docker ## DeepSparse Pipeline Example Pipeline is the default interface for interacting with DeepSparse. -Similar to Hugging Face Pipelines, DeepSparse Pipelines wrap pre- and post-processing around the inference performed by the Engine. This creates a clean API that allows you to pass raw images and text to DeepSparse and receive the post-processed prediction, making it easy to add DeepSparse to your application. +Similar to Hugging Face Pipelines, DeepSparse Pipelines wrap pre- and post-processing around the inference performed by the Engine. +This creates a clean API that allows you to pass raw images and text to DeepSparse and receive the post-processed prediction, making it easy to add DeepSparse to your application. -Use the `Pipeline.create()` constructor to create an instance of a sentiment analysis Pipeline with a 90% pruned-quantized version of BERT trained on SST2. We can then pass the Pipeline raw text and receive the predictions. All of the pre-processing (such as tokenizing the input) is handled by the Pipeline. +Use the `Pipeline.create()` constructor to create an instance of a sentiment analysis Pipeline with a 90% pruned-quantized version of BERT trained on SST2. We can then pass the Pipeline raw text and receive the predictions. +All the pre-processing (such as tokenizing the input) is handled by the Pipeline. Save this script in a file called `app.py`: ```python From f5896d4263fadbf7e8720adea87694b99fa75131 Mon Sep 17 00:00:00 2001 From: Michael Goin Date: Wed, 26 Apr 2023 23:32:07 -0400 Subject: [PATCH 03/13] Update docs/user-guide/deepsparse-docker.md --- docs/user-guide/deepsparse-docker.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/docs/user-guide/deepsparse-docker.md b/docs/user-guide/deepsparse-docker.md index cee939de52..d23922201c 100644 --- a/docs/user-guide/deepsparse-docker.md +++ b/docs/user-guide/deepsparse-docker.md @@ -53,7 +53,9 @@ Alternatively, you can run the two commands in a single line: ```bash docker container run -p 5543:5543 deepsparse_docker deepsparse.server --task sentiment_analysis --model_path "zoo:nlp/sentiment_analysis/distilbert-none/pytorch/huggingface/sst2/pruned90-none" ``` + You should see Uvicorn report that it is running on http://0.0.0.0:5543. Once launched, a `/docs` path is created with full endpoint descriptions and support for making sample requests. + Here is an example client request, using the Python requests library for formatting the HTTP: ```python From 7270b25f251ebf0c63f84d79a11398d80bcb3096 Mon Sep 17 00:00:00 2001 From: Derrick Mwiti Date: Thu, 27 Apr 2023 09:44:32 +0300 Subject: [PATCH 04/13] Update deepsparse-docker.md --- docs/user-guide/deepsparse-docker.md | 38 ++++++++-------------------- 1 file changed, 11 insertions(+), 27 deletions(-) diff --git a/docs/user-guide/deepsparse-docker.md b/docs/user-guide/deepsparse-docker.md index d23922201c..6a9ee46208 100644 --- a/docs/user-guide/deepsparse-docker.md +++ b/docs/user-guide/deepsparse-docker.md @@ -104,27 +104,18 @@ if __name__ == "__main__": ``` Next create a Dockerfile. The name of the file should be `Dokcerfile`. This file has instructions for: - Pulling the DeepSparse Docker Image -- Setting up a new user -- Setting the home directory for the user -- Copying the Python script into the container and setting owner to the created user +- Copying the Python script into the container - Running the Python script ```Dockerfile -# Pull DeepSparse Image FROM ghcr.io/neuralmagic/deepsparse:1.4.2 -# Set up a new user named "user" with user ID 1000 -RUN useradd -m -u 1000 user -# Switch to the "user" user -USER user -# Set home to the user's home directory -ENV HOME=/home/user \ - PATH=/home/user/.local/bin:$PATH - # Set the working directory to the user's home directory -WORKDIR $HOME/app -# Copy the current directory contents into the container at $HOME/app setting the owner to the user -COPY --chown=user . $HOME/app +WORKDIR /app + +# Copy the current directory contents into the container +COPY . . +#Run the Python script CMD ["python", "app.py"] ``` Create a DeepSparse Container where the Python script will run in. The `-t` argument tags the container with the given name. @@ -172,22 +163,15 @@ if __name__ == "__main__": ``` Next create a Dockerfile. The file should be named `Dockerfile`: ```Dockerfile -# Pull the DeepSparse Image FROM ghcr.io/neuralmagic/deepsparse:1.4.2 -# Set up a new user named "user" with user ID 1000 -RUN useradd -m -u 1000 user -# Switch to the "user" user -USER user -# Set home to the user's home directory -ENV HOME=/home/user \ - PATH=/home/user/.local/bin:$PATH - # Set the working directory to the user's home directory -WORKDIR $HOME/app -# Copy the current directory contents into the container at $HOME/app setting the owner to the user -COPY --chown=user . $HOME/app +WORKDIR /app + +# Copy the current directory contents into the container +COPY . . +#Run the Python script CMD ["python", "app.py"] ``` From 0fa44a8064cd4cf9bb02b9a70b89ceb63b90b1f7 Mon Sep 17 00:00:00 2001 From: Derrick Mwiti Date: Thu, 27 Apr 2023 10:25:36 +0300 Subject: [PATCH 05/13] Update docs/user-guide/deepsparse-docker.md Co-authored-by: Michael Goin --- docs/user-guide/deepsparse-docker.md | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/docs/user-guide/deepsparse-docker.md b/docs/user-guide/deepsparse-docker.md index 6a9ee46208..5a6e875cba 100644 --- a/docs/user-guide/deepsparse-docker.md +++ b/docs/user-guide/deepsparse-docker.md @@ -14,7 +14,9 @@ See the License for the specific language governing permissions and limitations under the License. --> # How to Use DeepSparse With Docker -Apart from installing DeepSparse with `pip` you can also set it up using [Docker](https://www.docker.com/), enabling you to start using DeepSparse immediately without the need to manually install all the required dependencies. +DeepSparse is an efficient and powerful tool for running inference on sparse and quantized models. Apart from installing DeepSparse with `pip`, it can be easily set up using [Docker](https://www.docker.com/) which allows you to start using DeepSparse without having to manually install all the required dependencies. + +In this guide, you will learn how to use DeepSparse with Docker for various use cases, such as running an HTTP server, working with the `Engine`, using the `Pipeline`, and benchmarking DeepSparse's performance. The first step is to pull the `deepsparse` image from the GitHub Container Registry. From 50a3358fbe4c959e64b822a5633d0d8f87edaed6 Mon Sep 17 00:00:00 2001 From: Derrick Mwiti Date: Thu, 27 Apr 2023 10:26:06 +0300 Subject: [PATCH 06/13] Update docs/user-guide/deepsparse-docker.md Co-authored-by: Michael Goin --- docs/user-guide/deepsparse-docker.md | 8 +++++++- 1 file changed, 7 insertions(+), 1 deletion(-) diff --git a/docs/user-guide/deepsparse-docker.md b/docs/user-guide/deepsparse-docker.md index 5a6e875cba..fb87990561 100644 --- a/docs/user-guide/deepsparse-docker.md +++ b/docs/user-guide/deepsparse-docker.md @@ -18,7 +18,13 @@ DeepSparse is an efficient and powerful tool for running inference on sparse and In this guide, you will learn how to use DeepSparse with Docker for various use cases, such as running an HTTP server, working with the `Engine`, using the `Pipeline`, and benchmarking DeepSparse's performance. -The first step is to pull the `deepsparse` image from the GitHub Container Registry. +## Prerequisites + +Before you begin, make sure you have Docker installed on your machine. You can download and install it from the [official Docker website](https://www.docker.com/products/docker-desktop). + +## Pulling and Tagging the DeepSparse Docker Image + +First, pull the `deepsparse` image from the GitHub Container Registry:``` ```bash docker pull ghcr.io/neuralmagic/deepsparse:1.4.2 From 20d9cc8306899834559c15f6ab07e64a7f67bfac Mon Sep 17 00:00:00 2001 From: Derrick Mwiti Date: Thu, 27 Apr 2023 10:26:23 +0300 Subject: [PATCH 07/13] Update docs/user-guide/deepsparse-docker.md Co-authored-by: Michael Goin --- docs/user-guide/deepsparse-docker.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/user-guide/deepsparse-docker.md b/docs/user-guide/deepsparse-docker.md index fb87990561..99d24b522c 100644 --- a/docs/user-guide/deepsparse-docker.md +++ b/docs/user-guide/deepsparse-docker.md @@ -37,7 +37,7 @@ docker tag ghcr.io/neuralmagic/deepsparse:1.4.2 deepsparse_docker ``` ## DeepSparse Server Example -Built on the popular FastAPI and Uvicorn stack, DeepSparse Server enables you to set up a REST endpoint for serving inferences over HTTP. Since DeepSparse Server wraps the Pipeline API, it inherits all the utilities provided by Pipelines. +DeepSparse Server, built on the popular FastAPI and Uvicorn stack, allows you to set up a REST endpoint for serving inferences over HTTP. It wraps the Pipeline API, inheriting all the utilities provided by Pipelines. Start the `deepsparse` container in interactive mode and publish the containers port 5543 to the local machine's port 5543 to expose the port outside the container. From a733e62698015ba96370c5fbbb73add022cb13a8 Mon Sep 17 00:00:00 2001 From: Derrick Mwiti Date: Thu, 27 Apr 2023 10:26:37 +0300 Subject: [PATCH 08/13] Update docs/user-guide/deepsparse-docker.md Co-authored-by: Michael Goin --- docs/user-guide/deepsparse-docker.md | 4 ---- 1 file changed, 4 deletions(-) diff --git a/docs/user-guide/deepsparse-docker.md b/docs/user-guide/deepsparse-docker.md index 99d24b522c..85fb84ccda 100644 --- a/docs/user-guide/deepsparse-docker.md +++ b/docs/user-guide/deepsparse-docker.md @@ -48,10 +48,6 @@ Here's the meaning of the commands after `docker container run`: ```bash docker container run -it -p 5543:5543 deepsparse_docker ``` -You can also run the container using the [old Docker CLI syntax](https://www.docker.com/blog/whats-new-in-docker-1-13/) but using the new commands is encouraged: -```bash -docker run -it -p 5543:5543 deepsparse_docker -``` Running the following CLI command inside the container launches a sentiment analysis pipeline with a 90% pruned-quantized BERT model identified by its SparseZoo stub: ```bash From 751ec5a57ea17b7c87891d353cc422bd9a3e0bcd Mon Sep 17 00:00:00 2001 From: Derrick Mwiti Date: Thu, 27 Apr 2023 10:26:46 +0300 Subject: [PATCH 09/13] Update docs/user-guide/deepsparse-docker.md Co-authored-by: Michael Goin --- docs/user-guide/deepsparse-docker.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/user-guide/deepsparse-docker.md b/docs/user-guide/deepsparse-docker.md index 85fb84ccda..1f9fc3540d 100644 --- a/docs/user-guide/deepsparse-docker.md +++ b/docs/user-guide/deepsparse-docker.md @@ -130,7 +130,7 @@ docker build -t engine_deepsparse_docker . ``` Run your newly created DeepSparse Container: ```bash -docker container run engine_deepsparse_docker +docker container run engine_deepsparse_docker # [array([[-0.34614536, 0.09025408]], dtype=float32)] ``` From 5dc0713e4dcfda6b7cf1ec5413a023b577c43190 Mon Sep 17 00:00:00 2001 From: Derrick Mwiti Date: Thu, 27 Apr 2023 10:27:05 +0300 Subject: [PATCH 10/13] Update docs/user-guide/deepsparse-docker.md Co-authored-by: Michael Goin --- docs/user-guide/deepsparse-docker.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/user-guide/deepsparse-docker.md b/docs/user-guide/deepsparse-docker.md index 1f9fc3540d..9cd278ad02 100644 --- a/docs/user-guide/deepsparse-docker.md +++ b/docs/user-guide/deepsparse-docker.md @@ -106,7 +106,7 @@ def run(): if __name__ == "__main__": run() ``` -Next create a Dockerfile. The name of the file should be `Dokcerfile`. This file has instructions for: +Next create a Dockerfile. The name of the file should be `Dockerfile`. This file has instructions for: - Pulling the DeepSparse Docker Image - Copying the Python script into the container - Running the Python script From 8462de7aa895b53f0b7501df398befb9d98e94d4 Mon Sep 17 00:00:00 2001 From: Derrick Mwiti Date: Thu, 27 Apr 2023 10:27:16 +0300 Subject: [PATCH 11/13] Update docs/user-guide/deepsparse-docker.md Co-authored-by: Michael Goin --- docs/user-guide/deepsparse-docker.md | 1 - 1 file changed, 1 deletion(-) diff --git a/docs/user-guide/deepsparse-docker.md b/docs/user-guide/deepsparse-docker.md index 9cd278ad02..9198c0dbef 100644 --- a/docs/user-guide/deepsparse-docker.md +++ b/docs/user-guide/deepsparse-docker.md @@ -150,7 +150,6 @@ from deepsparse import Pipeline def run(): # download onnx from sparsezoo and compile with batch size 1 sparsezoo_stub = "zoo:nlp/sentiment_analysis/obert-base/pytorch/huggingface/sst2/pruned90_quant-none" - batch_size = 1 sa_pipeline = Pipeline.create( task="sentiment-analysis", model_path=sparsezoo_stub, # sparsezoo stub or path to local ONNX From 74892d1aa6dc3a9e6450b84911dcc5390a55b0d3 Mon Sep 17 00:00:00 2001 From: Derrick Mwiti Date: Thu, 27 Apr 2023 10:27:24 +0300 Subject: [PATCH 12/13] Update docs/user-guide/deepsparse-docker.md Co-authored-by: Michael Goin --- docs/user-guide/deepsparse-docker.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/user-guide/deepsparse-docker.md b/docs/user-guide/deepsparse-docker.md index 9198c0dbef..a86e1f5724 100644 --- a/docs/user-guide/deepsparse-docker.md +++ b/docs/user-guide/deepsparse-docker.md @@ -184,7 +184,7 @@ docker build -t pipeline_deepsparse_docker . ``` Run the Docker Container: ```bash -docker container run pipeline_deepsparse_docker +docker container run pipeline_deepsparse_docker # labels=['positive'] scores=[0.9955807328224182] ``` ## DeepSparse Benchmarking From 1b8d0079931b23f4400691177825086731903804 Mon Sep 17 00:00:00 2001 From: Derrick Mwiti Date: Fri, 28 Apr 2023 10:00:02 +0300 Subject: [PATCH 13/13] Update deepsparse-docker.md --- docs/user-guide/deepsparse-docker.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/docs/user-guide/deepsparse-docker.md b/docs/user-guide/deepsparse-docker.md index a86e1f5724..d010744c80 100644 --- a/docs/user-guide/deepsparse-docker.md +++ b/docs/user-guide/deepsparse-docker.md @@ -202,4 +202,6 @@ docker container run -it deepsparse_docker deepsparse.benchmark "zoo:nlp/sentime > Latency Std (ms/batch): 465.9775 > Iterations: 15 ``` +## How to Make Your Own Deepsparse Docker Image +To build your own DeepSparse Image [follow these instructions](https://github.com/neuralmagic/deepsparse/tree/main/docker#build).