You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -260,13 +259,13 @@ A minimum example to show you how to link the shared lib of Lite.AI correctly fo
260
259
261
260
## 2. Model Zoo.
262
261
263
-
<divid="lite.ai.toolkit.toolkit-Model-Zoo"></div>
262
+
<divid="lite.ai.toolkit-Model-Zoo"></div>
264
263
265
264
*Lite.AI.ToolKit* contains *[70+](https://github.com/DefTruth/lite.ai.toolkit/tree/main/docs/hub/lite.ai.toolkit.hub.onnx.md)* AI models with *[150+](https://github.com/DefTruth/lite.ai.toolkit/tree/main/docs/hub/lite.ai.toolkit.hub.onnx.md)* frozen pretrained *.onnx* files now. Note that the models here are all from third-party projects. Most of the models were converted by *Lite.AI.ToolKit*. In Lite.AI, different names of the same algorithm mean that the corresponding models come from different repositories, different implementations, or use different training data, etc. ✅ means passed the test and ⚠️ means not implements yet but coming soon. For classes which denoted ✅, you can use it through *lite::cv::Type::Class* syntax, such as *[lite::cv::detection::YoloV5](#lite.ai.toolkit-object-detection)* . More details can be found at [Examples for Lite.AI](#lite.ai.toolkit-Examples-for-Lite.AI.ToolKit) .
266
265
<details>
267
266
<summary> Expand Details for Namespace and Lite.AI.ToolKit modules.</summary>
268
267
269
-
### Namespace and Lite.AI modules.
268
+
### Namespace and Lite.AI.ToolKit modules.
270
269
*Lite.AI* contains *[70+](https://github.com/DefTruth/lite.ai.toolkit/tree/main/docs/hub/lite.ai.toolkit.hub.onnx.md)* AI models with *[150+](https://github.com/DefTruth/lite.ai.toolkit/tree/main/docs/hub/lite.ai.toolkit.hub.onnx.md)* frozen pretrained *.onnx* files now. They come from different fields of computer vision. Click the Expand ▶️ button for more details.
271
270
272
271
@@ -287,7 +286,7 @@ A minimum example to show you how to link the shared lib of Lite.AI correctly fo
287
286
|*lite::cv::resolution*| Super Resolution. ⚠️ |
288
287
289
288
290
-
### Lite.AI's Classes and Pretrained Files.
289
+
### Lite.AI.ToolKit's Classes and Pretrained Files.
291
290
292
291
Correspondence between the classes in *Lite.AI.ToolKit* and pretrained model files can be found at [lite.ai.toolkit.hub.onnx.md](https://github.com/DefTruth/lite.ai.toolkit/tree/main/docs/hub/lite.ai.toolkit.hub.onnx.md). For examples, the pretrained model files for *lite::cv::detection::YoloV5* and *lite::cv::detection::YoloX* are listed as follows.
293
292
@@ -325,11 +324,7 @@ auto *yolox = new lite::cv::detection::YoloX("yolox_nano.onnx"); // 3.5Mb only
Note, I can not upload all the *.onnx files because of the storage limitation of Google Driver (15G). <divid="lite.ai.toolkit.toolkit-2"></div>
328
-
329
-
<!---
330
-
For example, ArcFace in [insightface](https://github.com/deepinsight/insightface) is different from ArcFace in [face.evoLVe.PyTorch](https://github.com/ZhaoJ9014/face.evoLVe.PyTorch) . ArcFace in [insightface](https://github.com/deepinsight/insightface) uses Arc-Loss + Softmax, while ArcFace in [face.evoLVe.PyTorch](https://github.com/ZhaoJ9014/face.evoLVe.PyTorch) uses Arc-Loss + Focal-Loss. Lite.AI uses naming to make the necessary distinctions between models from different sources.
More details of Default Version APIs can be found at [default-version-api-docs](https://github.com/DefTruth/lite.ai.toolkit/blob/main/docs/api/default.md) . For examples, the interface for YoloV5 is:
1182
+
More details of Default Version APIs can be found at [api.default.md](https://github.com/DefTruth/lite.ai.toolkit/blob/main/docs/api/api.default.md) . For examples, the interface for YoloV5 is:
<summary> Expand for ONNXRuntime, MNN and NCNN version APIs.</summary>
1199
1194
1200
1195
### 4.2 ONNXRuntime Version APIs.
1201
-
More details of ONNXRuntime Version APIs can be found at [onnxruntime-version-api-docs](https://github.com/DefTruth/lite.ai.toolkit/blob/main/docs/api/onnxruntime.md) . For examples, the interface for YoloV5 is:
1196
+
More details of ONNXRuntime Version APIs can be found at [api.onnxruntime.md](https://github.com/DefTruth/lite.ai.toolkit/blob/main/docs/api/api.onnxruntime.md) . For examples, the interface for YoloV5 is:
Copy file name to clipboardExpand all lines: docs/api/api.default.md
+1-1
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,6 @@
1
1
# Default Version APIs.
2
2
3
-
More details of basic types for Default Version APIs can be found at [types](https://github.com/DefTruth/litehub/blob/main/ort/core/ort_types.h) . Note that LiteHub uses `onnxruntime` as default backend, for the reason that onnxruntime supports the most of onnx's operators. `(TODO: Add detailed API documentation)`
3
+
More details of basic types for Default Version APIs can be found at [types](https://github.com/DefTruth/lite.ai.toolkit/blob/main/ort/core/ort_types.h) . Note that Lite.AI.ToolKit uses `onnxruntime` as default backend, for the reason that onnxruntime supports the most of onnx's operators. `(TODO: Add detailed API documentation)`
Copy file name to clipboardExpand all lines: docs/api/api.onnxruntime.md
+1-1
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,6 @@
1
1
# ONNXRuntime Version APIs.
2
2
3
-
More details of basic types for ONNXRuntime Version APIs can be found at [ort_types](https://github.com/DefTruth/litehub/blob/main/ort/core/ort_types.h) . `(TODO: Add detailed API documentation).`
3
+
More details of basic types for ONNXRuntime Version APIs can be found at [ort_types](https://github.com/DefTruth/lite.ai.toolkit/blob/main/ort/core/ort_types.h) . `(TODO: Add detailed API documentation).`
Copy file name to clipboardExpand all lines: docs/hub/lite.ai.toolkit.hub.onnx.md
+13-13
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,6 @@
1
-
# Lite.AI.Hub.ONNX
1
+
# Lite.AI.ToolKit.Hub.ONNX
2
2
3
-
Correspondence between the classes in *Lite.AI* and pretrained model files can be found at this document. For examples, the pretrained model files for *lite::cv::detection::YoloV5* and *lite::cv::detection::YoloX* are listed as follows.
3
+
Correspondence between the classes in *Lite.AI.ToolKit* and pretrained model files can be found at this document. For examples, the pretrained model files for *lite::cv::detection::YoloV5* and *lite::cv::detection::YoloX* are listed as follows.
4
4
5
5
| Class | Pretrained ONNX Files | Rename or Converted From (Repo) | Size |
0 commit comments