Skip to content

Files

Latest commit

 

History

History
162 lines (97 loc) · 4.02 KB

binding.md

File metadata and controls

162 lines (97 loc) · 4.02 KB

Bindings

Precondition

Build target libchatllm:

Windows:

Assume MSVC is used.

  1. Build target libchatllm:

    cmake --build build --config Release --target libchatllm
  2. Copy libchatllm.dll, libchatllm.lib and ggml.dll to bindings;

Linux/MacOS:

  1. Build target libchatllm:

    cmake --build build --target libchatllm

Python

Command line

Run chatllm.py with exactly the same command line options.

For example,

  • Linux: python3 chatllm.py -i -m path/to/model

  • Windows: python chatllm.py -i -m path/to/model

If OSError: exception: access violation reading 0x0000000000000000 occurred, try:

Web demo

There is also a Chatbot powered by Streamlit:

To start it:

streamlit run chatllm_st.py -- -i -m path/to/model

Note: "STOP" function is not implemented yet.

OpenAI/Ollama Compatible API

Here is a server providing some OpenAI/Ollama Compatible API. Note that most of the parameters are ignored.

openai_api.py supports loading several types models for chatting, code completion (FIM), or text embedding etc. For example, load to models, one for chatting, one for FIM:

python openai_api.py ---chat path/to/deepseekcoder-1.3b.bin ---fim /path/to/deepseekcoder-1.3b-base.bin

Additional arguments for each model can be specified too. For example:

python openai_api.py ---chat path/to/chat/model --top_k 2 ---fim /path/to/fim/model --temp 0.8

openai_api.py uses API path to select chatting or completion models: when API path is ending with /generate, code completion model is selected; when ending with /completions chatting model is selected.

Some base models that can be used for code completion:

This module provides sufficient Ollama API so that it can be used to emulate Ollama model provider in Visual Studio Code Copilot. For example, starting the server with a model:

python openai_api.py ---chat :qwen2.5

Select the model from Ollama provider:

JavaScript/TypeScript

Command line

Run chatllm.ts with exactly the same command line options using Bun:

bun run chatllm.ts -i -m path/to/model

WARNING: Bun looks buggy on Linux.

Other Languages

libchatllm can be utilized by all languages that can call into dynamic libraries.

C

  • Linux

    1. Build bindings\main.c:

      export LD_LIBRARY_PATH=.:$LD_LIBRARY_PATH
      gcc main.c libchatllm.so
    2. Test a.out with exactly the same command line options.

  • Windows:

    1. Build bindings\main.c:

      cl main.c libchatllm.lib
    2. Test main.exe with exactly the same command line options.

Pascal (Delphi/FPC)

Pascal binding is also available.

Examples:

Nim

Examples:

  • main.nim, which highlights code snippets.

    Build:

    nim c -d:Release -d:ssl main.nim
    

Others