Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update API to match latest llama.cpp version #1991

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

mamei16
Copy link

@mamei16 mamei16 commented Apr 3, 2025

This PR updates llama_cpp.py so that it matches the llama.h API changes introduced in the commits:
ggml-org/llama.cpp@e0dbec0
ggml-org/llama.cpp@8fcb563
ggml-org/llama.cpp@00d5380
ggml-org/llama.cpp@dd373dd
ggml-org/llama.cpp@b3de7ca
ggml-org/llama.cpp@2c3f8b8
ggml-org/llama.cpp@e0e912f

I couldn't find any example on how to handle deprecated methods in this project, so I added a @deprecated decorator to the methods in question.

class llama_model_tensor_buft_override(ctypes.Structure):
_fields_ = [
("pattern", ctypes.c_char_p),
("buft", ctypes.c_void_p)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wasn't sure if c_void_p is the correct type to use here, so feel free to change it if there's a better alternative.

Copy link

@jonathanreichhealthscope jonathanreichhealthscope left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great. Seems like there's a couple extra?

@ctypes_function(
    "llama_load_model_from_file",
    [ctypes.c_char_p, llama_model_params],
    llama_model_p_ctypes,
)
def llama_load_model_from_file(
    path_model: bytes, params: llama_model_params, /
) -> Optional[llama_model_p]:
    ...
    
 
# LLAMA_API void llama_model_free(struct llama_model * model);
@ctypes_function(
    "llama_model_free",
    [llama_model_p_ctypes],
    None,
)
def llama_model_free(model: llama_model_p, /):
    ...

@mamei16
Copy link
Author

mamei16 commented Apr 8, 2025

@jonathanreichhealthscope Thanks! What do you mean by that?

@jonathanreichhealthscope
Copy link

jonathanreichhealthscope commented Apr 10, 2025

@jonathanreichhealthscope Thanks! What do you mean by that?

Looks like there's a couple of extra API changes, that's what I've put in my comment. For example, the 'llama_load_model_from_file' is now best practice to use, as the former naming convention I think is deprecated.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants