Skip to content

llama.vim : add classic vim support #9995

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 20 commits into from
Oct 23, 2024

Conversation

m18coppola
Copy link
Contributor

@m18coppola m18coppola commented Oct 22, 2024

@m18coppola m18coppola closed this Oct 22, 2024
@m18coppola m18coppola changed the title added classic vim support llama.vim : add classic vim support [no ci] Oct 22, 2024
@m18coppola m18coppola reopened this Oct 22, 2024
@m18coppola m18coppola mentioned this pull request Oct 22, 2024
7 tasks
@ggerganov
Copy link
Member

Testing with Vim 9.1, the first call to prop_add when displaying the ghost text causes the window to shift up a few lines. Happens when editing a file that is larger than the vertical screen size. Can you reproduce?

Also the async jobs seem to create many scratch buffers with the following contents:

image

@m18coppola
Copy link
Contributor Author

@ggerganov I fixed the scratch buffer issue, it was due to a bad option for the job_start(). I was unable to reproduce the window shifting problem - I suspect that it may be because of the buffer-spam though. Retest and see if it fixes your issue. If the window shifting bug is still present after this fix, attach your vimrc and your vim --version output.

@ggerganov
Copy link
Member

Both issues are fixed now.

@ggerganov ggerganov changed the title llama.vim : add classic vim support [no ci] llama.vim : add classic vim support Oct 22, 2024
@ggerganov ggerganov merged commit ac113a0 into ggml-org:master Oct 23, 2024
7 checks passed
@ggerganov
Copy link
Member

@m18coppola Thanks for implementing the classic Vim support!

In case you are using this plugin, would appreciate any feedback about issues or ideas to improve it. So far, I think the biggest drawback compared to the original GH copilot is that it tends to over-suggest in certain situations (i.e. repeats lines that are in the suffix). The indentation logic helped a lot to limit this, but it seems that there is still something missing. Not sure if the sampler has to be improved in some way or if it is a model problem. Anyway, if you have any thoughts, let me know.

arthw pushed a commit to arthw/llama.cpp that referenced this pull request Nov 15, 2024
* added classic vim support

* fixed ring update, removed blank line

* minor

* minor

* minor doc update

* removed uneeded var

* minor

* minor

* fixed job_start creating new scratch buffers

* fixed job_start creating new scratch buffers

* fixed ghost text indenting when expandtab is on

* removed unused code

* minor

* unified fim_on_exit

* minor

* vim ghost text rendering now uses pos_x and pos_y parameters

* renamed *_hlgroup to hlgroup_*

* renamed *_ghost_text to ghost_text_*, moved nvim/vim detection to llama#init()

* minor

---------

Co-authored-by: Michael Coppola <[email protected]>
arthw pushed a commit to arthw/llama.cpp that referenced this pull request Nov 18, 2024
* added classic vim support

* fixed ring update, removed blank line

* minor

* minor

* minor doc update

* removed uneeded var

* minor

* minor

* fixed job_start creating new scratch buffers

* fixed job_start creating new scratch buffers

* fixed ghost text indenting when expandtab is on

* removed unused code

* minor

* unified fim_on_exit

* minor

* vim ghost text rendering now uses pos_x and pos_y parameters

* renamed *_hlgroup to hlgroup_*

* renamed *_ghost_text to ghost_text_*, moved nvim/vim detection to llama#init()

* minor

---------

Co-authored-by: Michael Coppola <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants