Skip to content

feat: use allow changing the Copilot model used #2 #373

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
Mar 21, 2025

Conversation

AntoineGS
Copy link
Collaborator

@AntoineGS AntoineGS commented Mar 12, 2025

Depends on PR #372
Addition of github_model option, used as such:

return {
  "zbirenbaum/copilot.lua",
  config = function()
    require("copilot").setup {
      copilot_model = "gpt-4o-copilot",
    }
  end,
}

Big thank to @trixnz for figuring out the required configs!

Fixes #365

language server now expects to be run directly through language-server.js
@AntoineGS
Copy link
Collaborator Author

I was not sure on the config placement, let me know if you prefer it elsewhere!
Also, passing vscode-chat does not seem to cause problems with the 3.5 model so I figured I would leave it in no matter the model.

@HOnatToprak
Copy link

When I set wrong model name, I can see on LSP logs that it fallbacks to default one. It would be nice to validate it in on setup and print error message.

@AntoineGS
Copy link
Collaborator Author

When I set wrong model name, I can see on LSP logs that it fallbacks to default one. It would be nice to validate it in on setup and print error message.

Hmm the downside to validating it before the LSP means that we would need to add future models whenever they come out. Though it took some time for 4o to come out so it might not be much maintenance.

I'll wait for @zbirenbaum to chip in on this one as I am divided on this!

@praveenperera
Copy link

Anyone know what to set it for, for Claude 3.7? And is there a way to check which model is being used?

@AntoineGS
Copy link
Collaborator Author

Claude 3.7 is only for the chat, not the autocomplete.
For chat you could check out CopilotChat.nvim which I believes supports using Claude 3.7.
The example config above is the latest model you can use, even in VSCode.

@AntoineGS AntoineGS changed the title feat: use allow changing the Copilot model used feat: use allow changing the Copilot model used #2 Mar 18, 2025
@luanlouzada
Copy link

nothing yet?

@grantjayy
Copy link

grantjayy commented Mar 18, 2025

Can we get this merged @zbirenbaum? I was surprised to learn that I've been using GPT 3.5 this whole time

@AntoineGS
Copy link
Collaborator Author

Assuming @zbirenbaum is too busy to review and merge the PRs (life hapens!), you can always use my repo in your copilot.lua configuration:

return {
  "AntoineGS/copilot.lua",
  config = function()
    require("copilot").setup {
      copilot_model = "gpt-4o-copilot",
    }
  end,
}

It does include some other changes that are either in PRs or going to be on PRs:
master...AntoineGS:copilot.lua:master

@grantjayy
Copy link

Love it! Thank you!

@AntoineGS
Copy link
Collaborator Author

No problem,
@zbirenbaum I would be happy to help maintain the project if ever you are looking for another maintainer!

@jrock2004
Copy link

Is this going to add support for models like Claude that is supported in vscode?

@AntoineGS
Copy link
Collaborator Author

Claude 3.7 is only for the chat, not the autocomplete. For chat you could check out CopilotChat.nvim which I believes supports using Claude 3.7. The example config above is the latest model you can use, even in VSCode.

We should sticky this 😂

@jrock2004
Copy link

Sorry, my bad for bring up old wounds :)

@AntoineGS AntoineGS merged commit 71eeb95 into zbirenbaum:master Mar 21, 2025
@AntoineGS AntoineGS deleted the gpt4o branch March 21, 2025 00:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Feature request: GPT-4o Copilot code completion model
6 participants