-
Notifications
You must be signed in to change notification settings - Fork 703
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding "phi4-mini" in samples, codespaces and docs #91
Conversation
DETAILS Updated model references in the devcontainer and multiple Program.cs files to use the new phi4-mini model. Enhanced comments to reflect the changes and added a section in README.md to highlight the new model support.
DETAILS Updated documentation and code to replace references to the "llama3.2" model with "phi4-mini" in the setup instructions and example code.
DETAILS Included a link to the Phi Cookbook for hands-on examples with Microsoft's Phi models in the README.md file.
👋 Thanks for contributing @elbruno! We will review the pull request and get back to you soon. |
Check Country Locale in URLsWe have automatically detected added country locale to URLs in your files. Check the file paths and associated URLs inside them.
|
Check Broken PathsWe have automatically detected the following broken relative paths in your files. Check the file paths and associated broken paths inside them.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This pull request updates the project's model usage from phi3.5 and llama3.2 to the new phi4-mini model, ensuring consistency across sample apps, configuration files, and documentation.
- Updated model identifiers in multiple source code files for chat clients.
- Adjusted documentation and dev container configurations to support phi4-mini.
- Refined instructions in readmes and getting-started docs to reflect the new model.
Reviewed Changes
Copilot reviewed 9 out of 9 changed files in this pull request and generated 1 comment.
Show a summary per file
File | Description |
---|---|
README.md | Added a "What's New" section highlighting phi4-mini support and updated the Phi Cookbook link. |
03-CoreGenerativeAITechniques/src/BasicChat-04OllamaSK/Program.cs | Updated model ID from "llama3.2" to "phi4-mini". |
02-SetupDevEnvironment/readme.md | Revised pull commands and instructions to reference phi4-mini instead of phi3.5/llama3.2. |
03-CoreGenerativeAITechniques/src/BasicChat-03Ollama/Program.cs | Updated the client instantiation to use phi4-mini and enhanced the prompt messaging. |
02-SetupDevEnvironment/src/BasicChat-03Ollama/Program.cs | Changed the model used from "llama3.2" to "phi4-mini" but left an inconsistent comment reference. |
03-CoreGenerativeAITechniques/src/RAGSimple-10SKOllama/Program.cs | Updated the chat model to phi4-mini for consistency. |
03-CoreGenerativeAITechniques/src/RAGSimple-15Ollama-DeepSeekR1/Program.cs | Retained usage of "deepseek-r1" for this specific implementation. |
02-SetupDevEnvironment/getting-started-ollama.md | Updated descriptive text to include phi4-mini, though one instruction still references phi3.5. |
Comments suppressed due to low confidence (1)
02-SetupDevEnvironment/getting-started-ollama.md:66
- The instruction text is inconsistent as it mentions trying the 'phi3.5' model while also listing 'phi4-mini' as the new supported model. Update the text to consistently promote 'phi4-mini'.
One of the cool things about Ollama is that it's easy to change models. The sample apps uses models like "phi4-mini" or "llama3.2" model. Let’s switch it up and try the "phi3.5" model instead.
// to test other models you can download them with the command "ollama pull <modelId>" | ||
// in example: "ollama pull deepseek-r1" or "ollama pull phi4-mini" (for the phi4-mini model which is still being tested) | ||
// in example: "ollama pull deepseek-r1" or "ollama pull phi3.5" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The inline comment references pulling the 'phi3.5' model, but the code instantiates the client with 'phi4-mini'. Please update the comment to ensure consistency with the code.
// in example: "ollama pull deepseek-r1" or "ollama pull phi3.5" | |
// in example: "ollama pull deepseek-r1" or "ollama pull phi4-mini" |
Copilot is powered by AI, so mistakes are possible. Review output carefully before use.
Check Broken URLsWe have automatically detected the following broken URLs in your files. Review and fix the paths to resolve this issue. Check the file paths and associated broken URLs inside them.
|
This pull request focuses on updating the models used in various parts of the project from "phi3.5" to "phi4-mini". The changes span multiple files, including configuration files, documentation, and source code.
Model Updates:
.devcontainer/Ollama/devcontainer.json
: Updated thepostCreateCommand
to pull the "phi4-mini" model instead of "phi3.5".02-SetupDevEnvironment/getting-started-ollama.md
: Updated documentation to mention the "phi4-mini" model and its usage in sample apps. [1] [2]02-SetupDevEnvironment/readme.md
: Updated instructions to pull the "phi4-mini" model for specific lessons and projects.02-SetupDevEnvironment/src/BasicChat-03Ollama/Program.cs
: Changed the model used in the example chat client from "llama3.2" to "phi4-mini".03-CoreGenerativeAITechniques/src/BasicChat-03Ollama/Program.cs
: Updated the chat client to use the "phi4-mini" model instead of "llama3.2".03-CoreGenerativeAITechniques/src/BasicChat-04OllamaSK/Program.cs
: Set the model ID to "phi4-mini".03-CoreGenerativeAITechniques/src/RAGSimple-10SKOllama/Program.cs
: Updated the model ID for chat to "phi4-mini".Documentation Enhancements:
README.md
: Added a new section highlighting the support for the "phi4-mini" model and its benefits. Also, included a link to the Phi Cookbook for more information. [1] [2]