Copilot CLI now supports BYOK and local models - GitHub Changelog
…This works with remote services like OpenAI and Azure OpenAI, as well as locally running models like Ollama, vLLM, and Foundry Local. See Using your own LLM models in GitHub Copilot CLI…