From ae48099f857e91c74afac57ad9ec7c53c6cc537e Mon Sep 17 00:00:00 2001 From: Yuan Tang Date: Tue, 10 Dec 2024 20:37:55 -0500 Subject: [PATCH] Change SIG Apps to SIG Network --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index eecb37d2..21bea826 100644 --- a/README.md +++ b/README.md @@ -1,6 +1,6 @@ # Kubernetes LLM Instance Gateway -The LLM Instance Gateway came out of [wg-serving](https://github.com/kubernetes/community/tree/master/wg-serving) and is sponsored by [SIG Apps](https://github.com/kubernetes/community/blob/master/sig-apps/README.md#llm-instance-gateway). This repo contains: the load balancing algorithm, [ext-proc](https://www.envoyproxy.io/docs/envoy/latest/configuration/http/http_filters/ext_proc_filter) code, CRDs, and controllers to support the LLM Instance Gateway. +The LLM Instance Gateway came out of [wg-serving](https://github.com/kubernetes/community/tree/master/wg-serving) and is sponsored by [SIG Network](https://github.com/kubernetes/community/blob/master/sig-network/README.md#gateway-api-inference-extension). This repo contains: the load balancing algorithm, [ext-proc](https://www.envoyproxy.io/docs/envoy/latest/configuration/http/http_filters/ext_proc_filter) code, CRDs, and controllers to support the LLM Instance Gateway. This Gateway is intented to provide value to multiplexed LLM services on a shared pool of compute. See the [proposal](https://github.com/kubernetes-sigs/wg-serving/tree/main/proposals/012-llm-instance-gateway) for more info.