Skip to content

[Enhancement] Request to add "connection: close" response header via http/1.1 connection after graceful shutdown phase started #40802

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
SaeGon-Heo opened this issue May 17, 2024 · 1 comment
Labels
for: external-project For an external project and not something we can fix status: invalid An issue that we don't feel is valid

Comments

@SaeGon-Heo
Copy link

Hello. I have request about graceful shutdown on http/1.1 connection.
I though this request fit in spring boot side, but sorry about if spring boot is not a role about this behavior.
Sorry for this PR might have any confuse words or sentences, too.

While configure k8s environment run Spring Boot Server with Application Load Balancer(a.k.a ALB)
I heard there are chance to client can get 503 from ALB.

ALB are communicating with a pod using HTTP/1.1 in my case.
When I rolling update pods with new image in k8s, there are must be newly created pods and terminating pods.
And in terminating pods, Spring Boot are in graceful shutdown phase, so new connection to these will got RST packet.
In additional, after graceful shutdown phase, processed requests are not include connection response header.

Because of above

  1. ALB may try to connect Spring Boot in graceful shutdown phase, got RST and ALB response to client with 503
  2. ALB may got response from Spring Boot in graceful shutdown phase which no connection header.
    So, if next request fired, ALB try to reuse current connection because before connection is recognized as keep-alive, then got RST and ALB response to client with 503

This may solved by make ALB retry to another instance several times until got success packet not RST.
But this behavior seems not good because ALB may got RST by another reason.

If I use HTTP/2, I got GOAWAY frame at graceful shutdown phase begun from connection by async.
And this behavior done by tomcat not Spring Boot.
But I can't use HTTP/2 in my environment now.

Seems Golang had a issue similar this before, so it looks like have a code that add connection: close response header on shutdown phase.

https://cs.opensource.google/go/go/+/refs/tags/go1.22.3:src/net/http/server.go;l=1512
https://cs.opensource.google/go/go/+/refs/tags/go1.22.3:src/net/http/server.go;l=1355
https://cs.opensource.google/go/go/+/refs/tags/go1.22.3:src/net/http/server.go;l=1266
https://cs.opensource.google/go/go/+/refs/tags/go1.22.3:src/net/http/server.go;l=3385

This might need to done in tomcat or else side.
But I thought graceful shutdown phase itself is controlled by Spring Boot via org.springframework.boot.web.embedded.tomcat.GracefulShutdown in org.springframework.boot:spring-boot dependency.

Am I have to carry this to servers community like tomcat, jetty, netty, undertow?

@spring-projects-issues spring-projects-issues added the status: waiting-for-triage An issue we've not yet triaged label May 17, 2024
@wilkinsona
Copy link
Member

Am I have to carry this to servers community like tomcat, jetty, netty, undertow?

Yes, I'm afraid you will. Please see #40108 (comment) for a description of each server's current behaviour.

@wilkinsona wilkinsona closed this as not planned Won't fix, can't repro, duplicate, stale May 17, 2024
@wilkinsona wilkinsona added status: invalid An issue that we don't feel is valid for: external-project For an external project and not something we can fix and removed status: waiting-for-triage An issue we've not yet triaged labels May 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
for: external-project For an external project and not something we can fix status: invalid An issue that we don't feel is valid
Projects
None yet
Development

No branches or pull requests

3 participants