-
Notifications
You must be signed in to change notification settings - Fork 41.1k
Spring boot application does not get shut down if you are using ScheduledExecutorService to start Kafka server #4008
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
For reproducing this issue create a spring boot application with Apche Kafka server. Make sure autoStartup is false 'kafkaMessageDrivenChannelAdapterListenerContainerSpec.autoStartup(false)'. Then create a bean that can start and stop the server as done in KafkaServerTask. |
As I said in the other issue, it's not clear why you think this is a bug in Spring Boot and you haven't answered the questions that I asked. They were:
I'm not sure that I can piece together what you're doing from your code and description above. Please provide a sample project that reproduces the problem so that we can be sure that we're diagnosing the problem that you're seeing. |
Might be I am wrong but from dibugging it looks that spring boot application on shutdown stops the kafka server which seems to be creating the issue. It might be related to the way spring-kafka-integration to. KafkaServerTask is defined as a bean in other part of application and it has 'stopKafkaServer' as a destroy method. Yes, I haves used initMethod and destroyMethod attributes on KafkaServerTask bean |
Also If I remove the code ' kafkaMessageDrivenChannelAdapter.start();' from KafkaServerTask. Everythings work fine |
I was able to reproduce it with a simple spring-boot-application with kafka integration: Please set up Apache Kafka as mentioned in http://kafka.apache.org Code files are as below:
Application.java: @EnableScheduling
@EnableAsync
@Configuration
@ComponentScan("com")
@EnableAutoConfiguration
@EnableAspectJAutoProxy
@SpringBootApplication
@EnableWebMvc
public class Application {
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
} KafkaConfiguration import java.util.Properties;
import org.apache.kafka.clients.producer.KafkaProducer;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
@Configuration
public class KafkaConfiguration {
/*
* The address for apache kafka server.
*/
@Value("${Kafka.brokerAddress}")
private String bootstrapServer;
@Value("${Kafka.serializerClass}")
private String serializerClass;
@Value("${Kafka.keySerializer}")
private String keySerializer;
@Value("${Kafka.valueSerializer}")
private String valueSerializer;
/*
* KafkaProducer holds the configuration required for sending the message to
* kafka server. This is a default implementation for Kaishi-kafka producer.
*/
@Bean(name = "kafkaProducer")
public KafkaProducer<String, String> producer() {
Properties props = new Properties();
props.put("bootstrap.servers", bootstrapServer);
props.put("serializer.class", serializerClass);
props.put("key.serializer", keySerializer);
props.put("value.serializer", valueSerializer);
return new KafkaProducer<>(props);
}
} KafkaMessageConsumerConfiguration: package com.config;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.DependsOn;
import org.springframework.integration.dsl.IntegrationFlow;
import org.springframework.integration.dsl.IntegrationFlows;
import org.springframework.integration.dsl.kafka.Kafka;
import org.springframework.integration.dsl.kafka.KafkaMessageDrivenChannelAdapterSpec.KafkaMessageDrivenChannelAdapterListenerContainerSpec;
import org.springframework.integration.kafka.core.ConnectionFactory;
import org.springframework.integration.kafka.core.DefaultConnectionFactory;
import org.springframework.integration.kafka.core.ZookeeperConfiguration;
import org.springframework.integration.kafka.inbound.KafkaMessageDrivenChannelAdapter;
import org.springframework.integration.kafka.listener.MetadataStoreOffsetManager;
import org.springframework.integration.kafka.listener.OffsetManager;
import org.springframework.integration.kafka.support.ZookeeperConnect;
import org.springframework.integration.scheduling.PollerMetadata;
import org.springframework.scheduling.support.PeriodicTrigger;
import com.test.KafkaServerTask;
import com.test.TestConsumerServiceImpl;
import kafka.api.OffsetRequest;
/**
* AbstractKafkaMessageConsumerConfiguration is an helper class for building a
* kafka message driven adapter. The class extending it should call the super
* methods for implementation.
*
*/
@Configuration
public class KafkaMessageConsumerConfiguration {
@Value("mediaConsumerListeningFromKafkaResults")
private String mediaConsumerListeningFromKafkaResults;
/*
* The address for kafka zookeeper.
*/
@Value("${Kafka.zookeeperAddress}")
public String zookeeperAddress;
/*
* The method name of consumer which will be invoked when a messaged arrives
* at kafka queue.
*/
@Value("${Kafka.consumeMessageMethodName}")
private String consumeMessageMethodName;
/*
* Bean for establishing connection to kafka server.
*/
@Bean()
public ConnectionFactory connectionFactory() {
return new DefaultConnectionFactory(new ZookeeperConfiguration(new ZookeeperConnect(zookeeperAddress)));
}
/*
* The poller defines the properties for polling the request from kafka
* server.
*/
@Bean(name = PollerMetadata.DEFAULT_POLLER)
public PollerMetadata defaultPoller() {
PollerMetadata pollerMetadata = new PollerMetadata();
pollerMetadata.setTrigger(new PeriodicTrigger(10));
return pollerMetadata;
}
/*
* OffsetManager keep track of the messages that have been sent and received
* from kafka server.
*/
@Bean
public OffsetManager offsetManager(ConnectionFactory connectionFactory) {
MetadataStoreOffsetManager offsetManager = new MetadataStoreOffsetManager(connectionFactory);
/*
* A string that uniquely identifies the group of consumer processes to
* which this consumer belongs. By setting the same group id multiple
* processes indicate that they are all part of the same consumer group.
*/
offsetManager.setConsumerId(getConsumerGroupId());
/*
* smallest : automatically reset the offset to the smallest offset
* largest : automatically reset the offset to the largest offset
*/
// offsetManager.setReferenceTimestamp(OffsetRequest.LatestTime());
offsetManager.setReferenceTimestamp(OffsetRequest.LatestTime());
return offsetManager;
}
/*
* Configuration for fetching the messages from kafka server.
*/
@Bean
public IntegrationFlow consumerIntegrationFlow() {
KafkaMessageDrivenChannelAdapterListenerContainerSpec kafkaMessageDrivenChannelAdapterListenerContainerSpec = Kafka.messageDriverChannelAdapter(connectionFactory(), getConsumerTopic()).id(getKafkaAdpaterId());
return IntegrationFlows
.from(kafkaMessageDrivenChannelAdapterListenerContainerSpec.autoStartup(false).autoCommitOffset(false).payloadDecoder(String::new).keyDecoder(b -> Integer.valueOf(new String(b))).configureListenerContainer(c -> c.offsetManager(offsetManager(connectionFactory())).maxFetch(100)))
.channel(c -> c.queue(getConsumerListeningFromKafkaResults())).handle(getConsumerService(), consumeMessageMethodName).get();
}
@Bean(name = "consumerService")
@DependsOn({ "offsetManager", "connectionFactory" })
public TestConsumerServiceImpl getConsumerService() {
TestConsumerServiceImpl mediaConsumerService = new TestConsumerServiceImpl();
return mediaConsumerService;
}
/*
* Bean responsible for start/stop/restart of kafka server.
*/
@Bean(destroyMethod="stopKafkaServer")
@DependsOn("consumerIntegrationFlow")
public KafkaServerTask kafkaServerTask(KafkaMessageDrivenChannelAdapter kafkaMessageDrivenChannelAdapter) {
KafkaServerTask kafkaServerTask = new KafkaServerTask();
kafkaServerTask.setKafkaMessageDrivenChannelAdapter(kafkaMessageDrivenChannelAdapter);
return kafkaServerTask;
}
protected String getConsumerGroupId() {
return "testConsumer";
}
protected String getConsumerListeningFromKafkaResults() {
return mediaConsumerListeningFromKafkaResults;
}
protected String getConsumerTopic() {
return "mediaKafkaKaishiTopic";
}
protected String getKafkaAdpaterId() {
return "kafkaMessageDrivenChannelAdapter";
}
protected Integer getPartitionId() {
return new Integer("0");
}
} KafkaProdcuerConfiguration package com.config;
import org.apache.kafka.clients.producer.KafkaProducer;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.DependsOn;
import com.test.TestProducerServiceImpl;
@Configuration
public class KafkaProdcuerConfiguration {
@Autowired
private KafkaProducer<String, String> kafkaProducer;
@Bean()
@DependsOn()
public TestProducerServiceImpl kafkaProducerService() {
TestProducerServiceImpl testProducerServiceImpl = new TestProducerServiceImpl();
testProducerServiceImpl.setKafkaProducer(kafkaProducer);
return testProducerServiceImpl;
}
} KafkaServerTask: package com.test;
import org.apache.log4j.Logger;
import org.springframework.integration.kafka.inbound.KafkaMessageDrivenChannelAdapter;
import org.springframework.scheduling.annotation.Async;
import org.springframework.scheduling.annotation.Scheduled;
/*
* KafkaServerTask is responsible for starting and stopping the kafka server.
*/
public class KafkaServerTask {
private static final Logger logger = Logger.getLogger(KafkaServerTask.class.getName());
KafkaMessageDrivenChannelAdapter kafkaMessageDrivenChannelAdapter;
public void setKafkaMessageDrivenChannelAdapter(KafkaMessageDrivenChannelAdapter kafkaMessageDrivenChannelAdapter) {
this.kafkaMessageDrivenChannelAdapter = kafkaMessageDrivenChannelAdapter;
}
/*
* Bean start up method.
*
* Start the kafka server.
*/
@Scheduled(fixedDelay = 600000)
@Async
public void startKafkaServer() {
if (!kafkaMessageDrivenChannelAdapter.isRunning()) {
try {
kafkaMessageDrivenChannelAdapter.start();
logger.info("kafka server started");
} catch (Exception exp) {
logger.info("Failed to start the kafka server. " + exp.getMessage());
}
}
}
/*
* Bean destroy method.
*
* Stop the kafka serve.
*/
public void stopKafkaServer() {
try {
kafkaMessageDrivenChannelAdapter.stop();
logger.info("kafka server stoped");
} catch (Exception exp) {
logger.info("Failed to stop the kafka server. " + exp.getMessage());
}
}
} TestConsumerServiceImpl: package com.test;
public class TestConsumerServiceImpl {
public void consumeMessage(String message) {
System.out.println("Message received." + message);
}
} package com.test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Controller;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.bind.annotation.ResponseBody;
@Controller
public class TestController {
@Autowired
TestProducerServiceImpl kafkaProducerService;
@RequestMapping(value = "/hello", method = RequestMethod.GET)
public @ResponseBody String getPage() {
kafkaProducerService.sendMessage("mediaKafkaKaishiTopic", "testing....");
return "Test message";
}
} TestProducerServiceImpl: package com.test;
import org.apache.kafka.clients.producer.Callback;
import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.ProducerRecord;
import org.apache.kafka.clients.producer.RecordMetadata;
import org.apache.log4j.Logger;
/**
* The Abstract class for all the kafka producers. The class holds the generic
* implementation for the methods that are used while sending the message to
* Kafka.
*
*/
public class TestProducerServiceImpl {
private static final Logger logger = Logger.getLogger(TestProducerServiceImpl.class.getName());
protected KafkaProducer<String, String> kafkaProducer;
public void setKafkaProducer(KafkaProducer<String, String> kafkaProducer) {
this.kafkaProducer = kafkaProducer;
}
/*
* Send message to kafka server.
*
*/
public void sendMessage(String topic, String message) {
logger.info(String.format("Processing kafka send meesage request for topic '%s' .", topic));
ProducerRecord<String, String> record = new ProducerRecord<String, String>(topic, message);
kafkaProducer.send(record, new Callback() {
public void onCompletion(RecordMetadata metadata, Exception e) {
if (e != null) {
logger.info("Error while sending message to Kafka: " + e.getMessage());
} else {
logger.info("kafka meesage sent. Topic: " + topic + ". Meessage: " + message + ".");
}
}
});
}
} appliocation.yaml
pom.xml <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-parent</artifactId>
<version>Angel.SR3</version>
<relativePath />
</parent>
<name>kafka-service</name>
<groupId>com.bcgdv</groupId>
<artifactId>core-kafka-service</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<properties>
<start-class>com.bcgdv.kaishi.kafka.Application</start-class>
<spring.integration.kafka.version>1.2.0.RELEASE</spring.integration.kafka.version>
</properties>
<repositories>
<repository>
<id>spring-milestones</id>
<url>http://repo.spring.io/libs-milestone/</url>
</repository>
</repositories>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-actuator</artifactId>
<exclusions>
<exclusion>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-logging</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
<exclusions>
<exclusion>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-tomcat</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-jetty</artifactId>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.integration</groupId>
<artifactId>spring-integration-kafka</artifactId>
<version>${spring.integration.kafka.version}</version>
<exclusions>
<exclusion>
<artifactId>jms</artifactId>
<groupId>javax.jms</groupId>
</exclusion>
<exclusion>
<artifactId>jmxri</artifactId>
<groupId>com.sun.jmx</groupId>
</exclusion>
<exclusion>
<artifactId>jmxtools</artifactId>
<groupId>com.sun.jdmk</groupId>
</exclusion>
<exclusion>
<artifactId>log4j</artifactId>
<groupId>log4j</groupId>
</exclusion>
<exclusion>
<artifactId>slf4j-log4j12</artifactId>
<groupId>org.slf4j</groupId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework.integration</groupId>
<artifactId>spring-integration-stream</artifactId>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.springframework.integration</groupId>
<artifactId>spring-integration-java-dsl</artifactId>
<version>1.1.0.M1</version>
</dependency>
<dependency>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
<version>1.1.2</version>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjweaver</artifactId>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk</artifactId>
<version>1.10.11</version>
<exclusions>
<exclusion>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
<version>2.4</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
</project> Start the server, hit the url 'htpp://localhost:8080/hello/ Check whether kafka is working ot not. Stop the server using curl like 'curl -X POST http://localhost:9012/shutdown' Spring context closes but there are certain threads that remains active. |
hey @suchitgupta the public void startKafkaServer() {
final Runnable startKafkaServerTask = new Runnable() {
public void run() {
if (!kafkaMessageDrivenChannelAdapter.isRunning()) {
try {
kafkaMessageDrivenChannelAdapter.start();
logger.info("kafka server starteds");
} catch (Exception exp) {
logger.info("Failed to start the kafka server. " + exp.getMessage());
}
}
}
};
kafkaserverFutureTask = scheduler.scheduleAtFixedRate(startKafkaServerTask, 0, taskPeriod, TimeUnit.MINUTES);
} and yes, although you're using Spring with Spring Boot, I don't see any Spring Boot issue either. |
I even tried spring scheduler and even that did not worked. @Scheduled(fixedDelay = 600000)
@Async
public void startKafkaServer() {
if (!kafkaMessageDrivenChannelAdapter.isRunning()) {
try {
kafkaMessageDrivenChannelAdapter.start();
logger.info("kafka server started");
} catch (Exception exp) {
logger.info("Failed to start the kafka server. " + exp.getMessage());
}
}
}
/*
* Bean destroy method.
*
* Stop the kafka serve.
*/
public void stopKafkaServer() {
try {
kafkaMessageDrivenChannelAdapter.stop();
logger.info("kafka server stoped");
} catch (Exception exp) {
logger.info("Failed to stop the kafka server. " + exp.getMessage());
}
} |
I even removed init-method and destroy method configuration from bean and even that didn't worked.. |
Your runnable start a thread out of the Spring thread executor. There's Sent from mobile.
|
With my new code I am not using runnable interface. It just uses @schedule. |
@suchitgupta I don't think a GitHub issue is the right place to get the help that it appears you need. Looking at the code, I think you'd benefit from spending some time reading the reference guides for the projects that you're using as well as the Getting Started Guides. A series of comments on a GitHub issue isn't a good way to share some code. The code that you have shared doesn't appear to be the same as the code that you've described. For example, I can't see any sign of your |
@wilkinsona, firstly I am sorry if the code was not easy to understand. I will definitely try to improve it. I build up a new code to demonstrate it to be an issue with spring-boot (spring boot application with Kafka integration ) and it is reproducible with the code I shared. As mentioned in my comments I tried different things and I even removed initMethod from Bean and the problem is still reproducible. |
Was this issue ever resolved? |
I am have a spring boot application that sends/receive messages to Apache Kafka server. The project using
ScheduledExecutorService
to start/stop apache kafka server. All the things work well till I try to shutdown the server usingcurl -X POST http://localhost:9012/shutdown
. On running this command the spring container closes however theScheduledExecutorService
task still exists.KafkaServerTask
Thread dump:
The text was updated successfully, but these errors were encountered: