Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Kafka Input Plugin: Shutdown semantics #284

Open
kaihil opened this issue Jun 19, 2018 · 1 comment
Open

Kafka Input Plugin: Shutdown semantics #284

kaihil opened this issue Jun 19, 2018 · 1 comment
Assignees

Comments

@kaihil
Copy link

kaihil commented Jun 19, 2018

  • Version: Logstash 6.2.1, Kafka 1.1.0
  • Operating System: macOS 10.13.5 (17F77) / Docker Version 18.03.1-ce-mac65 (24312)
  • Config File:
input {
    kafka {
        bootstrap_servers => "kafka:9092"
        topics => "example-topic"
        codec => "json"
        auto_offset_reset => "latest"
    }
}
  • Steps to Reproduce:

I have a question regarding the shutdown semantics of Logstash's Kafka Input Plugin:

When I run it with default settings, send a batch of 10.000 message into kafka and restart* logstash while it is processing the message, I end up with more than 10.000 (e.g. 10.212) messages in ES.

So at a first glance it looks like it just continues at the last commit done via the auto_commit_interval_ms (which may be up to 5seconds old). But I would assume that the plugin properly closes the consumer and sends a last offset commit to avoid double processing of messages.

Now I wonder how this is supposed to work?

Thank you in advance!
Kai

*) I restart logstash with a SIGTERM first, wait until it is properly shut down and then wait a minute to start it again.

@numbnut
Copy link

numbnut commented Aug 3, 2018

Hi,
any news regarding this issue?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants