Skip to content

Self-signed certs: certificate verification error on connect #252

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
gabe-sorensen opened this issue Apr 19, 2019 · 10 comments
Closed

Self-signed certs: certificate verification error on connect #252

gabe-sorensen opened this issue Apr 19, 2019 · 10 comments

Comments

@gabe-sorensen
Copy link

gabe-sorensen commented Apr 19, 2019

Using self-signed certificates to connect to Kafka doesn't appear to be supported, even when providing the CA certificate in the fluentd config. I'm getting the following error on connect:

2019-04-19 16:30:01 +0000 [info]: parsing config file is succeeded path="/fluentd/etc/fluent.conf"
2019-04-19 16:30:01 +0000 [info]: Will watch for topics infra-logs at brokers kafka-logs.domain:9092 and 'logs-consumer' group
2019-04-19 16:30:01 +0000 [info]: using configuration file: <ROOT>
  <source>
    @type kafka_group
    brokers "kafka-logs.domain:9092"
    ssl_ca_cert "/etc/ipa/ca.crt"
    ssl_client_cert "/etc/ipa/host.crt"
    ssl_client_cert_key "/etc/ipa/host.key"
    consumer_group "logs-consumer"
    topics "infra-logs"
    format "json"
    start_from_beginning false
  </source>
  <match **>
    @type stdout
  </match>
</ROOT>
2019-04-19 16:30:01 +0000 [info]: starting fluentd-1.4.2 pid=6 ruby="2.3.3"
2019-04-19 16:30:01 +0000 [info]: spawn command to main:  cmdline=["/usr/bin/ruby2.3", "-Eascii-8bit:ascii-8bit", "/usr/local/bin/fluentd", "-c", "/fluentd/etc/fluent.conf", "-p", "/fluentd/plugins", "--under-supervisor"]
2019-04-19 16:30:02 +0000 [info]: gem 'fluent-plugin-kafka' version '0.5.5'
2019-04-19 16:30:02 +0000 [info]: gem 'fluent-plugin-zookeeper' version '0.1.2'
2019-04-19 16:30:02 +0000 [info]: gem 'fluentd' version '1.4.2'
2019-04-19 16:30:02 +0000 [info]: adding match pattern="**" type="stdout"
2019-04-19 16:30:02 +0000 [info]: adding source type="kafka_group"
2019-04-19 16:30:02 +0000 [info]: #0 Will watch for topics infra-logs at brokers kafka-logs.domain:9092 and 'logs-consumer' group
2019-04-19 16:30:02 +0000 [info]: #0 starting fluentd worker pid=14 ppid=6 worker=0
2019-04-19 16:30:02 +0000 [error]: #0 unexpected error error_class=OpenSSL::SSL::SSLError error="SSL_connect returned=1 errno=0 state=error: certificate verify failed"
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/ssl_socket_with_timeout.rb:66:in `connect_nonblock'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/ssl_socket_with_timeout.rb:66:in `initialize'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/connection.rb:108:in `new'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/connection.rb:108:in `open'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/connection.rb:87:in `block in send_request'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/instrumenter.rb:21:in `instrument'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/connection.rb:86:in `send_request'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/broker.rb:30:in `fetch_metadata'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/cluster.rb:198:in `block in fetch_cluster_info'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/cluster.rb:193:in `each'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/cluster.rb:193:in `fetch_cluster_info'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/cluster.rb:181:in `cluster_info'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/cluster.rb:67:in `refresh_metadata!'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/cluster.rb:48:in `add_target_topics'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/consumer_group.rb:24:in `subscribe'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/ruby-kafka-0.3.17/lib/kafka/consumer.rb:86:in `subscribe'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluent-plugin-kafka-0.5.5/lib/fluent/plugin/in_kafka_group.rb:149:in `block in setup_consumer'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluent-plugin-kafka-0.5.5/lib/fluent/plugin/in_kafka_group.rb:148:in `each'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluent-plugin-kafka-0.5.5/lib/fluent/plugin/in_kafka_group.rb:148:in `setup_consumer'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluent-plugin-kafka-0.5.5/lib/fluent/plugin/in_kafka_group.rb:129:in `start'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/compat/call_super_mixin.rb:42:in `start'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/root_agent.rb:203:in `block in start'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/root_agent.rb:192:in `block (2 levels) in lifecycle'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/root_agent.rb:191:in `each'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/root_agent.rb:191:in `block in lifecycle'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/root_agent.rb:178:in `each'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/root_agent.rb:178:in `lifecycle'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/root_agent.rb:202:in `start'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/engine.rb:274:in `start'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/engine.rb:219:in `run'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/supervisor.rb:805:in `run_engine'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/supervisor.rb:549:in `block in run_worker'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/supervisor.rb:730:in `main_process'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/supervisor.rb:544:in `run_worker'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/lib/fluent/command/fluentd.rb:316:in `<top (required)>'
  2019-04-19 16:30:02 +0000 [error]: #0 /usr/lib/ruby/2.3.0/rubygems/core_ext/kernel_require.rb:55:in `require'
  2019-04-19 16:30:02 +0000 [error]: #0 /usr/lib/ruby/2.3.0/rubygems/core_ext/kernel_require.rb:55:in `require'
  2019-04-19 16:30:02 +0000 [error]: #0 /var/lib/gems/2.3.0/gems/fluentd-1.4.2/bin/fluentd:8:in `<top (required)>'
  2019-04-19 16:30:02 +0000 [error]: #0 /usr/local/bin/fluentd:22:in `load'
  2019-04-19 16:30:02 +0000 [error]: #0 /usr/local/bin/fluentd:22:in `<main>'
2019-04-19 16:30:02 +0000 [error]: #0 unexpected error error_class=OpenSSL::SSL::SSLError error="SSL_connect returned=1 errno=0 state=error: certificate verify failed"
  2019-04-19 16:30:02 +0000 [error]: #0 suppressed same stacktrace
2019-04-19 16:30:03 +0000 [info]: Worker 0 finished unexpectedly with status 1

I've also tried adding the CA entry as an array (ssl_ca_cert ["/etc/ipa/ca.crt"]), and get the same error.

The same certificates/keys work fine for fluent-bit kafka output:

Copyright (C) Treasure Data
[2019/04/19 16:34:37] [ info] [storage] initializing...
[2019/04/19 16:34:37] [ info] [storage] in-memory
[2019/04/19 16:34:37] [ info] [storage] normal synchronization mode, checksum disabled
[2019/04/19 16:34:37] [ info] [engine] started (pid=1)
[2019/04/19 16:34:37] [ info] [in_systemd] seek_cursor=s=dea1eae7fbde4a93b99fc438eb30e9a1;i=edb... OK
[2019/04/19 16:34:37] [ info] [out_kafka] brokers='kafka-logs.domain:9092' topics='infra-logs'
[2019/04/19 16:34:37] [ info] [http_server] listen iface=127.0.0.1 tcp_port=2020```
@gabe-sorensen gabe-sorensen changed the title Self-signed certs: certificate validation error on connect Self-signed certs: certificate verification error on connect Apr 19, 2019
@edmeister
Copy link

Any update on this issue?

@haijeploeg
Copy link

We also have the exact same issue. We are also using IPA as our CA. Kafka is successfully using SSL using those certs, but we cannot cannot connect to Kafka using SSL. In the Kafka logs we are also getting the following error:

INFO [SocketServer brokerId=1] Failed authentication with /x.x.x.x (SSL handshake failed) (org.apache.kafka.common.network.Selector)

And on fluentd server the same issue as well:

unexpected error error_class=OpenSSL::SSL::SSLError error="SSL_connect returned=1 errno=0 state=error: certificate verify failed"

@edmeister
Copy link

I tried my certificates in a small python app and the output I got there, was more verbose. Turns out I had to add both root and intermediate CA.

@jbouter
Copy link

jbouter commented May 14, 2019

@edmeister Did you cat the intermediate and CA together by any chance? If not, could you provide an example of the configuration you are using?

@edmeister
Copy link

@kyentei yes, I just combined them in one file. That was a faster solution than figuring out how to configure it.

@jbouter
Copy link

jbouter commented May 14, 2019

That sadly didn't resolve the matter for us. Testing with openssl shows the certificates are valid, and hooking up other applications to kafka works flawlessly.

@skbly7
Copy link

skbly7 commented May 15, 2019

This seems to be a bug (?) in ruby-kafka which is used by fluentd-kafka-plugin.
The ruby-kafka behaves wrongly (?) when the CA file containers cert array instead of just the one CA. The issue is shared here and a request for proper error:
zendesk/ruby-kafka#322
zendesk/ruby-kafka#683

So in your CA file get rid of intermediate CA and just put the root CA instead i.e. last one (or have multiple files with one each).
It will connect happily after it, at least it did for our case.

-----BEGIN CERTIFICATE-----
123
-----END CERTIFICATE-----
-----BEGIN CERTIFICATE-----
456
-----END CERTIFICATE-----

to

-----BEGIN CERTIFICATE-----
456
-----END CERTIFICATE-----

@gabe-sorensen
Copy link
Author

We did have a similar situation @skbly7, but I've tried loading both of our certificates individually and it still fails. For our first certificate I get:

2019-08-22 22:02:47 +0000 [error]: #0 unexpected error error_class=OpenSSL::SSL::SSLError error="SSL_connect returned=1 errno=0 state=error: certificate verify failed (self signed certificate in certificate chain)"

and for the second one:

2019-08-22 22:10:11 +0000 [error]: #0 unexpected error error_class=OpenSSL::SSL::SSLError error="SSL_connect returned=1 errno=0 state=error: certificate verify failed (unspecified certificate verification error)"```

@gabe-sorensen
Copy link
Author

Nevermind. I checked the broker URL and it was connecting to our round-robin DNS name instead of the server directly. Using only the second certificate, as mentioned previously seems to fix the issue.

@gabe-sorensen
Copy link
Author

I'm closing this as a duplicate of #287 since that more accurately describes the issue. The self-signed part isn't actually the issue, it's the fact that multiple certificates in the same file aren't being handled correctly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants