Hi,
Anybody worked on sending logs from logstash server (pod running on
openshift) with existing elasticsearch of openshift efk solution which is
secured with searchguard..

Please share configuration details how to get connectivity between them.

I am getting same kind of below error again again..



On Wed, May 30, 2018, 3:16 PM Himmat Singh <[email protected]>
wrote:

> Hi Team,
>
> I have deployed rabbitmq, logstash server on openshift to make another ELK
> pipeline for logging which supports some set of application and want to
> forward logs from those application logs through ELK pipeline but
> Elasticsearch will be the common For both EFK/ELK pipeline.
>
> I have below secrets on openshift logging-elasticsearch :
>
> logging-elasticsearch created 3 months ago
> Opaque Reveal Secret
> admin-ca
> *****
> admin-cert
> *****
> admin-key
> *****
> admin.jks
> *****
> key
> *****
> searchguard.key
> *****
> searchguard.truststore
> *****
> truststore
> *****
>
> ------------------------------
>
> I have grabbed truststore key using below command and used
> truststore_password => tspass from elasticsaerch.yml :
>
> sudo oc get secret logging-elasticsearch --template='{{index .data 
> "truststore"}}' | base64 -d > truststore.jks
>
> Please help me with procedure i need to follow if i want to connect using
> truststore keys,username,password for truststore.
>
> Below is logstash.conf file : :
>
> input {
>   rabbitmq {
>     host => "rabbitmq-logstash"
>         queue => "logstash"
>         durable => true
>         port => 5672
>         user => "admin"
>         password => "admin"
> }
> }
> output {
>   elasticsearch {
>     hosts => ["logging-es:9200"]
>     #cacert => '/etc/logstash/conf.d/keys/es-ca.crt'
>     #user => 'fluentd'
>     #password => 'changeme'
>     ssl => true
>     ssl_certificate_verification => false
>     truststore => "/etc/logstash/conf.d/keys/truststore.jks"
>     truststore_password => tspass
>     index => "logstash-%{+YYYY.MM.dd}"
>     manage_template => false
>     document_type => "%{[@metadata][type]}"
>    }
>   stdout { codec => rubydebug }
> }
>
> I am facing below error:
>
> 10:51:56.154 [Ruby-0-Thread-5:
> /usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-7.4.2-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:228]
> WARN logstash.outputs.elasticsearch - Attempted to resurrect connection to
> dead ES instance, but got an error. {:url=>"https://logging-es:9200/";,
> :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError,
> :error=>"Got response code '401' contacting Elasticsearch at URL '
> https://logging-es:9200/'"}
> 10:52:01.155 [Ruby-0-Thread-5:
> /usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-7.4.2-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:228]
> INFO logstash.outputs.elasticsearch - Running health check to see if an
> Elasticsearch connection is working {:healthcheck_url=>
> https://logging-es:9200/, :path=>"/"}
>   | 10:52:01.158 [Ruby-0-Thread-5:
> /usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-7.4.2-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:228]
> WARN logstash.outputs.elasticsearch - Attempted to resurrect connection to
> dead ES instance, but got an error. {:url=>"https://logging-es:9200/";,
> :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError,
> :error=>"Got response code '401' contacting Elasticsearch at URL '
> https://logging-es:9200/'"}
>
> Please help me with correct configuration how do i get all parameter
> username, password and truststore_password, truststore, ca certificate.
>
>
>
> *Thanks and Regards,                                      *
> *Himmat Singh.*
> *Virtusa|Polaris Pvt Ltd*
> *8465009408*
>
>
>
_______________________________________________
users mailing list
[email protected]
http://lists.openshift.redhat.com/openshiftmm/listinfo/users

Reply via email to