tb51cx opened a new issue #6558:
URL: https://github.com/apache/apisix/issues/6558


   ### Issue description
   
   **ETCD为已经在本机安装好的集群,配置如下:**
   systemctl cat etcd
   
   /etc/systemd/system/etcd.service
   [Unit]
   Description=Etcd Server
   After=network.target
   After=network-online.target
   Wants=network-online.target
   Documentation=https://github.com/coreos
   
   [Service]
   Type=notify
   WorkingDirectory=/var/lib/etcd
   ExecStart=/opt/kube/bin/etcd
   --name=etcd-172.18.188.208
   --cert-file=/etc/kubernetes/ssl/etcd.pem
   --key-file=/etc/kubernetes/ssl/etcd-key.pem
   --peer-cert-file=/etc/kubernetes/ssl/etcd.pem
   --peer-key-file=/etc/kubernetes/ssl/etcd-key.pem
   --trusted-ca-file=/etc/kubernetes/ssl/ca.pem
   --peer-trusted-ca-file=/etc/kubernetes/ssl/ca.pem
   --initial-advertise-peer-urls=https://172.18.188.208:2380
   --listen-peer-urls=https://172.18.188.208:2380
   --listen-client-urls=https://172.18.188.208:2379,http://127.0.0.1:2379
   --advertise-client-urls=https://172.18.188.208:2379
   --initial-cluster-token=etcd-cluster-0
   
--initial-cluster=etcd-172.18.188.208=https://172.18.188.208:2380,etcd-172.18.188.205=https://172.18.188.205:2380,etcd-172.18.188.206=https://172.18.188.206:2380
   --initial-cluster-state=new
   --data-dir=/var/lib/etcd
   --wal-dir=
   --snapshot-count=50000
   --auto-compaction-retention=1
   --auto-compaction-mode=periodic
   --max-request-bytes=10485760
   --quota-backend-bytes=8589934592
   Restart=always
   RestartSec=15
   LimitNOFILE=65536
   OOMScoreAdjust=-999
   
   [Install]
   WantedBy=multi-user.target
   
   
   **证书内容如下:**
   ll /etc/kubernetes/ssl
   
   total 40
   -rw-r--r-- 1 root root 1679 Mar 2 17:05 aggregator-proxy-key.pem
   -rw-r--r-- 1 root root 1383 Mar 2 17:05 aggregator-proxy.pem
   -rw-r--r-- 1 root root 1675 Mar 2 17:05 ca-key.pem
   -rw-r--r-- 1 root root 1302 Mar 2 17:04 ca.pem
   -rw-r--r-- 1 root root 1675 Mar 2 17:04 etcd-key.pem
   -rw-r--r-- 1 root root 1428 Mar 2 17:04 etcd.pem
   -rw-r--r-- 1 root root 1679 Mar 2 17:06 kubelet-key.pem
   -rw-r--r-- 1 root root 1452 Mar 2 17:06 kubelet.pem
   -rw-r--r-- 1 root root 1679 Mar 2 17:05 kubernetes-key.pem
   -rw-r--r-- 1 root root 1736 Mar 2 17:05 kubernetes.pem
   
   
   **创建K8S证书**
   cp /etc/kubernetes/ssl/etcd.pem /root/zhengshu/
   cp /etc/kubernetes/ssl/etcd-key.pem /root/zhengshu/
   
   转换格式
   openssl rsa -in etcd-key.pem -out etcd.key
   openssl x509 -in etcd.pem -out etcd.crt
   
   
   kubectl create secret tls etcd-ssl -n ingress-apisix 
--cert=/root/zhengshu/etcd.crt --key=/root/zhengshu/etcd.key --dry-run=client 
-o yaml > etcd-ssl.yaml
   kubectl apply -f etcd-ssl.yaml
   
   
   
   kubectl describe secret etcd-ssl -n ingress-apisix
   Name: etcd
   Namespace: ingress-apisix
   Labels:
   Annotations:
   
   Type: kubernetes.io/tls
   
   Data
   tls.crt: 1428 bytes
   tls.key: 1675 bytes
   
   
   **测试证书+ETCD状态如下:**
   ETCDCTL_API=3 etcdctl \
   --endpoints=https://172.18.188.208:2379 \
   --cacert=/etc/kubernetes/ssl/ca.pem \
   --cert=/root/zhengshu/etcd.crt \
   --key=/root/zhengshu/etcd.key \
   endpoint health
   https://172.18.188.208:2379 is healthy: successfully committed proposal: 
took = 11.030749ms
   
   
   **apisix helm安装命令如下:**
   `helm install apisix apisix/apisix
   --set gateway.type=NodePort
   --set ingress-controller.enabled=true
   --namespace ingress-apisix
   --set ingress-controller.config.apisix.serviceNamespace=ingress-apisix
   --set etcd.enabled=false
   --set etcd.auth.tls.enabled=true
   --set 
etcd.host={https://172.18.188.208:2379\,https://172.18.188.205:2379\,https://172.18.188.206:2379}
   --set etcd.auth.tls.existingSecret=etcd-ssl
   --set etcd.auth.tls.certFilename=tls.crt
   --set etcd.auth.tls.certKeyFilename=tls.key
   
   
   **POD报错内容如下:**
   `kubectl logs apisix-7c6d459dbd-vlxbb -n ingress-apisix
   /usr/local/openresty/luajit/bin/luajit ./apisix/cli/apisix.lua init
   
   WARNING: using fixed Admin API token has security risk.
   Please modify "admin_key" in conf/config.yaml .
   
   /usr/local/openresty/luajit/bin/luajit ./apisix/cli/apisix.lua init_etcd
   Warning! Request etcd endpoint 'https://172.18.188.208:2379/version' error, 
certificate verify failed, retry time=1
   request etcd endpoint 'https://172.18.188.208:2379/version' error, 
certificate verify failed
   request etcd endpoint 'https://172.18.188.205:2379/version' error, 
certificate verify failed
   request etcd endpoint 'https://172.18.188.206:2379/version' error, 
certificate verify failed
   all etcd nodes are unavailable
   Warning! Request etcd endpoint 'https://172.18.188.208:2379/version' error, 
certificate verify failed, retry time=2
   Warning! Request etcd endpoint 'https://172.18.188.205:2379/version' error, 
certificate verify failed, retry time=1
   Warning! Request etcd endpoint 'https://172.18.188.205:2379/version' error, 
certificate verify failed, retry time=2
   Warning! Request etcd endpoint 'https://172.18.188.206:2379/version' error, 
certificate verify failed, retry time=1
   Warning! Request etcd endpoint 'https://172.18.188.206:2379/version' error, 
certificate verify failed, retry time=2`
   
   
   这个问题困扰我一周多了,太崩溃了。看了好多issuess也都没有解决。
   另外如果--set etcd.auth.tls.verify=false apisix是可以启但是
   **etcd验证还是失败.log如下:**
   `/usr/local/openresty/luajit/bin/luajit ./apisix/cli/apisix.lua init
   
   WARNING: using fixed Admin API token has security risk.
   Please modify "admin_key" in conf/config.yaml .
   
   
   /usr/local/openresty/luajit/bin/luajit ./apisix/cli/apisix.lua init_etcd
   2022/03/09 07:07:23 [warn] 1#1: low address bits of 127.0.0.1/24 are 
meaningless in /usr/local/apisix/conf/nginx.conf:279
   nginx: [warn] low address bits of 127.0.0.1/24 are meaningless in 
/usr/local/apisix/conf/nginx.conf:279
   2022/03/09 07:07:23 [warn] 46#46: *3 [lua] plugin.lua:172: load(): new 
plugins: 
{"api-breaker":true,"node-status":true,"request-validation":true,"gzip":true,"udp-logger":true,"jwt-auth":true,"http-logger":true,"key-auth":true,"authz-keycloak":true,"basic-auth":true,"cors":true,"response-rewrite":true,"redirect":true,"limit-count":true,"tcp-logger":true,"sls-logger":true,"request-id":true,"hmac-auth":true,"wolf-rbac":true,"limit-req":true,"consumer-restriction":true,"serverless-post-function":true,"prometheus":true,"authz-casbin":true,"proxy-cache":true,"fault-injection":true,"proxy-mirror":true,"real-ip":true,"traffic-split":true,"zipkin":true,"uri-blocker":true,"syslog":true,"echo":true,"kafka-logger":true,"limit-conn":true,"grpc-transcode":true,"ua-restriction":true,"ip-restriction":true,"serverless-pre-function":true,"batch-requests":true,"referer-restriction":true,"openid-connect":true,"proxy-rewrite":true},
 context: init_worker_by_lua*
   2022/03/09 07:07:23 [warn] 45#45: *1 [lua] plugin.lua:172: load(): new 
plugins: 
{"api-breaker":true,"node-status":true,"request-validation":true,"gzip":true,"udp-logger":true,"jwt-auth":true,"http-logger":true,"key-auth":true,"authz-keycloak":true,"basic-auth":true,"cors":true,"response-rewrite":true,"redirect":true,"limit-count":true,"tcp-logger":true,"sls-logger":true,"request-id":true,"hmac-auth":true,"wolf-rbac":true,"limit-req":true,"consumer-restriction":true,"serverless-post-function":true,"prometheus":true,"authz-casbin":true,"proxy-cache":true,"fault-injection":true,"proxy-mirror":true,"real-ip":true,"traffic-split":true,"zipkin":true,"uri-blocker":true,"syslog":true,"echo":true,"kafka-logger":true,"limit-conn":true,"grpc-transcode":true,"ua-restriction":true,"ip-restriction":true,"serverless-pre-function":true,"batch-requests":true,"referer-restriction":true,"openid-connect":true,"proxy-rewrite":true},
 context: init_worker_by_lua*
   2022/03/09 07:07:23 [warn] 47#47: *2 [lua] plugin.lua:172: load(): new 
plugins: 
{"api-breaker":true,"node-status":true,"request-validation":true,"gzip":true,"udp-logger":true,"jwt-auth":true,"http-logger":true,"key-auth":true,"authz-keycloak":true,"basic-auth":true,"cors":true,"response-rewrite":true,"redirect":true,"limit-count":true,"tcp-logger":true,"sls-logger":true,"request-id":true,"hmac-auth":true,"wolf-rbac":true,"limit-req":true,"consumer-restriction":true,"serverless-post-function":true,"prometheus":true,"authz-casbin":true,"proxy-cache":true,"fault-injection":true,"proxy-mirror":true,"real-ip":true,"traffic-split":true,"zipkin":true,"uri-blocker":true,"syslog":true,"echo":true,"kafka-logger":true,"limit-conn":true,"grpc-transcode":true,"ua-restriction":true,"ip-restriction":true,"serverless-pre-function":true,"batch-requests":true,"referer-restriction":true,"openid-connect":true,"proxy-rewrite":true},
 context: init_worker_by_lua*
   2022/03/09 07:07:23 [warn] 46#46: *3 [lua] plugin.lua:222: load_stream(): 
new plugins: {"limit-conn":true,"ip-restriction":true,"mqtt-proxy":true}, 
context: init_worker_by_lua*
   2022/03/09 07:07:23 [warn] 45#45: *1 [lua] plugin.lua:222: load_stream(): 
new plugins: {"limit-conn":true,"ip-restriction":true,"mqtt-proxy":true}, 
context: init_worker_by_lua*
   2022/03/09 07:07:23 [warn] 47#47: *2 [lua] plugin.lua:222: load_stream(): 
new plugins: {"limit-conn":true,"ip-restriction":true,"mqtt-proxy":true}, 
context: init_worker_by_lua*
   2022/03/09 07:07:23 [warn] 51#51: *4 [lua] plugin.lua:172: load(): new 
plugins: 
{"api-breaker":true,"node-status":true,"request-validation":true,"gzip":true,"udp-logger":true,"jwt-auth":true,"http-logger":true,"key-auth":true,"authz-keycloak":true,"basic-auth":true,"cors":true,"response-rewrite":true,"redirect":true,"limit-count":true,"tcp-logger":true,"sls-logger":true,"request-id":true,"hmac-auth":true,"wolf-rbac":true,"limit-req":true,"consumer-restriction":true,"serverless-post-function":true,"prometheus":true,"authz-casbin":true,"proxy-cache":true,"fault-injection":true,"proxy-mirror":true,"real-ip":true,"traffic-split":true,"zipkin":true,"uri-blocker":true,"syslog":true,"echo":true,"kafka-logger":true,"limit-conn":true,"grpc-transcode":true,"ua-restriction":true,"ip-restriction":true,"serverless-pre-function":true,"batch-requests":true,"referer-restriction":true,"openid-connect":true,"proxy-rewrite":true},
 context: init_worker_by_lua*
   2022/03/09 07:07:23 [warn] 48#48: *5 [lua] plugin.lua:172: load(): new 
plugins: 
{"api-breaker":true,"node-status":true,"request-validation":true,"gzip":true,"udp-logger":true,"jwt-auth":true,"http-logger":true,"key-auth":true,"authz-keycloak":true,"basic-auth":true,"cors":true,"response-rewrite":true,"redirect":true,"limit-count":true,"tcp-logger":true,"sls-logger":true,"request-id":true,"hmac-auth":true,"wolf-rbac":true,"limit-req":true,"consumer-restriction":true,"serverless-post-function":true,"prometheus":true,"authz-casbin":true,"proxy-cache":true,"fault-injection":true,"proxy-mirror":true,"real-ip":true,"traffic-split":true,"zipkin":true,"uri-blocker":true,"syslog":true,"echo":true,"kafka-logger":true,"limit-conn":true,"grpc-transcode":true,"ua-restriction":true,"ip-restriction":true,"serverless-pre-function":true,"batch-requests":true,"referer-restriction":true,"openid-connect":true,"proxy-rewrite":true},
 context: init_worker_by_lua*
   2022/03/09 07:07:23 [warn] 46#46: *6 stream [lua] plugin.lua:222: 
load_stream(): new plugins: 
{"mqtt-proxy":true,"ip-restriction":true,"limit-conn":true}, context: 
init_worker_by_lua*
   2022/03/09 07:07:23 [warn] 45#45: *7 stream [lua] plugin.lua:222: 
load_stream(): new plugins: 
{"mqtt-proxy":true,"ip-restriction":true,"limit-conn":true}, context: 
init_worker_by_lua*
   2022/03/09 07:07:24 [warn] 46#46: *9 stream [lua] v3.lua:647: 
request_chunk(): https://172.18.188.205:2379: SSL_set_tlsext_host_name failed. 
Retrying, context: ngx.timer
   2022/03/09 07:07:24 [warn] 46#46: *11 stream [lua] v3.lua:647: 
request_chunk(): https://172.18.188.205:2379: SSL_set_tlsext_host_name failed. 
Retrying, context: ngx.timer
   2022/03/09 07:07:24 [warn] 46#46: *13 stream [lua] health_check.lua:90: 
report_failure(): update endpoint: https://172.18.188.205:2379 to unhealthy, 
context: ngx.timer`
   
   
   证书已经验证过是没有问题的,为什么就是不工作呢
   
   ### Environment
   
   - apisix version (cmd: `apisix version`):
   - OS (cmd: `uname -a`):
   - OpenResty / Nginx version (cmd: `nginx -V` or `openresty -V`):
   - etcd version, if have (cmd: run `curl 
http://127.0.0.1:9090/v1/server_info` to get the info from server-info API):
   - apisix-dashboard version, if have:
   - the plugin runner version, if the issue is about a plugin runner (cmd: 
depended on the kind of runner):
   - luarocks version, if the issue is about installation (cmd: `luarocks 
--version`):
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to