Hi,

May you help use the cmd ‘kubectl describe job 
dev-oof-music-cassandra-job-config –n onap’ to show the status of the job?
Thanks.

Best Regards,
Ruoyu

From: [email protected] [mailto:[email protected]] On 
Behalf Of gulsum atici
Sent: Tuesday, December 25, 2018 7:56 PM
To: Borislav Glozman <[email protected]>; [email protected]
Subject: Re: [onap-discuss] Casablanca oof module pods are waiting on init 
status #oof

Dear Borislav,

I grab some  logs  from the pod init containers. I have recreated all pods 
including  dbs  several times. However  latest situation didn't  change.
dev-oof-cmso-db-0                                             1/1       Running 
                0          33m       10.42.140.74    kub3      <none>
dev-oof-music-cassandra-0                                     1/1       Running 
                0          32m       10.42.254.144   kub3      <none>
dev-oof-music-cassandra-1                                     1/1       Running 
                0          1h        10.42.244.161   kub4      <none>
dev-oof-music-cassandra-2                                     1/1       Running 
                0          1h        10.42.56.156    kub2      <none>
dev-oof-music-tomcat-685fd777c9-8qmll                         0/1       
Init:1/3                3          35m       10.42.159.78    kub3      <none>
dev-oof-music-tomcat-685fd777c9-crdf6                         0/1       
Init:1/3                3          35m       10.42.167.24    kub2      <none>
dev-oof-music-tomcat-84bc66c649-7xf8q                         0/1       
Init:1/3                6          1h        10.42.19.117    kub1      <none>
dev-oof-music-tomcat-84bc66c649-lzmtj                         0/1       
Init:1/3                6          1h        10.42.198.179   kub4      <none>
dev-oof-oof-8ff8b46f5-8sbwv                                   1/1       Running 
                0          35m       10.42.35.56     kub3      <none>
dev-oof-oof-cmso-service-6c485cdff-pbzb6                      0/1       
Init:CrashLoopBackOff   10         35m       10.42.224.93    kub3      <none>
dev-oof-oof-has-api-74c6695b64-kcr4n                          0/1       
Init:0/3                2          35m       10.42.70.206    kub1      <none>
dev-oof-oof-has-controller-7cb97bbd4f-n7k9j                   0/1       
Init:0/3                3          35m       10.42.194.39    kub3      <none>
dev-oof-oof-has-data-5b4f76fc7b-t92r6                         0/1       
Init:0/4                3          35m       10.42.205.181   kub1      <none>
dev-oof-oof-has-healthcheck-8hqbt                             0/1       
Init:0/1                3          35m       10.42.131.183   kub3      <none>
dev-oof-oof-has-onboard-mqglv                                 0/1       
Init:0/2                3          35m       10.42.34.251    kub1      <none>
dev-oof-oof-has-reservation-5b899687db-dgjnh                  0/1       
Init:0/4                3          35m       10.42.245.175   kub1      <none>
dev-oof-oof-has-solver-65486d5fc7-s84w4                       0/1       
Init:0/4                3          35m       10.42.35.223    kub3      <none>


ubuntu@kub4:~$ kubectl  describe  pod  dev-oof-music-tomcat-685fd777c9-8qmll  
-n  onap
Name:           dev-oof-music-tomcat-685fd777c9-8qmll
Namespace:      onap
Node:           kub3/192.168.13.151
Start Time:     Tue, 25 Dec 2018 11:20:04 +0000
Labels:         app=music-tomcat
                pod-template-hash=2419833375
                release=dev-oof
Annotations:    <none>
Status:         Pending
IP:             10.42.159.78
Controlled By:  ReplicaSet/dev-oof-music-tomcat-685fd777c9
Init Containers:
  music-tomcat-zookeeper-readiness:
    Container ID:  
docker://79b0507168a8590b10f0b1eb8c720e04cd173914b6365834d5b6c9c6f86a074d
    Image:         oomk8s/readiness-check:2.0.0
    Image ID:      
docker-pullable://oomk8s/readiness-check@sha256:7daa08b81954360a1111d03364febcb3dcfeb723bcc12ce3eb3ed3e53f2323ed
    Port:          <none>
    Host Port:     <none>
    Command:
      /root/ready.py
    Args:
      --container-name
      zookeeper
    State:          Terminated
      Reason:       Completed
      Exit Code:    0
      Started:      Tue, 25 Dec 2018 11:20:57 +0000
      Finished:     Tue, 25 Dec 2018 11:21:32 +0000
    Ready:          True
    Restart Count:  0
    Environment:
      NAMESPACE:  onap (v1:metadata.namespace)
    Mounts:
      /var/run/secrets/kubernetes.io/serviceaccount from default-token-rm7hn 
(ro)
  music-tomcat-cassandra-readiness:
    Container ID:  
docker://36b752b9b2d96d6437992cab6d63d32b80107799b34b0420056656fcc4476213
    Image:         oomk8s/readiness-check:2.0.0
    Image ID:      
docker-pullable://oomk8s/readiness-check@sha256:7daa08b81954360a1111d03364febcb3dcfeb723bcc12ce3eb3ed3e53f2323ed
    Port:          <none>
    Host Port:     <none>
    Command:
      /root/job_complete.py
    Args:
      -j
      dev-oof-music-cassandra-job-config
    State:          Running
      Started:      Tue, 25 Dec 2018 11:41:58 +0000
    Last State:     Terminated
      Reason:       Error
      Exit Code:    1
      Started:      Tue, 25 Dec 2018 11:31:49 +0000
      Finished:     Tue, 25 Dec 2018 11:41:53 +0000
    Ready:          False
    Restart Count:  2
    Environment:
      NAMESPACE:  onap (v1:metadata.namespace)
    Mounts:
      /var/run/secrets/kubernetes.io/serviceaccount from default-token-rm7hn 
(ro)
  music-tomcat-war:
    Container ID:
    Image:         nexus3.onap.org:10001/onap/music/music:3.0.24
    Image ID:
    Port:          <none>
    Host Port:     <none>
    Command:
      cp
      /app/MUSIC.war
      /webapps
    State:          Waiting
      Reason:       PodInitializing
    Ready:          False
    Restart Count:  0
    Environment:    <none>
    Mounts:
      /var/run/secrets/kubernetes.io/serviceaccount from default-token-rm7hn 
(ro)
      /webapps from shared-data (rw)
Containers:
  music-tomcat:
    Container ID:
    Image:          nexus3.onap.org:10001/library/tomcat:8.5
    Image ID:
    Port:           8080/TCP
    Host Port:      0/TCP
    State:          Waiting
      Reason:       PodInitializing
    Ready:          False
    Restart Count:  0
    Liveness:       tcp-socket :8080 delay=100s timeout=50s period=10s 
#success=1 #failure=3
    Readiness:      tcp-socket :8080 delay=100s timeout=50s period=10s 
#success=1 #failure=3
    Environment:    <none>
    Mounts:
      /etc/localtime from localtime (ro)
      /opt/app/music/etc/music.properties from properties-music (rw)
      /usr/local/tomcat/webapps from shared-data (rw)
      /var/run/secrets/kubernetes.io/serviceaccount from default-token-rm7hn 
(ro)
Conditions:
  Type              Status
  Initialized       False
  Ready             False
  ContainersReady   False
  PodScheduled      True
Volumes:
  shared-data:
    Type:    EmptyDir (a temporary directory that shares a pod's lifetime)
    Medium:
  localtime:
    Type:          HostPath (bare host directory volume)
    Path:          /etc/localtime
    HostPathType:
  properties-music:
    Type:      ConfigMap (a volume populated by a ConfigMap)
    Name:      dev-oof-music-tomcat-configmap
    Optional:  false
  default-token-rm7hn:
    Type:        Secret (a volume populated by a Secret)
    SecretName:  default-token-rm7hn
    Optional:    false
QoS Class:       BestEffort
Node-Selectors:  <none>
Tolerations:     node.kubernetes.io/not-ready:NoExecute for 300s
                 node.kubernetes.io/unreachable:NoExecute for 300s
Events:
  Type    Reason     Age               From               Message
  ----    ------     ----              ----               -------
  Normal  Scheduled  27m               default-scheduler  Successfully assigned 
onap/dev-oof-music-tomcat-685fd777c9-8qmll to kub3
  Normal  Pulling    26m               kubelet, kub3      pulling image 
"oomk8s/readiness-check:2.0.0"
  Normal  Pulled     26m               kubelet, kub3      Successfully pulled 
image "oomk8s/readiness-check:2.0.0"
  Normal  Created    26m               kubelet, kub3      Created container
  Normal  Started    26m               kubelet, kub3      Started container
  Normal  Pulling    5m (x3 over 25m)  kubelet, kub3      pulling image 
"oomk8s/readiness-check:2.0.0"
  Normal  Pulled     5m (x3 over 25m)  kubelet, kub3      Successfully pulled 
image "oomk8s/readiness-check:2.0.0"
  Normal  Created    5m (x3 over 25m)  kubelet, kub3      Created container
  Normal  Started    5m (x3 over 25m)  kubelet, kub3      Started container
ubuntu@kub4:~$ kubectl  logs -f  dev-oof-music-tomcat-685fd777c9-8qmll  -c 
music-tomcat-zookeeper-readiness -n onap
2018-12-25 11:20:58,478 - INFO - Checking if zookeeper  is ready
2018-12-25 11:21:32,325 - INFO - zookeeper is ready!
2018-12-25 11:21:32,326 - INFO - zookeeper is ready!
ubuntu@kub4:~$ kubectl  logs -f  dev-oof-music-tomcat-685fd777c9-8qmll  -c  
music-tomcat-cassandra-readiness  -n onap
2018-12-25 11:41:59,688 - INFO - Checking if dev-oof-music-cassandra-job-config 
 is complete
2018-12-25 11:42:00,014 - INFO - dev-oof-music-cassandra-job-config has not 
succeeded yet
2018-12-25 11:42:05,019 - INFO - Checking if dev-oof-music-cassandra-job-config 
 is complete
2018-12-25 11:42:05,305 - INFO - dev-oof-music-cassandra-job-config has not 
succeeded yet
2018-12-25 11:42:10,310 - INFO - Checking if dev-oof-music-cassandra-job-config 
 is complete
2018-12-25 11:42:10,681 - INFO - dev-oof-music-cassandra-job-config has not 
succeeded yet
2018-12-25 11:42:15,686 - INFO - Checking if dev-oof-music-cassandra-job-config 
 is complete
2018-12-25 11:42:16,192 - INFO - dev-oof-music-cassandra-job-config has not 
succeeded yet
2018-12-25 11:42:21,198 - INFO - Checking if dev-oof-music-cassandra-job-config 
 is complete
2018-12-25 11:42:22,058 - INFO - dev-oof-music-cassandra-job-config has not 
succeeded yet
2018-12-25 11:42:27,063 - INFO - Checking if dev-oof-music-cassandra-job-config 
 is complete
2018-12-25 11:42:28,051 - INFO - dev-oof-music-cassandra-job-config has not 
succeeded yet
2018-12-25 11:42:33,054 - INFO - Checking if dev-oof-music-cassandra-job-config 
 is complete
2018-12-25 11:42:35,798 - INFO - dev-oof-music-cassandra-job-config has not 
succeeded yet
2018-12-25 11:42:40,802 - INFO - Checking if dev-oof-music-cassandra-job-config 
 is complete
2018-12-25 11:42:42,112 - INFO - dev-oof-music-cassandra-job-config has not 
succeeded yet
2018-12-25 11:42:47,117 - INFO - Checking if dev-oof-music-cassandra-job-config 
 is complete
2018-12-25 11:42:48,173 - INFO - dev-oof-music-cassandra-job-config has not 
succeeded yet
2018-12-25 11:42:53,176 - INFO - Checking if dev-oof-music-cassandra-job-config 
 is complete
2018-12-25 11:42:54,378 - INFO - dev-oof-music-cassandra-job-config has not 
succeeded yet
2018-12-25 11:42:59,382 - INFO - Checking if dev-oof-music-cassandra-job-config 
 is complete
2018-12-25 11:43:00,239 - INFO - dev-oof-music-cassandra-job-config has not 
succeeded yet
2018-12-25 11:43:05,245 - INFO - Checking if dev-oof-music-cassandra-job-config 
 is complete
2018-12-25 11:43:05,925 - INFO - dev-oof-music-cassandra-job-config has not 
succeeded yet
2018-12-25 11:43:10,930 - INFO - Checking if dev-oof-music-cassandra-job-config 
 is complete
2018-12-25 11:43:11,930 - INFO - dev-oof-music-cassandra-job-config has not 
succeeded yet
2018-12-25 11:43:16,934 - INFO - Checking if dev-oof-music-cassandra-job-config 
 is complete
2018-12-25 11:43:19,212 - INFO - dev-oof-music-cassandra-job-config has not 
succeeded yet
2018-12-25 11:43:24,217 - INFO - Checking if dev-oof-music-cassandra-job-config 
 is complete
2018-12-25 11:43:25,102 - INFO - dev-oof-music-cassandra-job-config has not 
succeeded yet
2018-12-25 11:43:30,106 - INFO - Checking if dev-oof-music-cassandra-job-config 
 is complete
2018-12-25 11:43:32,245 - INFO - dev-oof-music-cassandra-job-config has not 
succeeded yet
2018-12-25 11:43:37,254 - INFO - Checking if dev-oof-music-cassandra-job-config 
 is complete
2018-12-25 11:43:37,534 - INFO - dev-oof-music-cassandra-job-config has not 
succeeded yet
2018-12-25 11:43:42,539 - INFO - Checking if dev-oof-music-cassandra-job-config 
 is complete
2018-12-25 11:43:44,826 - INFO - dev-oof-music-cassandra-job-config has not 
succeeded yet
2018-12-25 11:43:49,830 - INFO - Checking if dev-oof-music-cassandra-job-config 
 is complete
2018-12-25 11:43:50,486 - INFO - dev-oof-music-cassandra-job-config has not 
succeeded yet
2018-12-25 11:43:55,490 - INFO - Checking if dev-oof-music-cassandra-job-config 
 is complete
2018-12-25 11:43:56,398 - INFO - dev-oof-music-cassandra-job-config has not 
succeeded yet
2018-12-25 11:44:01,403 - INFO - Checking if dev-oof-music-cassandra-job-config 
 is complete
2018-12-25 11:44:02,134 - INFO - dev-oof-music-cassandra-job-config has not 
succeeded yet
2018-12-25 11:44:07,139 - INFO - Checking if dev-oof-music-cassandra-job-config 
 is complete
2018-12-25 11:44:07,834 - INFO - dev-oof-music-cassandra-job-config has not 
succeeded yet
2018-12-25 11:44:12,837 - INFO - Checking if dev-oof-music-cassandra-job-config 
 is complete
2018-12-25 11:44:13,026 - INFO - dev-oof-music-cassandra-job-config has not 
succeeded yet
2018-12-25 11:44:18,030 - INFO - Checking if dev-oof-music-cassandra-job-config 
 is complete
2018-12-25 11:44:19,561 - INFO - dev-oof-music-cassandra-job-config has not 
succeeded yet
2018-12-25 11:44:24,566 - INFO - Checking if dev-oof-music-cassandra-job-config 
 is complete
2018-12-25 11:44:25,153 - INFO - dev-oof-music-cassandra-job-config has not 
succeeded yet
ubuntu@kub4:~$ kubectl describe pod  dev-oof-oof-cmso-service-6c485cdff-pbzb6  
-n onap
Name:           dev-oof-oof-cmso-service-6c485cdff-pbzb6
Namespace:      onap
Node:           kub3/192.168.13.151
Start Time:     Tue, 25 Dec 2018 11:20:07 +0000
Labels:         app=oof-cmso-service
                pod-template-hash=270417899
                release=dev-oof
Annotations:    <none>
Status:         Pending
IP:             10.42.224.93
Controlled By:  ReplicaSet/dev-oof-oof-cmso-service-6c485cdff
Init Containers:
  oof-cmso-service-readiness:
    Container ID:  
docker://bb4ccdfaf3ba6836e606685de4bbe069da2e5193f165ae466f768dad85b71908
    Image:         oomk8s/readiness-check:2.0.0
    Image ID:      
docker-pullable://oomk8s/readiness-check@sha256:7daa08b81954360a1111d03364febcb3dcfeb723bcc12ce3eb3ed3e53f2323ed
    Port:          <none>
    Host Port:     <none>
    Command:
      /root/ready.py
    Args:
      --container-name
      cmso-db
    State:          Terminated
      Reason:       Completed
      Exit Code:    0
      Started:      Tue, 25 Dec 2018 11:22:53 +0000
      Finished:     Tue, 25 Dec 2018 11:25:01 +0000
    Ready:          True
    Restart Count:  0
    Environment:
      NAMESPACE:  onap (v1:metadata.namespace)
    Mounts:
      /var/run/secrets/kubernetes.io/serviceaccount from default-token-rm7hn 
(ro)
  db-init:
    Container ID:   
docker://dbc9fadd1140584043b8f690974a4d626f64d12ef5002108b7b5c29148981e23
    Image:          nexus3.onap.org:10001/onap/optf-cmso-dbinit:1.0.1
    Image ID:       
docker-pullable://nexus3.onap.org:10001/onap/optf-cmso-dbinit@sha256:c5722a319fb0d91ad4d533597cdee2b55fc5c51d0a8740cf02cbaa1969c8554f
    Port:           <none>
    Host Port:      <none>
    State:          Waiting
      Reason:       CrashLoopBackOff
    Last State:     Terminated
      Reason:       Error
      Exit Code:    1
      Started:      Tue, 25 Dec 2018 11:48:31 +0000
      Finished:     Tue, 25 Dec 2018 11:48:41 +0000
    Ready:          False
    Restart Count:  9
    Environment:
      DB_HOST:      oof-cmso-dbhost.onap
      DB_PORT:      3306
      DB_USERNAME:  root
      DB_SCHEMA:    cmso
      DB_PASSWORD:  <set to the key 'db-root-password' in secret 
'dev-oof-cmso-db'>  Optional: false
    Mounts:
      /share/etc/config from dev-oof-oof-cmso-service-config (rw)
      /share/logs from dev-oof-oof-cmso-service-logs (rw)
      /var/run/secrets/kubernetes.io/serviceaccount from default-token-rm7hn 
(ro)
Containers:
  oof-cmso-service:
    Container ID:
    Image:          nexus3.onap.org:10001/onap/optf-cmso-service:1.0.1
    Image ID:
    Port:           8080/TCP
    Host Port:      0/TCP
    State:          Waiting
      Reason:       PodInitializing
    Ready:          False
    Restart Count:  0
    Liveness:       tcp-socket :8080 delay=120s timeout=50s period=10s 
#success=1 #failure=3
    Readiness:      tcp-socket :8080 delay=100s timeout=50s period=10s 
#success=1 #failure=3
    Environment:
      DB_HOST:      oof-cmso-dbhost.onap
      DB_PORT:      3306
      DB_USERNAME:  cmso-admin
      DB_SCHEMA:    cmso
      DB_PASSWORD:  <set to the key 'user-password' in secret 
'dev-oof-cmso-db'>  Optional: false
    Mounts:
      /share/debug-logs from dev-oof-oof-cmso-service-logs (rw)
      /share/etc/config from dev-oof-oof-cmso-service-config (rw)
      /share/logs from dev-oof-oof-cmso-service-logs (rw)
      /var/run/secrets/kubernetes.io/serviceaccount from default-token-rm7hn 
(ro)
Conditions:
  Type              Status
  Initialized       False
  Ready             False
  ContainersReady   False
  PodScheduled      True
Volumes:
  dev-oof-oof-cmso-service-config:
    Type:      ConfigMap (a volume populated by a ConfigMap)
    Name:      dev-oof-oof-cmso-service
    Optional:  false
  dev-oof-oof-cmso-service-logs:
    Type:    EmptyDir (a temporary directory that shares a pod's lifetime)
    Medium:
  default-token-rm7hn:
    Type:        Secret (a volume populated by a Secret)
    SecretName:  default-token-rm7hn
    Optional:    false
QoS Class:       BestEffort
Node-Selectors:  <none>
Tolerations:     node.kubernetes.io/not-ready:NoExecute for 300s
                 node.kubernetes.io/unreachable:NoExecute for 300s
Events:
  Type     Reason                  Age                From               Message
  ----     ------                  ----               ----               -------
  Normal   Scheduled               30m                default-scheduler  
Successfully assigned onap/dev-oof-oof-cmso-service-6c485cdff-pbzb6 to kub3
  Warning  FailedCreatePodSandBox  29m                kubelet, kub3      Failed 
create pod sandbox: rpc error: code = Unknown desc = [failed to set up sandbox 
container "7d02bb1144aaaf2479a741c971bad617ea532717e7e72d71e2bfeeac992a7451" 
network for pod "dev-oof-oof-cmso-service-6c485cdff-pbzb6": NetworkPlugin cni 
failed to set up pod "dev-oof-oof-cmso-service-6c485cdff-pbzb6_onap" network: 
No MAC address found, failed to clean up sandbox container 
"7d02bb1144aaaf2479a741c971bad617ea532717e7e72d71e2bfeeac992a7451" network for 
pod "dev-oof-oof-cmso-service-6c485cdff-pbzb6": NetworkPlugin cni failed to 
teardown pod "dev-oof-oof-cmso-service-6c485cdff-pbzb6_onap" network: failed to 
get IP addresses for "eth0": <nil>]
  Normal   SandboxChanged          29m                kubelet, kub3      Pod 
sandbox changed, it will be killed and re-created.
  Normal   Pulling                 27m                kubelet, kub3      
pulling image "oomk8s/readiness-check:2.0.0"
  Normal   Pulled                  27m                kubelet, kub3      
Successfully pulled image "oomk8s/readiness-check:2.0.0"
  Normal   Created                 27m                kubelet, kub3      
Created container
  Normal   Started                 27m                kubelet, kub3      
Started container
  Normal   Pulling                 23m (x4 over 25m)  kubelet, kub3      
pulling image "nexus3.onap.org:10001/onap/optf-cmso-dbinit:1.0.1"
  Normal   Pulled                  23m (x4 over 25m)  kubelet, kub3      
Successfully pulled image "nexus3.onap.org:10001/onap/optf-cmso-dbinit:1.0.1"
  Normal   Created                 23m (x4 over 25m)  kubelet, kub3      
Created container
  Normal   Started                 23m (x4 over 25m)  kubelet, kub3      
Started container
  Warning  BackOff                 4m (x80 over 24m)  kubelet, kub3      
Back-off restarting failed container
ubuntu@kub4:~$ kubectl logs  -f  dev-oof-oof-cmso-service-6c485cdff-pbzb6  -c 
oof-cmso-service-readiness -n onap
2018-12-25 11:22:54,683 - INFO - Checking if cmso-db  is ready
2018-12-25 11:23:02,186 - INFO - Checking if cmso-db  is ready
2018-12-25 11:23:09,950 - INFO - Checking if cmso-db  is ready
2018-12-25 11:23:12,938 - INFO - cmso-db is not ready.
2018-12-25 11:23:17,963 - INFO - Checking if cmso-db  is ready
2018-12-25 11:23:20,091 - INFO - cmso-db is not ready.
2018-12-25 11:23:25,111 - INFO - Checking if cmso-db  is ready
2018-12-25 11:23:27,315 - INFO - cmso-db is not ready.
2018-12-25 11:23:32,329 - INFO - Checking if cmso-db  is ready
2018-12-25 11:23:35,390 - INFO - cmso-db is not ready.
2018-12-25 11:23:40,407 - INFO - Checking if cmso-db  is ready
2018-12-25 11:23:43,346 - INFO - cmso-db is not ready.
2018-12-25 11:23:48,371 - INFO - Checking if cmso-db  is ready
2018-12-25 11:23:53,848 - INFO - cmso-db is not ready.
2018-12-25 11:23:58,870 - INFO - Checking if cmso-db  is ready
2018-12-25 11:24:02,188 - INFO - cmso-db is not ready.
2018-12-25 11:24:07,207 - INFO - Checking if cmso-db  is ready
2018-12-25 11:24:10,598 - INFO - cmso-db is not ready.
2018-12-25 11:24:15,622 - INFO - Checking if cmso-db  is ready
2018-12-25 11:24:18,936 - INFO - cmso-db is not ready.
2018-12-25 11:24:23,955 - INFO - Checking if cmso-db  is ready
2018-12-25 11:24:26,794 - INFO - cmso-db is not ready.
2018-12-25 11:24:31,813 - INFO - Checking if cmso-db  is ready
2018-12-25 11:24:35,529 - INFO - cmso-db is not ready.
2018-12-25 11:24:40,566 - INFO - Checking if cmso-db  is ready
2018-12-25 11:24:44,374 - INFO - cmso-db is not ready.
2018-12-25 11:24:49,403 - INFO - Checking if cmso-db  is ready
2018-12-25 11:24:53,222 - INFO - cmso-db is not ready.
2018-12-25 11:24:58,238 - INFO - Checking if cmso-db  is ready
2018-12-25 11:25:01,340 - INFO - cmso-db is ready!
ubuntu@kub4:~$ kubectl logs  -f  dev-oof-oof-cmso-service-6c485cdff-pbzb6  -c  
db-init  -n onap
VM_ARGS=

  .   ____          _            __ _ _
 /\\ / ___'_ __ _ _(_)_ __  __ _ \ \ \ \
( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \
 \\/  ___)| |_)| | | | | || (_| |  ) ) ) )
  '  |____| .__|_| |_|_| |_\__, | / / / /
 =========|_|==============|___/=/_/_/_/
 :: Spring Boot ::        (v2.0.6.RELEASE)

2018-12-25 11:48:36.187  INFO 8 --- [           main] 
o.o.o.c.liquibase.LiquibaseApplication   : Starting LiquibaseApplication on 
dev-oof-oof-cmso-service-6c485cdff-pbzb6 with PID 8 
(/opt/app/cmso-dbinit/app.jar started by root in /opt/app/cmso-dbinit)
2018-12-25 11:48:36.199  INFO 8 --- [           main] 
o.o.o.c.liquibase.LiquibaseApplication   : No active profile set, falling back 
to default profiles: default
2018-12-25 11:48:36.310  INFO 8 --- [           main] 
s.c.a.AnnotationConfigApplicationContext : Refreshing 
org.springframework.context.annotation.AnnotationConfigApplicationContext@d44fc21<mailto:org.springframework.context.annotation.AnnotationConfigApplicationContext@d44fc21>:
 startup date [Tue Dec 25 11:48:36 UTC 2018]; root of context hierarchy
2018-12-25 11:48:40.336  INFO 8 --- [           main] 
com.zaxxer.hikari.HikariDataSource       : HikariPool-1 - Starting...
2018-12-25 11:48:40.754  INFO 8 --- [           main] 
com.zaxxer.hikari.HikariDataSource       : HikariPool-1 - Start completed.
2018-12-25 11:48:41.044  WARN 8 --- [           main] 
s.c.a.AnnotationConfigApplicationContext : Exception encountered during context 
initialization - cancelling refresh attempt: 
org.springframework.beans.factory.BeanCreationException: Error creating bean 
with name 'liquibase' defined in class path resource 
[org/onap/optf/cmso/liquibase/LiquibaseData.class]: Invocation of init method 
failed; nested exception is liquibase.exception.LockException: 
liquibase.exception.DatabaseException: liquibase.exception.DatabaseException: 
java.sql.SQLTransactionRollbackException: (conn=327) Deadlock found when trying 
to get lock; try restarting transaction
2018-12-25 11:48:41.045  INFO 8 --- [           main] 
com.zaxxer.hikari.HikariDataSource       : HikariPool-1 - Shutdown initiated...
2018-12-25 11:48:41.109  INFO 8 --- [           main] 
com.zaxxer.hikari.HikariDataSource       : HikariPool-1 - Shutdown completed.
2018-12-25 11:48:41.177  INFO 8 --- [           main] 
ConditionEvaluationReportLoggingListener :

Error starting ApplicationContext. To display the conditions report re-run your 
application with 'debug' enabled.
2018-12-25 11:48:41.223 ERROR 8 --- [           main] 
o.s.boot.SpringApplication               : Application run failed

org.springframework.beans.factory.BeanCreationException: Error creating bean 
with name 'liquibase' defined in class path resource 
[org/onap/optf/cmso/liquibase/LiquibaseData.class]: Invocation of init method 
failed; nested exception is liquibase.exception.LockException: 
liquibase.exception.DatabaseException: liquibase.exception.DatabaseException: 
java.sql.SQLTransactionRollbackException: (conn=327) Deadlock found when trying 
to get lock; try restarting transaction
at 
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1694)
 ~[spring-beans-5.0.10.RELEASE.jar!/:5.0.10.RELEASE]
at 
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:573)
 ~[spring-beans-5.0.10.RELEASE.jar!/:5.0.10.RELEASE]
at 
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:495)
 ~[spring-beans-5.0.10.RELEASE.jar!/:5.0.10.RELEASE]
at 
org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317)
 ~[spring-beans-5.0.10.RELEASE.jar!/:5.0.10.RELEASE]
at 
org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
 ~[spring-beans-5.0.10.RELEASE.jar!/:5.0.10.RELEASE]
at 
org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315)
 ~[spring-beans-5.0.10.RELEASE.jar!/:5.0.10.RELEASE]
at 
org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199)
 ~[spring-beans-5.0.10.RELEASE.jar!/:5.0.10.RELEASE]
at 
org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:759)
 ~[spring-beans-5.0.10.RELEASE.jar!/:5.0.10.RELEASE]
at 
org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:867)
 ~[spring-context-5.0.10.RELEASE.jar!/:5.0.10.RELEASE]
at 
org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:548)
 ~[spring-context-5.0.10.RELEASE.jar!/:5.0.10.RELEASE]
at 
org.springframework.boot.SpringApplication.refresh(SpringApplication.java:754) 
[spring-boot-2.0.6.RELEASE.jar!/:2.0.6.RELEASE]
at 
org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:386)
 [spring-boot-2.0.6.RELEASE.jar!/:2.0.6.RELEASE]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:307) 
[spring-boot-2.0.6.RELEASE.jar!/:2.0.6.RELEASE]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1242) 
[spring-boot-2.0.6.RELEASE.jar!/:2.0.6.RELEASE]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1230) 
[spring-boot-2.0.6.RELEASE.jar!/:2.0.6.RELEASE]
at 
org.onap.optf.cmso.liquibase.LiquibaseApplication.main(LiquibaseApplication.java:45)
 [classes!/:na]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_181]
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
~[na:1.8.0_181]
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[na:1.8.0_181]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_181]
at 
org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48) 
[app.jar:na]
at org.springframework.boot.loader.Launcher.launch(Launcher.java:87) 
[app.jar:na]
at org.springframework.boot.loader.Launcher.launch(Launcher.java:50) 
[app.jar:na]
at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:51) 
[app.jar:na]
Caused by: liquibase.exception.LockException: 
liquibase.exception.DatabaseException: liquibase.exception.DatabaseException: 
java.sql.SQLTransactionRollbackException: (conn=327) Deadlock found when trying 
to get lock; try restarting transaction
at 
liquibase.lockservice.StandardLockService.acquireLock(StandardLockService.java:242)
 ~[liquibase-core-3.5.5.jar!/:na]
at 
liquibase.lockservice.StandardLockService.waitForLock(StandardLockService.java:170)
 ~[liquibase-core-3.5.5.jar!/:na]
at liquibase.Liquibase.update(Liquibase.java:196) 
~[liquibase-core-3.5.5.jar!/:na]
at liquibase.Liquibase.update(Liquibase.java:192) 
~[liquibase-core-3.5.5.jar!/:na]
at 
liquibase.integration.spring.SpringLiquibase.performUpdate(SpringLiquibase.java:431)
 ~[liquibase-core-3.5.5.jar!/:na]
at 
liquibase.integration.spring.SpringLiquibase.afterPropertiesSet(SpringLiquibase.java:388)
 ~[liquibase-core-3.5.5.jar!/:na]
at 
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1753)
 ~[spring-beans-5.0.10.RELEASE.jar!/:5.0.10.RELEASE]
at 
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1690)
 ~[spring-beans-5.0.10.RELEASE.jar!/:5.0.10.RELEASE]
... 23 common frames omitted
Caused by: liquibase.exception.DatabaseException: 
liquibase.exception.DatabaseException: 
java.sql.SQLTransactionRollbackException: (conn=327) Deadlock found when trying 
to get lock; try restarting transaction
at 
liquibase.database.AbstractJdbcDatabase.commit(AbstractJdbcDatabase.java:1159) 
~[liquibase-core-3.5.5.jar!/:na]
at 
liquibase.lockservice.StandardLockService.acquireLock(StandardLockService.java:233)
 ~[liquibase-core-3.5.5.jar!/:na]
... 30 common frames omitted
Caused by: liquibase.exception.DatabaseException: 
java.sql.SQLTransactionRollbackException: (conn=327) Deadlock found when trying 
to get lock; try restarting transaction
at liquibase.database.jvm.JdbcConnection.commit(JdbcConnection.java:126) 
~[liquibase-core-3.5.5.jar!/:na]
at 
liquibase.database.AbstractJdbcDatabase.commit(AbstractJdbcDatabase.java:1157) 
~[liquibase-core-3.5.5.jar!/:na]
... 31 common frames omitted
Caused by: java.sql.SQLTransactionRollbackException: (conn=327) Deadlock found 
when trying to get lock; try restarting transaction
at 
org.mariadb.jdbc.internal.util.exceptions.ExceptionMapper.get(ExceptionMapper.java:179)
 ~[mariadb-java-client-2.2.6.jar!/:na]
at 
org.mariadb.jdbc.internal.util.exceptions.ExceptionMapper.getException(ExceptionMapper.java:110)
 ~[mariadb-java-client-2.2.6.jar!/:na]
at 
org.mariadb.jdbc.MariaDbStatement.executeExceptionEpilogue(MariaDbStatement.java:228)
 ~[mariadb-java-client-2.2.6.jar!/:na]
at org.mariadb.jdbc.MariaDbStatement.executeInternal(MariaDbStatement.java:334) 
~[mariadb-java-client-2.2.6.jar!/:na]
at org.mariadb.jdbc.MariaDbStatement.execute(MariaDbStatement.java:386) 
~[mariadb-java-client-2.2.6.jar!/:na]
at org.mariadb.jdbc.MariaDbConnection.commit(MariaDbConnection.java:709) 
~[mariadb-java-client-2.2.6.jar!/:na]
at com.zaxxer.hikari.pool.ProxyConnection.commit(ProxyConnection.java:368) 
~[HikariCP-2.7.9.jar!/:na]
at 
com.zaxxer.hikari.pool.HikariProxyConnection.commit(HikariProxyConnection.java) 
~[HikariCP-2.7.9.jar!/:na]
at liquibase.database.jvm.JdbcConnection.commit(JdbcConnection.java:123) 
~[liquibase-core-3.5.5.jar!/:na]
... 32 common frames omitted
Caused by: java.sql.SQLException: Deadlock found when trying to get lock; try 
restarting transaction
Query is: COMMIT
at 
org.mariadb.jdbc.internal.util.LogQueryTool.exceptionWithQuery(LogQueryTool.java:119)
 ~[mariadb-java-client-2.2.6.jar!/:na]
at 
org.mariadb.jdbc.internal.protocol.AbstractQueryProtocol.executeQuery(AbstractQueryProtocol.java:200)
 ~[mariadb-java-client-2.2.6.jar!/:na]
at org.mariadb.jdbc.MariaDbStatement.executeInternal(MariaDbStatement.java:328) 
~[mariadb-java-client-2.2.6.jar!/:na]
... 37 common frames omitted

ubuntu@kub4:~$




-=-=-=-=-=-=-=-=-=-=-=-
Links: You receive all messages sent to this group.

View/Reply Online (#14697): https://lists.onap.org/g/onap-discuss/message/14697
Mute This Topic: https://lists.onap.org/mt/28846240/21656
Mute #oof: https://lists.onap.org/mk?hashtag=oof&subid=2740164
Group Owner: [email protected]
Unsubscribe: https://lists.onap.org/g/onap-discuss/unsub  
[[email protected]]
-=-=-=-=-=-=-=-=-=-=-=-

Reply via email to