hi, oom experts, there are some issues when I install ONAP via OOM, firstly I 
made on the branch dublin and seleted components aaf, aai, appc and log, but 
there were many pods in status error or init or  CrashLoopBackOff, all what I 
did is from the guide OOM Quick Start Guide — dublin documentation. then I 
checkouted to the branch casablanca and try again, still many pods in error or 
init or CrashLoopBackOff.
env:Name experiment-vm : kubernetes masterRAM 8GBVCPUs 4 VCPUDisk 
80GBImage:ubuntu-18.04
Name develop-for-onap : kubernetes  workerRAM 40GBVCPUs 16 VCPUDisk 320GB

 helm deploy development local/onap --namespace onap -f 
onap/resources/environments/dev.yamlfetching local/onaprelease "development" 
deployedrelease "development-aaf" deployedrelease "development-aai" 
deployedrelease "development-appc" deployedrelease "development-log" 
deployedrelease "development-robot" 
deployedroot@experiment-vm:~/oom/kubernetes# helm listNAME                    
REVISION        UPDATED                         STATUS          CHART           
NAMESPACEdevelopment             1               Fri Jul 26 05:59:06 2019       
 DEPLOYED        onap-3.0.0      onapdevelopment-aaf         1               
Fri Jul 26 05:59:06 2019        DEPLOYED        aaf-3.0.0       
onapdevelopment-aai         1               Fri Jul 26 05:59:08 2019        
DEPLOYED        aai-3.0.0       onapdevelopment-appc        1               Fri 
Jul 26 05:59:14 2019        DEPLOYED        appc-3.0.0      onapdevelopment-log 
        1               Fri Jul 26 05:59:18 2019        DEPLOYED        
log-3.0.0       onapdevelopment-robot       1               Fri Jul 26 05:59:22 
2019        DEPLOYED        robot-3.0.0     onap

root@experiment-vm:~/oom# kubectl get pod -n onapNAME                           
                         READY   STATUS             RESTARTS   
AGEdevelopment-aaf-aaf-cm-89c5964bc-vnb9h                  0/1     Init:1/2     
      9          98mdevelopment-aaf-aaf-cs-658c4d86b9-dtqkh                 1/1 
    Running            0          98mdevelopment-aaf-aaf-fs-6c857fb68d-wn5pc    
             0/1     Init:1/2           9          
98mdevelopment-aaf-aaf-gui-6d58dcbb5b-n9sfs                0/1     Init:1/2     
      9          98mdevelopment-aaf-aaf-hello-7d84c4c6ff-8k7kv              0/1 
    Init:1/2           9          
98mdevelopment-aaf-aaf-locate-7c75554bb8-stsdv             0/1     Init:1/2     
      9          98mdevelopment-aaf-aaf-oauth-7f4c6b6d66-8bvmq              0/1 
    Init:1/2           9          
98mdevelopment-aaf-aaf-service-55cd9b76f4-98tsw            0/1     
CrashLoopBackOff   18         98mdevelopment-aaf-aaf-sms-85566ddf99-jfh42       
         0/1     Running            21         
98mdevelopment-aaf-aaf-sms-quorumclient-0                  1/1     Running      
      0          98mdevelopment-aaf-aaf-sms-quorumclient-1                  1/1 
    Running            0          93mdevelopment-aaf-aaf-sms-quorumclient-2     
             1/1     Running            0          
76mdevelopment-aaf-aaf-sms-vault-0                         1/2     
CrashLoopBackOff   21         98mdevelopment-aaf-aaf-sshsm-distcenter-z476n     
         0/1     Completed          0          
98mdevelopment-aaf-aaf-sshsm-testca-bp7hq                  0/1     Init:Error   
      0          88mdevelopment-aaf-aaf-sshsm-testca-d2v8m                  0/1 
    Completed          0          77mdevelopment-aaf-aaf-sshsm-testca-zlbdx     
             0/1     Init:Error         0          
98mdevelopment-aai-aai-5fbd69b74b-cxkxm                    0/1     Init:0/1     
      9          98mdevelopment-aai-aai-babel-5674478d8-65nmj               2/2 
    Running            0          98mdevelopment-aai-aai-cassandra-0            
             1/1     Running            17         
98mdevelopment-aai-aai-champ-77d94fb96d-ms9wg              1/2     Running      
      0          98mdevelopment-aai-aai-data-router-55d9d6689d-929r8        1/2 
    CrashLoopBackOff   27         
98mdevelopment-aai-aai-elasticsearch-558fbb9c5-8m5w2       1/1     Running      
      0          98mdevelopment-aai-aai-gizmo-85d57fbfb9-d4ldb              2/2 
    Running            0          
98mdevelopment-aai-aai-graphadmin-bc58dff78-ffcjs          0/2     Init:0/1     
      9          98mdevelopment-aai-aai-graphadmin-create-db-schema-4svjc   0/1 
    Error              0          
16mdevelopment-aai-aai-graphadmin-create-db-schema-9t92z   0/1     Error        
      0          47mdevelopment-aai-aai-graphadmin-create-db-schema-9vfgg   0/1 
    Error              0          
10mdevelopment-aai-aai-graphadmin-create-db-schema-bw5bb   0/1     Error        
      0          64mdevelopment-aai-aai-graphadmin-create-db-schema-c74wk   0/1 
    Init:Error         0          
98mdevelopment-aai-aai-graphadmin-create-db-schema-dflcf   0/1     Error        
      0          72mdevelopment-aai-aai-graphadmin-create-db-schema-gfg7f   0/1 
    Error              0          
57mdevelopment-aai-aai-graphadmin-create-db-schema-h46wm   0/1     Error        
      0          34mdevelopment-aai-aai-graphadmin-create-db-schema-jbdzz   0/1 
    Error              0          
87mdevelopment-aai-aai-graphadmin-create-db-schema-jxr8q   0/1     Error        
      0          22mdevelopment-aai-aai-graphadmin-create-db-schema-p5kqm   0/1 
    Error              0          
68mdevelopment-aai-aai-modelloader-6dbbc59488-prqkw        2/2     Running      
      0          98mdevelopment-aai-aai-resources-58c8655d7-fd4db           0/2 
    Init:0/1           9          
98mdevelopment-aai-aai-search-data-5d9555d457-jhk97        2/2     Running      
      0          98mdevelopment-aai-aai-sparky-be-64f4655cb5-hv2p2          0/2 
    Init:0/1           2          98mdevelopment-aai-aai-spike-8675765db5-ldklj 
             0/2     Init:0/1           9          
98mdevelopment-aai-aai-traversal-96d68799-h2mjx            0/2     Init:0/1     
      9          98mdevelopment-appc-appc-0                                 0/2 
    Init:0/1           9          
98mdevelopment-appc-appc-ansible-server-6f5fb5ffd4-npmk9   0/1     Init:0/1     
      9          98mdevelopment-appc-appc-cdt-d4ff956dd-jq96x               1/1 
    Running            0          98mdevelopment-appc-appc-db-0                 
             0/1     CrashLoopBackOff   23         
98mdevelopment-appc-appc-dgbuilder-5f6564c755-648jv        0/1     Init:0/1     
      9          98mdevelopment-log-log-elasticsearch-744c776b6f-dwfs8      1/1 
    Running            0          
98mdevelopment-log-log-kibana-69d787557f-ckfn2             0/1     Running      
      13         98mdevelopment-log-log-logstash-796f87c6bd-6frnq           1/1 
    Running            0          
98mdevelopment-log-log-logstash-796f87c6bd-8sf48           1/1     Running      
      0          98mdevelopment-log-log-logstash-796f87c6bd-fz6n6           1/1 
    Running            0          
98mdevelopment-log-log-logstash-796f87c6bd-v4bhp           1/1     Running      
      0          98mdevelopment-log-log-logstash-796f87c6bd-wlbzn           1/1 
    Running            0          98mdevelopment-robot-robot-597dcb6fd9-9qjxd   
             1/1     Running            0          98m

error for CrashLoopBackOff   pod root@experiment-vm:~/oom# kubectl logs 
development-aaf-aaf-service-55cd9b76f4-98tsw -n onapSleeping 0Done2019-07-26 
07:35:12,633 WARN [init] 2019-07-26T07:35:12.627+0000 INIT [init] Loading CADI 
Properties from /opt/app/osaaf/etc/org.osaaf.aaf.service.props2019-07-26 
07:35:12,635 WARN [init] 2019-07-26T07:35:12.635+0000 INIT [init] Loading CADI 
Properties from /opt/app/osaaf/local/org.osaaf.aaf.props2019-07-26 07:35:12,636 
WARN [init] 2019-07-26T07:35:12.636+0000 INIT [init] Loading CADI Properties 
from /opt/app/osaaf/local/org.osaaf.aaf.location.props2019-07-26 07:35:12,646 
WARN [init] 2019-07-26T07:35:12.646+0000 INIT [init] Loading CADI Properties 
from /opt/app/osaaf/local/org.osaaf.aaf.cred.props2019-07-26 07:35:12,674 WARN 
[init] 2019-07-26T07:35:12.674+0000 INIT [init] cadi_keyfile points to 
/opt/app/osaaf/local/org.osaaf.aaf.keyfile2019-07-26 07:35:12,679 WARN [init] 
2019-07-26T07:35:12.679+0000 INIT [init] Loading CADI Properties from 
/opt/app/osaaf/etc/org.osaaf.aaf.log4j.props2019-07-26 07:35:12,680 WARN [init] 
2019-07-26T07:35:12.680+0000 INIT [init] Loading CADI Properties from 
/opt/app/osaaf/local/org.osaaf.aaf.cassandra.props2019-07-26 07:35:12,681 WARN 
[init] 2019-07-26T07:35:12.681+0000 INIT [init] Loading CADI Properties from 
/opt/app/osaaf/etc/org.osaaf.aaf.orgs.props2019-07-26 07:35:12,682 WARN [init] 
2019-07-26T07:35:12.681+0000 INIT [init] cadi_keyfile points to 
/opt/app/osaaf/local/org.osaaf.aaf.keyfile2019-07-26 07:35:12,685 WARN [init] 
2019-07-26T07:35:12.685+0000 INIT [init] cadi_keyfile points to 
/opt/app/osaaf/local/org.osaaf.aaf.keyfile2019-07-26 07:35:12,698 WARN [init] 
2019-07-26T07:35:12.698+0000 INIT [init] AAF Root NS is org.osaaf.aaf, and AAF 
Company Root is org.osaaf2019-07-26 07:35:12,750 WARN [init] 
2019-07-26T07:35:12.750+0000 INIT [init] Cass Port =  90422019-07-26 
07:35:12,750 WARN [init] 2019-07-26T07:35:12.750+0000 INIT [init] Cass User =  
cassandra2019-07-26 07:35:12,751 WARN [init] 2019-07-26T07:35:12.751+0000 INIT 
[init] cadi_keyfile points to 
/opt/app/osaaf/local/org.osaaf.aaf.keyfile2019-07-26 07:35:12,831 WARN [init] 
2019-07-26T07:35:12.831+0000 INIT [init] Cass ResetExceptions =  
com.datastax.driver.core.exceptions.NoHostAvailableException:"no host was 
tried":"Connection has been closed"2019-07-26 07:35:12,834 WARN [init] 
2019-07-26T07:35:12.834+0000 INIT [init] Service Latitude,Longitude = 
38.000000,-72.0000002019-07-26 07:35:12,834 WARN [init] 
2019-07-26T07:35:12.834+0000 INIT [init] Cass Clusters =  
aaf-cass.onapjava.lang.IllegalArgumentException: Failed to add contact point: 
aaf-cass.onap        at 
com.datastax.driver.core.Cluster$Builder.addContactPoint(Cluster.java:922)      
  at 
com.datastax.driver.core.Cluster$Builder.addContactPoints(Cluster.java:942)     
   at org.onap.aaf.auth.dao.CassAccess.cluster(CassAccess.java:146)        at 
org.onap.aaf.auth.service.AAF_Service.<init>(AAF_Service.java:95)        at 
org.onap.aaf.auth.service.AAF_Service.main(AAF_Service.java:229)Caused by: 
java.net.UnknownHostException: aaf-cass.onap: Temporary failure in name 
resolution        at java.net.Inet4AddressImpl.lookupAllHostAddr(Native Method) 
       at java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:929)        
at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1324)      
  at java.net.InetAddress.getAllByName0(InetAddress.java:1277)        at 
java.net.InetAddress.getAllByName(InetAddress.java:1193)        at 
java.net.InetAddress.getAllByName(InetAddress.java:1127)        at 
com.datastax.driver.core.Cluster$Builder.addContactPoint(Cluster.java:919)      
  ... 4 more


root@experiment-vm:~/oom# kubectl logs development-aaf-aaf-sms-vault-0 -n 
onapError from server (BadRequest): a container name must be specified for pod 
development-aaf-aaf-sms-vault-0, choose one of: [aaf-sms-vault 
aaf-sms-vault-backend]

root@experiment-vm:~/oom# kubectl logs 
development-aai-aai-data-router-55d9d6689d-929r8 -n onapError from server 
(BadRequest): a container name must be specified for pod 
development-aai-aai-data-router-55d9d6689d-929r8, choose one of: 
[aai-data-router filebeat-onap] or one of the init containers: [init-sysctl]

root@experiment-vm:~/oom# kubectl logs development-appc-appc-db-0 -n onap+ 
CONTAINER_SCRIPTS_DIR=/usr/share/container-scripts/mysql+ 
EXTRA_DEFAULTS_FILE=/etc/my.cnf.d/galera.cnf+ '[' -z onap ']'+ echo 'Galera: 
Finding peers'Galera: Finding peers++ hostname -f++ cut -d. -f2+ 
K8S_SVC_NAME=appc-dbhost+ echo 'Using service name: appc-dbhost'+ cp 
/usr/share/container-scripts/mysql/galera.cnf /etc/my.cnf.d/galera.cnfUsing 
service name: appc-dbhost+ /usr/bin/peer-finder 
-on-start=/usr/share/container-scripts/mysql/configure-galera.sh 
-service=appc-dbhost2019/07/26 07:36:41 lookup appc-dbhost on 10.96.0.10:53: 
read udp 172.16.1.84:53274->10.96.0.10:53: i/o timeout
error status podsroot@experiment-vm:~/oom# kubectl logs 
development-aaf-aaf-sshsm-testca-bp7hq -n onapError from server (BadRequest): 
container "aaf-sshsm-testca" in pod "development-aaf-aaf-sshsm-testca-bp7hq" is 
waiting to start: PodInitializing    

root@experiment-vm:~/oom# kubectl logs 
development-aai-aai-graphadmin-create-db-schema-4svjc -n onapProject Build 
Version: 1.0.1chown: changing ownership of 
'/opt/app/aai-graphadmin/resources/etc/auth/aai_keystore': Read-only file 
systemchown: changing ownership of 
'/opt/app/aai-graphadmin/resources/etc/appprops/janusgraph-realtime.properties':
 Read-only file systemchown: changing ownership of 
'/opt/app/aai-graphadmin/resources/etc/appprops/aaiconfig.properties': 
Read-only file systemchown: changing ownership of 
'/opt/app/aai-graphadmin/resources/etc/appprops/janusgraph-cached.properties': 
Read-only file systemchown: changing ownership of 
'/opt/app/aai-graphadmin/resources/localhost-access-logback.xml': Read-only 
file systemchown: changing ownership of 
'/opt/app/aai-graphadmin/resources/logback.xml': Read-only file systemchown: 
changing ownership of 
'/opt/app/aai-graphadmin/resources/application.properties': Read-only file 
system
Fri Jul 26 07:25:05 UTC 2019    Starting 
/opt/app/aai-graphadmin/bin/createDBSchema.sh    ---- NOTE --- about to open 
graph (takes a little while)--------;Exception in thread "main" 
java.lang.reflect.InvocationTargetException        at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)   
     at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)        at 
org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48)  
      at org.springframework.boot.loader.Launcher.launch(Launcher.java:87)      
  at org.springframework.boot.loader.Launcher.launch(Launcher.java:50)        
at 
org.springframework.boot.loader.PropertiesLauncher.main(PropertiesLauncher.java:595)Caused
 by: java.lang.ExceptionInInitializerError        at 
org.onap.aai.dbmap.AAIGraph.getInstance(AAIGraph.java:103)        at 
org.onap.aai.schema.GenTester.main(GenTester.java:126)        ... 8 moreCaused 
by: java.lang.RuntimeException: Failed to instantiate graphs        at 
org.onap.aai.dbmap.AAIGraph.<init>(AAIGraph.java:85)        at 
org.onap.aai.dbmap.AAIGraph.<init>(AAIGraph.java:57)        at 
org.onap.aai.dbmap.AAIGraph$Helper.<clinit>(AAIGraph.java:90)        ... 10 
moreCaused by: java.lang.IllegalArgumentException: Could not instantiate 
implementation: 
org.janusgraph.diskstorage.cassandra.astyanax.AstyanaxStoreManager        at 
org.janusgraph.util.system.ConfigurationUtil.instantiate(ConfigurationUtil.java:69)
        at 
org.janusgraph.diskstorage.Backend.getImplementationClass(Backend.java:477)     
   at org.janusgraph.diskstorage.Backend.getStorageManager(Backend.java:409)    
    at 
org.janusgraph.graphdb.configuration.GraphDatabaseConfiguration.<init>(GraphDatabaseConfiguration.java:1376)
        at 
org.janusgraph.core.JanusGraphFactory.open(JanusGraphFactory.java:164)        
at org.janusgraph.core.JanusGraphFactory.open(JanusGraphFactory.java:133)       
 at org.janusgraph.core.JanusGraphFactory.open(JanusGraphFactory.java:113)      
  at org.onap.aai.dbmap.AAIGraph.loadGraph(AAIGraph.java:115)        at 
org.onap.aai.dbmap.AAIGraph.<init>(AAIGraph.java:82)        ... 12 moreCaused 
by: java.lang.reflect.InvocationTargetException        at 
sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)        at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)      
  at 
org.janusgraph.util.system.ConfigurationUtil.instantiate(ConfigurationUtil.java:58)
        ... 20 moreCaused by: 
org.janusgraph.diskstorage.TemporaryBackendException: Temporary failure in 
storage backend        at 
org.janusgraph.diskstorage.cassandra.astyanax.AstyanaxStoreManager.ensureKeyspaceExists(AstyanaxStoreManager.java:619)
        at 
org.janusgraph.diskstorage.cassandra.astyanax.AstyanaxStoreManager.<init>(AstyanaxStoreManager.java:314)
        ... 25 moreCaused by: 
com.netflix.astyanax.connectionpool.exceptions.PoolTimeoutException: 
PoolTimeoutException: 
[host=development-aai-aai-cassandra-2.aai-cassandra(development-aai-aai-cassandra-2.aai-cassandra):9160,
 latency=30002(30002), attempts=3]Timed out waiting for connection        at 
com.netflix.astyanax.connectionpool.impl.SimpleHostConnectionPool.waitForConnection(SimpleHostConnectionPool.java:231)
        at 
com.netflix.astyanax.connectionpool.impl.SimpleHostConnectionPool.borrowConnection(SimpleHostConnectionPool.java:198)
        at 
com.netflix.astyanax.connectionpool.impl.RoundRobinExecuteWithFailover.borrowConnection(RoundRobinExecuteWithFailover.java:84)
        at 
com.netflix.astyanax.connectionpool.impl.AbstractExecuteWithFailoverImpl.tryOperation(AbstractExecuteWithFailoverImpl.java:117)
        at 
com.netflix.astyanax.connectionpool.impl.AbstractHostPartitionConnectionPool.executeWithFailover(AbstractHostPartitionConnectionPool.java:352)
        at 
com.netflix.astyanax.thrift.ThriftClusterImpl.executeSchemaChangeOperation(ThriftClusterImpl.java:146)
        at 
com.netflix.astyanax.thrift.ThriftClusterImpl.internalCreateKeyspace(ThriftClusterImpl.java:321)
        at 
com.netflix.astyanax.thrift.ThriftClusterImpl.addKeyspace(ThriftClusterImpl.java:294)
        at 
org.janusgraph.diskstorage.cassandra.astyanax.AstyanaxStoreManager.ensureKeyspaceExists(AstyanaxStoreManager.java:614)
        ... 26 moreFailed to run the tool 
/opt/app/aai-graphadmin/bin/createDBSchema.sh successfullyFailed to run the 
createDBSchema.sh

root@experiment-vm:~/oom# kubectl describe pod development-appc-appc-db-0 -n 
onapName:           development-appc-appc-db-0Namespace:      onapPriority:     
  0Node:           develop-for-onap/192.168.0.13Start Time:     Fri, 26 Jul 
2019 05:59:16 +0000Labels:         app=development-appc-appc-db                
controller-revision-hash=development-appc-appc-db-659d74df8f                
statefulset.kubernetes.io/pod-name=development-appc-appc-db-0Annotations:    
cni.projectcalico.org/podIP: 172.16.1.84/32                
pod.alpha.kubernetes.io/initialized: trueStatus:         RunningIP:             
172.16.1.84Controlled By:  StatefulSet/development-appc-appc-dbInit Containers: 
 mariadb-galera-prepare:    Container ID:  
docker://5de3d72dae00fc752011b47380a37e645448f667ca0c77460e2d7d47ac20727c    
Image:         nexus3.onap.org:10001/busybox    Image ID:      
docker-pullable://busybox@sha256:9f1003c480699be56815db0f8146ad2e22efea85129b5b5983d0e0fb52d9ab70
    Port:          <none>    Host Port:     <none>    Command:      sh      -c  
    chown -R 27:27 /var/lib/mysql    State:          Terminated      Reason:    
   Completed      Exit Code:    0      Started:      Fri, 26 Jul 2019 06:18:46 
+0000      Finished:     Fri, 26 Jul 2019 06:18:46 +0000    Ready:          
True    Restart Count:  0    Environment:    <none>    Mounts:      
/var/lib/mysql from development-appc-appc-db-data (rw)      
/var/run/secrets/kubernetes.io/serviceaccount from default-token-gkxcr 
(ro)Containers:  appc-db:    Container ID:   
docker://c538aa0ac0a0fca0eb79144db6fe6bc5bef900729909b5393fc94501801c4525    
Image:          
nexus3.onap.org:10001/adfinissygroup/k8s-mariadb-galera-centos:v002    Image 
ID:       
docker-pullable://nexus3.onap.org:10001/adfinissygroup/k8s-mariadb-galera-centos@sha256:fbcb842f30065ae94532cb1af9bb03cc6e2acaaf896d87d0ec38da7dd09a3dde
    Ports:          3306/TCP, 4444/TCP, 4567/TCP, 4568/TCP    Host Ports:     
0/TCP, 0/TCP, 0/TCP, 0/TCP    State:          Waiting      Reason:       
CrashLoopBackOff    Last State:     Terminated      Reason:       Error      
Exit Code:    137      Started:      Fri, 26 Jul 2019 07:43:41 +0000      
Finished:     Fri, 26 Jul 2019 07:45:01 +0000    Ready:          False    
Restart Count:  25    Liveness:       exec [mysqladmin ping] delay=30s 
timeout=5s period=10s #success=1 #failure=3    Readiness:      exec 
[/usr/share/container-scripts/mysql/readiness-probe.sh] delay=15s timeout=1s 
period=10s #success=1 #failure=3    Environment:      POD_NAMESPACE:        
onap (v1:metadata.namespace)      MYSQL_USER:           my-user      
MYSQL_PASSWORD:       <set to the key 'user-password' in secret 
'development-appc-appc-db'>  Optional: false      MYSQL_DATABASE:       
my-database      MYSQL_ROOT_PASSWORD:  <set to the key 'db-root-password' in 
secret 'development-appc-appc-db'>  Optional: false    Mounts:      
/etc/localtime from localtime (ro)      /var/lib/mysql from 
development-appc-appc-db-data (rw)      
/var/run/secrets/kubernetes.io/serviceaccount from default-token-gkxcr 
(ro)Conditions:  Type              Status  Initialized       True  Ready        
     False  ContainersReady   False  PodScheduled      TrueVolumes:  
development-appc-appc-db-data:    Type:       PersistentVolumeClaim (a 
reference to a PersistentVolumeClaim in the same namespace)    ClaimName:  
development-appc-appc-db-data-development-appc-appc-db-0    ReadOnly:   false  
localtime:    Type:          HostPath (bare host directory volume)    Path:     
     /etc/localtime    HostPathType:  default-token-gkxcr:    Type:        
Secret (a volume populated by a Secret)    SecretName:  default-token-gkxcr    
Optional:    falseQoS Class:       BestEffortNode-Selectors:  
<none>Tolerations:     node.kubernetes.io/not-ready:NoExecute for 300s          
       node.kubernetes.io/unreachable:NoExecute for 300sEvents:  Type     
Reason     Age                   From                       Message  ----     
------     ----                  ----                       -------  Warning  
BackOff    9m1s (x226 over 78m)  kubelet, develop-for-onap  Back-off restarting 
failed container  Warning  Unhealthy  4m (x185 over 88m)    kubelet, 
develop-for-onap  Readiness probe failed: ERROR 2002 (HY000): Can't connect to 
local MySQL server through socket '/var/lib/mysql/mysql.sock' (2 "No such file 
or directory")

root@experiment-vm:~/oom# kubectl describe pod 
development-aaf-aaf-service-55cd9b76f4-98tsw -n onapName:           
development-aaf-aaf-service-55cd9b76f4-98tswNamespace:      onapPriority:       
0Node:           develop-for-onap/192.168.0.13Start Time:     Fri, 26 Jul 2019 
05:59:07 +0000Labels:         app=aaf-service                
pod-template-hash=55cd9b76f4                release=development-aafAnnotations: 
   cni.projectcalico.org/podIP: 172.16.1.60/32Status:         RunningIP:        
     172.16.1.60Controlled By:  
ReplicaSet/development-aaf-aaf-service-55cd9b76f4Init Containers:  
aaf-service-config-container:    Container ID:   
docker://8b305c72faca84a1379c21fe9b943b3277e7d32ee30fd11d8fc4ab8b5d5cd5aa    
Image:          nexus3.onap.org:10001/onap/aaf/aaf_config:2.1.8    Image ID:    
   
docker-pullable://nexus3.onap.org:10001/onap/aaf/aaf_config@sha256:900671567f107fed9c9f595c41ad51fe44eaea72f341cd9e39245bdfcbf94bee
    Port:           <none>    Host Port:      <none>    State:          
Terminated      Reason:       Completed      Exit Code:    0      Started:      
Fri, 26 Jul 2019 06:08:30 +0000      Finished:     Fri, 26 Jul 2019 06:08:32 
+0000    Ready:          True    Restart Count:  0    Environment:      
HOSTNAME:         aaf.onap      AAF_ENV:          DEV      AAF_REGISTER_AS:  
aaf-service.onap      LATITUDE:         38.0      LONGITUDE:        -72.0      
CASS_HOST:        aaf-cass.onap      AAF_LOCATOR_AS:   aaf-locate.onap    
Mounts:      /opt/app/osaaf from aaf-service-config-vol (rw)      
/var/run/secrets/kubernetes.io/serviceaccount from default-token-gkxcr (ro)  
aaf-service-readiness:    Container ID:  
docker://353fbf8b8becff511f8bd7ebf96b23b2a6bcd131bc722f49f5f742981fe5991a    
Image:         oomk8s/readiness-check:2.0.0    Image ID:      
docker-pullable://oomk8s/readiness-check@sha256:7daa08b81954360a1111d03364febcb3dcfeb723bcc12ce3eb3ed3e53f2323ed
    Port:          <none>    Host Port:     <none>    Command:      
/root/ready.py    Args:      --container-name      aaf-cs    State:          
Terminated      Reason:       Completed      Exit Code:    0      Started:      
Fri, 26 Jul 2019 06:08:33 +0000      Finished:     Fri, 26 Jul 2019 06:08:34 
+0000    Ready:          True    Restart Count:  0    Environment:      
NAMESPACE:  onap (v1:metadata.namespace)    Mounts:      
/var/run/secrets/kubernetes.io/serviceaccount from default-token-gkxcr 
(ro)Containers:  aaf-service:    Container ID:  
docker://6b4ada5cf83a1f1538795c75c477df13f2a315509a4417aa7d4dea8d2113d20f    
Image:         nexus3.onap.org:10001/onap/aaf/aaf_service:2.1.8    Image ID:    
  
docker-pullable://nexus3.onap.org:10001/onap/aaf/aaf_service@sha256:24a91ad2fb700518a9e72e53093a9eee1580ca5cc2cd876addf23a9779d81110
    Port:          <none>    Host Port:     <none>    Command:      /bin/bash   
   /opt/app/aaf/pod/pod_wait.sh      aaf_service      sleep      0      cd 
/opt/app/aaf;bin/service    State:          Waiting      Reason:       
CrashLoopBackOff    Last State:     Terminated      Reason:       Completed     
 Exit Code:    0      Started:      Fri, 26 Jul 2019 07:46:10 +0000      
Finished:     Fri, 26 Jul 2019 07:46:31 +0000    Ready:          False    
Restart Count:  20    Liveness:       tcp-socket :8100 delay=300s timeout=1s 
period=10s #success=1 #failure=3    Readiness:      tcp-socket :8100 delay=30s 
timeout=1s period=10s #success=1 #failure=3    Environment:    <none>    
Mounts:      /etc/localtime from localtime (ro)      /opt/app/osaaf from 
aaf-service-config-vol (rw)      /var/run/secrets/kubernetes.io/serviceaccount 
from default-token-gkxcr (ro)Conditions:  Type              Status  Initialized 
      True  Ready             False  ContainersReady   False  PodScheduled      
TrueVolumes:  localtime:    Type:          HostPath (bare host directory 
volume)    Path:          /etc/localtime    HostPathType:  
aaf-service-config-vol:    Type:       EmptyDir (a temporary directory that 
shares a pod's lifetime)    Medium:    SizeLimit:  <unset>  
default-token-gkxcr:    Type:        Secret (a volume populated by a Secret)    
SecretName:  default-token-gkxcr    Optional:    falseQoS Class:       
BestEffortNode-Selectors:  <none>Tolerations:     
node.kubernetes.io/not-ready:NoExecute for 300s                 
node.kubernetes.io/unreachable:NoExecute for 300sEvents:  Type     Reason   Age 
                 From                       Message  ----     ------   ----     
            ----                       -------  Normal   Pulled   30m (x15 over 
87m)   kubelet, develop-for-onap  Container image 
"nexus3.onap.org:10001/onap/aaf/aaf_service:2.1.8" already present on machine  
Warning  BackOff  24s (x381 over 87m)  kubelet, develop-for-onap  Back-off 
restarting failed container

root@experiment-vm:~/oom# kubectl describe pod 
development-aaf-aaf-sms-85566ddf99-jfh42 -n onapName:           
development-aaf-aaf-sms-85566ddf99-jfh42Namespace:      onapPriority:       
0Node:           develop-for-onap/192.168.0.13Start Time:     Fri, 26 Jul 2019 
05:59:07 +0000Labels:         app=aaf-sms                
pod-template-hash=85566ddf99                release=development-aafAnnotations: 
   cni.projectcalico.org/podIP: 172.16.1.64/32Status:         RunningIP:        
     172.16.1.64Controlled By:  
ReplicaSet/development-aaf-aaf-sms-85566ddf99Init Containers:  
aaf-sms-readiness:    Container ID:  
docker://9c9f80f420eb936d45f0112111b0314551ec48718d4ed4c92fba5f6d92d9047c    
Image:         oomk8s/readiness-check:2.0.0    Image ID:      
docker-pullable://oomk8s/readiness-check@sha256:7daa08b81954360a1111d03364febcb3dcfeb723bcc12ce3eb3ed3e53f2323ed
    Port:          <none>    Host Port:     <none>    Command:      
/root/ready.py    Args:      --container-name      aaf-sms-vault      
--container-name      aaf-sms-vault-backend    State:          Terminated      
Reason:       Completed      Exit Code:    0      Started:      Fri, 26 Jul 
2019 05:59:23 +0000      Finished:     Fri, 26 Jul 2019 05:59:44 +0000    
Ready:          True    Restart Count:  0    Environment:      NAMESPACE:  onap 
(v1:metadata.namespace)    Mounts:      
/var/run/secrets/kubernetes.io/serviceaccount from default-token-gkxcr 
(ro)Containers:  aaf-sms:    Container ID:  
docker://7481866fc5d665e409dd7994b9285452056923d21d1719a04b2fce7608615a91    
Image:         nexus3.onap.org:10001/onap/aaf/sms:3.0.1    Image ID:      
docker-pullable://nexus3.onap.org:10001/onap/aaf/sms@sha256:d5b64947edb93848acacaa9820234aa29e58217db9f878886b7bafae00fdb436
    Port:          10443/TCP    Host Port:     0/TCP    Command:      
/sms/bin/sms    State:          Waiting      Reason:       CrashLoopBackOff    
Last State:     Terminated      Reason:       Error      Exit Code:    2      
Started:      Fri, 26 Jul 2019 07:44:11 +0000      Finished:     Fri, 26 Jul 
2019 07:45:39 +0000    Ready:          False    Restart Count:  23    Liveness: 
      http-get https://:10443/v1/sms/quorum/status delay=10s timeout=1s 
period=30s #success=1 #failure=3    Readiness:      http-get 
https://:10443/v1/sms/quorum/status delay=10s timeout=1s period=30s #success=1 
#failure=3    Environment:    <none>    Mounts:      /etc/localtime from 
localtime (ro)      /sms/auth from development-aaf-aaf-sms-auth (rw)      
/sms/smsconfig.json from aaf-sms (rw,path="smsconfig.json")      
/var/run/secrets/kubernetes.io/serviceaccount from default-token-gkxcr 
(ro)Conditions:  Type              Status  Initialized       True  Ready        
     False  ContainersReady   False  PodScheduled      TrueVolumes:  localtime: 
   Type:          HostPath (bare host directory volume)    Path:          
/etc/localtime    HostPathType:  aaf-sms:    Type:      ConfigMap (a volume 
populated by a ConfigMap)    Name:      development-aaf-aaf-sms    Optional:  
false  development-aaf-aaf-sms-auth:    Type:       PersistentVolumeClaim (a 
reference to a PersistentVolumeClaim in the same namespace)    ClaimName:  
development-aaf-aaf-sms    ReadOnly:   false  default-token-gkxcr:    Type:     
   Secret (a volume populated by a Secret)    SecretName:  default-token-gkxcr  
  Optional:    falseQoS Class:       BestEffortNode-Selectors:  
<none>Tolerations:     node.kubernetes.io/not-ready:NoExecute for 300s          
       node.kubernetes.io/unreachable:NoExecute for 300sEvents:  Type     
Reason     Age                    From                       Message  ----     
------     ----                   ----                       -------  Warning  
Unhealthy  13m (x62 over 88m)     kubelet, develop-for-onap  Readiness probe 
failed: Get https://172.16.1.64:10443/v1/sms/quorum/status: dial tcp 
172.16.1.64:10443: connect: connection refused  Warning  BackOff    3m44s (x210 
over 71m)  kubelet, develop-for-onap  Back-off restarting failed container
root@experiment-vm:~/oom# kubectl describe pod 
development-aai-aai-graphadmin-create-db-schema-9t92z -n onapName:           
development-aai-aai-graphadmin-create-db-schema-9t92zNamespace:      
onapPriority:       0Node:           develop-for-onap/192.168.0.13Start Time:   
  Fri, 26 Jul 2019 06:50:02 +0000Labels:         app=aai-graphadmin-job         
       controller-uid=61391fff-d38c-4005-b20f-0c1ef284d4d7                
job-name=development-aai-aai-graphadmin-create-db-schema                
release=development-aaiAnnotations:    checksum/config: 
7bab071acc4462a150ee1b669121edd29cd762ccb5cea80a9edaaff42fd872ac                
cni.projectcalico.org/podIP: 172.16.1.103/32Status:         FailedIP:           
  172.16.1.103Controlled By:  
Job/development-aai-aai-graphadmin-create-db-schemaInit Containers:  
aai-graphadmin-readiness:    Container ID:  
docker://7bcddb7af2a13273a44cecca5d698c16effbb6c2d7d0f97519aada8d463a3ba8    
Image:         oomk8s/readiness-check:2.0.0    Image ID:      
docker-pullable://oomk8s/readiness-check@sha256:7daa08b81954360a1111d03364febcb3dcfeb723bcc12ce3eb3ed3e53f2323ed
    Port:          <none>    Host Port:     <none>    Command:      
/root/ready.py    Args:      --container-name      aai-cassandra    State:      
    Terminated      Reason:       Completed      Exit Code:    0      Started:  
    Fri, 26 Jul 2019 06:50:04 +0000      Finished:     Fri, 26 Jul 2019 
06:54:45 +0000    Ready:          True    Restart Count:  0    Environment:     
 NAMESPACE:  onap (v1:metadata.namespace)    Mounts:      
/var/run/secrets/kubernetes.io/serviceaccount from default-token-gkxcr 
(ro)Containers:  aai-graphadmin-job:    Container ID:  
docker://8fcb53ebe5c232a06968ba17f98f06de0e7d783a305cca1a58a8bdc48562cc03    
Image:         nexus3.onap.org:10001/onap/aai-graphadmin:1.0.4    Image ID:     
 
docker-pullable://nexus3.onap.org:10001/onap/aai-graphadmin@sha256:0be05f5b36d4ec7b5410e965bd8f7655e4be49d15804e67d7fe49718d43bca53
    Port:          <none>    Host Port:     <none>    Command:      /bin/bash   
   docker-entrypoint.sh      createDBSchema.sh    State:          Terminated    
  Reason:       Error      Exit Code:    1      Started:      Fri, 26 Jul 2019 
06:54:47 +0000      Finished:     Fri, 26 Jul 2019 06:57:07 +0000    Ready:     
     False    Restart Count:  0    Environment:      LOCAL_USER_ID:   1000      
LOCAL_GROUP_ID:  1000    Mounts:      /etc/localtime from localtime (ro)      
/opt/aai/logroot/AAI-GA from development-aai-aai-graphadmin-logs (rw)      
/opt/app/aai-graphadmin/resources/application.properties from 
development-aai-aai-graphadmin-springapp-conf 
(rw,path="application.properties")      
/opt/app/aai-graphadmin/resources/etc/appprops/aaiconfig.properties from 
development-aai-aai-graphadmin-aaiconfig-conf (rw,path="aaiconfig.properties")  
    /opt/app/aai-graphadmin/resources/etc/appprops/janusgraph-cached.properties 
from development-aai-aai-graphadmin-db-cached-conf 
(rw,path="janusgraph-cached.properties")      
/opt/app/aai-graphadmin/resources/etc/appprops/janusgraph-realtime.properties 
from development-aai-aai-graphadmin-db-real-conf 
(rw,path="janusgraph-realtime.properties")      
/opt/app/aai-graphadmin/resources/etc/auth/aai_keystore from 
development-aai-aai-graphadmin-auth-truststore-sec (rw,path="aai_keystore")     
 /opt/app/aai-graphadmin/resources/localhost-access-logback.xml from 
development-aai-aai-graphadmin-localhost-access-log-conf 
(rw,path="localhost-access-logback.xml")      
/opt/app/aai-graphadmin/resources/logback.xml from 
development-aai-aai-graphadmin-log-conf (rw,path="logback.xml")      
/var/run/secrets/kubernetes.io/serviceaccount from default-token-gkxcr 
(ro)Conditions:  Type              Status  Initialized       True  Ready        
     False  ContainersReady   False  PodScheduled      TrueVolumes:  localtime: 
   Type:          HostPath (bare host directory volume)    Path:          
/etc/localtime    HostPathType:  filebeat-conf:    Type:      ConfigMap (a 
volume populated by a ConfigMap)    Name:      aai-filebeat    Optional:  false 
 development-aai-aai-graphadmin-logs:    Type:          HostPath (bare host 
directory volume)    Path:          
/dockerdata-nfs/development-aai/aai/aai-graphadmin-create-db-schema    
HostPathType:  development-aai-aai-graphadmin-filebeat:    Type:       EmptyDir 
(a temporary directory that shares a pod's lifetime)    Medium:    SizeLimit:  
<unset>  development-aai-aai-graphadmin-log-conf:    Type:      ConfigMap (a 
volume populated by a ConfigMap)    Name:      
development-aai-aai-graphadmin-log    Optional:  false  
development-aai-aai-graphadmin-localhost-access-log-conf:    Type:      
ConfigMap (a volume populated by a ConfigMap)    Name:      
development-aai-aai-graphadmin-localhost-access-log-configmap    Optional:  
false  development-aai-aai-graphadmin-db-real-conf:    Type:      ConfigMap (a 
volume populated by a ConfigMap)    Name:      
development-aai-aai-graphadmin-db-real-configmap    Optional:  false  
development-aai-aai-graphadmin-db-cached-conf:    Type:      ConfigMap (a 
volume populated by a ConfigMap)    Name:      
development-aai-aai-graphadmin-db-cached-configmap    Optional:  false  
development-aai-aai-graphadmin-aaiconfig-conf:    Type:      ConfigMap (a 
volume populated by a ConfigMap)    Name:      
development-aai-aai-graphadmin-aaiconfig-configmap    Optional:  false  
development-aai-aai-graphadmin-springapp-conf:    Type:      ConfigMap (a 
volume populated by a ConfigMap)    Name:      
development-aai-aai-graphadmin-springapp-configmap    Optional:  false  
development-aai-aai-graphadmin-realm-conf:    Type:      ConfigMap (a volume 
populated by a ConfigMap)    Name:      
development-aai-aai-graphadmin-realm-configmap    Optional:  false  
development-aai-aai-graphadmin-auth-truststore-sec:    Type:        Secret (a 
volume populated by a Secret)    SecretName:  aai-auth-truststore-secret    
Optional:    false  default-token-gkxcr:    Type:        Secret (a volume 
populated by a Secret)    SecretName:  default-token-gkxcr    Optional:    
falseQoS Class:       BestEffortNode-Selectors:  <none>Tolerations:     
node.kubernetes.io/not-ready:NoExecute for 300s                 
node.kubernetes.io/unreachable:NoExecute for 300sEvents:  Type    Reason   Age  
 From                       Message  ----    ------   ----  ----                
       -------  Normal  Pulled   58m   kubelet, develop-for-onap  Container 
image "nexus3.onap.org:10001/onap/aai-graphadmin:1.0.4" already present on 
machine  Normal  Created  58m   kubelet, develop-for-onap  Created container 
aai-graphadmin-job  Normal  Started  58m   kubelet, develop-for-onap  Started 
container aai-graphadmin-job

     


| 
| 
|  | 
OOM Quick Start Guide — dublin documentation


 |

 |

 |



 

-=-=-=-=-=-=-=-=-=-=-=-
Links: You receive all messages sent to this group.

View/Reply Online (#18314): https://lists.onap.org/g/onap-discuss/message/18314
Mute This Topic: https://lists.onap.org/mt/32607215/21656
Group Owner: [email protected]
Unsubscribe: https://lists.onap.org/g/onap-discuss/unsub  
[[email protected]]
-=-=-=-=-=-=-=-=-=-=-=-

Reply via email to