Hi Jimmy,

Thanks for the reply,

As you suggested we had taken AAI/OOM from master branch. In this case 
"aai-aai-resources" is in crash loop and "aai-aai-graphadmin-create-db-schema" 
is in init error state.



Also we checked "org.onap.aai.p12" and it's hashcode(i.e. as shown below), It 
is matching with the hashcode which you gave in the above comment.

root@demoelalto-nfs:~/oom/kubernetes/aai/components/aai-resources/resources/config/aaf#
 md5sum org.onap.aai.p12
67ae173d38ba718ea41d69a8264458c6  org.onap.aai.p12

Please find the attached "aai_logs" file containing details of health check, 
pod log, container log etc.

Note :- We had taken only master branch (aai/oom) and the remaining all other 
components are (Elalto-Branch) repo's only.


@FREEMAN, BRIAN D<mailto:[email protected]> :  As aai resource pod is in crashloop 
we are unable to enter into that, but we checked for the aaf-locate endpoints, 
Here output looks,

url : 
https://aaf-locate.onap:8095/locate/onap.org.osaaf.aaf.service:2.1<https://ind01.safelinks.protection.outlook.com/?url=https%3A%2F%2Faaf-locate.onap%3A8095%2Flocate%2Fonap.org.osaaf.aaf.service%3A2.1&data=02%7C01%7Cvelugubantla.praveen%40Ltts.com%7C07eec015ac3f47055b9a08d7e52e1037%7C311b33788e8a4b5ea33fe80a3d8ba60a%7C0%7C0%7C637229858318592557&sdata=36Nj9vHb8OFnjVxG2bI4qbdeL5YGqVkv3Gi6KX%2F5SKs%3D&reserved=0>

{
"endpoint": [{
"name": "onap.org.osaaf.aaf.service",
"major": 2,
"minor": 1,
"patch": 0,
"pkg": 0,
"latitude": 38.0,
"longitude": -72.0,
"protocol": "https",
"subprotocol": ["TLSv1.1", "TLSv1.2"],
"hostname": "aaf-service.onap",
"port": 8100
}]
}


Regards,

________________________________________________________

Velugubantla Praveen

Engineer - CTO-Common

L&T TECHNOLOGY SERVICES LIMITED

L3 Building, Manyata Embassy Business Park,
Nagawara Hobli, Bengaluru-560045

________________________________________________________

Mobile: +91 9154111420


www.LTTS.com<http://www.ltts.com/>

[cid:cc8d682a-2af3-4453-bb5c-dc33a73aebe9]

________________________________
From: [email protected] <[email protected]> on behalf of 
Jimmy Forsyth via lists.onap.org <[email protected]>
Sent: Monday, April 20, 2020 7:05 PM
To: FREEMAN, BRIAN D <[email protected]>; Velugubantla Praveen 
<[email protected]>; [email protected] 
<[email protected]>; [email protected] <[email protected]>; 
FRANEY, JOHN J <[email protected]>; Vivekanandan Muthukrishnan 
<[email protected]>; [email protected] 
<[email protected]>; [email protected] 
<[email protected]>; 陈庄洋 <[email protected]>
Cc: JOMY JOSE <[email protected]>; Devangam Manjunatha 
<[email protected]>; Sudarshan K.S <[email protected]>
Subject: Re: [onap-discuss] AAF Authentication Error #aai [403-Access Denied]


Also, please see my comment on 
https://jira.onap.org/browse/AAI-2872<https://ind01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fjira.onap.org%2Fbrowse%2FAAI-2872&data=02%7C01%7Cvelugubantla.praveen%40ltts.com%7C3c1540cdd0ee4cdea5d708d7e52fce07%7C311b33788e8a4b5ea33fe80a3d8ba60a%7C0%7C1%7C637229865802481014&sdata=ME7YKUjQ3wa1o1kdXvIyBU0nRdIFwfNgGXkk%2BwQp2vM%3D&reserved=0>
 to verify your deployment.



Thanks,

jimmy



From: "FREEMAN, BRIAN D" <[email protected]>
Date: Monday, April 20, 2020 at 9:23 AM
To: "[email protected]" <[email protected]>, 
"[email protected]" <[email protected]>, 
"[email protected]" <[email protected]>, "FRANEY, JOHN J" 
<[email protected]>, "FORSYTH, JAMES" <[email protected]>, Vivekanandan Muthukrishnan 
<[email protected]>, "[email protected]" 
<[email protected]>, "[email protected]" 
<[email protected]>, 陈庄洋 <[email protected]>
Cc: JOMY JOSE <[email protected]>, Devangam Manjunatha 
<[email protected]>, "Sudarshan K.S" <[email protected]>
Subject: RE: [onap-discuss] AAF Authentication Error #aai [403-Access Denied]



Certificate unknown seems to imply that the certificate patch didn’t result in 
the right certificate being in the java configuration on the aai pod so that 
now the https to AAF for authentication is failing the certificate check.



If you log into the aai-resource pod what certificate does it see and does that 
cert chain match the cert presented by  
https://aaf-locate.onap:8095/locate/onap.org.osaaf.aaf.service:2.1<https://ind01.safelinks.protection.outlook.com/?url=https%3A%2F%2Faaf-locate.onap%3A8095%2Flocate%2Fonap.org.osaaf.aaf.service%3A2.1&data=02%7C01%7Cvelugubantla.praveen%40ltts.com%7C3c1540cdd0ee4cdea5d708d7e52fce07%7C311b33788e8a4b5ea33fe80a3d8ba60a%7C0%7C1%7C637229865802481014&sdata=2xxO6VsrBrO87LQuXTJ1NDc%2B236H6byBrKp2DiJH3B4%3D&reserved=0>



Brian







From: [email protected] <[email protected]>
Sent: Saturday, April 18, 2020 10:41 AM
To: [email protected]; [email protected]; FRANEY, JOHN J 
<[email protected]>; FREEMAN, BRIAN D <[email protected]>; FORSYTH, JAMES 
<[email protected]>; Vivekanandan Muthukrishnan 
<[email protected]>; [email protected]; 
[email protected]; 陈庄洋 <[email protected]>
Cc: JOMY JOSE <[email protected]>; Devangam Manjunatha 
<[email protected]>; Sudarshan K.S <[email protected]>
Subject: Re: [onap-discuss] AAF Authentication Error #aai [403-Access Denied]





Hi Team,



We had deployed the ONAP-Elalto with the latest certificate patches of AAI, 
DMAAP & DCAE few days back, from then onwards we are unable perform the robot 
demo init & not able to perform the network preload (it's showing as ACCESS 
DENIED 403). Even i posted that issue in community also. After going through 
the Hari mail, i checked the aai-resource pod then we had also having same 
issue. I had attached the error log please help us in coming out of this issue.





Regards,

________________________________________________________

Velugubantla Praveen

Engineer - CTO-Common

L&T TECHNOLOGY SERVICES LIMITED

L3 Building, Manyata Embassy Business Park,
Nagawara Hobli, Bengaluru-560045

________________________________________________________

Mobile: +91 9154111420



www.LTTS.com<https://ind01.safelinks.protection.outlook.com/?url=https%3A%2F%2Furldefense.proofpoint.com%2Fv2%2Furl%3Fu%3Dhttp-3A__www.ltts.com_%26d%3DDwMFoQ%26c%3DLFYZ-o9_HUMeMTSQicvjIg%26r%3De3d1ehx3DI5AoMgDmi2Fzw%26m%3Dv88QSga50-ZvQRfGZOaMjqm_g_AK4ktLcJkJ_iNp3X0%26s%3DweDebBZ0WtBhqj2sOBlkyZhGxUyDYoRYGAPgUXvlZEI%26e%3D&data=02%7C01%7Cvelugubantla.praveen%40ltts.com%7C3c1540cdd0ee4cdea5d708d7e52fce07%7C311b33788e8a4b5ea33fe80a3d8ba60a%7C0%7C0%7C637229865802491007&sdata=oAtNrcQNN2FEXSPHBmBz%2BItiFQxe1xEfQgQ2su7fR4U%3D&reserved=0>

[cid:[email protected]]



________________________________

From: [email protected]<mailto:[email protected]> 
<[email protected]<mailto:[email protected]>> on behalf of 
FRANEY, JOHN J via lists.onap.org 
<[email protected]<mailto:[email protected]>>
Sent: Saturday, April 18, 2020 4:47 AM
To: [email protected]<mailto:[email protected]> 
<[email protected]<mailto:[email protected]>>; 
[email protected]<mailto:[email protected]> 
<[email protected]<mailto:[email protected]>>
Subject: Re: [onap-discuss] AAF Authentication Error #aai



The error means the server side TLS certificate is not trusted by your client.



Trust is established from the truststore.  The truststore must contain the root 
ca certificate that signed the server side certificate.



This is a configuration error.  Maybe you missed a step during installation.  
Your jvm may not be pointing to the right truststore.





John





-------- Original message --------

From: "hariharan.38 via lists.onap.org" 
<[email protected]<mailto:[email protected]>>

Date: 4/17/20 7:03 PM (GMT-05:00)

To: [email protected]<mailto:[email protected]>

Subject: [onap-discuss] AAF Authentication Error #aai



Hi All,
Currently when I am trying to give the curl request to AAI,it fails with 403 
forbidden error.When I checked the console logs of the aai-resources pod I am 
getting the  logs as seen in the attached jpg below.Even though my AAF pods are 
running fine the request fails with 403.I am currently using the Dublin release 
.Can anyone please help me out with this issue.

Regards,
Hari

L&T Technology Services Ltd

www.LTTS.com<https://ind01.safelinks.protection.outlook.com/?url=https%3A%2F%2Furldefense.proofpoint.com%2Fv2%2Furl%3Fu%3Dhttp-3A__www.LTTS.com%26d%3DDwQFoQ%26c%3DLFYZ-o9_HUMeMTSQicvjIg%26r%3De3d1ehx3DI5AoMgDmi2Fzw%26m%3Dv88QSga50-ZvQRfGZOaMjqm_g_AK4ktLcJkJ_iNp3X0%26s%3Dp2jvnylMFI0nc0g4JOezYLXgSle0Aap9_gnCTa-AYMU%26e%3D&data=02%7C01%7Cvelugubantla.praveen%40ltts.com%7C3c1540cdd0ee4cdea5d708d7e52fce07%7C311b33788e8a4b5ea33fe80a3d8ba60a%7C0%7C0%7C637229865802491007&sdata=%2BIKarepJ7echM3dnwDwV9bCvPkXWC0NxMFo4t8rQ74o%3D&reserved=0>

L&T Technology Services Limited (LTTS) is committed to safeguard your data 
privacy. For more information to view our commitment towards data privacy under 
GDPR, please visit the privacy policy on our website 
www.Ltts.com<https://ind01.safelinks.protection.outlook.com/?url=https%3A%2F%2Furldefense.proofpoint.com%2Fv2%2Furl%3Fu%3Dhttp-3A__www.Ltts.com%26d%3DDwQFoQ%26c%3DLFYZ-o9_HUMeMTSQicvjIg%26r%3De3d1ehx3DI5AoMgDmi2Fzw%26m%3Dv88QSga50-ZvQRfGZOaMjqm_g_AK4ktLcJkJ_iNp3X0%26s%3DuEpSX0WrsRpMsDCG6Vx8xWw9nL_8ZOYgd3YstNfgESU%26e%3D&data=02%7C01%7Cvelugubantla.praveen%40ltts.com%7C3c1540cdd0ee4cdea5d708d7e52fce07%7C311b33788e8a4b5ea33fe80a3d8ba60a%7C0%7C0%7C637229865802501001&sdata=kUgCiEajQrIEnsGMyPc%2FcHBT%2FtyRr%2FuMUQLftR4wGw0%3D&reserved=0>.
 This Email may contain confidential or privileged information for the intended 
recipient (s). If you are not the intended recipient, please do not use or 
disseminate the information, notify the sender and delete it from your system.



L&T Technology Services Ltd

www.LTTS.com

L&T Technology Services Limited (LTTS) is committed to safeguard your data 
privacy. For more information to view our commitment towards data privacy under 
GDPR, please visit the privacy policy on our website www.Ltts.com. This Email 
may contain confidential or privileged information for the intended recipient 
(s). If you are not the intended recipient, please do not use or disseminate 
the information, notify the sender and delete it from your system.

-=-=-=-=-=-=-=-=-=-=-=-
Links: You receive all messages sent to this group.

View/Reply Online (#20807): https://lists.onap.org/g/onap-discuss/message/20807
Mute This Topic: https://lists.onap.org/mt/73108068/21656
Mute #aai: https://lists.onap.org/mk?hashtag=aai&subid=2740164
Group Owner: [email protected]
Unsubscribe: https://lists.onap.org/g/onap-discuss/unsub  
[[email protected]]
-=-=-=-=-=-=-=-=-=-=-=-

root@demoelalto-nfs:~/oom/kubernetes/robot# ./ete-k8s.sh onap health
++ export NAMESPACE=onap
++ NAMESPACE=onap
+++ kubectl --namespace onap get pods
+++ sed 's/ .*//'
+++ grep robot
++ POD=demo-robot-robot-59d7c9dc45-qmrxc
++ TAGS='-i health'
+++ dirname ./ete-k8s.sh
++ DIR=.
++ SCRIPTDIR=scripts/etescript
++ ETEHOME=/var/opt/ONAP
++ [[ health == \e\x\e\c\s\c\r\i\p\t ]]
+++ kubectl --namespace onap exec demo-robot-robot-59d7c9dc45-qmrxc -- bash -c 
'ls -1q /share/logs/ | wc -l'
++ export GLOBAL_BUILD_NUMBER=3
++ GLOBAL_BUILD_NUMBER=3
+++ printf %04d 3
++ OUTPUT_FOLDER=0003_ete_health
++ DISPLAY_NUM=93
++ VARIABLEFILES='-V /share/config/robot_properties.py'
++ VARIABLES='-v GLOBAL_BUILD_NUMBER:4584'
++ kubectl --namespace onap exec demo-robot-robot-59d7c9dc45-qmrxc -- 
/var/opt/ONAP/runTags.sh -V /share/config/robot_properties.py -v 
GLOBAL_BUILD_NUMBER:4584 -d /share/logs/0003_ete_health -i health --display 93
Starting Xvfb on display :93 with res 1280x1024x24
Executing robot tests at log level TRACE
==============================================================================
Testsuites                                                                    
==============================================================================
Testsuites.Health-Check :: Test that ONAP components are available via basi...
==============================================================================
Basic A&AI Health Check                                               | FAIL |
Test timeout 20 seconds exceeded.
------------------------------------------------------------------------------
Basic AAF Health Check                                                | PASS |
------------------------------------------------------------------------------
Basic AAF SMS Health Check                                            | PASS |
------------------------------------------------------------------------------



=====================================================================================================

root@demoelalto-nfs:/home/ubuntu# kubectl get pods -o wide | grep aai
demo-aai-aai-5cb69c65dc-4c52h                                  0/1     Init:0/1 
          14         142m   10.42.6.65    demoelalto-k8s-02   <none>           
<none>
demo-aai-aai-babel-7dd487684f-xs6cv                            2/2     Running  
          0          142m   10.42.9.32    demoelalto-k8s-03   <none>           
<none>
demo-aai-aai-data-router-69b494d694-xnp7t                      2/2     Running  
          0          142m   10.42.8.44    demoelalto-k8s-05   <none>           
<none>
demo-aai-aai-elasticsearch-7f5d9986b7-q7zgj                    1/1     Running  
          0          142m   10.42.5.38    demoelalto-k8s-04   <none>           
<none>
demo-aai-aai-graphadmin-5f85c4cd4f-pd9ll                       2/2     Running  
          0          142m   10.42.7.36    demoelalto-k8s-06   <none>           
<none>
demo-aai-aai-graphadmin-create-db-schema-m5s6k                 0/1     
Init:Error         0          142m   10.42.6.64    demoelalto-k8s-02   <none>   
        <none>
demo-aai-aai-graphadmin-create-db-schema-zdb4l                 0/1     
Completed          0          131m   10.42.7.46    demoelalto-k8s-06   <none>   
        <none>
demo-aai-aai-graphgraph-5cd45884fd-t6v9j                       1/1     Running  
          0          142m   10.42.4.56    demoelalto-k8s-08   <none>           
<none>
demo-aai-aai-modelloader-f99546598-vmdd4                       2/2     Running  
          0          142m   10.42.10.55   demoelalto-k8s-07   <none>           
<none>
demo-aai-aai-resources-ccfcdbff6-pp48k                         1/2     
CrashLoopBackOff   29         142m   10.42.11.54   demoelalto-k8s-01   <none>   
        <none>
demo-aai-aai-schema-service-5d7659ff4b-74fgj                   2/2     Running  
          0          142m   10.42.3.44    demoelalto-k8s-09   <none>           
<none>
demo-aai-aai-search-data-5b54cc9bf9-55dcc                      2/2     Running  
          0          142m   10.42.9.33    demoelalto-k8s-03   <none>           
<none>
demo-aai-aai-sparky-be-86676cf967-2k42m                        0/2     Init:0/1 
          0          142m   10.42.11.53   demoelalto-k8s-01   <none>           
<none>
demo-aai-aai-traversal-747bbf597c-s9smq                        2/2     Running  
          0          142m   10.42.8.46    demoelalto-k8s-05   <none>           
<none>
demo-pomba-pomba-aaictxbuilder-8865f8768-xm6l8                 2/2     Running  
          0          130m   10.42.3.63    demoelalto-k8s-09   <none>           
<none>


=====================================================================================================
Describe POD of aai-resource information
=====================================================================================================

root@demoelalto-nfs:/home/ubuntu# kubectl describe pod 
demo-aai-aai-resources-ccfcdbff6-pp48k
Name:           demo-aai-aai-resources-ccfcdbff6-pp48k
Namespace:      onap
Priority:       0
Node:           demoelalto-k8s-01/30.0.0.9
Start Time:     Tue, 21 Apr 2020 07:05:20 +0000
Labels:         app=aai-resources
                pod-template-hash=ccfcdbff6
                release=demo-aai
Annotations:    checksum/config: 
165150d6457f3c1f5dc0bbb2af5ce0d2e70f9fe8c99dea3bc2b22dacb220ba6e
                cni.projectcalico.org/podIP: 10.42.11.54/32
                msb.onap.org/service-info:
                  [ { "serviceName": "_aai-cloudInfrastructure", "version": 
"v11", "url": "/aai/v11/cloud-infrastructure", "protocol": "REST", "port": 
"8447...
Status:         Running
IP:             10.42.11.54
Controlled By:  ReplicaSet/demo-aai-aai-resources-ccfcdbff6
Init Containers:
  aai-resources-readiness:
    Container ID:  
docker://4887af543f2537e9f89b7493e880a6af82338b784b821c41a73999f2454e7873
    Image:         Elalto-Repo:5000/readiness-check:2.0.2
    Image ID:      
docker-pullable://Elalto-Repo:5000/readiness-check@sha256:f369e309921a383d48c5330413fc9a36180e491c3c4cbf48931b57fd52ab4738
    Port:          <none>
    Host Port:     <none>
    Command:
      /root/job_complete.py
    Args:
      --job-name
      demo-aai-aai-graphadmin-create-db-schema
    State:          Terminated
      Reason:       Completed
      Exit Code:    0
      Started:      Tue, 21 Apr 2020 07:16:16 +0000
      Finished:     Tue, 21 Apr 2020 07:18:12 +0000
    Ready:          True
    Restart Count:  1
    Environment:
      NAMESPACE:  onap (v1:metadata.namespace)
    Mounts:
      /var/run/secrets/kubernetes.io/serviceaccount from default-token-xv57w 
(ro)
Containers:
  aai-resources:
    Container ID:   
docker://a92d55a614cdbec6b96f61c1778172210c5709115dfb987e04cf501acceb43cd
    Image:          Elalto-Repo:5000/onap/aai-resources:1.5.1
    Image ID:       
docker-pullable://Elalto-Repo:5000/onap/aai-resources@sha256:62029e600cc9f85f00f0881ce888c7fb076bccbffe0227b304d817e8a98e7ba1
    Ports:          8447/TCP, 5005/TCP
    Host Ports:     0/TCP, 0/TCP
    State:          Waiting
      Reason:       CrashLoopBackOff
    Last State:     Terminated
      Reason:       Error
      Exit Code:    1
      Started:      Tue, 21 Apr 2020 09:36:42 +0000
      Finished:     Tue, 21 Apr 2020 09:36:49 +0000
    Ready:          False
    Restart Count:  31
    Readiness:      tcp-socket :8447 delay=60s timeout=1s period=10s #success=1 
#failure=3
    Environment:
      LOCAL_USER_ID:   1000
      LOCAL_GROUP_ID:  1000
    Mounts:
      /etc/localtime from localtime (ro)
      /opt/aai/logroot/AAI-RES from demo-aai-aai-resources-logs (rw)
      /opt/app/aai-resources/resources/aaf/bath_config.csv from 
demo-aai-aai-resources-aaf-certs (rw,path="bath_config.csv")
      /opt/app/aai-resources/resources/aaf/org.onap.aai.keyfile from 
demo-aai-aai-resources-aaf-certs (rw,path="org.onap.aai.keyfile")
      /opt/app/aai-resources/resources/aaf/org.onap.aai.p12 from 
demo-aai-aai-resources-aaf-certs (rw,path="org.onap.aai.p12")
      /opt/app/aai-resources/resources/aaf/org.onap.aai.props from 
demo-aai-aai-resources-aaf-properties (rw,path="org.onap.aai.props")
      /opt/app/aai-resources/resources/aaf/org.osaaf.location.props from 
demo-aai-aai-resources-aaf-properties (rw,path="org.osaaf.location.props")
      /opt/app/aai-resources/resources/aaf/permissions.properties from 
demo-aai-aai-resources-aaf-properties (rw,path="permissions.properties")
      /opt/app/aai-resources/resources/aaf/truststoreONAPall.jks from 
aai-common-aai-auth-mount (rw,path="truststoreONAPall.jks")
      /opt/app/aai-resources/resources/application.properties from 
demo-aai-aai-resources-config (rw,path="application.properties")
      /opt/app/aai-resources/resources/cadi.properties from 
demo-aai-aai-resources-aaf-properties (rw,path="cadi.properties")
      /opt/app/aai-resources/resources/etc/appprops/aaiconfig.properties from 
demo-aai-aai-resources-config (rw,path="aaiconfig.properties")
      
/opt/app/aai-resources/resources/etc/appprops/janusgraph-cached.properties from 
demo-aai-aai-resources-config (rw,path="janusgraph-cached.properties")
      
/opt/app/aai-resources/resources/etc/appprops/janusgraph-realtime.properties 
from demo-aai-aai-resources-config (rw,path="janusgraph-realtime.properties")
      /opt/app/aai-resources/resources/etc/auth/aai_keystore from 
demo-aai-aai-resources-auth-truststore-sec (rw,path="aai_keystore")
      /opt/app/aai-resources/resources/etc/auth/realm.properties from 
demo-aai-aai-resources-config (rw,path="realm.properties")
      /opt/app/aai-resources/resources/localhost-access-logback.xml from 
demo-aai-aai-resources-config (rw,path="localhost-access-logback.xml")
      /opt/app/aai-resources/resources/logback.xml from 
demo-aai-aai-resources-config (rw,path="logback.xml")
      /var/run/secrets/kubernetes.io/serviceaccount from default-token-xv57w 
(ro)
  filebeat-onap:
    Container ID:   
docker://e1f4aa83f1ca464660b29125fb579bf4e9d7717a048748ea9d008737c8cd9f1f
    Image:          Elalto-Repo:5000/beats/filebeat:5.5.0
    Image ID:       
docker-pullable://Elalto-Repo:5000/beats/filebeat@sha256:a0b7095e1cb51706383c9bb4a54f86e328935a03003f1ea1acfd224ae9dd69c5
    Port:           <none>
    Host Port:      <none>
    State:          Running
      Started:      Tue, 21 Apr 2020 07:18:28 +0000
    Ready:          True
    Restart Count:  0
    Environment:    <none>
    Mounts:
      /usr/share/filebeat/data from demo-aai-aai-resources-filebeat (rw)
      /usr/share/filebeat/filebeat.yml from filebeat-conf 
(rw,path="filebeat.yml")
      /var/log/onap from demo-aai-aai-resources-logs (rw)
      /var/run/secrets/kubernetes.io/serviceaccount from default-token-xv57w 
(ro)
Conditions:
  Type              Status
  Initialized       True 
  Ready             False 
  ContainersReady   False 
  PodScheduled      True 
Volumes:
  aai-common-aai-auth-mount:
    Type:        Secret (a volume populated by a Secret)
    SecretName:  aai-common-aai-auth
    Optional:    false
  localtime:
    Type:          HostPath (bare host directory volume)
    Path:          /etc/localtime
    HostPathType:  
  filebeat-conf:
    Type:      ConfigMap (a volume populated by a ConfigMap)
    Name:      aai-filebeat
    Optional:  false
  demo-aai-aai-resources-logs:
    Type:       EmptyDir (a temporary directory that shares a pod's lifetime)
    Medium:     
    SizeLimit:  <unset>
  demo-aai-aai-resources-filebeat:
    Type:       EmptyDir (a temporary directory that shares a pod's lifetime)
    Medium:     
    SizeLimit:  <unset>
  demo-aai-aai-resources-config:
    Type:      ConfigMap (a volume populated by a ConfigMap)
    Name:      demo-aai-aai-resources-configmap
    Optional:  false
  demo-aai-aai-resources-aaf-properties:
    Type:      ConfigMap (a volume populated by a ConfigMap)
    Name:      demo-aai-aai-resources-aaf-props
    Optional:  false
  demo-aai-aai-resources-aaf-certs:
    Type:        Secret (a volume populated by a Secret)
    SecretName:  demo-aai-aai-resources-aaf-keys
    Optional:    false
  demo-aai-aai-resources-auth-truststore-sec:
    Type:        Secret (a volume populated by a Secret)
    SecretName:  aai-common-truststore
    Optional:    false
  default-token-xv57w:
    Type:        Secret (a volume populated by a Secret)
    SecretName:  default-token-xv57w
    Optional:    false
QoS Class:       BestEffort
Node-Selectors:  <none>
Tolerations:     node.kubernetes.io/not-ready:NoExecute for 300s
                 node.kubernetes.io/unreachable:NoExecute for 300s
Events:
  Type     Reason   Age                    From                        Message
  ----     ------   ----                   ----                        -------
  Normal   Pulled   23m (x28 over 141m)    kubelet, demoelalto-k8s-01  
Container image "Elalto-Repo:5000/onap/aai-resources:1.5.1" already present on 
machine
  Warning  BackOff  4m2s (x612 over 140m)  kubelet, demoelalto-k8s-01  Back-off 
restarting failed container


=====================================================================================================
aai-resource container logs
=====================================================================================================


root@demoelalto-nfs:/home/ubuntu# kubectl logs 
demo-aai-aai-resources-ccfcdbff6-pp48k -c aai-resources
Creating mailbox file: No such file or directory
chown: /opt/app/aai-resources/resources/logback.xml: Read-only file system
chown: /opt/app/aai-resources/resources/localhost-access-logback.xml: Read-only 
file system
chown: /opt/app/aai-resources/resources/aaf/org.onap.aai.props: Read-only file 
system
chown: /opt/app/aai-resources/resources/aaf/truststoreONAPall.jks: Read-only 
file system
chown: /opt/app/aai-resources/resources/aaf/org.osaaf.location.props: Read-only 
file system
chown: /opt/app/aai-resources/resources/aaf/permissions.properties: Read-only 
file system
chown: /opt/app/aai-resources/resources/aaf/org.onap.aai.keyfile: Read-only 
file system
chown: /opt/app/aai-resources/resources/aaf/bath_config.csv: Read-only file 
system
chown: /opt/app/aai-resources/resources/aaf/org.onap.aai.p12: Read-only file 
system
chown: 
/opt/app/aai-resources/resources/etc/appprops/janusgraph-realtime.properties: 
Read-only file system
chown: /opt/app/aai-resources/resources/etc/appprops/aaiconfig.properties: 
Read-only file system
chown: 
/opt/app/aai-resources/resources/etc/appprops/janusgraph-cached.properties: 
Read-only file system
chown: /opt/app/aai-resources/resources/etc/auth/realm.properties: Read-only 
file system
chown: /opt/app/aai-resources/resources/etc/auth/aai_keystore: Read-only file 
system
chown: /opt/app/aai-resources/resources/application.properties: Read-only file 
system
chown: /opt/app/aai-resources/resources/cadi.properties: Read-only file system
chown: /opt/bulkprocess_load: No such file or directory

Tue Apr 21 09:36:42 UTC 2020    Starting 
/opt/app/aai-resources/scripts/updatePem.sh
Mac verify error: invalid password?
Mac verify error: invalid password?

Tue Apr 21 09:36:43 UTC 2020    Done /opt/app/aai-resources/scripts/updatePem.sh
2020-04-21 09:36:44.075  INFO   --- [           main] 
org.onap.aai.util.AAIConfig              : Initializing AAIConfig
2020-04-21 09:36:44.081 DEBUG   --- [           main] 
org.onap.aai.util.AAIConfig              : Reloading config from 
/opt/app/aai-resources/./resources/etc/appprops/aaiconfig.properties
2020-04-21 09:36:44.082  INFO   --- [           main] 
org.onap.aai.util.AAIConfig              : A&AI Server Node Name = 
aai.config.nodename

  .   ____          _            __ _ _
 /\\ / ___'_ __ _ _(_)_ __  __ _ \ \ \ \
( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \
 \\/  ___)| |_)| | | | | || (_| |  ) ) ) )
  '  |____| .__|_| |_|_| |_\__, | / / / /
 =========|_|==============|___/=/_/_/_/
 :: Spring Boot ::       (v1.5.21.RELEASE)

09:36:48,010 |-INFO in LogbackRequestLog - Will use configuration resource 
[/localhost-access-logback.xml]
09:36:48,016 |-INFO in ch.qos.logback.access.joran.action.ConfigurationAction - 
debug attribute not set
09:36:48,017 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - About 
to instantiate appender of type 
[ch.qos.logback.core.rolling.RollingFileAppender]
09:36:48,017 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - Naming 
appender as [ACCESS]
09:36:48,017 |-INFO in c.q.l.core.rolling.TimeBasedRollingPolicy@388357135 - No 
compression will be used
09:36:48,017 |-INFO in c.q.l.core.rolling.TimeBasedRollingPolicy@388357135 - 
Will use the pattern 
/opt/app/aai-resources/logs/ajsc-jetty/localhost_access.log.%d{yyyy-MM-dd} for 
the active file
09:36:48,017 |-INFO in 
c.q.l.core.rolling.DefaultTimeBasedFileNamingAndTriggeringPolicy - The date 
pattern is 'yyyy-MM-dd' from file name pattern 
'/opt/app/aai-resources/logs/ajsc-jetty/localhost_access.log.%d{yyyy-MM-dd}'.
09:36:48,017 |-INFO in 
c.q.l.core.rolling.DefaultTimeBasedFileNamingAndTriggeringPolicy - Roll-over at 
midnight.
09:36:48,018 |-INFO in 
c.q.l.core.rolling.DefaultTimeBasedFileNamingAndTriggeringPolicy - Setting 
initial period to Tue Apr 21 07:18:43 GMT 2020
09:36:48,030 |-INFO in ch.qos.logback.core.rolling.RollingFileAppender[ACCESS] 
- Active log file name: 
/opt/app/aai-resources/logs/ajsc-jetty/localhost_access.log
09:36:48,030 |-INFO in ch.qos.logback.core.rolling.RollingFileAppender[ACCESS] 
- File property is set to 
[/opt/app/aai-resources/logs/ajsc-jetty/localhost_access.log]
09:36:48,031 |-INFO in ch.qos.logback.core.joran.action.AppenderRefAction - 
Attaching appender named [ACCESS] to null
09:36:48,031 |-INFO in ch.qos.logback.access.joran.action.ConfigurationAction - 
End of configuration.
09:36:48,031 |-INFO in ch.qos.logback.access.joran.JoranConfigurator@3911c2a7 - 
Registering current configuration as safe fallback point

2020-04-21T09:36:48.787+0000 INIT [cadi] Loading CADI Properties from 
/opt/app/aai-resources/resources/aaf/org.osaaf.location.props
2020-04-21T09:36:48.787+0000 INIT [cadi] Loading CADI Properties from 
/opt/app/aai-resources/resources/aaf/org.onap.aai.props
2020-04-21T09:36:48.790+0000 INIT [cadi] cadi_keyfile points to 
/opt/app/aai-resources/resources/aaf/org.onap.aai.keyfile
2020-04-21T09:36:48.804+0000 INIT [cadi] hostname is set to aai-resources
2020-04-21T09:36:48.804+0000 INIT [cadi] basic_realm is set to aai-resources
2020-04-21T09:36:48.804+0000 INIT [cadi] aaf_default_realm is set to 
aai-resources
2020-04-21T09:36:48.804+0000 INIT [cadi] aaf_id is set to [email protected]
2020-04-21 09:36:49.032 ERROR 1 --- [           main] org.onap.aai.ResourcesApp 
               : Problems starting ResourcesApp Error connecting to 
SchemaService - Please Investigate
2020-04-21 09:36:49.039 ERROR 1 --- [           main] 
org.onap.aai.logging.ErrorLogHelper      : ERR.5.4.3025 
ex=org.onap.aai.exceptions.AAIException: Error connecting to SchemaService - 
Please Investigate ClassName- org.onap.aai.ResourcesApp :LineNumber- 219 
:MethodName- schemaServiceExceptionTranslator ClassName- 
org.onap.aai.ResourcesApp :LineNumber- 151 :MethodName- main ClassName- 
sun.reflect.NativeMethodAccessorImpl :LineNumber- -2 :MethodName- invoke0 
ClassName- sun.reflect.NativeMethodAccessorImpl :LineNumber- 62 :MethodName- 
invoke ClassName- sun.reflect.DelegatingMethodAccessorImpl :LineNumber- 43 
:MethodName- invoke ClassName- java.lang.reflect.Method :LineNumber- 498 
:MethodName- invoke ClassName- org.springframework.boot.loader.MainMethodRunner 
:LineNumber- 48 :MethodName- run ClassName- 
org.springframework.boot.loader.Launcher :LineNumber- 87 :MethodName- launch 
ClassName- org.springframework.boot.loader.Launcher :LineNumber- 50 
:MethodName- launch ClassName- 
org.springframework.boot.loader.PropertiesLauncher :LineNumber- 595 
:MethodName- main
2020-04-21 09:36:49.040 ERROR 1 --- [           main] 
org.onap.aai.logging.ErrorLogHelper      : ERR.5.4.3025 
ex=org.onap.aai.exceptions.AAIException: Unable to start embedded container; 
nested exception is 
org.springframework.boot.context.embedded.EmbeddedServletContainerException: 
Unable to start embedded Jetty servlet container, resolve and restart Resources 
ClassName- org.onap.aai.logging.ErrorLogHelper :LineNumber- 631 :MethodName- 
logError ClassName- org.onap.aai.ResourcesApp :LineNumber- 156 :MethodName- 
main ClassName- sun.reflect.NativeMethodAccessorImpl :LineNumber- -2 
:MethodName- invoke0 ClassName- sun.reflect.NativeMethodAccessorImpl 
:LineNumber- 62 :MethodName- invoke ClassName- 
sun.reflect.DelegatingMethodAccessorImpl :LineNumber- 43 :MethodName- invoke 
ClassName- java.lang.reflect.Method :LineNumber- 498 :MethodName- invoke 
ClassName- org.springframework.boot.loader.MainMethodRunner :LineNumber- 48 
:MethodName- run ClassName- org.springframework.boot.loader.Launcher 
:LineNumber- 87 :MethodName- launch ClassName- 
org.springframework.boot.loader.Launcher :LineNumber- 50 :MethodName- launch 
ClassName- org.springframework.boot.loader.PropertiesLauncher :LineNumber- 595 
:MethodName- main
Exception in thread "main" java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48)
        at org.springframework.boot.loader.Launcher.launch(Launcher.java:87)
        at org.springframework.boot.loader.Launcher.launch(Launcher.java:50)
        at 
org.springframework.boot.loader.PropertiesLauncher.main(PropertiesLauncher.java:595)
Caused by: org.onap.aai.exceptions.AAIException: Error connecting to 
SchemaService - Please Investigate
        at 
org.onap.aai.ResourcesApp.schemaServiceExceptionTranslator(ResourcesApp.java:219)
        at org.onap.aai.ResourcesApp.main(ResourcesApp.java:151)
        ... 8 more




==========================================================================================================


root@demoelalto-nfs:/home/ubuntu# kubectl describe pod 
demo-aai-aai-5cb69c65dc-4c52h
Name:           demo-aai-aai-5cb69c65dc-4c52h
Namespace:      onap
Priority:       0
Node:           demoelalto-k8s-02/30.0.0.23
Start Time:     Tue, 21 Apr 2020 07:05:21 +0000
Labels:         app=aai
                pod-template-hash=5cb69c65dc
                release=demo-aai
Annotations:    checksum/config: 
93d1021ee25ea8067d298bddb1d5eb5110476ab400b6188823e725d98e5e61fb
                cni.projectcalico.org/podIP: 10.42.6.65/32
Status:         Pending
IP:             10.42.6.65
Controlled By:  ReplicaSet/demo-aai-aai-5cb69c65dc
Init Containers:
  aai-readiness:
    Container ID:  
docker://777fe9f02be9460ab10410343c84b111d76c99e20f3f217c83521f2087f6c9c6
    Image:         Elalto-Repo:5000/readiness-check:2.0.2
    Image ID:      
docker-pullable://Elalto-Repo:5000/readiness-check@sha256:f369e309921a383d48c5330413fc9a36180e491c3c4cbf48931b57fd52ab4738
    Port:          <none>
    Host Port:     <none>
    Command:
      /root/ready.py
    Args:
      --container-name
      aai-resources
      --container-name
      aai-traversal
      --container-name
      aai-graphadmin
    State:          Running
      Started:      Tue, 21 Apr 2020 09:37:56 +0000
    Last State:     Terminated
      Reason:       Error
      Exit Code:    1
      Started:      Tue, 21 Apr 2020 09:27:48 +0000
      Finished:     Tue, 21 Apr 2020 09:37:55 +0000
    Ready:          False
    Restart Count:  15
    Environment:
      NAMESPACE:  onap (v1:metadata.namespace)
    Mounts:
      /var/run/secrets/kubernetes.io/serviceaccount from default-token-xv57w 
(ro)
Containers:
  aai:
    Container ID:   
    Image:          Elalto-Repo:5000/aaionap/haproxy:1.4.0
    Image ID:       
    Port:           8443/TCP
    Host Port:      0/TCP
    State:          Waiting
      Reason:       PodInitializing
    Ready:          False
    Restart Count:  0
    Liveness:       tcp-socket :8443 delay=120s timeout=1s period=10s 
#success=1 #failure=3
    Readiness:      http-get https://:8443/aai/util/echo delay=10s timeout=1s 
period=10s #success=1 #failure=3
    Environment:    <none>
    Mounts:
      /dev/log from aai-service-log (rw)
      /etc/localtime from localtime (ro)
      /etc/ssl/private/aai.pem from aai-pem (rw,path="aai.pem")
      /usr/local/etc/haproxy/haproxy.cfg from haproxy-cfg 
(rw,path="haproxy.cfg")
      /var/run/secrets/kubernetes.io/serviceaccount from default-token-xv57w 
(ro)
Conditions:
  Type              Status
  Initialized       False 
  Ready             False 
  ContainersReady   False 
  PodScheduled      True 
Volumes:
  localtime:
    Type:          HostPath (bare host directory volume)
    Path:          /etc/localtime
    HostPathType:  
  aai-service-log:
    Type:          HostPath (bare host directory volume)
    Path:          /dev/log
    HostPathType:  
  haproxy-cfg:
    Type:      ConfigMap (a volume populated by a ConfigMap)
    Name:      aai-deployment-configmap
    Optional:  false
  aai-pem:
    Type:        Secret (a volume populated by a Secret)
    SecretName:  aai-haproxy-secret
    Optional:    false
  default-token-xv57w:
    Type:        Secret (a volume populated by a Secret)
    SecretName:  default-token-xv57w
    Optional:    false
QoS Class:       BestEffort
Node-Selectors:  <none>
Tolerations:     node.kubernetes.io/not-ready:NoExecute for 300s
                 node.kubernetes.io/unreachable:NoExecute for 300s
Events:
  Type    Reason   Age                   From                        Message
  ----    ------   ----                  ----                        -------
  Normal  Pulled   5m9s (x16 over 157m)  kubelet, demoelalto-k8s-02  Container 
image "Elalto-Repo:5000/readiness-check:2.0.2" already present on machine
  Normal  Created  5m9s (x16 over 157m)  kubelet, demoelalto-k8s-02  Created 
container aai-readiness
  Normal  Started  5m9s (x16 over 157m)  kubelet, demoelalto-k8s-02  Started 
container aai-readiness


==========================================================================================================

root@demoelalto-nfs:/home/ubuntu# kubectl describe pod 
demo-aai-aai-sparky-be-86676cf967-2k42m
Name:           demo-aai-aai-sparky-be-86676cf967-2k42m
Namespace:      onap
Priority:       0
Node:           demoelalto-k8s-01/30.0.0.9
Start Time:     Tue, 21 Apr 2020 07:05:21 +0000
Labels:         app=aai-sparky-be
                pod-template-hash=86676cf967
                release=demo-aai
Annotations:    cni.projectcalico.org/podIP: 10.42.11.53/32
Status:         Pending
IP:             10.42.11.53
Controlled By:  ReplicaSet/demo-aai-aai-sparky-be-86676cf967
Init Containers:
  aai-sparky-be-readiness:
    Container ID:  
docker://8e82c51f57195e23320af33f7b9b0daec88b09336abaa64eaf2b129b92f2857c
    Image:         Elalto-Repo:5000/readiness-check:2.0.2
    Image ID:      
docker-pullable://Elalto-Repo:5000/readiness-check@sha256:f369e309921a383d48c5330413fc9a36180e491c3c4cbf48931b57fd52ab4738
    Port:          <none>
    Host Port:     <none>
    Command:
      /root/ready.py
    Args:
      --container-name
      aai-elasticsearch
      --container-name
      aai-search-data
      --container-name
      aai
    State:          Running
      Started:      Tue, 21 Apr 2020 07:05:27 +0000
    Ready:          False
    Restart Count:  0
    Environment:
      NAMESPACE:  onap (v1:metadata.namespace)
    Mounts:
      /var/run/secrets/kubernetes.io/serviceaccount from default-token-xv57w 
(ro)
Containers:
  aai-sparky-be:
    Container ID:   
    Image:          Elalto-Repo:5000/onap/sparky-be:1.5.1
    Image ID:       
    Port:           8000/TCP
    Host Port:      0/TCP
    State:          Waiting
      Reason:       PodInitializing
    Ready:          False
    Restart Count:  0
    Liveness:       tcp-socket :8000 delay=120s timeout=1s period=10s 
#success=1 #failure=3
    Readiness:      tcp-socket :8000 delay=10s timeout=1s period=10s #success=1 
#failure=3
    Environment:    <none>
    Mounts:
      /etc/localtime from localtime (ro)
      /opt/app/sparky/config/application-oxm-default.properties from 
demo-aai-aai-sparky-be-properties (rw,path="application-oxm-default.properties")
      /opt/app/sparky/config/application-oxm-override.properties from 
demo-aai-aai-sparky-be-properties 
(rw,path="application-oxm-override.properties")
      /opt/app/sparky/config/application-oxm-schema-prod.properties from 
demo-aai-aai-sparky-be-properties 
(rw,path="application-oxm-schema-prod.properties")
      /opt/app/sparky/config/application-resources.properties from 
demo-aai-aai-sparky-be-properties (rw,path="application-resources.properties")
      /opt/app/sparky/config/application-ssl.properties from 
demo-aai-aai-sparky-be-properties (rw,path="application-ssl.properties")
      /opt/app/sparky/config/application.properties from 
demo-aai-aai-sparky-be-properties (rw,path="application.properties")
      /opt/app/sparky/config/auth/client-cert-onap.p12 from 
demo-aai-aai-sparky-be-auth-config (rw,path="client-cert-onap.p12")
      /opt/app/sparky/config/auth/csp-cookie-filter.properties from 
demo-aai-aai-sparky-be-auth-config (rw,path="csp-cookie-filter.properties")
      /opt/app/sparky/config/auth/org.onap.aai.p12 from 
demo-aai-aai-sparky-be-auth-config (rw,path="org.onap.aai.p12")
      /opt/app/sparky/config/auth/truststoreONAPall.jks from 
aai-common-aai-auth-mount (rw,path="truststoreONAPall.jks")
      /opt/app/sparky/config/portal/ from demo-aai-aai-sparky-be-portal-config 
(rw)
      /opt/app/sparky/config/portal/BOOT-INF/classes/ from 
demo-aai-aai-sparky-be-portal-config-props (rw)
      /opt/app/sparky/config/roles.config from 
demo-aai-aai-sparky-be-properties (rw,path="roles.config")
      /opt/app/sparky/config/users.config from 
demo-aai-aai-sparky-be-properties (rw,path="users.config")
      /var/log/onap from demo-aai-aai-sparky-be-logs (rw)
      /var/run/secrets/kubernetes.io/serviceaccount from default-token-xv57w 
(ro)
  filebeat-onap:
    Container ID:   
    Image:          Elalto-Repo:5000/beats/filebeat:5.5.0
    Image ID:       
    Port:           <none>
    Host Port:      <none>
    State:          Waiting
      Reason:       PodInitializing
    Ready:          False
    Restart Count:  0
    Environment:    <none>
    Mounts:
      /usr/share/filebeat/data from aai-sparky-filebeat (rw)
      /usr/share/filebeat/filebeat.yml from filebeat-conf 
(rw,path="filebeat.yml")
      /var/log/onap from demo-aai-aai-sparky-be-logs (rw)
      /var/run/secrets/kubernetes.io/serviceaccount from default-token-xv57w 
(ro)
Conditions:
  Type              Status
  Initialized       False 
  Ready             False 
  ContainersReady   False 
  PodScheduled      True 
Volumes:
  localtime:
    Type:          HostPath (bare host directory volume)
    Path:          /etc/localtime
    HostPathType:  
  demo-aai-aai-sparky-be-properties:
    Type:      ConfigMap (a volume populated by a ConfigMap)
    Name:      demo-aai-aai-sparky-be-prop
    Optional:  false
  demo-aai-aai-sparky-be-config:
    Type:      ConfigMap (a volume populated by a ConfigMap)
    Name:      demo-aai-aai-sparky-be
    Optional:  false
  demo-aai-aai-sparky-be-portal-config:
    Type:      ConfigMap (a volume populated by a ConfigMap)
    Name:      demo-aai-aai-sparky-be-portal
    Optional:  false
  demo-aai-aai-sparky-be-portal-config-props:
    Type:      ConfigMap (a volume populated by a ConfigMap)
    Name:      demo-aai-aai-sparky-be-portal-props
    Optional:  false
  demo-aai-aai-sparky-be-auth-config:
    Type:        Secret (a volume populated by a Secret)
    SecretName:  demo-aai-aai-sparky-be
    Optional:    false
  aai-common-aai-auth-mount:
    Type:        Secret (a volume populated by a Secret)
    SecretName:  aai-common-aai-auth
    Optional:    false
  filebeat-conf:
    Type:      ConfigMap (a volume populated by a ConfigMap)
    Name:      aai-filebeat
    Optional:  false
  demo-aai-aai-sparky-be-logs:
    Type:       EmptyDir (a temporary directory that shares a pod's lifetime)
    Medium:     
    SizeLimit:  <unset>
  aai-sparky-filebeat:
    Type:       EmptyDir (a temporary directory that shares a pod's lifetime)
    Medium:     
    SizeLimit:  <unset>
  modeldir:
    Type:       EmptyDir (a temporary directory that shares a pod's lifetime)
    Medium:     
    SizeLimit:  <unset>
  default-token-xv57w:
    Type:        Secret (a volume populated by a Secret)
    SecretName:  default-token-xv57w
    Optional:    false
QoS Class:       BestEffort
Node-Selectors:  <none>
Tolerations:     node.kubernetes.io/not-ready:NoExecute for 300s
                 node.kubernetes.io/unreachable:NoExecute for 300s
Events:          <none>

-----------------------------------------------------------------------------------

root@demoelalto-nfs:/home/ubuntu# kubectl logs 
demo-aai-aai-sparky-be-86676cf967-2k42m -c aai-sparky-be
Error from server (BadRequest): container "aai-sparky-be" in pod 
"demo-aai-aai-sparky-be-86676cf967-2k42m" is waiting to start: PodInitializing

Reply via email to