I will need to redeploy ElasticSearch, correct?

Thanks and Regards,
Sunil Muniyal


On Tue, Sep 8, 2020 at 4:05 PM William Guo <[email protected]> wrote:

> Could you try with this version?
> <elasticsearch.version>6.4.1</elasticsearch.version>
>
>
> Thanks,
> William
>
> On Tue, Sep 8, 2020 at 5:59 PM Sunil Muniyal <[email protected]>
> wrote:
>
>> Hi William / Dev group,
>>
>> I have deployed ES 7.9 - latest version (single node) and the same is
>> configured. I also get the default page when hitting
>> http://<ES HOST IP>:9200/
>>
>> Upon creating the griffin configurations using the JSON string given
>>
>> curl -k -H "Content-Type: application/json" -X PUT http://<replaced with my 
>> ES host IP>:9200/griffin \
>>  -d '{
>>     "aliases": {},
>>     "mappings": {
>>         "accuracy": {
>>             "properties": {
>>                 "name": {
>>                     "fields": {
>>                         "keyword": {
>>                             "ignore_above": 256,
>>                             "type": "keyword"
>>                         }
>>                     },
>>                     "type": "text"
>>                 },
>>                 "tmst": {
>>                     "type": "date"
>>                 }
>>             }
>>         }
>>     },
>>     "settings": {
>>         "index": {
>>             "number_of_replicas": "2",
>>             "number_of_shards": "5"
>>         }
>>     }
>> }'
>>
>>
>> *I get below error:*
>>
>> *{"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"Root
>> mapping definition has unsupported parameters:  [accuracy :
>> {properties={name={fields={keyword={ignore_above=256, type=keyword}},
>> type=text},
>> tmst={type=date}}}]"}],"type":"mapper_parsing_exception","reason":"Failed
>> to parse mapping [_doc]: Root mapping definition has unsupported
>> parameters:  [accuracy :
>> {properties={name={fields={keyword={ignore_above=256, type=keyword}},
>> type=text},
>> tmst={type=date}}}]","caused_by":{"type":"mapper_parsing_exception","reason":"Root
>> mapping definition has unsupported parameters:  [accuracy :
>> {properties={name={fields={keyword={ignore_above=256, type=keyword}},
>> type=text}, tmst={type=date}}}]"}},"status":400}*
>>
>> Seems like the JSON string is missing some values or is incorrectly
>> provided.
>>
>> Would be great if you could please help.
>>
>> Thanks and Regards,
>> Sunil Muniyal
>>
>>
>> On Mon, Sep 7, 2020 at 8:16 PM Sunil Muniyal <[email protected]>
>> wrote:
>>
>>> Thank you for the response, William.
>>>
>>> I have started preparing for ES deployment and should attempt the same
>>> tomorrow.
>>>
>>> In the meantime, I will also wait for the Dev team in case they have any
>>> additional inputs.
>>>
>>> Thanks and Regards,
>>> Sunil Muniyal
>>>
>>>
>>> On Mon, Sep 7, 2020 at 8:06 PM William Guo <[email protected]> wrote:
>>>
>>>> If dev confirms it to be mandatory, as I understand correct, I will
>>>> need to:
>>>> 1. Deploy and Configure ES
>>>> 2. Update application.properties to include ES details and create ES
>>>> index
>>>> 3. Rebuild Maven package and rerun the Griffin service
>>>>
>>>> *Right, you need to package es env configuration into your jar.*
>>>>
>>>> There is no need to reload the data into Hadoop (Hive), correct?
>>>>
>>>> *No*
>>>>
>>>> On a side note, is there any other documentation of Griffin available
>>>> or underway which would help to get below details while integrating it with
>>>> Cloudera Hadoop?
>>>> 1. What are the exact ports requirements (internal and external)?
>>>> *check log and make sure all extra connections in properties can
>>>> accessible*
>>>> 2. Which all packages will be required?
>>>> *no*
>>>> 3. Any Java dependencies?
>>>> *java 1.8*
>>>> 4. If we have Cloudera Hadoop cluster kerberized (secured), what are
>>>> the dependencies or additional configurations needed?
>>>> *Should no extra dependencies, except those transitive dependencies
>>>> incurred by spark and hadoop.*
>>>>
>>>> On Mon, Sep 7, 2020 at 6:42 PM Sunil Muniyal <
>>>> [email protected]> wrote:
>>>>
>>>>> Ohh ok.
>>>>>
>>>>> If dev confirms it to be mandatory, as I understand correct, I will
>>>>> need to:
>>>>> 1. Deploy and Configure ES
>>>>> 2. Update application.properties to include ES details and create ES
>>>>> index
>>>>> 3. Rebuild Maven package and rerun the Griffin service
>>>>>
>>>>> There is no need to reload the data into Hadoop (Hive), correct?
>>>>>
>>>>> On a side note, is there any other documentation of Griffin available
>>>>> or underway which would help to get below details while integrating it 
>>>>> with
>>>>> Cloudera Hadoop?
>>>>> 1. What are the exact ports requirements (internal and external)?
>>>>> 2. Which all packages will be required?
>>>>> 3. Any Java dependencies?
>>>>> 4. If we have Cloudera Hadoop cluster kerberized (secured), what are
>>>>> the dependencies or additional configurations needed?
>>>>>
>>>>> I know some of the above information can be fetched from the
>>>>> deployment guide on Github. However, checking if any other formal
>>>>> documentation has been made available for the same?
>>>>>
>>>>> Thanks and Regards,
>>>>> Sunil Muniyal
>>>>>
>>>>>
>>>>> On Mon, Sep 7, 2020 at 4:05 PM William Guo <[email protected]> wrote:
>>>>>
>>>>>> cc dev for double checking.
>>>>>>
>>>>>> Measure will emit metrics and store them in elastic, UI fetch those
>>>>>> metrics from elastic.
>>>>>> So elastic should be mandatory.
>>>>>>
>>>>>> Thanks,
>>>>>> William
>>>>>>
>>>>>> On Mon, Sep 7, 2020 at 6:32 PM Sunil Muniyal <
>>>>>> [email protected]> wrote:
>>>>>>
>>>>>>> Thank you for the quick response, William.
>>>>>>>
>>>>>>> I have not configured ElasticSearch since it is not deployed.
>>>>>>>
>>>>>>> In the application.properties, I just added the dummy information
>>>>>>> (as below) just to pass the validation test and get Griffin up and 
>>>>>>> running.
>>>>>>>
>>>>>>> # elasticsearch
>>>>>>> # elasticsearch.host = <IP>
>>>>>>> # elasticsearch.port = <elasticsearch rest port>
>>>>>>> # elasticsearch.user = user
>>>>>>> # elasticsearch.password = password
>>>>>>> elasticsearch.host=localhost
>>>>>>> elasticsearch.port=9200
>>>>>>> elasticsearch.scheme=http
>>>>>>>
>>>>>>> Is ElasticSearch a mandatory requirement to use Griffin?
>>>>>>>
>>>>>>> Thanks and Regards,
>>>>>>> Sunil Muniyal
>>>>>>>
>>>>>>>
>>>>>>> On Mon, Sep 7, 2020 at 3:58 PM William Guo <[email protected]> wrote:
>>>>>>>
>>>>>>>> Could you check whether ES has been injected with those metrics or
>>>>>>>> not?
>>>>>>>>
>>>>>>>>
>>>>>>>> On Mon, Sep 7, 2020 at 6:23 PM Sunil Muniyal <
>>>>>>>> [email protected]> wrote:
>>>>>>>>
>>>>>>>>> Hello William,
>>>>>>>>>
>>>>>>>>> I was able to bypass this error by entering the default field
>>>>>>>>> values for LDAP, ElasticSearch and Livy in application.properties and
>>>>>>>>> successfully get Griffin running.
>>>>>>>>>
>>>>>>>>> By following the below article, I have created a test measure and
>>>>>>>>> then a job which triggers that measure.
>>>>>>>>>
>>>>>>>>> https://github.com/apache/griffin/blob/master/griffin-doc/ui/user-guide.md
>>>>>>>>>
>>>>>>>>> Have allowed the job to get triggered multiple times, however,
>>>>>>>>> still i can't see anything in metrics related to the job. Neither I 
>>>>>>>>> see
>>>>>>>>> anything in *health *or *mydashboard* tabs. Also, if you notice
>>>>>>>>> in the screenshot below, being in the *DQ Metrics* tab, I still
>>>>>>>>> do not see the created measure in the drop down list.
>>>>>>>>>
>>>>>>>>> [image: image.png]
>>>>>>>>>
>>>>>>>>> *Test job executed multiple times:*
>>>>>>>>> [image: image.png]
>>>>>>>>>
>>>>>>>>> Please advise if anything is mis-configured.
>>>>>>>>>
>>>>>>>>> Thanks and Regards,
>>>>>>>>> Sunil Muniyal
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Mon, Sep 7, 2020 at 12:40 PM Sunil Muniyal <
>>>>>>>>> [email protected]> wrote:
>>>>>>>>>
>>>>>>>>>> Hello William,
>>>>>>>>>>
>>>>>>>>>> Thank you for the reply.
>>>>>>>>>>
>>>>>>>>>> This helped, actually i had missed to add the property in
>>>>>>>>>> application.properties.
>>>>>>>>>>
>>>>>>>>>> Now the other challenge is, along with ES and Livy, I am also not
>>>>>>>>>> using LDAP and it is hitting the error *unable to resolve
>>>>>>>>>> ldap.url property.* Of Course it will, since the property is not
>>>>>>>>>> configured.
>>>>>>>>>>
>>>>>>>>>> Please suggest.
>>>>>>>>>>
>>>>>>>>>> Thanks and Regards,
>>>>>>>>>> Sunil Muniyal
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Sun, Sep 6, 2020 at 7:26 PM William Guo <[email protected]>
>>>>>>>>>> wrote:
>>>>>>>>>>
>>>>>>>>>>> hi Sunil Muniyal,
>>>>>>>>>>>
>>>>>>>>>>> Could you check this property in your griffin properties file?
>>>>>>>>>>>
>>>>>>>>>>> internal.event.listeners
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> Thanks,
>>>>>>>>>>>
>>>>>>>>>>> William
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Thu, Sep 3, 2020 at 11:05 PM Sunil Muniyal <
>>>>>>>>>>> [email protected]> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> Hello,
>>>>>>>>>>>>
>>>>>>>>>>>> I am attempting to integrate Griffin with Cloudera Hadoop by
>>>>>>>>>>>> following below article:
>>>>>>>>>>>>
>>>>>>>>>>>> https://github.com/apache/griffin/blob/master/griffin-doc/deploy/deploy-guide.md
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> <https://github.com/apache/griffin/blob/master/griffin-doc/deploy/deploy-guide.md>I
>>>>>>>>>>>> have followed everything as instructed, apart from below things:
>>>>>>>>>>>> 1. Using Cloudera Hadoop 5.15 and relevant configurations
>>>>>>>>>>>> instead of Apache Hadoop
>>>>>>>>>>>> 2. Not using Elastic search as it is not applicable
>>>>>>>>>>>> 3. Did not use Livy as it is not applicable.
>>>>>>>>>>>>
>>>>>>>>>>>> Maven build is successful and has got 2 jars at service/target
>>>>>>>>>>>> and measure/target which I have uploaded to HDFS.
>>>>>>>>>>>>
>>>>>>>>>>>> However, *starting griffin-service.jar using nohup command* is
>>>>>>>>>>>> failing with below error:
>>>>>>>>>>>> *Caused by: java.lang.IllegalArgumentException: Could not
>>>>>>>>>>>> resolve placeholder 'internal.event.listeners' in string value
>>>>>>>>>>>> "#{'${internal.event.listeners}'.split(',')}"*
>>>>>>>>>>>> *        at
>>>>>>>>>>>> org.springframework.util.PropertyPlaceholderHelper.parseStringValue(PropertyPlaceholderHelper.java:174)
>>>>>>>>>>>> ~[spring-core-4.3.6.RELEASE.jar!/:4.3.6.RELEASE]*
>>>>>>>>>>>> *        at
>>>>>>>>>>>> org.springframework.util.PropertyPlaceholderHelper.replacePlaceholders(PropertyPlaceholderHelper.java:126)
>>>>>>>>>>>> ~[spring-core-4.3.6.RELEASE.jar!/:4.3.6.RELEASE]*
>>>>>>>>>>>> *        at
>>>>>>>>>>>> org.springframework.core.env.AbstractPropertyResolver.doResolvePlaceholders(AbstractPropertyResolver.java:236)
>>>>>>>>>>>> ~[spring-core-4.3.6.RELEASE.jar!/:4.3.6.RELEASE]*
>>>>>>>>>>>>
>>>>>>>>>>>> I have tried to search a lot of articles with no luck.
>>>>>>>>>>>>
>>>>>>>>>>>> Would be great if someone could help me to fix this.
>>>>>>>>>>>>
>>>>>>>>>>>> Also, attached is the output of nohup command that was
>>>>>>>>>>>> written in service.out.
>>>>>>>>>>>>
>>>>>>>>>>>> Thanks and Regards,
>>>>>>>>>>>> Sunil Muniyal
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> ---------------------------------------------------------------------
>>>>>>>>>>>> To unsubscribe, e-mail: [email protected]
>>>>>>>>>>>> For additional commands, e-mail: [email protected]
>>>>>>>>>>>
>>>>>>>>>>>

Reply via email to