Ohh ok.

If dev confirms it to be mandatory, as I understand correct, I will need to:
1. Deploy and Configure ES
2. Update application.properties to include ES details and create ES index
3. Rebuild Maven package and rerun the Griffin service

There is no need to reload the data into Hadoop (Hive), correct?

On a side note, is there any other documentation of Griffin available or
underway which would help to get below details while integrating it with
Cloudera Hadoop?
1. What are the exact ports requirements (internal and external)?
2. Which all packages will be required?
3. Any Java dependencies?
4. If we have Cloudera Hadoop cluster kerberized (secured), what are the
dependencies or additional configurations needed?

I know some of the above information can be fetched from the deployment
guide on Github. However, checking if any other formal documentation has
been made available for the same?

Thanks and Regards,
Sunil Muniyal


On Mon, Sep 7, 2020 at 4:05 PM William Guo <[email protected]> wrote:

> cc dev for double checking.
>
> Measure will emit metrics and store them in elastic, UI fetch those
> metrics from elastic.
> So elastic should be mandatory.
>
> Thanks,
> William
>
> On Mon, Sep 7, 2020 at 6:32 PM Sunil Muniyal <[email protected]>
> wrote:
>
>> Thank you for the quick response, William.
>>
>> I have not configured ElasticSearch since it is not deployed.
>>
>> In the application.properties, I just added the dummy information (as
>> below) just to pass the validation test and get Griffin up and running.
>>
>> # elasticsearch
>> # elasticsearch.host = <IP>
>> # elasticsearch.port = <elasticsearch rest port>
>> # elasticsearch.user = user
>> # elasticsearch.password = password
>> elasticsearch.host=localhost
>> elasticsearch.port=9200
>> elasticsearch.scheme=http
>>
>> Is ElasticSearch a mandatory requirement to use Griffin?
>>
>> Thanks and Regards,
>> Sunil Muniyal
>>
>>
>> On Mon, Sep 7, 2020 at 3:58 PM William Guo <[email protected]> wrote:
>>
>>> Could you check whether ES has been injected with those metrics or not?
>>>
>>>
>>> On Mon, Sep 7, 2020 at 6:23 PM Sunil Muniyal <[email protected]>
>>> wrote:
>>>
>>>> Hello William,
>>>>
>>>> I was able to bypass this error by entering the default field values
>>>> for LDAP, ElasticSearch and Livy in application.properties and
>>>> successfully get Griffin running.
>>>>
>>>> By following the below article, I have created a test measure and then
>>>> a job which triggers that measure.
>>>>
>>>> https://github.com/apache/griffin/blob/master/griffin-doc/ui/user-guide.md
>>>>
>>>> Have allowed the job to get triggered multiple times, however, still i
>>>> can't see anything in metrics related to the job. Neither I see anything 
>>>> in *health
>>>> *or *mydashboard* tabs. Also, if you notice in the screenshot below,
>>>> being in the *DQ Metrics* tab, I still do not see the created measure
>>>> in the drop down list.
>>>>
>>>> [image: image.png]
>>>>
>>>> *Test job executed multiple times:*
>>>> [image: image.png]
>>>>
>>>> Please advise if anything is mis-configured.
>>>>
>>>> Thanks and Regards,
>>>> Sunil Muniyal
>>>>
>>>>
>>>> On Mon, Sep 7, 2020 at 12:40 PM Sunil Muniyal <
>>>> [email protected]> wrote:
>>>>
>>>>> Hello William,
>>>>>
>>>>> Thank you for the reply.
>>>>>
>>>>> This helped, actually i had missed to add the property in
>>>>> application.properties.
>>>>>
>>>>> Now the other challenge is, along with ES and Livy, I am also not
>>>>> using LDAP and it is hitting the error *unable to resolve ldap.url
>>>>> property.* Of Course it will, since the property is not configured.
>>>>>
>>>>> Please suggest.
>>>>>
>>>>> Thanks and Regards,
>>>>> Sunil Muniyal
>>>>>
>>>>>
>>>>> On Sun, Sep 6, 2020 at 7:26 PM William Guo <[email protected]> wrote:
>>>>>
>>>>>> hi Sunil Muniyal,
>>>>>>
>>>>>> Could you check this property in your griffin properties file?
>>>>>>
>>>>>> internal.event.listeners
>>>>>>
>>>>>>
>>>>>>
>>>>>> Thanks,
>>>>>>
>>>>>> William
>>>>>>
>>>>>>
>>>>>> On Thu, Sep 3, 2020 at 11:05 PM Sunil Muniyal <
>>>>>> [email protected]> wrote:
>>>>>>
>>>>>>> Hello,
>>>>>>>
>>>>>>> I am attempting to integrate Griffin with Cloudera Hadoop by
>>>>>>> following below article:
>>>>>>>
>>>>>>> https://github.com/apache/griffin/blob/master/griffin-doc/deploy/deploy-guide.md
>>>>>>>
>>>>>>>
>>>>>>> <https://github.com/apache/griffin/blob/master/griffin-doc/deploy/deploy-guide.md>I
>>>>>>> have followed everything as instructed, apart from below things:
>>>>>>> 1. Using Cloudera Hadoop 5.15 and relevant configurations instead of
>>>>>>> Apache Hadoop
>>>>>>> 2. Not using Elastic search as it is not applicable
>>>>>>> 3. Did not use Livy as it is not applicable.
>>>>>>>
>>>>>>> Maven build is successful and has got 2 jars at service/target and
>>>>>>> measure/target which I have uploaded to HDFS.
>>>>>>>
>>>>>>> However, *starting griffin-service.jar using nohup command* is
>>>>>>> failing with below error:
>>>>>>> *Caused by: java.lang.IllegalArgumentException: Could not resolve
>>>>>>> placeholder 'internal.event.listeners' in string value
>>>>>>> "#{'${internal.event.listeners}'.split(',')}"*
>>>>>>> *        at
>>>>>>> org.springframework.util.PropertyPlaceholderHelper.parseStringValue(PropertyPlaceholderHelper.java:174)
>>>>>>> ~[spring-core-4.3.6.RELEASE.jar!/:4.3.6.RELEASE]*
>>>>>>> *        at
>>>>>>> org.springframework.util.PropertyPlaceholderHelper.replacePlaceholders(PropertyPlaceholderHelper.java:126)
>>>>>>> ~[spring-core-4.3.6.RELEASE.jar!/:4.3.6.RELEASE]*
>>>>>>> *        at
>>>>>>> org.springframework.core.env.AbstractPropertyResolver.doResolvePlaceholders(AbstractPropertyResolver.java:236)
>>>>>>> ~[spring-core-4.3.6.RELEASE.jar!/:4.3.6.RELEASE]*
>>>>>>>
>>>>>>> I have tried to search a lot of articles with no luck.
>>>>>>>
>>>>>>> Would be great if someone could help me to fix this.
>>>>>>>
>>>>>>> Also, attached is the output of nohup command that was written in
>>>>>>> service.out.
>>>>>>>
>>>>>>> Thanks and Regards,
>>>>>>> Sunil Muniyal
>>>>>>>
>>>>>>> ---------------------------------------------------------------------
>>>>>>> To unsubscribe, e-mail: [email protected]
>>>>>>> For additional commands, e-mail: [email protected]
>>>>>>
>>>>>>

Reply via email to