Hi Lionel,
I’m not really sure about the answer to the questions you have asked but Ill
tell you the steps and pretty sure you’ll be able to figure out the answers.
* Downloaded livy version 0.4.0 zip – Unzipped it .
* Changed conf/livy.conf.template to conf/livy.conf
* Executed bin/livy-server.
Attached are griffin and livy latest logs . I’m also attaching my livy conf
file.
Regards
Ashwini
From: [email protected] <[email protected]> On Behalf Of Lionel Liu
Sent: 15 April 2018 14:48
To: Ashwini Kumar Gupta <[email protected]>
Cc: [email protected]
Subject: Re:RE: Re:RE: Can't get output after creating measure
Hi Ashwini,
I’ve read the griffin_log.txt you’ve attached, according to this:
org.apache.griffin.core.util.FSUtil : Setting
fs.defaultFS:hdfs://hdfs-default-name
did you set fs.defaultFS as the correct hdfs name? it should be like
“hdfs://quickstart.cloudera:8020”, just the same as it was set in core-site.xml
in Hadoop configure directory.
and according to this:
"path" : "dt=20180415 AND hour=06/_DONE"
Did you input the done file path as a “where clause”? it should be like
“dt=20180415/hour=06/_DONE”, the relative path of the “root.path”.
Actually, you can also IGNORE the “done” file configuration if you don’t have
any done file to check before calculation, then griffin will submit the job
directly every time.
In your later email, I think you’ve succeed to submit jobs to livy, but it also
fails.
Would you pls send me the livy.log?
Where did you deploy livy, can it access your HDFS by “hdfs://<path>” directly?
Or you need to access your HDFS like “hdfs:/// quickstart.cloudera:8020/<path>”
? If so, you need to give the full path in the configuration, to let livy
access the hdfs path.
BTW, the datanucleus jars are from hive library, not from livy, you’ve done the
right thing.
Thanks
Lionel, Liu
At 2018-04-15 13:47:08, "Ashwini Kumar Gupta"
<[email protected]<mailto:[email protected]>> wrote:
Hi Lionel,
I think in the previous mail people can’t see the image so I’m attaching it.
Also I don’t see measure name when I click on DQ Matrix and My
Dashboards.Please see atched.
Regards
Ashwini
From: Ashwini Kumar Gupta
<[email protected]<mailto:[email protected]>>
Sent: 15 April 2018 12:07
To: [email protected]<mailto:[email protected]>;
Lionel Liu <[email protected]<mailto:[email protected]>>
Subject: RE: Re:RE: Can't get output after creating measure
Update:
Griffin was not able to reach out to HDFS because while I was creating DONE
file in measure I gave path as /user/warehouse . How ever I should have fully
qualified name. Well I think this was the issue.
Now that I have corrected it I do not get the “can’t reach hdfs error” but I’m
still not getting the output.
Livy dashboard:
[cid:[email protected]]
Livy log file:
Warning: Skip remote jar hdfs:///griffin/griffin-measure.jar.
Warning: Skip remote jar hdfs:///livy/datanucleus-api-jdo-3.2.6.jar.
Warning: Skip remote jar hdfs:///livy/datanucleus-core-3.2.10.jar.
Warning: Skip remote jar hdfs:///livy/datanucleus-rdbms-3.2.9.jar.
java.lang.ClassNotFoundException: org.apache.griffin.measure.Application
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:176)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:689)
at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
NOTE: I have placed in griffin-measure in hdfs:///griffin/griffin-measure.jar.
And datanucleus files in hdfs:///livy/datanucleus-api-jdo-3.2.6.jar
Also I have copied datanucleus jar files from Hive library folder. Livy 0.4.0
doesn’t have these jars files.
Please suggest corrections
Regards
Ashwini
From: Ashwini Kumar Gupta
<[email protected]<mailto:[email protected]>>
Sent: 15 April 2018 11:41
To: Lionel Liu <[email protected]<mailto:[email protected]>>
Cc: [email protected]<mailto:[email protected]>
Subject: RE: Re:RE: Can't get output after creating measure
Hello Lionel,
As suggested I kept only one configuration i.e
sparkJob.jars =
hdfs:///livy/datanucleus-api-jdo-3.2.6.jar;\
hdfs:///livy/datanucleus-core-3.2.10.jar;\
hdfs:///livy/datanucleus-rdbms-3.2.9.jar
* I kept env.json , griffin-measure.jar ,hive-site.xml in
hdfs:///griffin/griffin-measure.jar .
* I created an accuracy measure and created a job with 1 minute cron
expression.
Attached is the log file. It seems it can’t get to HDFS although my cloudera
HDFS is up.
Regards
Ashwini
From: Lionel Liu <[email protected]<mailto:[email protected]>>
Sent: 13 April 2018 15:14
To: Ashwini Kumar Gupta
<[email protected]<mailto:[email protected]>>
Cc: [email protected]<mailto:[email protected]>
Subject: Re: Re:RE: Can't get output after creating measure
Hi Ashwini,
I've read your document, and here lists my answers:
Question
• Do I keep both of them?
You should only keep the effective “sparkJob.jars” parameter.
• Do I have to copy hive-site.xml to HDFS and give the HDFS path in
spark.yarn.dist.files?
You’d better copy hive-site.xml to HDFS, cause livy could only submit spark
applications in cluster mode, so the hive-site.xml should be accessed by each
node.
About the livy log:
According to livy log, it seems that your configuration of sparkJob.properties
doesn’t work, livy is trying to find hdfs:///griffin/griffin-measure.jar, not
hdfs:///user/griffin/griffin-measure.jar.
Pls correct the sparkJob.properties, and rebuild the service module and have a
try.
Thanks,
Lionel
On Fri, Apr 13, 2018 at 4:16 PM, Ashwini Kumar Gupta
<[email protected]<mailto:[email protected]>> wrote:
Hello Lionel,
Apologies for delayed reply. I was trying all my options before raising an
issue.
I’m attaching my installation steps. Please let me know what’s wrong with them.
Regards
Ashwin
From: [email protected]<mailto:[email protected]>
<[email protected]<mailto:[email protected]>> On Behalf Of Lionel Liu
Sent: 10 April 2018 18:11
To: [email protected]<mailto:[email protected]>;
Ashwini Kumar Gupta
<[email protected]<mailto:[email protected]>>
Subject: Re:RE: Can't get output after creating measure
Hi Ashwini,
It works the same in linux OS, we need to check the log to figure out what
happened, it might be some configure mistake or input mistake.
I recommend you try our docker image first, by following this doc:
https://github.com/apache/incubator-griffin/blob/master/griffin-doc/docker/griffin-docker-guide.md
--
Regards,
Lionel, Liu
At 2018-04-10 19:25:01, "Ashwini Kumar Gupta"
<[email protected]<mailto:[email protected]>> wrote:
>Hello Lionel,
>
>I’m running this in cloudera VM. Will that change anything?
>
>Regards
>Ashwin
>
>From: Lionel Liu <[email protected]<mailto:[email protected]>>
>Sent: 10 April 2018 15:26
>To: [email protected]<mailto:[email protected]>;
>Ashwini Kumar Gupta
><[email protected]<mailto:[email protected]>>
>Subject: Re: Can't get output after creating measure
>
>Hi Ashwini,
>
>First, you could check the log of griffin service, to know if it has triggered
>the job instance.
>Then, griffin service will submit a spark application with configuration to
>livy, you can check the log of livy, to verify if it has been submitted
>correctly.
>After that, you need to check the spark cluster, to verify the application has
>been accepted by the cluster, if it runs, you can get the application log
>through yarn.
>
>Each step error might block the result.
>
>Thanks,
>Lionel
>
>On Tue, Apr 10, 2018 at 5:01 PM, William Guo
><[email protected]<mailto:[email protected]<mailto:[email protected]%3cmailto:[email protected]>>>
> wrote:
>hi Ashwin,
>
>Could you show us your log here?
>
>Thanks,
>William
>
>On Tue, Apr 10, 2018 at 3:35 PM, Ashwini Kumar Gupta <
>[email protected]<mailto:[email protected]<mailto:[email protected]%3cmailto:[email protected]>>>
> wrote:
>
>> Hello Team,
>>
>> I have been trying to install and use griffin but I cannot get output when
>> I click on DQ Matrix.
>>
>> I created a measure, created a job to run.
>> The sequence in which I run all services are:
>>
>>
>> 1. Elasticsearch
>> 2. Jar file
>>
>> I also noticed that griffin is not creating mapping in ES.
>>
>> Can you please tell me where I'm going wrong.
>>
>> Thanks
>> Ashwin
>>
>
2018-04-15 02:31:00.706 INFO 1753 --- [ryBean_Worker-1]
o.a.griffin.core.job.SparkSubmitJob :
{"id":2,"state":"starting","appId":null,"appInfo":{"driverLogUrl":null,"sparkUiUrl":null},"log":["stdout:
","\nstderr: "]}
2018-04-15 02:31:00.723 INFO 1753 --- [ryBean_Worker-1]
o.a.griffin.core.job.SparkSubmitJob : Delete predicate
job(PG,test_job_predicate_1523784660232) success.
Hibernate: select jobinstanc0_.id as id1_7_0_, jobinstanc0_.created_date as
created_2_7_0_, jobinstanc0_.modified_date as modified3_7_0_,
jobinstanc0_.app_id as app_id4_7_0_, jobinstanc0_.app_uri as app_uri5_7_0_,
jobinstanc0_.predicate_job_deleted as predicat6_7_0_,
jobinstanc0_.expire_timestamp as expire_t7_7_0_,
jobinstanc0_.predicate_group_name as predicat8_7_0_,
jobinstanc0_.predicate_job_name as predicat9_7_0_, jobinstanc0_.session_id as
session10_7_0_, jobinstanc0_.state as state11_7_0_, jobinstanc0_.timestamp as
timesta12_7_0_ from job_instance_bean jobinstanc0_ where jobinstanc0_.id=?
Hibernate: update job_instance_bean set created_date=?, modified_date=?,
app_id=?, app_uri=?, predicate_job_deleted=?, expire_timestamp=?,
predicate_group_name=?, predicate_job_name=?, session_id=?, state=?,
timestamp=? where id=?
Hibernate: select distinct jobinstanc0_.id as id1_7_, jobinstanc0_.created_date
as created_2_7_, jobinstanc0_.modified_date as modified3_7_,
jobinstanc0_.app_id as app_id4_7_, jobinstanc0_.app_uri as app_uri5_7_,
jobinstanc0_.predicate_job_deleted as predicat6_7_,
jobinstanc0_.expire_timestamp as expire_t7_7_,
jobinstanc0_.predicate_group_name as predicat8_7_,
jobinstanc0_.predicate_job_name as predicat9_7_, jobinstanc0_.session_id as
session10_7_, jobinstanc0_.state as state11_7_, jobinstanc0_.timestamp as
timesta12_7_ from job_instance_bean jobinstanc0_ where jobinstanc0_.state in
('starting' , 'not_started' , 'recovering' , 'idle' , 'running' , 'busy')
Hibernate: select jobinstanc0_.id as id1_7_0_, jobinstanc0_.created_date as
created_2_7_0_, jobinstanc0_.modified_date as modified3_7_0_,
jobinstanc0_.app_id as app_id4_7_0_, jobinstanc0_.app_uri as app_uri5_7_0_,
jobinstanc0_.predicate_job_deleted as predicat6_7_0_,
jobinstanc0_.expire_timestamp as expire_t7_7_0_,
jobinstanc0_.predicate_group_name as predicat8_7_0_,
jobinstanc0_.predicate_job_name as predicat9_7_0_, jobinstanc0_.session_id as
session10_7_0_, jobinstanc0_.state as state11_7_0_, jobinstanc0_.timestamp as
timesta12_7_0_ from job_instance_bean jobinstanc0_ where jobinstanc0_.id=?
Hibernate: update job_instance_bean set created_date=?, modified_date=?,
app_id=?, app_uri=?, predicate_job_deleted=?, expire_timestamp=?,
predicate_group_name=?, predicate_job_name=?, session_id=?, state=?,
timestamp=? where id=?
Hibernate: select jobschedul0_.id as id1_8_0_, jobschedul0_.created_date as
created_2_8_0_, jobschedul0_.modified_date as modified3_8_0_,
jobschedul0_.cron_expression as cron_exp4_8_0_, jobschedul0_.job_name as
job_name5_8_0_, jobschedul0_.measure_id as measure_6_8_0_,
jobschedul0_.predicate_config as predicat7_8_0_, jobschedul0_.time_zone as
time_zon8_8_0_, segments1_.job_schedule_id as job_sche7_6_1_, segments1_.id as
id1_6_1_, segments1_.id as id1_6_2_, segments1_.created_date as created_2_6_2_,
segments1_.modified_date as modified3_6_2_, segments1_.baseline as
baseline4_6_2_, segments1_.data_connector_name as data_con5_6_2_,
segments1_.segment_range_id as segment_6_6_2_, segmentran2_.id as id1_12_3_,
segmentran2_.created_date as created_2_12_3_, segmentran2_.modified_date as
modified3_12_3_, segmentran2_.data_begin as data_beg4_12_3_,
segmentran2_.length as length5_12_3_ from job_schedule jobschedul0_ left outer
join job_data_segment segments1_ on jobschedul0_.id=segments1_.job_schedule_id
left outer join segment_range segmentran2_ on
segments1_.segment_range_id=segmentran2_.id where jobschedul0_.id=?
2018-04-15 02:32:00.075 INFO 1753 --- [ryBean_Worker-2]
o.s.b.f.config.PropertiesFactoryBean : Loading properties file from class
path resource [application.properties]
2018-04-15 02:32:00.076 INFO 1753 --- [ryBean_Worker-2]
o.a.griffin.core.util.PropertiesUtil : Read properties successfully from
/application.properties.
Hibernate: select griffinjob0_.id as id2_5_0_, griffinjob0_.created_date as
created_3_5_0_, griffinjob0_.modified_date as modified4_5_0_,
griffinjob0_.deleted as deleted5_5_0_, griffinjob0_.job_name as job_name6_5_0_,
griffinjob0_.measure_id as measure_7_5_0_, griffinjob0_.metric_name as
metric_n8_5_0_, griffinjob0_.quartz_group_name as quartz_g9_5_0_,
griffinjob0_.quartz_job_name as quartz_10_5_0_, jobinstanc1_.job_id as
job_id13_7_1_, jobinstanc1_.id as id1_7_1_, jobinstanc1_.id as id1_7_2_,
jobinstanc1_.created_date as created_2_7_2_, jobinstanc1_.modified_date as
modified3_7_2_, jobinstanc1_.app_id as app_id4_7_2_, jobinstanc1_.app_uri as
app_uri5_7_2_, jobinstanc1_.predicate_job_deleted as predicat6_7_2_,
jobinstanc1_.expire_timestamp as expire_t7_7_2_,
jobinstanc1_.predicate_group_name as predicat8_7_2_,
jobinstanc1_.predicate_job_name as predicat9_7_2_, jobinstanc1_.session_id as
session10_7_2_, jobinstanc1_.state as state11_7_2_, jobinstanc1_.timestamp as
timesta12_7_2_ from job griffinjob0_ left outer join job_instance_bean
jobinstanc1_ on griffinjob0_.id=jobinstanc1_.job_id where griffinjob0_.id=? and
griffinjob0_.type='griffin_job'
Hibernate: select griffinmea0_.id as id1_9_0_, griffinmea0_1_.created_date as
created_2_9_0_, griffinmea0_1_.modified_date as modified3_9_0_,
griffinmea0_1_.deleted as deleted4_9_0_, griffinmea0_1_.description as
descript5_9_0_, griffinmea0_1_.dq_type as dq_type6_9_0_, griffinmea0_1_.name as
name7_9_0_, griffinmea0_1_.organization as organiza8_9_0_, griffinmea0_1_.owner
as owner9_9_0_, griffinmea0_.evaluate_rule_id as evaluate4_4_0_,
griffinmea0_.process_type as process_1_4_0_, griffinmea0_.rule_description as
rule_des2_4_0_, datasource1_.measure_id as measure_5_1_1_, datasource1_.id as
id1_1_1_, datasource1_.id as id1_1_2_, datasource1_.created_date as
created_2_1_2_, datasource1_.modified_date as modified3_1_2_, datasource1_.name
as name4_1_2_, connectors2_.data_source_id as data_so10_0_3_, connectors2_.id
as id1_0_3_, connectors2_.id as id1_0_4_, connectors2_.created_date as
created_2_0_4_, connectors2_.modified_date as modified3_0_4_,
connectors2_.config as config4_0_4_, connectors2_.data_time_zone as
data_tim5_0_4_, connectors2_.data_unit as data_uni6_0_4_, connectors2_.name as
name7_0_4_, connectors2_.type as type8_0_4_, connectors2_.version as
version9_0_4_, evaluateru3_.id as id1_2_5_, evaluateru3_.created_date as
created_2_2_5_, evaluateru3_.modified_date as modified3_2_5_ from
griffin_measure griffinmea0_ inner join measure griffinmea0_1_ on
griffinmea0_.id=griffinmea0_1_.id left outer join data_source datasource1_ on
griffinmea0_.id=datasource1_.measure_id left outer join data_connector
connectors2_ on datasource1_.id=connectors2_.data_source_id left outer join
evaluate_rule evaluateru3_ on griffinmea0_.evaluate_rule_id=evaluateru3_.id
where griffinmea0_.id=?
Hibernate: select predicates0_.data_connector_id as data_con6_11_0_,
predicates0_.id as id1_11_0_, predicates0_.id as id1_11_1_,
predicates0_.created_date as created_2_11_1_, predicates0_.modified_date as
modified3_11_1_, predicates0_.config as config4_11_1_, predicates0_.type as
type5_11_1_ from segment_predicate predicates0_ where
predicates0_.data_connector_id=?
Hibernate: select rules0_.evaluate_rule_id as evaluat11_10_0_, rules0_.id as
id1_10_0_, rules0_.id as id1_10_1_, rules0_.created_date as created_2_10_1_,
rules0_.modified_date as modified3_10_1_, rules0_.details as details4_10_1_,
rules0_.dq_type as dq_type5_10_1_, rules0_.dsl_type as dsl_type6_10_1_,
rules0_.metric as metric7_10_1_, rules0_.name as name8_10_1_, rules0_.record as
record9_10_1_, rules0_.rule as rule10_10_1_ from rule rules0_ where
rules0_.evaluate_rule_id=?
Hibernate: select predicates0_.data_connector_id as data_con6_11_0_,
predicates0_.id as id1_11_0_, predicates0_.id as id1_11_1_,
predicates0_.created_date as created_2_11_1_, predicates0_.modified_date as
modified3_11_1_, predicates0_.config as config4_11_1_, predicates0_.type as
type5_11_1_ from segment_predicate predicates0_ where
predicates0_.data_connector_id=?
Hibernate: select griffinjob0_.id as id2_5_1_, griffinjob0_.created_date as
created_3_5_1_, griffinjob0_.modified_date as modified4_5_1_,
griffinjob0_.deleted as deleted5_5_1_, griffinjob0_.job_name as job_name6_5_1_,
griffinjob0_.measure_id as measure_7_5_1_, griffinjob0_.metric_name as
metric_n8_5_1_, griffinjob0_.quartz_group_name as quartz_g9_5_1_,
griffinjob0_.quartz_job_name as quartz_10_5_1_, jobinstanc1_.job_id as
job_id13_7_3_, jobinstanc1_.id as id1_7_3_, jobinstanc1_.id as id1_7_0_,
jobinstanc1_.created_date as created_2_7_0_, jobinstanc1_.modified_date as
modified3_7_0_, jobinstanc1_.app_id as app_id4_7_0_, jobinstanc1_.app_uri as
app_uri5_7_0_, jobinstanc1_.predicate_job_deleted as predicat6_7_0_,
jobinstanc1_.expire_timestamp as expire_t7_7_0_,
jobinstanc1_.predicate_group_name as predicat8_7_0_,
jobinstanc1_.predicate_job_name as predicat9_7_0_, jobinstanc1_.session_id as
session10_7_0_, jobinstanc1_.state as state11_7_0_, jobinstanc1_.timestamp as
timesta12_7_0_ from job griffinjob0_ left outer join job_instance_bean
jobinstanc1_ on griffinjob0_.id=jobinstanc1_.job_id where griffinjob0_.id=? and
griffinjob0_.type='griffin_job'
Hibernate: insert into job_instance_bean (created_date, modified_date, app_id,
app_uri, predicate_job_deleted, expire_timestamp, predicate_group_name,
predicate_job_name, session_id, state, timestamp) values (?, ?, ?, ?, ?, ?, ?,
?, ?, ?, ?)
Hibernate: update job_instance_bean set job_id=? where id=?
Hibernate: select jobinstanc0_.id as id1_7_, jobinstanc0_.created_date as
created_2_7_, jobinstanc0_.modified_date as modified3_7_, jobinstanc0_.app_id
as app_id4_7_, jobinstanc0_.app_uri as app_uri5_7_,
jobinstanc0_.predicate_job_deleted as predicat6_7_,
jobinstanc0_.expire_timestamp as expire_t7_7_,
jobinstanc0_.predicate_group_name as predicat8_7_,
jobinstanc0_.predicate_job_name as predicat9_7_, jobinstanc0_.session_id as
session10_7_, jobinstanc0_.state as state11_7_, jobinstanc0_.timestamp as
timesta12_7_ from job_instance_bean jobinstanc0_ where
jobinstanc0_.predicate_job_name=?
2018-04-15 02:32:00.397 INFO 1753 --- [ryBean_Worker-3]
o.a.griffin.core.job.SparkSubmitJob : {
"measure.type" : "griffin",
"id" : 2,
"name" : "test_job",
"owner" : "test",
"description" : null,
"organization" : null,
"deleted" : false,
"timestamp" : 1523781120000,
"dq.type" : "accuracy",
"process.type" : "batch",
"data.sources" : [ {
"id" : 3,
"name" : "source",
"connectors" : [ {
"id" : 3,
"name" : "source1523773590187",
"type" : "HIVE",
"version" : "1.2",
"predicates" : [ ],
"data.unit" : "1hour",
"data.time.zone" : "",
"config" : {
"database" : "griffin",
"table.name" : "emp_src",
"where" : "dt=20180415 AND hour=01"
}
} ]
}, {
"id" : 4,
"name" : "target",
"connectors" : [ {
"id" : 4,
"name" : "target1523773598244",
"type" : "HIVE",
"version" : "1.2",
"predicates" : [ ],
"data.unit" : "1hour",
"data.time.zone" : "",
"config" : {
"database" : "griffin",
"table.name" : "emp_tgt",
"where" : "dt=20180415 AND hour=01"
}
} ]
} ],
"evaluate.rule" : {
"id" : 2,
"rules" : [ {
"id" : 2,
"rule" : "source.id=target.id AND source.name=target.name AND
source.city=target.city",
"name" : "accuracy",
"dsl.type" : "griffin-dsl",
"dq.type" : "accuracy"
} ]
},
"measure.type" : "griffin"
}
2018-04-15 02:32:00.469 INFO 1753 --- [ryBean_Worker-3]
o.a.griffin.core.job.SparkSubmitJob :
{"id":3,"state":"starting","appId":null,"appInfo":{"driverLogUrl":null,"sparkUiUrl":null},"log":["stdout:
","\nstderr: "]}
2018-04-15 02:32:00.482 INFO 1753 --- [ryBean_Worker-3]
o.a.griffin.core.job.SparkSubmitJob : Delete predicate
job(PG,test_job_predicate_1523784720158) success.
Hibernate: select jobinstanc0_.id as id1_7_0_, jobinstanc0_.created_date as
created_2_7_0_, jobinstanc0_.modified_date as modified3_7_0_,
jobinstanc0_.app_id as app_id4_7_0_, jobinstanc0_.app_uri as app_uri5_7_0_,
jobinstanc0_.predicate_job_deleted as predicat6_7_0_,
jobinstanc0_.expire_timestamp as expire_t7_7_0_,
jobinstanc0_.predicate_group_name as predicat8_7_0_,
jobinstanc0_.predicate_job_name as predicat9_7_0_, jobinstanc0_.session_id as
session10_7_0_, jobinstanc0_.state as state11_7_0_, jobinstanc0_.timestamp as
timesta12_7_0_ from job_instance_bean jobinstanc0_ where jobinstanc0_.id=?
Hibernate: update job_instance_bean set created_date=?, modified_date=?,
app_id=?, app_uri=?, predicate_job_deleted=?, expire_timestamp=?,
predicate_group_name=?, predicate_job_name=?, session_id=?, state=?,
timestamp=? where id=?
Hibernate: select distinct jobinstanc0_.id as id1_7_, jobinstanc0_.created_date
as created_2_7_, jobinstanc0_.modified_date as modified3_7_,
jobinstanc0_.app_id as app_id4_7_, jobinstanc0_.app_uri as app_uri5_7_,
jobinstanc0_.predicate_job_deleted as predicat6_7_,
jobinstanc0_.expire_timestamp as expire_t7_7_,
jobinstanc0_.predicate_group_name as predicat8_7_,
jobinstanc0_.predicate_job_name as predicat9_7_, jobinstanc0_.session_id as
session10_7_, jobinstanc0_.state as state11_7_, jobinstanc0_.timestamp as
timesta12_7_ from job_instance_bean jobinstanc0_ where jobinstanc0_.state in
('starting' , 'not_started' , 'recovering' , 'idle' , 'running' , 'busy')
Hibernate: select jobinstanc0_.id as id1_7_0_, jobinstanc0_.created_date as
created_2_7_0_, jobinstanc0_.modified_date as modified3_7_0_,
jobinstanc0_.app_id as app_id4_7_0_, jobinstanc0_.app_uri as app_uri5_7_0_,
jobinstanc0_.predicate_job_deleted as predicat6_7_0_,
jobinstanc0_.expire_timestamp as expire_t7_7_0_,
jobinstanc0_.predicate_group_name as predicat8_7_0_,
jobinstanc0_.predicate_job_name as predicat9_7_0_, jobinstanc0_.session_id as
session10_7_0_, jobinstanc0_.state as state11_7_0_, jobinstanc0_.timestamp as
timesta12_7_0_ from job_instance_bean jobinstanc0_ where jobinstanc0_.id=?
Hibernate: update job_instance_bean set created_date=?, modified_date=?,
app_id=?, app_uri=?, predicate_job_deleted=?, expire_timestamp=?,
predicate_group_name=?, predicate_job_name=?, session_id=?, state=?,
timestamp=? where id=?
Hibernate: select griffinjob0_.id as id2_5_, griffinjob0_.created_date as
created_3_5_, griffinjob0_.modified_date as modified4_5_, griffinjob0_.deleted
as deleted5_5_, griffinjob0_.job_name as job_name6_5_, griffinjob0_.measure_id
as measure_7_5_, griffinjob0_.metric_name as metric_n8_5_,
griffinjob0_.quartz_group_name as quartz_g9_5_, griffinjob0_.quartz_job_name as
quartz_10_5_ from job griffinjob0_ where griffinjob0_.type='griffin_job' and
griffinjob0_.deleted=?
Hibernate: select jobinstanc0_.job_id as job_id13_7_0_, jobinstanc0_.id as
id1_7_0_, jobinstanc0_.id as id1_7_1_, jobinstanc0_.created_date as
created_2_7_1_, jobinstanc0_.modified_date as modified3_7_1_,
jobinstanc0_.app_id as app_id4_7_1_, jobinstanc0_.app_uri as app_uri5_7_1_,
jobinstanc0_.predicate_job_deleted as predicat6_7_1_,
jobinstanc0_.expire_timestamp as expire_t7_7_1_,
jobinstanc0_.predicate_group_name as predicat8_7_1_,
jobinstanc0_.predicate_job_name as predicat9_7_1_, jobinstanc0_.session_id as
session10_7_1_, jobinstanc0_.state as state11_7_1_, jobinstanc0_.timestamp as
timesta12_7_1_ from job_instance_bean jobinstanc0_ where jobinstanc0_.job_id=?
Hibernate: select abstractjo0_.id as id2_5_, abstractjo0_.created_date as
created_3_5_, abstractjo0_.modified_date as modified4_5_, abstractjo0_.deleted
as deleted5_5_, abstractjo0_.job_name as job_name6_5_, abstractjo0_.measure_id
as measure_7_5_, abstractjo0_.metric_name as metric_n8_5_,
abstractjo0_.quartz_group_name as quartz_g9_5_, abstractjo0_.quartz_job_name as
quartz_10_5_, abstractjo0_.type as type1_5_ from job abstractjo0_ where
abstractjo0_.deleted=?
Hibernate: select jobinstanc0_.job_id as job_id13_7_0_, jobinstanc0_.id as
id1_7_0_, jobinstanc0_.id as id1_7_1_, jobinstanc0_.created_date as
created_2_7_1_, jobinstanc0_.modified_date as modified3_7_1_,
jobinstanc0_.app_id as app_id4_7_1_, jobinstanc0_.app_uri as app_uri5_7_1_,
jobinstanc0_.predicate_job_deleted as predicat6_7_1_,
jobinstanc0_.expire_timestamp as expire_t7_7_1_,
jobinstanc0_.predicate_group_name as predicat8_7_1_,
jobinstanc0_.predicate_job_name as predicat9_7_1_, jobinstanc0_.session_id as
session10_7_1_, jobinstanc0_.state as state11_7_1_, jobinstanc0_.timestamp as
timesta12_7_1_ from job_instance_bean jobinstanc0_ where jobinstanc0_.job_id=?
Hibernate: select measure0_.id as id1_9_, measure0_.created_date as
created_2_9_, measure0_.modified_date as modified3_9_, measure0_.deleted as
deleted4_9_, measure0_.description as descript5_9_, measure0_.dq_type as
dq_type6_9_, measure0_.name as name7_9_, measure0_.organization as
organiza8_9_, measure0_.owner as owner9_9_, measure0_1_.evaluate_rule_id as
evaluate4_4_, measure0_1_.process_type as process_1_4_,
measure0_1_.rule_description as rule_des2_4_, measure0_2_.metric_name as
metric_n1_3_, measure0_2_.virtual_job_id as virtual_3_3_, case when
measure0_1_.id is not null then 1 when measure0_2_.id is not null then 2 when
measure0_.id is not null then 0 end as clazz_ from measure measure0_ left outer
join griffin_measure measure0_1_ on measure0_.id=measure0_1_.id left outer join
external_measure measure0_2_ on measure0_.id=measure0_2_.id where
measure0_.deleted=?
Hibernate: select evaluateru0_.id as id1_2_0_, evaluateru0_.created_date as
created_2_2_0_, evaluateru0_.modified_date as modified3_2_0_ from evaluate_rule
evaluateru0_ where evaluateru0_.id=?
Hibernate: select rules0_.evaluate_rule_id as evaluat11_10_0_, rules0_.id as
id1_10_0_, rules0_.id as id1_10_1_, rules0_.created_date as created_2_10_1_,
rules0_.modified_date as modified3_10_1_, rules0_.details as details4_10_1_,
rules0_.dq_type as dq_type5_10_1_, rules0_.dsl_type as dsl_type6_10_1_,
rules0_.metric as metric7_10_1_, rules0_.name as name8_10_1_, rules0_.record as
record9_10_1_, rules0_.rule as rule10_10_1_ from rule rules0_ where
rules0_.evaluate_rule_id=?
Hibernate: select datasource0_.measure_id as measure_5_1_0_, datasource0_.id as
id1_1_0_, datasource0_.id as id1_1_1_, datasource0_.created_date as
created_2_1_1_, datasource0_.modified_date as modified3_1_1_, datasource0_.name
as name4_1_1_ from data_source datasource0_ where datasource0_.measure_id=?
Hibernate: select connectors0_.data_source_id as data_so10_0_0_,
connectors0_.id as id1_0_0_, connectors0_.id as id1_0_1_,
connectors0_.created_date as created_2_0_1_, connectors0_.modified_date as
modified3_0_1_, connectors0_.config as config4_0_1_,
connectors0_.data_time_zone as data_tim5_0_1_, connectors0_.data_unit as
data_uni6_0_1_, connectors0_.name as name7_0_1_, connectors0_.type as
type8_0_1_, connectors0_.version as version9_0_1_ from data_connector
connectors0_ where connectors0_.data_source_id=?
Hibernate: select predicates0_.data_connector_id as data_con6_11_0_,
predicates0_.id as id1_11_0_, predicates0_.id as id1_11_1_,
predicates0_.created_date as created_2_11_1_, predicates0_.modified_date as
modified3_11_1_, predicates0_.config as config4_11_1_, predicates0_.type as
type5_11_1_ from segment_predicate predicates0_ where
predicates0_.data_connector_id=?
Hibernate: select connectors0_.data_source_id as data_so10_0_0_,
connectors0_.id as id1_0_0_, connectors0_.id as id1_0_1_,
connectors0_.created_date as created_2_0_1_, connectors0_.modified_date as
modified3_0_1_, connectors0_.config as config4_0_1_,
connectors0_.data_time_zone as data_tim5_0_1_, connectors0_.data_unit as
data_uni6_0_1_, connectors0_.name as name7_0_1_, connectors0_.type as
type8_0_1_, connectors0_.version as version9_0_1_ from data_connector
connectors0_ where connectors0_.data_source_id=?
Hibernate: select predicates0_.data_connector_id as data_con6_11_0_,
predicates0_.id as id1_11_0_, predicates0_.id as id1_11_1_,
predicates0_.created_date as created_2_11_1_, predicates0_.modified_date as
modified3_11_1_, predicates0_.config as config4_11_1_, predicates0_.type as
type5_11_1_ from segment_predicate predicates0_ where
predicates0_.data_connector_id=?
Hibernate: select abstractjo0_.id as id2_5_, abstractjo0_.created_date as
created_3_5_, abstractjo0_.modified_date as modified4_5_, abstractjo0_.deleted
as deleted5_5_, abstractjo0_.job_name as job_name6_5_, abstractjo0_.measure_id
as measure_7_5_, abstractjo0_.metric_name as metric_n8_5_,
abstractjo0_.quartz_group_name as quartz_g9_5_, abstractjo0_.quartz_job_name as
quartz_10_5_, abstractjo0_.type as type1_5_ from job abstractjo0_ where
abstractjo0_.deleted=?
Hibernate: select jobinstanc0_.job_id as job_id13_7_0_, jobinstanc0_.id as
id1_7_0_, jobinstanc0_.id as id1_7_1_, jobinstanc0_.created_date as
created_2_7_1_, jobinstanc0_.modified_date as modified3_7_1_,
jobinstanc0_.app_id as app_id4_7_1_, jobinstanc0_.app_uri as app_uri5_7_1_,
jobinstanc0_.predicate_job_deleted as predicat6_7_1_,
jobinstanc0_.expire_timestamp as expire_t7_7_1_,
jobinstanc0_.predicate_group_name as predicat8_7_1_,
jobinstanc0_.predicate_job_name as predicat9_7_1_, jobinstanc0_.session_id as
session10_7_1_, jobinstanc0_.state as state11_7_1_, jobinstanc0_.timestamp as
timesta12_7_1_ from job_instance_bean jobinstanc0_ where jobinstanc0_.job_id=?
Hibernate: select measure0_.id as id1_9_, measure0_.created_date as
created_2_9_, measure0_.modified_date as modified3_9_, measure0_.deleted as
deleted4_9_, measure0_.description as descript5_9_, measure0_.dq_type as
dq_type6_9_, measure0_.name as name7_9_, measure0_.organization as
organiza8_9_, measure0_.owner as owner9_9_, measure0_1_.evaluate_rule_id as
evaluate4_4_, measure0_1_.process_type as process_1_4_,
measure0_1_.rule_description as rule_des2_4_, measure0_2_.metric_name as
metric_n1_3_, measure0_2_.virtual_job_id as virtual_3_3_, case when
measure0_1_.id is not null then 1 when measure0_2_.id is not null then 2 when
measure0_.id is not null then 0 end as clazz_ from measure measure0_ left outer
join griffin_measure measure0_1_ on measure0_.id=measure0_1_.id left outer join
external_measure measure0_2_ on measure0_.id=measure0_2_.id where
measure0_.deleted=?
Hibernate: select evaluateru0_.id as id1_2_0_, evaluateru0_.created_date as
created_2_2_0_, evaluateru0_.modified_date as modified3_2_0_ from evaluate_rule
evaluateru0_ where evaluateru0_.id=?
Hibernate: select rules0_.evaluate_rule_id as evaluat11_10_0_, rules0_.id as
id1_10_0_, rules0_.id as id1_10_1_, rules0_.created_date as created_2_10_1_,
rules0_.modified_date as modified3_10_1_, rules0_.details as details4_10_1_,
rules0_.dq_type as dq_type5_10_1_, rules0_.dsl_type as dsl_type6_10_1_,
rules0_.metric as metric7_10_1_, rules0_.name as name8_10_1_, rules0_.record as
record9_10_1_, rules0_.rule as rule10_10_1_ from rule rules0_ where
rules0_.evaluate_rule_id=?
Hibernate: select datasource0_.measure_id as measure_5_1_0_, datasource0_.id as
id1_1_0_, datasource0_.id as id1_1_1_, datasource0_.created_date as
created_2_1_1_, datasource0_.modified_date as modified3_1_1_, datasource0_.name
as name4_1_1_ from data_source datasource0_ where datasource0_.measure_id=?
Hibernate: select connectors0_.data_source_id as data_so10_0_0_,
connectors0_.id as id1_0_0_, connectors0_.id as id1_0_1_,
connectors0_.created_date as created_2_0_1_, connectors0_.modified_date as
modified3_0_1_, connectors0_.config as config4_0_1_,
connectors0_.data_time_zone as data_tim5_0_1_, connectors0_.data_unit as
data_uni6_0_1_, connectors0_.name as name7_0_1_, connectors0_.type as
type8_0_1_, connectors0_.version as version9_0_1_ from data_connector
connectors0_ where connectors0_.data_source_id=?
Hibernate: select predicates0_.data_connector_id as data_con6_11_0_,
predicates0_.id as id1_11_0_, predicates0_.id as id1_11_1_,
predicates0_.created_date as created_2_11_1_, predicates0_.modified_date as
modified3_11_1_, predicates0_.config as config4_11_1_, predicates0_.type as
type5_11_1_ from segment_predicate predicates0_ where
predicates0_.data_connector_id=?
Hibernate: select connectors0_.data_source_id as data_so10_0_0_,
connectors0_.id as id1_0_0_, connectors0_.id as id1_0_1_,
connectors0_.created_date as created_2_0_1_, connectors0_.modified_date as
modified3_0_1_, connectors0_.config as config4_0_1_,
connectors0_.data_time_zone as data_tim5_0_1_, connectors0_.data_unit as
data_uni6_0_1_, connectors0_.name as name7_0_1_, connectors0_.type as
type8_0_1_, connectors0_.version as version9_0_1_ from data_connector
connectors0_ where connectors0_.data_source_id=?
Hibernate: select predicates0_.data_connector_id as data_con6_11_0_,
predicates0_.id as id1_11_0_, predicates0_.id as id1_11_1_,
predicates0_.created_date as created_2_11_1_, predicates0_.modified_date as
modified3_11_1_, predicates0_.config as config4_11_1_, predicates0_.type as
type5_11_1_ from segment_predicate predicates0_ where
predicates0_.data_connector_id=?
Hibernate: select jobschedul0_.id as id1_8_0_, jobschedul0_.created_date as
created_2_8_0_, jobschedul0_.modified_date as modified3_8_0_,
jobschedul0_.cron_expression as cron_exp4_8_0_, jobschedul0_.job_name as
job_name5_8_0_, jobschedul0_.measure_id as measure_6_8_0_,
jobschedul0_.predicate_config as predicat7_8_0_, jobschedul0_.time_zone as
time_zon8_8_0_, segments1_.job_schedule_id as job_sche7_6_1_, segments1_.id as
id1_6_1_, segments1_.id as id1_6_2_, segments1_.created_date as created_2_6_2_,
segments1_.modified_date as modified3_6_2_, segments1_.baseline as
baseline4_6_2_, segments1_.data_connector_name as data_con5_6_2_,
segments1_.segment_range_id as segment_6_6_2_, segmentran2_.id as id1_12_3_,
segmentran2_.created_date as created_2_12_3_, segmentran2_.modified_date as
modified3_12_3_, segmentran2_.data_begin as data_beg4_12_3_,
segmentran2_.length as length5_12_3_ from job_schedule jobschedul0_ left outer
join job_data_segment segments1_ on jobschedul0_.id=segments1_.job_schedule_id
left outer join segment_range segmentran2_ on
segments1_.segment_range_id=segmentran2_.id where jobschedul0_.id=?
2018-04-15 02:33:00.069 INFO 1753 --- [ryBean_Worker-4]
o.s.b.f.config.PropertiesFactoryBean : Loading properties file from class
path resource [application.properties]
2018-04-15 02:33:00.070 INFO 1753 --- [ryBean_Worker-4]
o.a.griffin.core.util.PropertiesUtil : Read properties successfully from
/application.properties.
Hibernate: select griffinjob0_.id as id2_5_0_, griffinjob0_.created_date as
created_3_5_0_, griffinjob0_.modified_date as modified4_5_0_,
griffinjob0_.deleted as deleted5_5_0_, griffinjob0_.job_name as job_name6_5_0_,
griffinjob0_.measure_id as measure_7_5_0_, griffinjob0_.metric_name as
metric_n8_5_0_, griffinjob0_.quartz_group_name as quartz_g9_5_0_,
griffinjob0_.quartz_job_name as quartz_10_5_0_, jobinstanc1_.job_id as
job_id13_7_1_, jobinstanc1_.id as id1_7_1_, jobinstanc1_.id as id1_7_2_,
jobinstanc1_.created_date as created_2_7_2_, jobinstanc1_.modified_date as
modified3_7_2_, jobinstanc1_.app_id as app_id4_7_2_, jobinstanc1_.app_uri as
app_uri5_7_2_, jobinstanc1_.predicate_job_deleted as predicat6_7_2_,
jobinstanc1_.expire_timestamp as expire_t7_7_2_,
jobinstanc1_.predicate_group_name as predicat8_7_2_,
jobinstanc1_.predicate_job_name as predicat9_7_2_, jobinstanc1_.session_id as
session10_7_2_, jobinstanc1_.state as state11_7_2_, jobinstanc1_.timestamp as
timesta12_7_2_ from job griffinjob0_ left outer join job_instance_bean
jobinstanc1_ on griffinjob0_.id=jobinstanc1_.job_id where griffinjob0_.id=? and
griffinjob0_.type='griffin_job'
Hibernate: select griffinmea0_.id as id1_9_0_, griffinmea0_1_.created_date as
created_2_9_0_, griffinmea0_1_.modified_date as modified3_9_0_,
griffinmea0_1_.deleted as deleted4_9_0_, griffinmea0_1_.description as
descript5_9_0_, griffinmea0_1_.dq_type as dq_type6_9_0_, griffinmea0_1_.name as
name7_9_0_, griffinmea0_1_.organization as organiza8_9_0_, griffinmea0_1_.owner
as owner9_9_0_, griffinmea0_.evaluate_rule_id as evaluate4_4_0_,
griffinmea0_.process_type as process_1_4_0_, griffinmea0_.rule_description as
rule_des2_4_0_, datasource1_.measure_id as measure_5_1_1_, datasource1_.id as
id1_1_1_, datasource1_.id as id1_1_2_, datasource1_.created_date as
created_2_1_2_, datasource1_.modified_date as modified3_1_2_, datasource1_.name
as name4_1_2_, connectors2_.data_source_id as data_so10_0_3_, connectors2_.id
as id1_0_3_, connectors2_.id as id1_0_4_, connectors2_.created_date as
created_2_0_4_, connectors2_.modified_date as modified3_0_4_,
connectors2_.config as config4_0_4_, connectors2_.data_time_zone as
data_tim5_0_4_, connectors2_.data_unit as data_uni6_0_4_, connectors2_.name as
name7_0_4_, connectors2_.type as type8_0_4_, connectors2_.version as
version9_0_4_, evaluateru3_.id as id1_2_5_, evaluateru3_.created_date as
created_2_2_5_, evaluateru3_.modified_date as modified3_2_5_ from
griffin_measure griffinmea0_ inner join measure griffinmea0_1_ on
griffinmea0_.id=griffinmea0_1_.id left outer join data_source datasource1_ on
griffinmea0_.id=datasource1_.measure_id left outer join data_connector
connectors2_ on datasource1_.id=connectors2_.data_source_id left outer join
evaluate_rule evaluateru3_ on griffinmea0_.evaluate_rule_id=evaluateru3_.id
where griffinmea0_.id=?
Hibernate: select predicates0_.data_connector_id as data_con6_11_0_,
predicates0_.id as id1_11_0_, predicates0_.id as id1_11_1_,
predicates0_.created_date as created_2_11_1_, predicates0_.modified_date as
modified3_11_1_, predicates0_.config as config4_11_1_, predicates0_.type as
type5_11_1_ from segment_predicate predicates0_ where
predicates0_.data_connector_id=?
Hibernate: select rules0_.evaluate_rule_id as evaluat11_10_0_, rules0_.id as
id1_10_0_, rules0_.id as id1_10_1_, rules0_.created_date as created_2_10_1_,
rules0_.modified_date as modified3_10_1_, rules0_.details as details4_10_1_,
rules0_.dq_type as dq_type5_10_1_, rules0_.dsl_type as dsl_type6_10_1_,
rules0_.metric as metric7_10_1_, rules0_.name as name8_10_1_, rules0_.record as
record9_10_1_, rules0_.rule as rule10_10_1_ from rule rules0_ where
rules0_.evaluate_rule_id=?
Hibernate: select predicates0_.data_connector_id as data_con6_11_0_,
predicates0_.id as id1_11_0_, predicates0_.id as id1_11_1_,
predicates0_.created_date as created_2_11_1_, predicates0_.modified_date as
modified3_11_1_, predicates0_.config as config4_11_1_, predicates0_.type as
type5_11_1_ from segment_predicate predicates0_ where
predicates0_.data_connector_id=?
Hibernate: select griffinjob0_.id as id2_5_1_, griffinjob0_.created_date as
created_3_5_1_, griffinjob0_.modified_date as modified4_5_1_,
griffinjob0_.deleted as deleted5_5_1_, griffinjob0_.job_name as job_name6_5_1_,
griffinjob0_.measure_id as measure_7_5_1_, griffinjob0_.metric_name as
metric_n8_5_1_, griffinjob0_.quartz_group_name as quartz_g9_5_1_,
griffinjob0_.quartz_job_name as quartz_10_5_1_, jobinstanc1_.job_id as
job_id13_7_3_, jobinstanc1_.id as id1_7_3_, jobinstanc1_.id as id1_7_0_,
jobinstanc1_.created_date as created_2_7_0_, jobinstanc1_.modified_date as
modified3_7_0_, jobinstanc1_.app_id as app_id4_7_0_, jobinstanc1_.app_uri as
app_uri5_7_0_, jobinstanc1_.predicate_job_deleted as predicat6_7_0_,
jobinstanc1_.expire_timestamp as expire_t7_7_0_,
jobinstanc1_.predicate_group_name as predicat8_7_0_,
jobinstanc1_.predicate_job_name as predicat9_7_0_, jobinstanc1_.session_id as
session10_7_0_, jobinstanc1_.state as state11_7_0_, jobinstanc1_.timestamp as
timesta12_7_0_ from job griffinjob0_ left outer join job_instance_bean
jobinstanc1_ on griffinjob0_.id=jobinstanc1_.job_id where griffinjob0_.id=? and
griffinjob0_.type='griffin_job'
Hibernate: insert into job_instance_bean (created_date, modified_date, app_id,
app_uri, predicate_job_deleted, expire_timestamp, predicate_group_name,
predicate_job_name, session_id, state, timestamp) values (?, ?, ?, ?, ?, ?, ?,
?, ?, ?, ?)
Hibernate: update job_instance_bean set job_id=? where id=?
Hibernate: select jobinstanc0_.id as id1_7_, jobinstanc0_.created_date as
created_2_7_, jobinstanc0_.modified_date as modified3_7_, jobinstanc0_.app_id
as app_id4_7_, jobinstanc0_.app_uri as app_uri5_7_,
jobinstanc0_.predicate_job_deleted as predicat6_7_,
jobinstanc0_.expire_timestamp as expire_t7_7_,
jobinstanc0_.predicate_group_name as predicat8_7_,
jobinstanc0_.predicate_job_name as predicat9_7_, jobinstanc0_.session_id as
session10_7_, jobinstanc0_.state as state11_7_, jobinstanc0_.timestamp as
timesta12_7_ from job_instance_bean jobinstanc0_ where
jobinstanc0_.predicate_job_name=?
2018-04-15 02:33:00.479 INFO 1753 --- [ryBean_Worker-5]
o.a.griffin.core.job.SparkSubmitJob : {
"measure.type" : "griffin",
"id" : 2,
"name" : "test_job",
"owner" : "test",
"description" : null,
"organization" : null,
"deleted" : false,
"timestamp" : 1523781180000,
"dq.type" : "accuracy",
"process.type" : "batch",
"data.sources" : [ {
"id" : 3,
"name" : "source",
"connectors" : [ {
"id" : 3,
"name" : "source1523773590187",
"type" : "HIVE",
"version" : "1.2",
"predicates" : [ ],
"data.unit" : "1hour",
"data.time.zone" : "",
"config" : {
"database" : "griffin",
"table.name" : "emp_src",
"where" : "dt=20180415 AND hour=01"
}
} ]
}, {
"id" : 4,
"name" : "target",
"connectors" : [ {
"id" : 4,
"name" : "target1523773598244",
"type" : "HIVE",
"version" : "1.2",
"predicates" : [ ],
"data.unit" : "1hour",
"data.time.zone" : "",
"config" : {
"database" : "griffin",
"table.name" : "emp_tgt",
"where" : "dt=20180415 AND hour=01"
}
} ]
} ],
"evaluate.rule" : {
"id" : 2,
"rules" : [ {
"id" : 2,
"rule" : "source.id=target.id AND source.name=target.name AND
source.city=target.city",
"name" : "accuracy",
"dsl.type" : "griffin-dsl",
"dq.type" : "accuracy"
} ]
},
"measure.type" : "griffin"
}
2018-04-15 02:33:00.563 INFO 1753 --- [ryBean_Worker-5]
o.a.griffin.core.job.SparkSubmitJob :
{"id":4,"state":"starting","appId":null,"appInfo":{"driverLogUrl":null,"sparkUiUrl":null},"log":["stdout:
","\nstderr: "]}
2018-04-15 02:33:00.573 INFO 1753 --- [ryBean_Worker-5]
o.a.griffin.core.job.SparkSubmitJob : Delete predicate
job(PG,test_job_predicate_1523784780136) success.
Hibernate: select jobinstanc0_.id as id1_7_0_, jobinstanc0_.created_date as
created_2_7_0_, jobinstanc0_.modified_date as modified3_7_0_,
jobinstanc0_.app_id as app_id4_7_0_, jobinstanc0_.app_uri as app_uri5_7_0_,
jobinstanc0_.predicate_job_deleted as predicat6_7_0_,
jobinstanc0_.expire_timestamp as expire_t7_7_0_,
jobinstanc0_.predicate_group_name as predicat8_7_0_,
jobinstanc0_.predicate_job_name as predicat9_7_0_, jobinstanc0_.session_id as
session10_7_0_, jobinstanc0_.state as state11_7_0_, jobinstanc0_.timestamp as
timesta12_7_0_ from job_instance_bean jobinstanc0_ where jobinstanc0_.id=?
Hibernate: update job_instance_bean set created_date=?, modified_date=?,
app_id=?, app_uri=?, predicate_job_deleted=?, expire_timestamp=?,
predicate_group_name=?, predicate_job_name=?, session_id=?, state=?,
timestamp=? where id=?
Hibernate: select distinct jobinstanc0_.id as id1_7_, jobinstanc0_.created_date
as created_2_7_, jobinstanc0_.modified_date as modified3_7_,
jobinstanc0_.app_id as app_id4_7_, jobinstanc0_.app_uri as app_uri5_7_,
jobinstanc0_.predicate_job_deleted as predicat6_7_,
jobinstanc0_.expire_timestamp as expire_t7_7_,
jobinstanc0_.predicate_group_name as predicat8_7_,
jobinstanc0_.predicate_job_name as predicat9_7_, jobinstanc0_.session_id as
session10_7_, jobinstanc0_.state as state11_7_, jobinstanc0_.timestamp as
timesta12_7_ from job_instance_bean jobinstanc0_ where jobinstanc0_.state in
('starting' , 'not_started' , 'recovering' , 'idle' , 'running' , 'busy')
Hibernate: select jobinstanc0_.id as id1_7_0_, jobinstanc0_.created_date as
created_2_7_0_, jobinstanc0_.modified_date as modified3_7_0_,
jobinstanc0_.app_id as app_id4_7_0_, jobinstanc0_.app_uri as app_uri5_7_0_,
jobinstanc0_.predicate_job_deleted as predicat6_7_0_,
jobinstanc0_.expire_timestamp as expire_t7_7_0_,
jobinstanc0_.predicate_group_name as predicat8_7_0_,
jobinstanc0_.predicate_job_name as predicat9_7_0_, jobinstanc0_.session_id as
session10_7_0_, jobinstanc0_.state as state11_7_0_, jobinstanc0_.timestamp as
timesta12_7_0_ from job_instance_bean jobinstanc0_ where jobinstanc0_.id=?
Hibernate: update job_instance_bean set created_date=?, modified_date=?,
app_id=?, app_uri=?, predicate_job_deleted=?, expire_timestamp=?,
predicate_group_name=?, predicate_job_name=?, session_id=?, state=?,
timestamp=? where id=?