+ Users
From: Preetam Shingavi <[email protected]>
Date: Monday, February 24, 2020 at 2:23 PM
To: "[email protected]" <[email protected]>
Subject: Spark sql output sink to Elastic search - No fields in result value
Hi,
I have a hive table and want to try running a spark-sql and output metric
result to elastic search. Right now I am facing an issue to see the projected
columns from the spark-sql query.
Here is the DQ job:
{
"name":"fms_count_score_measure",
"measure.type":"griffin",
"process.type":"BATCH",
"owner":"test",
"description":"fms count measure description",
"data.sources":[
{
"name":"fms_ip",
"connectors":[
{
"name":"fms_ip_connector",
"type":"HIVE",
"version":"1.2",
"data.unit":"1hour",
"data.time.zone":"UTC(WET,GMT)",
"config":{
"database":"default",
"table.name":"fms",
"where":"me_type='ITEM_PUBLISHED'"
}
}
]
}
],
"evaluate.rule":{
"rules":[
{
"dsl.type":"spark-sql",
"name":"fms_count_score",
"rule":"select count(*) as `ip_count` from fms_ip",
"out.dataframe.name": "fms_ip_cnt",
"out": [
{
"type": "metric",
"name": "fms_ip_cnt"
}
]
}
]
},
"sinks": [
"CONSOLE",
"ELASTICSEARCH",
"HDFS"
]
}
The metrics written is:
{"name":"FMS_Comp_score_Job","tmst":1582578720000,"value":{}}
I expected ‘ip_count’ to be a part of the value field but cannot see why it’s
not working as expected.
Any suggestions, highly appreciated 😊
Thank you in advance!
Regards,
Preetam