Hello,

I was trying to create a measure and write the rule in Spark-SQL directly 
instead of Griffin-DSL. I use Postman to create the measure. The measure is 
created successfully, the job is created and executed successfully.

However, the output metrics of execution of jobs are not persisted in 
ElasticSearch. The entry is created in Elastic but the "metricValues" array is 
NULL.

The same SQL query works fine directly on Spark-Shell.

I am not using Docker and building the environment (Griffin 3.0) on my local 
machine. All the measures created using UI are executing well. And measures 
created using Postman with griffin-dsl rule are also working well.

Below is the body of json which I am passing to add measure API call from 
Postman. Please help me understand what is going wrong.


{
   "name": "custom_profiling_measure_2",
   "measure.type": "griffin",
   "dq.type": "PROFILING",
   "rule.description": {
     "details": [
       {
         "name": "id",
         "infos": "Total Count"
       }
     ]
   },
   "process.type": "BATCH",
   "owner": "test",
   "description": "custom_profiling_measure_2",
   "data.sources": [
     {
       "name": "source",
       "connectors": [
         {
           "name": "source123",
           "type": "HIVE",
           "version": "1.2",
           "data.unit": "1day",
           "data.time.zone": "",
           "config": {
             "database": "default",
             "table.name": "demo_src",
             "where": ""
           }
         }
       ]
     }
   ],
   "evaluate.rule": {
     "out.dataframe.name": "profiling_2",
     "rules": [
       {
         "dsl.type": "spark-sql",
         "dq.type": "PROFILING",
         "rule": "SELECT count(id) AS cnt, max(age) AS Max_Age from demo_src",
         "out.dataframe.name": "id_count_2"
       }
     ]
   }
}





Regards,

Vikram

Reply via email to