Hi Karan, Do you mean that you want to put your profiling config files in a HDFS directory, and let griffin scan the directory to get the config files at run time?
Griffin measure module doesn't support this at current, you can refer to the code entrance and implement your own param file reader if you want to do that: https://github.com/apache/incubator-griffin/blob/master/measure/src/main/scala/org/apache/griffin/measure/Application.scala#L170 https://github.com/apache/incubator-griffin/tree/master/measure/src/main/scala/org/apache/griffin/measure/config/reader But in my opinion, maybe it's not appropriate to do such work in measure module. This seems like to be some schedule work before submitting griffin jobs. Thanks, Lionel On Mon, May 28, 2018 at 3:21 PM, Karan Gupta <[email protected]> wrote: > Hi Lionel, > > > > Thank you for your response, I created a single custom rule for multiple > sources. Now I am trying to run profiling jobs where my source is not > tightly coupled inside a rule. I want to run profiling jobs by just > pointing to a HDFS directory instead of a specific file <griffin should > pick up the file name from the directory on run time> > Is it possible to do that through Griffin? > > > > > > Thank you, > > Karan Gupta > ------------------------------ > Any comments or statements made in this email are not necessarily those of > Tavant Technologies. The information transmitted is intended only for the > person or entity to which it is addressed and may contain confidential > and/or privileged material. If you have received this in error, please > contact the sender and delete the material from any computer. All emails > sent from or to Tavant Technologies may be subject to our monitoring > procedures. >
