Thanks. I extract hadoop configuration and set a my arbitrary variable and
able to get inside InputFormat from JobContext.configuration

On Mon, Feb 23, 2015 at 12:04 PM, Tom Vacek <minnesota...@gmail.com> wrote:

> The SparkConf doesn't allow you to set arbitrary variables.  You can use
> SparkContext's HadoopRDD and create a JobConf (with whatever variables you
> want), and then grab them out of the JobConf in your RecordReader.
>
> On Sun, Feb 22, 2015 at 4:28 PM, hnahak <harihar1...@gmail.com> wrote:
>
>> Hi,
>>
>> I have written custom InputFormat and RecordReader for Spark, I need  to
>> use
>> user variables from spark client program.
>>
>> I added them in SparkConf
>>
>>  val sparkConf = new
>> SparkConf().setAppName(args(0)).set("developer","MyName")
>>
>> *and in InputFormat class*
>>
>>         protected boolean isSplitable(JobContext context, Path filename) {
>>
>>
>> System.out.println("######################################### Developer "
>> + context.getConfiguration().get("developer") );
>>                 return false;
>>         }
>>
>> but its return me *null* , is there any way I can pass user variables to
>> my
>> custom code?
>>
>> Thanks !!
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-send-user-variables-from-Spark-client-to-custom-InputFormat-or-RecordReader-tp21755.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>


-- 
{{{H2N}}}-----(@:

Reply via email to