Not sure what could be the problem be but, I would suggest you to double
check if the said property is part of SparkConf obejct being created in the
code (just by logging it).
Cheers,
Subroto Sanyal
On Wed, Aug 10, 2016 at 1:39 PM, Aneela Saleem <ane...@platalytics.com>
wrote:
> The
yes... you can set the property in the conf file or you can the property
explicitly in the Spark Configuration object used while creation of
SparkContext/JavaSparkContext.
Cheers,
Subroto Sanyal
On Wed, Aug 10, 2016 at 12:09 PM, Aneela Saleem <ane...@platalytics.com>
wrote:
> Thank
hi Aneela
By any chance you are missing the property:
spark.yarn.security.tokens.habse.enabled
This was introduced as part of the fix:
https://github.com/apache/spark/pull/8134/files
Cheers,
Subroto Sanyal
On Wed, Aug 10, 2016 at 11:53 AM, Aneela Saleem <ane...@platalytics.com>
Congratulations Rajesh…
Cheers,
Subroto Sanyal
On Sep 11, 2013, at 6:17 PM, ramkrishna vasudevan wrote:
Hi All,
Please join me in welcoming Rajeshbabu (Rajesh) as our new HBase committer.
Rajesh has been there for more than a year and has been solving some very
good bugs around
Good to see this. Thanks a lot to Priyank, Ramakrishna, Rajesh.
_
/(|
( :
__\ \ _
() `|
()| |
().__|
(___)__.|_
Cheers,
Subroto Sanyal
On Aug 13, 2013, at 9:28 AM, ramkrishna
Hi Manish,
Please include protobuf-java-*.jar in the dependencies.
Cheers,
Subroto Sanyal
On Aug 7, 2013, at 3:02 PM, manish dunani wrote:
Message
Hi,
I have an MapReduce job which uses does some operation of reading from HBase
tables. I have configured the cluster in Secure Mode including Secure HBase.
I am running the Job(classical MR job) from a custom client running under user
subroto.
The mentioned user has valid principal
column=cf:c, timestamp=1339581566227, value=value3
Thanks again for the correct pointers…. :-)
Cheers,
Subroto Sanyal
On Jun 14, 2012, at 9:07 AM, Sonal Goyal wrote:
Are you doing something specific with the RecordReader? Maybe you can post
more
Hi Anoop,
Thanks a lot for the explanation.
Need to go through the documentation properly…. ;-)
Cheers,
Subroto Sanyal
On Jun 14, 2012, at 10:15 AM, Anoop Sam John wrote:
Hi Subroto
Scan.addColumn you can use when u want only specific columns to be
retrieved in the sacn . U can
reason(s) behind it??
Cheers,
Subroto Sanyal
Hi Sonal,
The Scan is being created by:
void org.apache.hadoop.hbase.mapreduce.TableInputFormat.setConf(Configuration
configuration)
I am not providing any other scan options…. :-(
Cheers,
Subroto Sanyal
On Jun 13, 2012, at 1:30 PM, Sonal Goyal wrote:
Hi Subroto,
How are you configuring
11 matches
Mail list logo