I was able to Resolve this by changing the Hdfs-site.xml as I mentioned in my
initial thread
Thanks,
Asmath
> On Apr 12, 2021, at 8:35 PM, Peng Lei wrote:
>
>
> Hi KhajaAsmath Mohammed
> Please check the configuration of "spark.speculation.interval", just pass
> the "30" to it.
>
> ''
Hi KhajaAsmath Mohammed
Please check the configuration of "spark.speculation.interval", just pass
the "30" to it.
'''
override def start(): Unit = {
backend.start()
if (!isLocal && conf.get(SPECULATION_ENABLED)) {
logInfo("Starting speculative execution thread")
speculationSched
Something is passing this invalid 30s value, yes. Hard to say which
property it is. I'd check if your cluster config sets anything with the
value 30s - whatever is reading this property is not expecting it.
On Mon, Apr 12, 2021, 2:25 PM KhajaAsmath Mohammed
wrote:
> Hi Sean,
>
> Do you think any
Hi Sean,
Do you think anything that can cause this with DFS client?
java.lang.NumberFormatException: For input string: "30s"
at
java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Long.parseLong(Long.java:589)
at java.lang.Long.parseL
I am using spark hbase connector provided by hortonwokrs. I was able to run
without issues in my local environment and has this issue in emr.
Thanks,
Asmath
> On Apr 12, 2021, at 2:15 PM, Sean Owen wrote:
>
>
> Somewhere you're passing a property that expects a number, but give it "30s".
>
Somewhere you're passing a property that expects a number, but give it
"30s". Is it a time property somewhere that really just wants MS or
something? But most time properties (all?) in Spark should accept that type
of input anyway. Really depends on what property has a problem and what is
setting i
HI,
I am getting weird error when running spark job in emr cluster. Same
program runs in my local machine. Is there anything that I need to do to
resolve this?
21/04/12 18:48:45 ERROR SparkContext: Error initializing SparkContext.
java.lang.NumberFormatException: For input string: "30s"
I tried