ob: failed to compile: java.lang.NullPointerException
Hi,
We have a spark job that reads AVRO data from a S3 location , does some
processing and writes it back to S3. Of late it has been failing with the
exception below,
Application application_1529346471665_0020 failed 1 times due to AM
Hi,
I get
java.lang.NullPointerException at
org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:128)
When I try to createDataFrame using the sparkSession, see below:
SparkConf conf = new SparkConf().setMaster().setAppName("test")
Hi,
I am trying to submit a job to spark to count number of words in a specific
kafka topic but I get below exception when I check the log:
. failed with unrecoverable exception: java.lang.NullPointerException
The command that I run follows:
./scripts/dm-spark-submit.sh --class
(SparkSubmit.scala)
Caused by: java.lang.NullPointerException
at
org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.agg_doAggregateWithKeys$(Unknown
Source)
at
org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(Unknown
lue': 'foobar'}]
>
> [Stage 9:> (0 +
> 2) / 2]
> [Stage 9:=> (1 +
> 1) / 2]WARN 2016-05-02 17:23:55,240
> org.apache.spark.scheduler.TaskSetManager: Lost task
Maybe you were trying to embed pictures for the error and your code - but
they didn't go through.
On Mon, May 2, 2016 at 10:32 AM, meson10 wrote:
> Hi,
>
> I am trying to save a RDD to Cassandra but I am running into the following
> error:
>
>
>
> The Python code looks
Hi,
I am trying to save a RDD to Cassandra but I am running into the following
error:
The Python code looks like this:
I am using DSE 4.8.6 which runs Spark 1.4.2
I ran through a bunch of existing posts on this mailing lists and have
already performed the following routines:
* Ensure that
> On 11 Dec 2015, at 05:14, michael_han wrote:
>
> Hi Sarala,
> I found the reason, it's because when spark run it still needs Hadoop
> support, I think it's a bug in Spark and still not fixed now ;)
>
It's related to how the hadoop filesystem apis are used to access
Hi Sarala,
I found the reason, it's because when spark run it still needs Hadoop
support, I think it's a bug in Spark and still not fixed now ;)
After I download winutils.exe and following the steps from bellow
workaround, it works fine:
-hadoop2.6.0.jar
with error as before: Spark Java.lang.NullPointerException
spark-submit --master local --name "SparkTest App" --class
com.qad.SparkTest1 target/Spark-Test-1.0.jar --jars
c:/spark-1.5.2-bin-hadoop2.6/lib/spark-assembly-1.5.2-hadoop2.6.0.jar
--
View this message in context:
ht
>
> I am trying to customize the Twitter Example TD did by only printing
> messages that have a GeoLocation.
>
> I am getting a NullPointerException:
>
> java.lang.NullPointerException
> at Twitter$$anonfun$1.apply(Twitter.scala:64)
> at Twitt
Hello!
I am trying to customize the Twitter Example TD did by only printing
messages that have a GeoLocation.
I am getting a NullPointerException:
java.lang.NullPointerException
at Twitter$$anonfun$1.apply(Twitter.scala:64)
at Twitter$$anonfun$1.apply(Twitter.scala:64
Usually when the SparkContext throws an NPE it means that it has been shut
down due to some earlier failure.
On Wed, Oct 22, 2014 at 5:29 PM, arthur.hk.c...@gmail.com
arthur.hk.c...@gmail.com wrote:
Hi,
I got java.lang.NullPointerException. Please help!
sqlContext.sql(select l_orderkey
.
Thanks
Best Regards
On Thu, Oct 23, 2014 at 5:59 AM, arthur.hk.c...@gmail.com
arthur.hk.c...@gmail.com wrote:
Hi,
I got java.lang.NullPointerException. Please help!
sqlContext.sql(select l_orderkey, l_linenumber, l_partkey, l_quantity,
l_shipdate, L_RETURNFLAG, L_LINESTATUS from lineitem
Hi,
I got java.lang.NullPointerException. Please help!
sqlContext.sql(select l_orderkey, l_linenumber, l_partkey, l_quantity,
l_shipdate, L_RETURNFLAG, L_LINESTATUS from lineitem limit
10).collect().foreach(println);
2014-10-23 08:20:12,024 INFO [sparkDriver-akka.actor.default-dispatcher-31
I'm guessing the other result was wrong, or just never evaluated here. The
RDD transforms being lazy may have let it be expressed, but it wouldn't
work. Nested RDD's are not supported.
On Mon, Mar 17, 2014 at 4:01 PM, anny9699 anny9...@gmail.com wrote:
Hi Andrew,
Thanks for the reply.
16 matches
Mail list logo