Re: class not found exception Logging while running JavaKMeansExample

2016-08-16 Thread Ted Yu
The class is:
core/src/main/scala/org/apache/spark/internal/Logging.scala

So it is in spark-core.

On Tue, Aug 16, 2016 at 2:33 AM, subash basnet  wrote:

> Hello Yuzhihong,
>
> I didn't get how to implement what you said in the JavaKMeansExample.java.
> As I get the logging exception as while creating the spark session:
> Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/spark/internal/Logging
> at com.dfki.spark.kmeans.KMeansSpark.JavaKMeansExample.
> main(JavaKMeansExample.java*:43*)
> Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.
> Logging
>
> The exception occurs at the *builder()*:
>
> 42SparkSession spark = SparkSession
> *43 .builder()*
> 44  .appName("JavaKMeansExample")
> 45 .getOrCreate();
>
> I have added all the necessary log4j and sl4j dependencies in pom. I am
> still not getting what dependencies I am missing.
>
> Best Regards,
> Subash Basnet
>
> On Mon, Aug 15, 2016 at 6:50 PM, Ted Yu  wrote:
>
>> Logging has become private in 2.0 release:
>>
>> private[spark] trait Logging {
>>
>> On Mon, Aug 15, 2016 at 9:48 AM, subash basnet 
>> wrote:
>>
>>> Hello all,
>>>
>>> I am trying to run JavaKMeansExample of the spark example project. I am
>>> getting the classnotfound exception error:
>>> *Exception in thread "main" java.lang.NoClassDefFoundError:
>>> org/apache/spark/internal/Logging*
>>> at java.lang.ClassLoader.defineClass1(Native Method)
>>> at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
>>> at jcom.dfki.spark.kmeans.KMeansSpark.JavaKMeansExample.main(Ja
>>> vaKMeansExample.java:43)
>>> *Caused by: java.lang.ClassNotFoundException:
>>> org.apache.spark.internal.Logging*
>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>>
>>> I have added all the logging related dependencies as below:
>>>  org.slf4j slf4j-api
>>> ${slf4j.version}  
>>> org.slf4j slf4j-log4j12
>>> ${slf4j.version} ${hadoop.deps.scope}
>>>   org.slf4j
>>> jul-to-slf4j ${slf4j.version}
>>>   org.slf4j
>>> jcl-over-slf4j ${slf4j.version}
>>> log4j
>>> log4j ${log4j.version}
>>>   commons-logging
>>> commons-logging 1.2
>>>  What depedencies could I be missing, any idea? Regards,
>>> Subash Basnet
>>>
>>>
>>> -
>>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>>
>>
>>
>


Re: class not found exception Logging while running JavaKMeansExample

2016-08-16 Thread subash basnet
Hello Yuzhihong,

I didn't get how to implement what you said in the JavaKMeansExample.java.
As I get the logging exception as while creating the spark session:
Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/spark/internal/Logging
at
com.dfki.spark.kmeans.KMeansSpark.JavaKMeansExample.main(JavaKMeansExample.java
*:43*)
Caused by: java.lang.ClassNotFoundException:
org.apache.spark.internal.Logging

The exception occurs at the *builder()*:

42SparkSession spark = SparkSession
*43 .builder()*
44  .appName("JavaKMeansExample")
45 .getOrCreate();

I have added all the necessary log4j and sl4j dependencies in pom. I am
still not getting what dependencies I am missing.

Best Regards,
Subash Basnet

On Mon, Aug 15, 2016 at 6:50 PM, Ted Yu  wrote:

> Logging has become private in 2.0 release:
>
> private[spark] trait Logging {
>
> On Mon, Aug 15, 2016 at 9:48 AM, subash basnet  wrote:
>
>> Hello all,
>>
>> I am trying to run JavaKMeansExample of the spark example project. I am
>> getting the classnotfound exception error:
>> *Exception in thread "main" java.lang.NoClassDefFoundError:
>> org/apache/spark/internal/Logging*
>> at java.lang.ClassLoader.defineClass1(Native Method)
>> at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
>> at jcom.dfki.spark.kmeans.KMeansSpark.JavaKMeansExample.main(
>> JavaKMeansExample.java:43)
>> *Caused by: java.lang.ClassNotFoundException:
>> org.apache.spark.internal.Logging*
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>
>> I have added all the logging related dependencies as below:
>>  org.slf4j slf4j-api
>> ${slf4j.version}  
>> org.slf4j slf4j-log4j12
>> ${slf4j.version} ${hadoop.deps.scope}
>>   org.slf4j
>> jul-to-slf4j ${slf4j.version}
>>   org.slf4j
>> jcl-over-slf4j ${slf4j.version}
>> log4j
>> log4j ${log4j.version}
>>   commons-logging
>> commons-logging 1.2
>>  What depedencies could I be missing, any idea? Regards,
>> Subash Basnet
>>
>>
>> -
>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>
>
>


Re: class not found exception Logging while running JavaKMeansExample

2016-08-15 Thread Ted Yu
Logging has become private in 2.0 release:

private[spark] trait Logging {

On Mon, Aug 15, 2016 at 9:48 AM, subash basnet  wrote:

> Hello all,
>
> I am trying to run JavaKMeansExample of the spark example project. I am
> getting the classnotfound exception error:
> *Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/spark/internal/Logging*
> at java.lang.ClassLoader.defineClass1(Native Method)
> at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
> at jcom.dfki.spark.kmeans.KMeansSpark.JavaKMeansExample.
> main(JavaKMeansExample.java:43)
> *Caused by: java.lang.ClassNotFoundException:
> org.apache.spark.internal.Logging*
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>
> I have added all the logging related dependencies as below:
>  org.slf4j slf4j-api
> ${slf4j.version}  
> org.slf4j slf4j-log4j12
> ${slf4j.version} ${hadoop.deps.scope}
>   org.slf4j
> jul-to-slf4j ${slf4j.version}
>   org.slf4j
> jcl-over-slf4j ${slf4j.version}
> log4j
> log4j ${log4j.version}
>   commons-logging
> commons-logging 1.2
>  What depedencies could I be missing, any idea? Regards,
> Subash Basnet
>
>
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>


class not found exception Logging while running JavaKMeansExample

2016-08-15 Thread subash basnet
Hello all,

I am trying to run JavaKMeansExample of the spark example project. I am
getting the classnotfound exception error:
*Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/spark/internal/Logging*
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
at
jcom.dfki.spark.kmeans.KMeansSpark.JavaKMeansExample.main(JavaKMeansExample.java:43)
*Caused by: java.lang.ClassNotFoundException:
org.apache.spark.internal.Logging*
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

I have added all the logging related dependencies as below:
 org.slf4j
slf4j-api ${slf4j.version}
  org.slf4j
slf4j-log4j12 ${slf4j.version}
${hadoop.deps.scope}  
org.slf4j jul-to-slf4j
${slf4j.version}  
org.slf4j jcl-over-slf4j
${slf4j.version}   
 log4j log4j
${log4j.version}  
commons-logging commons-logging
1.2  What depedencies could I be missing,
any idea? Regards, Subash Basnet
http://maven.apache.org/POM/4.0.0; xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance;
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd;>
	4.0.0

	com.apache.spark
	gettingStarted
	0.0.1-SNAPSHOT
	jar

	gettingStarted
	http://maven.apache.org

	
		UTF-8
		UTF-8
		1.8
		3.3.3
		1.7.16
		1.2.17
		3.4.1
		
		3.2.2
		2.11.8
		2.11
		1.10
		2.4
		

		64m
		512m
		512m
	

	
		
			org.slf4j
			slf4j-api
			${slf4j.version}
			${hadoop.deps.scope}
		
		
			org.slf4j
			slf4j-log4j12
			${slf4j.version}
			${hadoop.deps.scope}
		
		
			org.slf4j
			jul-to-slf4j
			${slf4j.version}
		
		
			org.slf4j
			jcl-over-slf4j
			${slf4j.version}
			 
		
		
			log4j
			log4j
			${log4j.version}
			${hadoop.deps.scope}
		
		
			junit
			junit
			3.8.1
			test
		
		
			org.apache.spark
			spark-core_2.10
			1.6.0
		
		
		
			commons-logging
			commons-logging
			1.2
		
		
			org.apache.spark
			spark-sql_2.10
			2.0.0
		

		
		
			org.apache.spark
			spark-mllib_2.10
			2.0.0
		

		
			org.apache.hbase
			hbase-client
			1.1.3
		
	

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Exception Logging

2014-10-16 Thread Ge, Yao (Y.)
I need help to better trap Exception in the map functions. What is the best way 
to catch the exception and provide some helpful diagnostic information such as 
source of the input such as file name (and ideally line number if I am 
processing a text file)?

-Yao


Re: Exception Logging

2014-10-16 Thread Yana Kadiyska
you can out a try catch block in the map function and log the exception.
The only tricky part is that the exception log will be located in the
executor machine. Even if you don't do any trapping you should see the
exception stacktrace in the executors' stderr log which is visible through
the UI (if your app crashes the executor you can still see it as the last
executor that ran on a given worker). But things like println and logging
work inside map, you just have to remember everything happens on the remote
machine

On Thu, Oct 16, 2014 at 8:11 PM, Ge, Yao (Y.) y...@ford.com wrote:

  I need help to better trap Exception in the map functions. What is the
 best way to catch the exception and provide some helpful diagnostic
 information such as source of the input such as file name (and ideally line
 number if I am processing a text file)?



 -Yao