Things I tried and the errors are :
String path = 
"/home/ubuntu/spark-0.9.1/SimpleApp/target/simple-project-1.0-allinone.jar";..  
.set(path)
$mvn package[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-compiler-plugin:2.0.2:compile (default-compile) 
on project simple-project: Compilation failure[ERROR] 
/home/ubuntu/spark-0.9.1/SimpleApp/src/main/java/SimpleApp.java:[14,23] error: 
method set in class SparkConf cannot be applied to given types;[ERROR] -> [Help 
1]

String path = 
"/home/ubuntu/spark-0.9.1/SimpleApp/target/simple-project-1.0-allinone.jar";..  
.setJars(path)
$mvn package[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-compiler-plugin:2.0.2:compile (default-compile) 
on project simple-project: Compilation failure[ERROR] 
/home/ubuntu/spark-0.9.1/SimpleApp/src/main/java/SimpleApp.java:[14,23] error: 
no suitable method found for setJars(String)

content of my pom.xml file (maybe the problem arises here)<project>  <build>  
<plugins>   <plugin>     <groupId>org.apache.maven.plugins</groupId>     
<artifactId>maven-compiler-plugin</artifactId>     <configuration>       
<source>1.7</source>       <target>1.7</target>     </configuration>   </plugin>
<plugin>  <groupId>org.apache.maven.plugins</groupId>  
<artifactId>maven-shade-plugin</artifactId>  <version>1.5</version>  
<executions>    <execution>        <phase>package</phase>        <goals>        
  <goal>shade</goal>        </goals>        <configuration>          
<shadedArtifactAttached>true</shadedArtifactAttached>          
<shadedClassifierName>allinone</shadedClassifierName>          <artifactSet>    
        <includes>              <include>*:*</include>            </includes>   
       </artifactSet>          <filters>            <filter>              
<artifact>*:*</artifact>              <excludes>                
<exclude>META-INF/*.SF</exclude>                
<exclude>META-INF/*.DSA</exclude>                
<exclude>META-INF/*.RSA</exclude>              </excludes>            </filter> 
         </filters>          <transformers>            <transformer 
implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">  
            <resource>reference.conf</resource>            </transformer>       
     <transformer 
implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">  
            <resource>META-INF/spring.handlers</resource>            
</transformer>            <transformer 
implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">  
            <resource>META-INF/spring.schemas</resource>            
</transformer>            <transformer 
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
              <manifestEntries>                
<Main-Class>com.echoed.chamber.Main</Main-Class>              
</manifestEntries>            </transformer>          </transformers>        
</configuration>      </execution>    </executions></plugin></plugins>  
</build>  <groupId>edu.berkeley</groupId>  
<artifactId>simple-project</artifactId>  <modelVersion>4.0.0</modelVersion>  
<name>Simple Project</name>  <packaging>jar</packaging>  <version>1.0</version> 
 <repositories>    <repository>      <id>Akka repository</id>      
<url>http://repo.akka.io/releases</url>    </repository>  </repositories>  
<dependencies>    <dependency> <!-- Spark dependency -->      
<groupId>org.apache.spark</groupId>      
<artifactId>spark-core_2.10</artifactId>      <version>0.9.1</version>    
</dependency>  </dependencies></project>

Date: Thu, 1 May 2014 20:52:59 -0700
From: selme...@yahoo.com
To: u...@spark.incubator.apache.org
Subject: Re: java.lang.ClassNotFoundException

Hi, You should include the jar file of your project. for example: 
conf.set("yourjarfilepath.jar")
Joe     On Friday, May 2, 2014 7:39 AM, proofmoore [via Apache Spark User List] 
<[hidden email]> wrote:
    

        





HelIo. I followed "A Standalone App in Java" part of the tutorial 
https://spark.apache.org/docs/0.8.1/quick-start.html
Spark standalone cluster looks it's running without a problem : 
http://i.stack.imgur.com/7bFv8.png
I have built a fat jar for running this JavaApp on the cluster. Before maven 
package:        find .        ./pom.xml    ./src    ./src/main    
./src/main/java    ./src/main/java/SimpleApp.java

content of SimpleApp.java is :
     import org.apache.spark.api.java.*;     import 
org.apache.spark.api.java.function.Function;     import 
org.apache.spark.SparkConf;     import org.apache.spark.SparkContext;

     public class SimpleApp {     public static
 void main(String[] args) {
     SparkConf conf =  new SparkConf()                       
.setMaster("spark://10.35.23.13:7077")                       .setAppName("My 
app")                       .set("spark.executor.memory", "1g");
     JavaSparkContext   sc = new JavaSparkContext (conf);     String logFile = 
"/home/ubuntu/spark-0.9.1/test_data";     JavaRDD<String> logData = 
sc.textFile(logFile).cache();
     long numAs = logData.filter(new Function<String, Boolean>() {      public 
Boolean call(String s) { return s.contains("a");
 }     }).count();
     System.out.println("Lines with a: " + numAs);      }     } This program 
only works when master is set as setMaster("local"). Otherwise I get this error 
: http://i.stack.imgur.com/doRSn.png
Thanks,Ibrahim
                                          


        
        
        
        

        

        
        
                If you reply to this email, your message will be added to the 
discussion below:
                
http://apache-spark-user-list.1001560.n3.nabble.com/java-lang-ClassNotFoundException-tp5191.html
        
        
                To start a new topic under Apache Spark User List, email 
[hidden email] 

                To unsubscribe from Apache Spark User List, click here.

                NAML
        

      

        
        
        


View this message in context: Re: java.lang.ClassNotFoundException

Sent from the Apache Spark User List mailing list archive at Nabble.com.
                                          

Reply via email to