I have seen issues with some versions of the Scala Maven plugin auto-detecting 
the wrong JAVA_HOME when both a JRE and JDK are present on the system.  Setting 
JAVA_HOME explicitly to a JDK skips the plugins auto-detect logic and avoids 
the problem.

 

This may be related - https://github.com/davidB/scala-maven-plugin/pull/227 and 
https://github.com/davidB/scala-maven-plugin/issues/221 

Rob

 

From: Sean Owen <sro...@gmail.com>
Date: Tuesday, 30 April 2019 at 00:18
To: Shmuel Blitz <shmuel.bl...@similarweb.com>
Cc: dev <dev@spark.apache.org>
Subject: Re: Spark build can't find javac

 

Your JAVA_HOME is pointing to a JRE rather than JDK installation. Or you've 
actually installed the JRE. Only the JDK has javac, etc.

 

On Mon, Apr 29, 2019 at 4:36 PM Shmuel Blitz <shmuel.bl...@similarweb.com> 
wrote:

Hi,

 

Trying to build Spark on Manjaro with OpenJDK version 1.8.0_212, and I'm 
getting the following error:

 

Cannot run program "/usr/lib/jvm/java-8-openjdk/jre/bin/javac": error=2, No 
such file or directory

> which javac

/usr/bin/javac

 

only when I set JAVA_HOME as follows, do I get it to run.

> export JAVA_HOME=/usr/lib/jvm/default

 

 

Any idea what the issue is?

-- 

Shmuel Blitz 
Data Analysis Team Leader 
Email: shmuel.bl...@similarweb.com 
www.similarweb.com 
 

Reply via email to