Got curious with this intellij stuff.

I recall using sbt rather than MVN so go to terminal in your intellij and
verify what is installed

 sbt -version
sbt version in this project: 1.3.4
sbt script version: 1.3.4

 scala -version

Scala code runner version 2.11.7 -- Copyright 2002-2013, LAMP/EPFL

java --version
openjdk 11.0.7 2020-04-14
OpenJDK Runtime Environment AdoptOpenJDK (build 11.0.7+10)
OpenJDK 64-Bit Server VM AdoptOpenJDK (build 11.0.7+10, mixed mode)

For now in the directory where you have Main.scala create build.sbt file

// The simplest possible sbt build file is just one line:

scalaVersion := "2.11.7"
// That is, to create a valid sbt build, all you've got to do is define the
// version of Scala you'd like your project to use.

/ To learn more about multi-project builds, head over to the official sbt
// documentation at http://www.scala-sbt.org/documentation.html
libraryDependencies += "org.scala-lang.modules" %%
"scala-parser-combinators" % "1.1.2"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.1.0"


This is my Main.scala file a copy from sparkbyexample
<https://sparkbyexamples.com/spark/spark-setup-run-with-scala-intellij/>


package org.example
import org.apache.spark.sql.SparkSession
object SparkSessionTest extends App{
  val spark = SparkSession.builder()
    .master("local[1]")
    .appName("SparkByExample")
    .getOrCreate();

  println("First SparkContext:")
  println("APP Name :"+spark.sparkContext.appName);
  println("Deploy Mode :"+spark.sparkContext.deployMode);
  println("Master :"+spark.sparkContext.master);

  val sparkSession2 = SparkSession.builder()
    .master("local[1]")
    .appName("SparkByExample-test")
    .getOrCreate();

  println("Second SparkContext:")
  println("APP Name :"+sparkSession2.sparkContext.appName);
  println("Deploy Mode :"+sparkSession2.sparkContext.deployMode);
  println("Master :"+sparkSession2.sparkContext.master);
}

 Go back to Terminal under directory where you have both files built.sbt
and Main.scala


*sbt clean*

[info] Loading global plugins from C:\Users\admin\.sbt\1.0\plugins

[info] Loading project definition from
D:\temp\intellij\MichTest\src\main\scala\com\ctp\training\scala\project

[info] Loading settings for project scala from build.sbt ...

[info] Set current project to MichTest (in build
file:/D:/temp/intellij/MichTest/src/main/scala/com/ctp/training/scala/)

[success] Total time: 0 s, completed Feb 27, 2022 9:54:10 AM

*sbt compile*
[info] Loading global plugins from C:\Users\admin\.sbt\1.0\plugins
[info] Loading project definition from
D:\temp\intellij\MichTest\src\main\scala\com\ctp\training\scala\project
[info] Loading settings for project scala from build.sbt ...
[info] Set current project to MichTest (in build
file:/D:/temp/intellij/MichTest/src/main/scala/com/ctp/training/scala/)
[info] Executing in batch mode. For better performance use sbt's shell
[warn] There may be incompatibilities among your library dependencies; run
'evicted' to see detailed eviction warnings.
[info] Compiling 1 Scala source to
D:\temp\intellij\MichTest\src\main\scala\com\ctp\training\scala\target\scala-2.11\classes
...
[success] Total time: 5 s, completed Feb 27, 2022 9:55:10 AM

 *sbt package*

[info] Loading global plugins from C:\Users\admin\.sbt\1.0\plugins

[info] Loading project definition from
D:\temp\intellij\MichTest\src\main\scala\com\ctp\training\scala\project

[info] Loading settings for project scala from build.sbt ...

[info] Set current project to MichTest (in build
file:/D:/temp/intellij/MichTest/src/main/scala/com/ctp/training/scala/)

[success] Total time: 1 s, completed Feb 27, 2022 9:56:48 AM

 *ls*


    Directory:
D:\temp\intellij\MichTest\src\main\scala\com\ctp\training\scala


Mode                 LastWriteTime         Length Name

----                 -------------         ------ ----

d-----         2/27/2022   7:04 AM                null

d-----         2/27/2022   8:33 AM                project

d-----         2/27/2022   9:08 AM                spark-warehouse

d-----         2/27/2022   9:55 AM                target

-a----         2/27/2022   9:17 AM           3511 build.sbt

Note that you have target directory and underneath scala-2.11 (in my case)
and the uber jar file michtest_2.11-1.0.jar

*ls*
    Directory:
D:\temp\intellij\MichTest\src\main\scala\com\ctp\training\scala\target\scala-2.11
Mode                 LastWriteTime         Length Name
----                 -------------         ------ ----
d-----         2/27/2022   9:55 AM                classes
d-----         2/27/2022   9:55 AM                update
-a----         2/27/2022   9:56 AM           3938 *michtest_2.11-1.0.jar*


These are old stuff but still shows how to create a jar file with sbt

HTH


  view my Linkedin profile
<https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>



 https://en.everybodywiki.com/Mich_Talebzadeh



*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.




On Sat, 26 Feb 2022 at 22:48, Sean Owen <sro...@gmail.com> wrote:

> I don't think any of that is related, no.
> How are you dependencies set up? manually with IJ, or in a build file
> (Maven, Gradle)? Normally you do the latter and dependencies are taken care
> of for you, but you app would definitely have to express a dependency on
> Scala libs.
>
> On Sat, Feb 26, 2022 at 4:25 PM Bitfox <bit...@bitfox.top> wrote:
>
>> Java SDK installed?
>>
>> On Sun, Feb 27, 2022 at 5:39 AM Sachit Murarka <connectsac...@gmail.com>
>> wrote:
>>
>>> Hello ,
>>>
>>> Thanks for replying. I have installed Scala plugin in IntelliJ  first
>>> then also it's giving same error
>>>
>>> Cannot find project Scala library 2.12.12 for module SparkSimpleApp
>>>
>>> Thanks
>>> Rajat
>>>
>>> On Sun, Feb 27, 2022, 00:52 Bitfox <bit...@bitfox.top> wrote:
>>>
>>>> You need to install scala first, the current version for spark is
>>>> 2.12.15
>>>> I would suggest you install scala by sdk which works great.
>>>>
>>>> Thanks
>>>>
>>>> On Sun, Feb 27, 2022 at 12:10 AM rajat kumar <
>>>> kumar.rajat20...@gmail.com> wrote:
>>>>
>>>>> Hello Users,
>>>>>
>>>>> I am trying to create spark application using Scala(Intellij).
>>>>> I have installed Scala plugin in intelliJ still getting below error:-
>>>>>
>>>>> Cannot find project Scala library 2.12.12 for module SparkSimpleApp
>>>>>
>>>>>
>>>>> Could anyone please help what I am doing wrong?
>>>>>
>>>>> Thanks
>>>>>
>>>>> Rajat
>>>>>
>>>>

Reply via email to