Hi,
Thanks everybody to help me to solve my problem :)
As Zhu said, I had to use mapPartitionsWithIndex in my code.
Thanks,
Have a nice day,
Anahita
On Wed, Mar 29, 2017 at 2:51 AM, Shixiong(Ryan) Zhu wrote:
> mapPartitionsWithSplit was removed in Spark 2.0.0. You can
> use mapPartitionsWithIn
mapPartitionsWithSplit was removed in Spark 2.0.0. You can
use mapPartitionsWithIndex instead.
On Tue, Mar 28, 2017 at 3:52 PM, Anahita Talebi
wrote:
> Thanks.
> I tried this one, as well. Unfortunately I still get the same error.
>
>
> On Wednesday, March 29, 2017, Marco Mistroni wrote:
>
>> 1
Thanks.
I tried this one, as well. Unfortunately I still get the same error.
On Wednesday, March 29, 2017, Marco Mistroni wrote:
> 1.7.5
>
> On 28 Mar 2017 10:10 pm, "Anahita Talebi" > wrote:
>
>> Hi,
>>
>> Thanks for your answer.
>> What is the version of "org.slf4j" % "slf4j-api" in your sbt
1.7.5
On 28 Mar 2017 10:10 pm, "Anahita Talebi" wrote:
> Hi,
>
> Thanks for your answer.
> What is the version of "org.slf4j" % "slf4j-api" in your sbt file?
> I think the problem might come from this part.
>
> On Tue, Mar 28, 2017 at 11:02 PM, Marco Mistroni
> wrote:
>
>> Hello
>> uhm ihave a
Hello again,
I just tried to change the version to 3.0.0 and remove the libraries
breeze, netlib and scoopt but I still get the same error.
On Tue, Mar 28, 2017 at 11:02 PM, Marco Mistroni
wrote:
> Hello
> uhm ihave a project whose build,sbt is closest to yours, where i am using
> spark 2.1, sc
Hi,
Thanks for your answer.
What is the version of "org.slf4j" % "slf4j-api" in your sbt file?
I think the problem might come from this part.
On Tue, Mar 28, 2017 at 11:02 PM, Marco Mistroni
wrote:
> Hello
> uhm ihave a project whose build,sbt is closest to yours, where i am using
> spark 2.1,
Hello
uhm ihave a project whose build,sbt is closest to yours, where i am using
spark 2.1, scala 2.11 and scalatest (i upgraded to 3.0.0) and it works fine
in my projects though i don thave any of the following libraries that you
mention
- breeze
- netlib,all
- scoopt
hth
On Tue, Mar 28, 2017 a
Hi,
Thanks for your answer.
I first changed the scala version to 2.11.8 and kept the spark version
1.5.2 (old version). Then I changed the scalatest version into "3.0.1".
With this configuration, I could run the code and compile it and generate
the .jar file.
When I changed the spark version int
I personally never add the _scala version to the dependency but always
crosscompile. This seems to be cleanest. Additionally Spark dependencies and
hadoop dependencies should be provided not compile. Scalatest seems to be
outdated.
I would also not use a local repo, but either an artefact manag
Hi,
Thanks for your answer. I just changes the sbt file and set the scala
version to 2.10.4
But I still get the same error
[info] Compiling 4 Scala sources to
/Users/atalebi/Desktop/new_version_proxcocoa-master/target/scala-2.10/classes...
[error]
/Users/atalebi/Desktop/new_version_proxcocoa-mast
Hello
that looks to me like there's something dodgy withyour Scala installation
Though Spark 2.0 is built on Scala 2.11, it still support 2.10... i suggest
you change one thing at a time in your sbt
First Spark version. run it and see if it works
Then amend the scala version
hth
marco
On Tue, M
Hello,
Thanks you all for your informative answers.
I actually changed the scala version to the 2.11.8 and spark version into
2.1.0 in the build.sbt
Except for these two guys (scala and spark version), I kept the same values
for the rest in the build.sbt file.
Adding to advices given by others ... Spark 2.1.0 works with Scala 2.11, so set:
scalaVersion := "2.11.8"
When you see something like:
"org.apache.spark" % "spark-core_2.10" % "1.5.2"
that means that library `spark-core` is compiled against Scala 2.10,
so you would have to change that to 2.
it yo
On Jörn Franke , Mar 28, 2017 12:11 AM wrote:Usually you define the dependencies to the Spark library as provided. You also seem to mix different Spark versions which should be avoided.The Hadoop library seems to be outdated and should also only be provided.The other dependencies you could
check these versions
function create_build_sbt_file {
BUILD_SBT_FILE=${GEN_APPSDIR}/scala/${APPLICATION}/build.sbt
[ -f ${BUILD_SBT_FILE} ] && rm -f ${BUILD_SBT_FILE}
cat >> $BUILD_SBT_FILE << !
lazy val root = (project in file(".")).
settings(
name := "${APPLICATION}
Usually you define the dependencies to the Spark library as provided. You also
seem to mix different Spark versions which should be avoided.
The Hadoop library seems to be outdated and should also only be provided.
The other dependencies you could assemble in a fat jar.
> On 27 Mar 2017, at 21:2
16 matches
Mail list logo