Hi,
I'm trying to transform one RDD two times. I'm using foreachParition and
embedded I have two map transformations on it. First time, it works fine and
I get results, but second time I call map on it, it behaves like RDD has no
elements.
This is my code:
val credentialsIdsScala:
Hi,I'm trying to create application that would programmatically submit jar
file to Spark standalone cluster running on my local PC. However, I'm always
getting the error WARN TaskSetManager:66 - Lost task 1.0 in stage 0.0 (TID
1, 192.168.2.68, executor 0): java.lang.RuntimeException: Stream
Hi,
I need to develop a service that will recommend user with other similar
users that he can connect to. For each user I have a data about user
preferences for specific items in the form:
user, item, preference
1,75, 0.89
2,168, 0.478
2,99, 0.321
3,31, 0.012
So
Hi,
I have a mixed Java/Scala project. I have already been using Spark in Scala
code in local mode. Now, some new team members should develop
functionalities that should use Spark but in Java code, and they are not
familiar with Scala. I know it's not possible to have two Spark contexts in
the