connect mobile app with Spark backend

2015-06-18 Thread Ralph Bergmann
Hi,


I'm new to Spark and need some architecture tips :-)

I need a way to connect the mobile app with the Spark backend to upload
data to and   download data from the Spark backend.

The use case is that the user do something with the app. This changes
are uploaded to the backend. Spark calculates something. If the user
uses the app again it download the new calculated data.

My plan is that the mobile app talks with a Jersey-Tomcat server and
this Jersey-Tomcat server loads the data into Spark and starts the jobs.

But what is the best way to upload the data to Spark and to start the job?

Currently Jersey, Tomcat and Spark are on the same machine.

I found this spark-jobserver[1] but I'm not sure if it is the right
choise. The mobile app uploads a JSON. Jersey converts it into POJOs to
do something with it. And than it converts it to JSON again to load it
into Spark witch converts it to POJOs.

I thought also about Spark streaming. But this means that this streaming
stuff runs 24/7?



[1] ... https://github.com/spark-jobserver/spark-jobserver

-- 

Ralph Bergmann


www  http://www.dasralph.de | http://www.the4thFloor.eu
mail ra...@dasralph.de
skypedasralph

facebook https://www.facebook.com/dasralph
google+  https://plus.google.com/+RalphBergmann
xing https://www.xing.com/profile/Ralph_Bergmann3
linkedin https://www.linkedin.com/in/ralphbergmann
gulp https://www.gulp.de/Profil/RalphBergmann.html
github   https://github.com/the4thfloor


pgp key id   0x421F9B78
pgp fingerprint  CEE3 7AE9 07BE 98DF CD5A E69C F131 4A8E 421F 9B78

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: ClassNotFoundException

2015-03-17 Thread Ralph Bergmann
Hi Kevin,


yes I can test it means I have to build Spark from git repository?


Ralph

Am 17.03.15 um 02:59 schrieb Kevin (Sangwoo) Kim:
 Hi Ralph,
 
 It seems like https://issues.apache.org/jira/browse/SPARK-6299 issue,
 which is I'm working on. 
 I submitted a PR for it, would you test it?
 
 Regards,
 Kevin


-- 

Ralph Bergmann


www  http://www.dasralph.de | http://www.the4thFloor.eu
mail ra...@dasralph.de
skypedasralph

facebook https://www.facebook.com/dasralph
google+  https://plus.google.com/+RalphBergmann
xing https://www.xing.com/profile/Ralph_Bergmann3
linkedin https://www.linkedin.com/in/ralphbergmann
gulp https://www.gulp.de/Profil/RalphBergmann.html
github   https://github.com/the4thfloor


pgp key id   0x421F9B78
pgp fingerprint  CEE3 7AE9 07BE 98DF CD5A E69C F131 4A8E 421F 9B78

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: unable to access spark @ spark://debian:7077

2015-03-16 Thread Ralph Bergmann
I can access the manage webpage at port 8080 from my mac and it told me
that master and 1 slave is running and I can access them at port 7077

But the port scanner shows that port 8080 is open but not port 7077. I
started the port scanner on the same machine where Spark is running.


Ralph


Am 16.03.15 um 13:51 schrieb Sean Owen:
 Are you sure the master / slaves started?
 Do you have network connectivity between the two?
 Do you have multiple interfaces maybe?
 Does debian resolve correctly and as you expect to the right host/interface?


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



ClassNotFoundException

2015-03-16 Thread Ralph Bergmann
Hi,


I want to try the JavaSparkPi example[1] on a remote Spark server but I
get a ClassNotFoundException.

When I run it local it works but not remote.

I added the spark-core lib as dependency. Do I need more?

Any ideas?

Thanks Ralph


[1] ...
https://github.com/apache/spark/blob/master/examples/src/main/java/org/apache/spark/examples/JavaSparkPi.java
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
15/03/16 17:02:45 INFO CoarseGrainedExecutorBackend: Registered signal handlers 
for [TERM, HUP, INT]
2015-03-16 17:02:45.624 java[5730:1133038] Unable to load realm info from 
SCDynamicStore
15/03/16 17:02:45 WARN NativeCodeLoader: Unable to load native-hadoop library 
for your platform... using builtin-java classes where applicable
15/03/16 17:02:45 INFO SecurityManager: Changing view acls to: dasralph
15/03/16 17:02:45 INFO SecurityManager: Changing modify acls to: dasralph
15/03/16 17:02:45 INFO SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(dasralph); users 
with modify permissions: Set(dasralph)
15/03/16 17:02:46 INFO Slf4jLogger: Slf4jLogger started
15/03/16 17:02:46 INFO Remoting: Starting remoting
15/03/16 17:02:46 INFO Remoting: Remoting started; listening on addresses 
:[akka.tcp://driverPropsFetcher@10.0.0.10:54973]
15/03/16 17:02:46 INFO Utils: Successfully started service 'driverPropsFetcher' 
on port 54973.
15/03/16 17:02:46 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down 
remote daemon.
15/03/16 17:02:46 INFO SecurityManager: Changing view acls to: dasralph
15/03/16 17:02:46 INFO SecurityManager: Changing modify acls to: dasralph
15/03/16 17:02:46 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon 
shut down; proceeding with flushing remote transports.
15/03/16 17:02:46 INFO SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(dasralph); users 
with modify permissions: Set(dasralph)
15/03/16 17:02:46 INFO Slf4jLogger: Slf4jLogger started
15/03/16 17:02:46 INFO Remoting: Starting remoting
15/03/16 17:02:46 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut 
down.
15/03/16 17:02:46 INFO Remoting: Remoting started; listening on addresses 
:[akka.tcp://sparkExecutor@10.0.0.10:54977]
15/03/16 17:02:46 INFO Utils: Successfully started service 'sparkExecutor' on 
port 54977.
15/03/16 17:02:46 INFO AkkaUtils: Connecting to MapOutputTracker: 
akka.tcp://sparkDriver@10.0.0.10:54945/user/MapOutputTracker
15/03/16 17:02:46 INFO AkkaUtils: Connecting to BlockManagerMaster: 
akka.tcp://sparkDriver@10.0.0.10:54945/user/BlockManagerMaster
15/03/16 17:02:46 INFO DiskBlockManager: Created local directory at 
/var/folders/5p/s1k2jrqx38ncxkm4wlflgfvwgn/T/spark-185c8652-0244-42ff-90b4-fd9c7dbde7b3/spark-8a9ba955-de6f-4ab1-8995-1474ac1ba3a9/spark-2533211f-c399-467f-851d-6b9ec89defdc/blockmgr-82e98f66-d592-42a7-a4fa-1ab78780814b
15/03/16 17:02:46 INFO MemoryStore: MemoryStore started with capacity 265.4 MB
15/03/16 17:02:46 INFO AkkaUtils: Connecting to OutputCommitCoordinator: 
akka.tcp://sparkDriver@10.0.0.10:54945/user/OutputCommitCoordinator
15/03/16 17:02:46 INFO CoarseGrainedExecutorBackend: Connecting to driver: 
akka.tcp://sparkDriver@10.0.0.10:54945/user/CoarseGrainedScheduler
15/03/16 17:02:46 INFO WorkerWatcher: Connecting to worker 
akka.tcp://sparkWorker@10.0.0.10:58715/user/Worker
15/03/16 17:02:46 INFO WorkerWatcher: Successfully connected to 
akka.tcp://sparkWorker@10.0.0.10:58715/user/Worker
15/03/16 17:02:46 INFO CoarseGrainedExecutorBackend: Successfully registered 
with driver
15/03/16 17:02:46 INFO Executor: Starting executor ID 0 on host 10.0.0.10
15/03/16 17:02:47 INFO NettyBlockTransferService: Server created on 54983
15/03/16 17:02:47 INFO BlockManagerMaster: Trying to register BlockManager
15/03/16 17:02:47 INFO BlockManagerMaster: Registered BlockManager
15/03/16 17:02:47 INFO AkkaUtils: Connecting to HeartbeatReceiver: 
akka.tcp://sparkDriver@10.0.0.10:54945/user/HeartbeatReceiver
15/03/16 17:02:47 INFO CoarseGrainedExecutorBackend: Got assigned task 0
15/03/16 17:02:47 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
15/03/16 17:02:47 INFO CoarseGrainedExecutorBackend: Got assigned task 1
15/03/16 17:02:47 INFO Executor: Running task 1.0 in stage 0.0 (TID 1)
15/03/16 17:02:47 INFO TorrentBroadcast: Started reading broadcast variable 0
15/03/16 17:02:47 INFO MemoryStore: ensureFreeSpace(1679) called with curMem=0, 
maxMem=278302556
15/03/16 17:02:47 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in 
memory (estimated size 1679.0 B, free 265.4 MB)
15/03/16 17:02:47 INFO BlockManagerMaster: Updated info of block 
broadcast_0_piece0
15/03/16 17:02:47 INFO TorrentBroadcast: Reading broadcast variable 0 took 188 
ms
15/03/16 17:02:47 INFO MemoryStore: ensureFreeSpace(2312) called with 
curMem=1679, maxMem=278302556
15/03/16 17:02:47 INFO MemoryStore: Block broadcast_0 stored as values 

Re: unable to access spark @ spark://debian:7077

2015-03-16 Thread Ralph Bergmann
Okay I think I found the mistake

The Eclipse Maven plug suggested me version 1.2.1 of the spark-core lib
but I use Spark 1.3.0

As I fixed it I can access the Spark server.


Ralph


Am 16.03.15 um 14:39 schrieb Ralph Bergmann:
 I can access the manage webpage at port 8080 from my mac and it told me
 that master and 1 slave is running and I can access them at port 7077
 
 But the port scanner shows that port 8080 is open but not port 7077. I
 started the port scanner on the same machine where Spark is running.
 
 
 Ralph

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



unable to access spark @ spark://debian:7077

2015-03-16 Thread Ralph Bergmann
Hi,


I try my first steps with Spark but I have problems to access Spark
running on my Linux server from my Mac.

I start Spark with sbin/start-all.sh

When I now open the website at port 8080 I see that all is running and I
can access Spark at port 7077 but this doesn't work.

I scanned the Linux machine with nmap and port 7077 isn't open.

On my Mac side I get this error message:

Using Spark's default log4j profile:
org/apache/spark/log4j-defaults.properties
15/03/16 09:11:41 INFO SparkContext: Running Spark version 1.3.0
2015-03-16 09:11:41.782 java[1004:46676] Unable to load realm info from
SCDynamicStore
15/03/16 09:11:41 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
15/03/16 09:11:42 INFO SecurityManager: Changing view acls to: dasralph
15/03/16 09:11:42 INFO SecurityManager: Changing modify acls to: dasralph
15/03/16 09:11:42 INFO SecurityManager: SecurityManager: authentication
disabled; ui acls disabled; users with view permissions: Set(dasralph);
users with modify permissions: Set(dasralph)
15/03/16 09:11:43 INFO Slf4jLogger: Slf4jLogger started
15/03/16 09:11:43 INFO Remoting: Starting remoting
15/03/16 09:11:43 INFO Remoting: Remoting started; listening on
addresses :[akka.tcp://sparkDriver@imac_wlan.lan:52886]
15/03/16 09:11:43 INFO Utils: Successfully started service 'sparkDriver'
on port 52886.
15/03/16 09:11:43 INFO SparkEnv: Registering MapOutputTracker
15/03/16 09:11:43 INFO SparkEnv: Registering BlockManagerMaster
15/03/16 09:11:43 INFO DiskBlockManager: Created local directory at
/var/folders/h3/r2qtlmbn1cd6ctj_rcyyq_24gn/T/spark-9cce9d78-a0e6-4fb5-8cf6-00d91c764927/blockmgr-bd444818-a50a-4ea0-9cf6-3b2545f32238
15/03/16 09:11:43 INFO MemoryStore: MemoryStore started with capacity
1966.1 MB
15/03/16 09:11:43 INFO HttpFileServer: HTTP File server directory is
/var/folders/h3/r2qtlmbn1cd6ctj_rcyyq_24gn/T/spark-dd67dc02-c0b7-4167-b8d5-29f057cfb253/httpd-8534edfe-46b8-49ea-9273-3e8e47947332
15/03/16 09:11:43 INFO HttpServer: Starting HTTP Server
15/03/16 09:11:44 INFO Server: jetty-8.y.z-SNAPSHOT
15/03/16 09:11:44 INFO AbstractConnector: Started
SocketConnector@0.0.0.0:52913
15/03/16 09:11:44 INFO Utils: Successfully started service 'HTTP file
server' on port 52913.
15/03/16 09:11:44 INFO SparkEnv: Registering OutputCommitCoordinator
15/03/16 09:11:44 INFO Server: jetty-8.y.z-SNAPSHOT
15/03/16 09:11:44 INFO AbstractConnector: Started
SelectChannelConnector@0.0.0.0:4040
15/03/16 09:11:44 INFO Utils: Successfully started service 'SparkUI' on
port 4040.
15/03/16 09:11:44 INFO SparkUI: Started SparkUI at http://imac_wlan.lan:4040
15/03/16 09:11:44 INFO AppClient$ClientActor: Connecting to master
akka.tcp://sparkMaster@debian:7077/user/Master...
15/03/16 09:11:45 WARN ReliableDeliverySupervisor: Association with
remote system [akka.tcp://sparkMaster@debian:7077] has failed, address
is now gated for [5000] ms. Reason is: [Disassociated].
15/03/16 09:12:04 INFO AppClient$ClientActor: Connecting to master
akka.tcp://sparkMaster@debian:7077/user/Master...
15/03/16 09:12:04 WARN ReliableDeliverySupervisor: Association with
remote system [akka.tcp://sparkMaster@debian:7077] has failed, address
is now gated for [5000] ms. Reason is: [Disassociated].
15/03/16 09:12:24 INFO AppClient$ClientActor: Connecting to master
akka.tcp://sparkMaster@debian:7077/user/Master...
15/03/16 09:12:24 WARN ReliableDeliverySupervisor: Association with
remote system [akka.tcp://sparkMaster@debian:7077] has failed, address
is now gated for [5000] ms. Reason is: [Disassociated].
15/03/16 09:12:44 ERROR SparkDeploySchedulerBackend: Application has
been killed. Reason: All masters are unresponsive! Giving up.
15/03/16 09:12:44 ERROR TaskSchedulerImpl: Exiting due to error from
cluster scheduler: All masters are unresponsive! Giving up.
15/03/16 09:12:44 WARN SparkDeploySchedulerBackend: Application ID is
not initialized yet.
15/03/16 09:12:45 INFO NettyBlockTransferService: Server created on 53666
15/03/16 09:12:45 INFO BlockManagerMaster: Trying to register BlockManager
15/03/16 09:12:45 INFO BlockManagerMasterActor: Registering block
manager imac_wlan.lan:53666 with 1966.1 MB RAM, BlockManagerId(driver,
imac_wlan.lan, 53666)
15/03/16 09:12:45 INFO BlockManagerMaster: Registered BlockManager
15/03/16 09:12:45 ERROR MetricsSystem: Sink class
org.apache.spark.metrics.sink.MetricsServlet cannot be instantialized


What's going wrong?

Ralph

-- 

Ralph Bergmann


www  http://www.dasralph.de | http://www.the4thFloor.eu
mail ra...@dasralph.de
skypedasralph

facebook https://www.facebook.com/dasralph
google+  https://plus.google.com/+RalphBergmann
xing https://www.xing.com/profile/Ralph_Bergmann3
linkedin https://www.linkedin.com/in/ralphbergmann
gulp https://www.gulp.de/Profil/RalphBergmann.html
github   https://github.com/the4thfloor


pgp key id

How to connect a mobile app (Android/iOS) with a Spark backend?

2015-02-18 Thread Ralph Bergmann | the4thFloor.eu
Hi,


I have dependency problems to use spark-core inside of a HttpServlet
(see other mail from me).

Maybe I'm wrong?!

What I want to do: I develop a mobile app (Android and iOS) and want to
connect them with Spark on backend side.

To do this I want to use Tomcat. The app uses https to ask Tomcat for
the needed data and Tomcat asks Spark.

Is this the right way? Or is there a better way to connect my mobile
apps with the Spark backend?

I hope that I'm not the first one who want to do this.



Ralph



signature.asc
Description: OpenPGP digital signature


Re: How to connect a mobile app (Android/iOS) with a Spark backend?

2015-02-18 Thread Ralph Bergmann | the4thFloor.eu
Hi,


Am 18.02.15 um 15:58 schrieb Sean Owen:
 That said, it depends a lot on what you are trying to do. What are you
 trying to do? You just say you're connecting to spark.

There are 2 tasks I want to solve with Spark.

1) The user opens the mobile app. The app sends a pink to the backend.
When this happens the backend has to collect some data from other server
via http and has to do some stuff with this data.

2) The mobile app can download this data from 1. In this case the
backend has to find/create the right data (depending on user location,
rating, etc.)


Ralph



signature.asc
Description: OpenPGP digital signature


Berlin Apache Spark Meetup

2015-02-17 Thread Ralph Bergmann | the4thFloor.eu
Hi,


there is a small Spark Meetup group in Berlin, Germany :-)
http://www.meetup.com/Berlin-Apache-Spark-Meetup/

Plaes add this group to the Meetups list at
https://spark.apache.org/community.html


Ralph

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



spark-core in a servlet

2015-02-17 Thread Ralph Bergmann | the4thFloor.eu
Hi,


I want to use spark-core inside of a HttpServlet. I use Maven for the
build task but I have a dependency problem :-(

I get this error message:

ClassCastException:
com.sun.jersey.server.impl.container.servlet.JerseyServletContainerInitializer
cannot be cast to javax.servlet.ServletContainerInitializer

When I add this exclusions it builds but than there are other classes
not found at runtime:

  dependency
 groupIdorg.apache.spark/groupId
 artifactIdspark-core_2.11/artifactId
 version1.2.1/version
 exclusions
exclusion
   groupIdorg.apache.hadoop/groupId
   artifactIdhadoop-client/artifactId
/exclusion
exclusion
   groupIdorg.eclipse.jetty/groupId
   artifactId*/artifactId
/exclusion
 /exclusions
  /dependency


What can I do?


Thanks a lot!,

Ralph

-- 

Ralph Bergmann

iOS and Android app developer


www  http://www.the4thFloor.eu

mail ra...@the4thfloor.eu
skypedasralph

google+  https://plus.google.com/+RalphBergmann
xing https://www.xing.com/profile/Ralph_Bergmann3
linkedin https://www.linkedin.com/in/ralphbergmann
gulp https://www.gulp.de/Profil/RalphBergmann.html
github   https://github.com/the4thfloor


pgp key id   0x421F9B78
pgp fingerprint  CEE3 7AE9 07BE 98DF CD5A E69C F131 4A8E 421F 9B78
project
   xmlns=http://maven.apache.org/POM/4.0.0;
   xmlns:xsi=http://www.w3.org/2001/XMLSchema-instance;
   xsi:schemaLocation=http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd;
   modelVersion4.0.0/modelVersion

   nameSparkWordCount/name
   version1.0-SNAPSHOT/version
   groupIdeu.the4thfloor/groupId
   artifactIdSparkWordCount/artifactId
   packagingwar/packaging
   urlhttp://maven.apache.org/url

   properties
  jdk.version1.7/jdk.version
   /properties

   dependencies
  dependency
 groupIdjunit/groupId
 artifactIdjunit/artifactId
 version4.11/version
 scopetest/scope
  /dependency
  dependency
 groupIdjavax.servlet/groupId
 artifactIdjavax.servlet-api/artifactId
 version3.1.0/version
 scopeprovided/scope
  /dependency
  dependency
 groupIdorg.apache.spark/groupId
 artifactIdspark-core_2.11/artifactId
 version1.2.1/version
 exclusions
exclusion
   groupIdorg.apache.hadoop/groupId
   artifactIdhadoop-client/artifactId
/exclusion
exclusion
   groupIdorg.eclipse.jetty/groupId
   artifactId*/artifactId
/exclusion
 /exclusions
  /dependency
   /dependencies

   build
  finalNameSparkWordCount/finalName
  plugins
 !-- Eclipse project --
 plugin
groupIdorg.apache.maven.plugins/groupId
artifactIdmaven-eclipse-plugin/artifactId
version2.9/version
configuration
   !-- Always download and attach dependencies source code --
   downloadSourcestrue/downloadSources
   downloadJavadocsfalse/downloadJavadocs
   !-- Avoid type mvn eclipse:eclipse -Dwtpversion=2.0 --
   wtpversion2.0/wtpversion
/configuration
 /plugin
 !-- Set JDK Compiler Level --
 plugin
groupIdorg.apache.maven.plugins/groupId
artifactIdmaven-compiler-plugin/artifactId
version2.3.2/version
configuration
   source${jdk.version}/source
   target${jdk.version}/target
/configuration
 /plugin
 !-- For Maven Tomcat Plugin --
 plugin
groupIdorg.apache.tomcat.maven/groupId
artifactIdtomcat7-maven-plugin/artifactId
version2.2/version
configuration
   path/SparkWordCount/path
/configuration
 /plugin
  /plugins
   /build
/project

signature.asc
Description: OpenPGP digital signature