Hi Pat,

I have adapted the integration test for my app. I attach the integration
test as I'm running it, the data file and the screen output of the
integration test. In the latter you can see the same error as before even
though now I have much more data.

I wanted to implement a small example (a sort of 'Hello world' example, one
in which I could work out the numbers by hand) so I could understand how
PredictionIO and UR work before I move on to implementing real-world
apps... but this is turning out to be more difficult than anticipated.

Any help is greatly appreciated!

Best regards,
Noelia



On 5 October 2017 at 19:22, Pat Ferrel <p...@occamsmachete.com> wrote:

> Ok, that config should work. Does the integration test pass?
>
> The data you are using is extremely small and though it does look like it
> has cooccurrences, they may not meet minimum “big-data” thresholds used by
> default. Try adding more data or use the handmade example data, rename
> purchase to view and discard the existing view data if you wish.
>
> The error is very odd and I’ve never seen it. If the integration test
> works I can only surmise it's your data.
>
>
> On Oct 5, 2017, at 12:02 AM, Noelia Osés Fernández <no...@vicomtech.org>
> wrote:
>
> SPARK: spark-1.6.3-bin-hadoop2.6
>
> PIO: 0.11.0-incubating
>
> Scala: whatever gets installed when installing PIO 0.11.0-incubating, I
> haven't installed Scala separately
>
> UR: ActionML's UR v0.6.0 I suppose as that's the last version mentioned in
> the readme file. I have attached the UR zip file I downloaded from the
> actionml github account.
>
> Thank you for your help!!
>
> On 4 October 2017 at 17:20, Pat Ferrel <p...@occamsmachete.com> wrote:
>
>> What version of Scala. Spark, PIO, and UR are you using?
>>
>>
>> On Oct 4, 2017, at 6:10 AM, Noelia Osés Fernández <no...@vicomtech.org>
>> wrote:
>>
>> Hi all,
>>
>> I'm still trying to create a very simple app to learn to use PredictionIO
>> and still having trouble. I have done pio build no problem. But when I do
>> pio train I get a very long error message related to serialisation (error
>> message copied below).
>>
>> pio status reports system is all ready to go.
>>
>> The app I'm trying to build is very simple, it only has 'view' events.
>> Here's the engine.json:
>>
>> *===========================================================*
>> {
>>   "comment":" This config file uses default settings for all but the
>> required values see README.md for docs",
>>   "id": "default",
>>   "description": "Default settings",
>>   "engineFactory": "com.actionml.RecommendationEngine",
>>   "datasource": {
>>     "params" : {
>>       "name": "tiny_app_data.csv",
>>       "appName": "TinyApp",
>>       "eventNames": ["view"]
>>     }
>>   },
>>   "algorithms": [
>>     {
>>       "comment": "simplest setup where all values are default, popularity
>> based backfill, must add eventsNames",
>>       "name": "ur",
>>       "params": {
>>         "appName": "TinyApp",
>>         "indexName": "urindex",
>>         "typeName": "items",
>>         "comment": "must have data for the first event or the model will
>> not build, other events are optional",
>>         "eventNames": ["view"]
>>       }
>>     }
>>   ]
>> }
>> *===========================================================*
>>
>> The data I'm using is:
>>
>> "u1","i1"
>> "u2","i1"
>> "u2","i2"
>> "u3","i2"
>> "u3","i3"
>> "u4","i4"
>>
>> meaning user u viewed item i.
>>
>> The data has been added to the database with the following python code:
>>
>> *===========================================================*
>> """
>> Import sample data for recommendation engine
>> """
>>
>> import predictionio
>> import argparse
>> import random
>>
>> RATE_ACTIONS_DELIMITER = ","
>> SEED = 1
>>
>>
>> def import_events(client, file):
>>   f = open(file, 'r')
>>   random.seed(SEED)
>>   count = 0
>>   print "Importing data..."
>>
>>   items = []
>>   users = []
>>   f = open(file, 'r')
>>   for line in f:
>>     data = line.rstrip('\r\n').split(RATE_ACTIONS_DELIMITER)
>>     users.append(data[0])
>>     items.append(data[1])
>>     client.create_event(
>>       event="view",
>>       entity_type="user",
>>       entity_id=data[0],
>>       target_entity_type="item",
>>       target_entity_id=data[1]
>>     )
>>     print "Event: " + "view" + " entity_id: " + data[0] + "
>> target_entity_id: " + data[1]
>>     count += 1
>>   f.close()
>>
>>   users = set(users)
>>   items = set(items)
>>   print "All users: " + str(users)
>>   print "All items: " + str(items)
>>   for item in items:
>>     client.create_event(
>>       event="$set",
>>       entity_type="item",
>>       entity_id=item
>>     )
>>     count += 1
>>
>>
>>   print "%s events are imported." % count
>>
>>
>> if __name__ == '__main__':
>>   parser = argparse.ArgumentParser(
>>     description="Import sample data for recommendation engine")
>>   parser.add_argument('--access_key', default='invald_access_key')
>>   parser.add_argument('--url', default="http://localhost:7070";)
>>   parser.add_argument('--file', default="./data/tiny_app_data.csv")
>>
>>   args = parser.parse_args()
>>   print args
>>
>>   client = predictionio.EventClient(
>>     access_key=args.access_key,
>>     url=args.url,
>>     threads=5,
>>     qsize=500)
>>   import_events(client, args.file)
>> *===========================================================*
>>
>> My pio_env.sh is the following:
>>
>> *===========================================================*
>> #!/usr/bin/env bash
>> #
>> # Copy this file as pio-env.sh and edit it for your site's configuration.
>> #
>> # Licensed to the Apache Software Foundation (ASF) under one or more
>> # contributor license agreements.  See the NOTICE file distributed with
>> # this work for additional information regarding copyright ownership.
>> # The ASF licenses this file to You under the Apache License, Version 2.0
>> # (the "License"); you may not use this file except in compliance with
>> # the License.  You may obtain a copy of the License at
>> #
>> #    http://www.apache.org/licenses/LICENSE-2.0
>> #
>> # Unless required by applicable law or agreed to in writing, software
>> # distributed under the License is distributed on an "AS IS" BASIS,
>> # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
>> # See the License for the specific language governing permissions and
>> # limitations under the License.
>> #
>>
>> # PredictionIO Main Configuration
>> #
>> # This section controls core behavior of PredictionIO. It is very likely
>> that
>> # you need to change these to fit your site.
>>
>> # SPARK_HOME: Apache Spark is a hard dependency and must be configured.
>> # SPARK_HOME=$PIO_HOME/vendors/spark-2.0.2-bin-hadoop2.7
>> SPARK_HOME=$PIO_HOME/vendors/spark-1.6.3-bin-hadoop2.6
>>
>> POSTGRES_JDBC_DRIVER=$PIO_HOME/lib/postgresql-42.1.4.jar
>> MYSQL_JDBC_DRIVER=$PIO_HOME/lib/mysql-connector-java-5.1.41.jar
>>
>> # ES_CONF_DIR: You must configure this if you have advanced configuration
>> for
>> #              your Elasticsearch setup.
>> # ES_CONF_DIR=/opt/elasticsearch
>> #ES_CONF_DIR=$PIO_HOME/vendors/elasticsearch-1.7.6
>>
>> # HADOOP_CONF_DIR: You must configure this if you intend to run
>> PredictionIO
>> #                  with Hadoop 2.
>> # HADOOP_CONF_DIR=/opt/hadoop
>>
>> # HBASE_CONF_DIR: You must configure this if you intend to run
>> PredictionIO
>> #                 with HBase on a remote cluster.
>> # HBASE_CONF_DIR=$PIO_HOME/vendors/hbase-1.0.0/conf
>>
>> # Filesystem paths where PredictionIO uses as block storage.
>> PIO_FS_BASEDIR=$HOME/.pio_store
>> PIO_FS_ENGINESDIR=$PIO_FS_BASEDIR/engines
>> PIO_FS_TMPDIR=$PIO_FS_BASEDIR/tmp
>>
>> # PredictionIO Storage Configuration
>> #
>> # This section controls programs that make use of PredictionIO's built-in
>> # storage facilities. Default values are shown below.
>> #
>> # For more information on storage configuration please refer to
>> # http://predictionio.incubator.apache.org/system/anotherdatastore/
>>
>> # Storage Repositories
>>
>> # Default is to use PostgreSQL
>> PIO_STORAGE_REPOSITORIES_METADATA_NAME=pio_meta
>> PIO_STORAGE_REPOSITORIES_METADATA_SOURCE=ELASTICSEARCH
>>
>> PIO_STORAGE_REPOSITORIES_EVENTDATA_NAME=pio_event
>> PIO_STORAGE_REPOSITORIES_EVENTDATA_SOURCE=HBASE
>>
>> PIO_STORAGE_REPOSITORIES_MODELDATA_NAME=pio_model
>> PIO_STORAGE_REPOSITORIES_MODELDATA_SOURCE=LOCALFS
>>
>> # Storage Data Sources
>>
>> # PostgreSQL Default Settings
>> # Please change "pio" to your database name in
>> PIO_STORAGE_SOURCES_PGSQL_URL
>> # Please change PIO_STORAGE_SOURCES_PGSQL_USERNAME and
>> # PIO_STORAGE_SOURCES_PGSQL_PASSWORD accordingly
>> PIO_STORAGE_SOURCES_PGSQL_TYPE=jdbc
>> PIO_STORAGE_SOURCES_PGSQL_URL=jdbc:postgresql://localhost/pio
>> PIO_STORAGE_SOURCES_PGSQL_USERNAME=pio
>> PIO_STORAGE_SOURCES_PGSQL_PASSWORD=pio
>>
>> # MySQL Example
>> # PIO_STORAGE_SOURCES_MYSQL_TYPE=jdbc
>> # PIO_STORAGE_SOURCES_MYSQL_URL=jdbc:mysql://localhost/pio
>> # PIO_STORAGE_SOURCES_MYSQL_USERNAME=pio
>> # PIO_STORAGE_SOURCES_MYSQL_PASSWORD=pio
>>
>> # Elasticsearch Example
>> # PIO_STORAGE_SOURCES_ELASTICSEARCH_TYPE=elasticsearch
>> # PIO_STORAGE_SOURCES_ELASTICSEARCH_HOSTS=localhost
>> # PIO_STORAGE_SOURCES_ELASTICSEARCH_PORTS=9200
>> # PIO_STORAGE_SOURCES_ELASTICSEARCH_SCHEMES=http
>> # PIO_STORAGE_SOURCES_ELASTICSEARCH_HOME=$PIO_HOME/vendors/
>> elasticsearch-5.2.1
>> # Elasticsearch 1.x Example
>> PIO_STORAGE_SOURCES_ELASTICSEARCH_TYPE=elasticsearch
>> PIO_STORAGE_SOURCES_ELASTICSEARCH_CLUSTERNAME=myprojectES
>> PIO_STORAGE_SOURCES_ELASTICSEARCH_HOSTS=localhost
>> PIO_STORAGE_SOURCES_ELASTICSEARCH_PORTS=9300
>> PIO_STORAGE_SOURCES_ELASTICSEARCH_HOME=$PIO_HOME/vendors/
>> elasticsearch-1.7.6
>>
>> # Local File System Example
>> PIO_STORAGE_SOURCES_LOCALFS_TYPE=localfs
>> PIO_STORAGE_SOURCES_LOCALFS_PATH=$PIO_FS_BASEDIR/models
>>
>> # HBase Example
>> PIO_STORAGE_SOURCES_HBASE_TYPE=hbase
>> PIO_STORAGE_SOURCES_HBASE_HOME=$PIO_HOME/vendors/hbase-1.2.6
>>
>>
>> *===========================================================Error
>> message:*
>>
>> *===========================================================*
>> [ERROR] [TaskSetManager] Task 2.0 in stage 10.0 (TID 24) had a not
>> serializable result: org.apache.mahout.math.RandomAccessSparseVector
>> Serialization stack:
>>     - object not serializable (class: 
>> org.apache.mahout.math.RandomAccessSparseVector,
>> value: {3:1.0,2:1.0})
>>     - field (class: scala.Tuple2, name: _2, type: class java.lang.Object)
>>     - object (class scala.Tuple2, (2,{3:1.0,2:1.0})); not retrying
>> [ERROR] [TaskSetManager] Task 3.0 in stage 10.0 (TID 25) had a not
>> serializable result: org.apache.mahout.math.RandomAccessSparseVector
>> Serialization stack:
>>     - object not serializable (class: 
>> org.apache.mahout.math.RandomAccessSparseVector,
>> value: {0:1.0,3:1.0})
>>     - field (class: scala.Tuple2, name: _2, type: class java.lang.Object)
>>     - object (class scala.Tuple2, (3,{0:1.0,3:1.0})); not retrying
>> [ERROR] [TaskSetManager] Task 1.0 in stage 10.0 (TID 23) had a not
>> serializable result: org.apache.mahout.math.RandomAccessSparseVector
>> Serialization stack:
>>     - object not serializable (class: 
>> org.apache.mahout.math.RandomAccessSparseVector,
>> value: {1:1.0})
>>     - field (class: scala.Tuple2, name: _2, type: class java.lang.Object)
>>     - object (class scala.Tuple2, (1,{1:1.0})); not retrying
>> [ERROR] [TaskSetManager] Task 0.0 in stage 10.0 (TID 22) had a not
>> serializable result: org.apache.mahout.math.RandomAccessSparseVector
>> Serialization stack:
>>     - object not serializable (class: 
>> org.apache.mahout.math.RandomAccessSparseVector,
>> value: {0:1.0})
>>     - field (class: scala.Tuple2, name: _2, type: class java.lang.Object)
>>     - object (class scala.Tuple2, (0,{0:1.0})); not retrying
>> Exception in thread "main" org.apache.spark.SparkException: Job aborted
>> due to stage failure: Task 2.0 in stage 10.0 (TID 24) had a not
>> serializable result: org.apache.mahout.math.RandomAccessSparseVector
>> Serialization stack:
>>     - object not serializable (class: 
>> org.apache.mahout.math.RandomAccessSparseVector,
>> value: {3:1.0,2:1.0})
>>     - field (class: scala.Tuple2, name: _2, type: class java.lang.Object)
>>     - object (class scala.Tuple2, (2,{3:1.0,2:1.0}))
>>     at org.apache.spark.scheduler.DAGScheduler.org
>> <http://org.apache.spark.scheduler.dagscheduler.org/>$apache$spark$sch
>> eduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1431)
>>     at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$
>> 1.apply(DAGScheduler.scala:1419)
>>     at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$
>> 1.apply(DAGScheduler.scala:1418)
>>     at scala.collection.mutable.ResizableArray$class.foreach(Resiza
>> bleArray.scala:59)
>>     at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
>>     at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGSchedu
>> ler.scala:1418)
>>     at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskS
>> etFailed$1.apply(DAGScheduler.scala:799)
>>     at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskS
>> etFailed$1.apply(DAGScheduler.scala:799)
>>     at scala.Option.foreach(Option.scala:236)
>>     at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(
>> DAGScheduler.scala:799)
>>     at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOn
>> Receive(DAGScheduler.scala:1640)
>>     at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onRe
>> ceive(DAGScheduler.scala:1599)
>>     at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onRe
>> ceive(DAGScheduler.scala:1588)
>>     at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
>>     at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.
>> scala:620)
>>     at org.apache.spark.SparkContext.runJob(SparkContext.scala:1832)
>>     at org.apache.spark.SparkContext.runJob(SparkContext.scala:1952)
>>     at org.apache.spark.rdd.RDD$$anonfun$fold$1.apply(RDD.scala:1088)
>>     at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperati
>> onScope.scala:150)
>>     at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperati
>> onScope.scala:111)
>>     at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
>>     at org.apache.spark.rdd.RDD.fold(RDD.scala:1082)
>>     at org.apache.mahout.sparkbindings.drm.CheckpointedDrmSpark.com
>> puteNRow(CheckpointedDrmSpark.scala:188)
>>     at org.apache.mahout.sparkbindings.drm.CheckpointedDrmSpark.
>> nrow$lzycompute(CheckpointedDrmSpark.scala:55)
>>     at org.apache.mahout.sparkbindings.drm.CheckpointedDrmSpark.
>> nrow(CheckpointedDrmSpark.scala:55)
>>     at org.apache.mahout.sparkbindings.drm.CheckpointedDrmSpark.new
>> RowCardinality(CheckpointedDrmSpark.scala:219)
>>     at com.actionml.IndexedDatasetSpark$.apply(Preparator.scala:213)
>>     at com.actionml.Preparator$$anonfun$3.apply(Preparator.scala:71)
>>     at com.actionml.Preparator$$anonfun$3.apply(Preparator.scala:49)
>>     at scala.collection.TraversableLike$$anonfun$map$1.apply(
>> TraversableLike.scala:244)
>>     at scala.collection.TraversableLike$$anonfun$map$1.apply(
>> TraversableLike.scala:244)
>>     at scala.collection.immutable.List.foreach(List.scala:318)
>>     at scala.collection.TraversableLike$class.map(TraversableLike.
>> scala:244)
>>     at scala.collection.AbstractTraversable.map(Traversable.scala:105)
>>     at com.actionml.Preparator.prepare(Preparator.scala:49)
>>     at com.actionml.Preparator.prepare(Preparator.scala:32)
>>     at org.apache.predictionio.controller.PPreparator.prepareBase(
>> PPreparator.scala:37)
>>     at org.apache.predictionio.controller.Engine$.train(Engine.scala:671)
>>     at org.apache.predictionio.controller.Engine.train(Engine.scala:177)
>>     at org.apache.predictionio.workflow.CoreWorkflow$.runTrain(
>> CoreWorkflow.scala:67)
>>     at org.apache.predictionio.workflow.CreateWorkflow$.main(Create
>> Workflow.scala:250)
>>     at org.apache.predictionio.workflow.CreateWorkflow.main(CreateW
>> orkflow.scala)
>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce
>> ssorImpl.java:62)
>>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe
>> thodAccessorImpl.java:43)
>>     at java.lang.reflect.Method.invoke(Method.java:498)
>>     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy
>> $SparkSubmit$$runMain(SparkSubmit.scala:731)
>>     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit
>> .scala:181)
>>     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>>     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>>     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>
>> *===========================================================*
>> Thank you all for your help.
>>
>> Best regards,
>> noelia
>>
>>
>

Attachment: integration-test
Description: Binary data

Integration test for The Universal Recommender.


Checking status, should exit if pio is not running.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/lib/spark/pio-data-hdfs-assembly-0.11.0-incubating.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/lib/pio-assembly-0.11.0-incubating.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
[INFO] [Management$] Inspecting PredictionIO...
[INFO] [Management$] PredictionIO 0.11.0-incubating is installed at /home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating
[INFO] [Management$] Inspecting Apache Spark...
[INFO] [Management$] Apache Spark is installed at /home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/vendors/spark-1.6.3-bin-hadoop2.6
[INFO] [Management$] Apache Spark 1.6.3 detected (meets minimum requirement of 1.3.0)
[INFO] [Management$] Inspecting storage backend connections...
[INFO] [Storage$] Verifying Meta Data Backend (Source: ELASTICSEARCH)...
[INFO] [Storage$] Verifying Model Data Backend (Source: LOCALFS)...
[INFO] [Storage$] Verifying Event Data Backend (Source: HBASE)...
[INFO] [Storage$] Test writing to Event Store (App Id 0)...
[INFO] [HBLEvents] The table pio_event:events_0 doesn't exist yet. Creating now...
[INFO] [HBLEvents] Removing table pio_event:events_0...
[INFO] [Management$] Your system is all ready to go.

Checking to see if handmade app exists, should exit if not.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/lib/spark/pio-data-hdfs-assembly-0.11.0-incubating.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/lib/pio-assembly-0.11.0-incubating.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
[INFO] [Pio$]     App Name: TinyApp
[INFO] [Pio$]       App ID: 2
[INFO] [Pio$]  Description: 
[INFO] [Pio$]   Access Key: khM_SRwwRAHZJrGlnxuObT7nhEUPBkoPNPk115YddeZG5LOWHgamuYvvVKjcQlKk | (all)

Deleting TinyApp app data since importing data throws error otherwise
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/lib/spark/pio-data-hdfs-assembly-0.11.0-incubating.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/lib/pio-assembly-0.11.0-incubating.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
[INFO] [Pio$] Data of the following app (default channel only) will be deleted. Are you sure?
[INFO] [Pio$]     App Name: TinyApp
[INFO] [Pio$]       App ID: 2
[INFO] [Pio$]  Description: None
[INFO] [HBLEvents] Removing table pio_event:events_2...
[INFO] [App$] Removed Event Store for the app ID: 2
[INFO] [HBLEvents] The table pio_event:events_2 doesn't exist yet. Creating now...
[INFO] [App$] Initialized Event Store for the app ID: 2

Importing data for integration test
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/lib/spark/pio-data-hdfs-assembly-0.11.0-incubating.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/lib/pio-assembly-0.11.0-incubating.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Access key: khM_SRwwRAHZJrGlnxuObT7nhEUPBkoPNPk115YddeZG5LOWHgamuYvvVKjcQlKk
Namespace(access_key='khM_SRwwRAHZJrGlnxuObT7nhEUPBkoPNPk115YddeZG5LOWHgamuYvvVKjcQlKk', file='./data/tiny_app_data.csv', url='http://localhost:7070')
Importing data...
Event: view entity_id: "u1" target_entity_id: "i1"
Event: view entity_id: "u2" target_entity_id: "i1"
Event: view entity_id: "u2" target_entity_id: "i2"
Event: view entity_id: "u3" target_entity_id: "i2"
Event: view entity_id: "u3" target_entity_id: "i3"
Event: view entity_id: "u4" target_entity_id: "i4"
Event: view entity_id: "u1" target_entity_id: "i5"
Event: view entity_id: "u2" target_entity_id: "i5"
Event: view entity_id: "u2" target_entity_id: "i6"
Event: view entity_id: "u3" target_entity_id: "i6"
Event: view entity_id: "u3" target_entity_id: "i7"
Event: view entity_id: "u4" target_entity_id: "i8"
Event: view entity_id: "u1" target_entity_id: "i9"
Event: view entity_id: "u2" target_entity_id: "i9"
Event: view entity_id: "u2" target_entity_id: "i10"
Event: view entity_id: "u3" target_entity_id: "i10"
Event: view entity_id: "u3" target_entity_id: "i11"
Event: view entity_id: "u4" target_entity_id: "i12"
Event: view entity_id: "u1" target_entity_id: "i13"
Event: view entity_id: "u2" target_entity_id: "i13"
Event: view entity_id: "u2" target_entity_id: "i14"
Event: view entity_id: "u3" target_entity_id: "i14"
Event: view entity_id: "u3" target_entity_id: "i15"
Event: view entity_id: "u4" target_entity_id: "i16"
Event: view entity_id: "u1" target_entity_id: "i17"
Event: view entity_id: "u2" target_entity_id: "i17"
Event: view entity_id: "u2" target_entity_id: "i18"
Event: view entity_id: "u3" target_entity_id: "i18"
Event: view entity_id: "u3" target_entity_id: "i19"
Event: view entity_id: "u4" target_entity_id: "i20"
Event: view entity_id: "u1" target_entity_id: "i21"
Event: view entity_id: "u2" target_entity_id: "i21"
Event: view entity_id: "u2" target_entity_id: "i22"
Event: view entity_id: "u3" target_entity_id: "i22"
Event: view entity_id: "u3" target_entity_id: "i23"
Event: view entity_id: "u4" target_entity_id: "i24"
Event: view entity_id: "u1" target_entity_id: "i25"
Event: view entity_id: "u2" target_entity_id: "i25"
Event: view entity_id: "u2" target_entity_id: "i26"
Event: view entity_id: "u3" target_entity_id: "i26"
Event: view entity_id: "u3" target_entity_id: "i27"
Event: view entity_id: "u4" target_entity_id: "i28"
Event: view entity_id: "u1" target_entity_id: "i29"
Event: view entity_id: "u2" target_entity_id: "i29"
Event: view entity_id: "u2" target_entity_id: "i30"
Event: view entity_id: "u3" target_entity_id: "i30"
Event: view entity_id: "u3" target_entity_id: "i31"
Event: view entity_id: "u4" target_entity_id: "i32"
Event: view entity_id: "u1" target_entity_id: "i33"
Event: view entity_id: "u2" target_entity_id: "i33"
Event: view entity_id: "u2" target_entity_id: "i34"
Event: view entity_id: "u3" target_entity_id: "i34"
Event: view entity_id: "u3" target_entity_id: "i35"
Event: view entity_id: "u4" target_entity_id: "i36"
Event: view entity_id: "u1" target_entity_id: "i37"
Event: view entity_id: "u2" target_entity_id: "i37"
Event: view entity_id: "u2" target_entity_id: "i38"
Event: view entity_id: "u3" target_entity_id: "i38"
Event: view entity_id: "u3" target_entity_id: "i39"
Event: view entity_id: "u4" target_entity_id: "i40"
Event: view entity_id: "u1" target_entity_id: "i41"
Event: view entity_id: "u2" target_entity_id: "i41"
Event: view entity_id: "u2" target_entity_id: "i42"
Event: view entity_id: "u3" target_entity_id: "i42"
Event: view entity_id: "u3" target_entity_id: "i43"
Event: view entity_id: "u4" target_entity_id: "i44"
Event: view entity_id: "u1" target_entity_id: "i45"
Event: view entity_id: "u2" target_entity_id: "i45"
Event: view entity_id: "u2" target_entity_id: "i46"
Event: view entity_id: "u3" target_entity_id: "i46"
Event: view entity_id: "u3" target_entity_id: "i47"
Event: view entity_id: "u4" target_entity_id: "i48"
Event: view entity_id: "u1" target_entity_id: "i49"
Event: view entity_id: "u2" target_entity_id: "i49"
Event: view entity_id: "u2" target_entity_id: "i50"
Event: view entity_id: "u3" target_entity_id: "i50"
Event: view entity_id: "u3" target_entity_id: "i51"
Event: view entity_id: "u4" target_entity_id: "i52"
Event: view entity_id: "u1" target_entity_id: "i53"
Event: view entity_id: "u2" target_entity_id: "i53"
Event: view entity_id: "u2" target_entity_id: "i54"
Event: view entity_id: "u3" target_entity_id: "i54"
Event: view entity_id: "u3" target_entity_id: "i55"
Event: view entity_id: "u4" target_entity_id: "i56"
Event: view entity_id: "u1" target_entity_id: "i57"
Event: view entity_id: "u2" target_entity_id: "i57"
Event: view entity_id: "u2" target_entity_id: "i58"
Event: view entity_id: "u3" target_entity_id: "i58"
Event: view entity_id: "u3" target_entity_id: "i59"
Event: view entity_id: "u4" target_entity_id: "i60"
Event: view entity_id: "u1" target_entity_id: "i61"
Event: view entity_id: "u2" target_entity_id: "i61"
Event: view entity_id: "u2" target_entity_id: "i62"
Event: view entity_id: "u3" target_entity_id: "i62"
Event: view entity_id: "u3" target_entity_id: "i63"
Event: view entity_id: "u4" target_entity_id: "i64"
Event: view entity_id: "u1" target_entity_id: "i65"
Event: view entity_id: "u2" target_entity_id: "i65"
Event: view entity_id: "u2" target_entity_id: "i66"
Event: view entity_id: "u3" target_entity_id: "i66"
Event: view entity_id: "u3" target_entity_id: "i67"
Event: view entity_id: "u4" target_entity_id: "i68"
Event: view entity_id: "u1" target_entity_id: "i69"
Event: view entity_id: "u2" target_entity_id: "i69"
Event: view entity_id: "u2" target_entity_id: "i70"
Event: view entity_id: "u3" target_entity_id: "i70"
Event: view entity_id: "u3" target_entity_id: "i71"
Event: view entity_id: "u4" target_entity_id: "i72"
Event: view entity_id: "u1" target_entity_id: "i73"
Event: view entity_id: "u2" target_entity_id: "i73"
Event: view entity_id: "u2" target_entity_id: "i74"
Event: view entity_id: "u3" target_entity_id: "i74"
Event: view entity_id: "u3" target_entity_id: "i75"
Event: view entity_id: "u4" target_entity_id: "i76"
Event: view entity_id: "u1" target_entity_id: "i77"
Event: view entity_id: "u2" target_entity_id: "i77"
Event: view entity_id: "u2" target_entity_id: "i78"
Event: view entity_id: "u3" target_entity_id: "i78"
Event: view entity_id: "u3" target_entity_id: "i79"
Event: view entity_id: "u4" target_entity_id: "i80"
Event: view entity_id: "u1" target_entity_id: "i81"
Event: view entity_id: "u2" target_entity_id: "i81"
Event: view entity_id: "u2" target_entity_id: "i82"
Event: view entity_id: "u3" target_entity_id: "i82"
Event: view entity_id: "u3" target_entity_id: "i83"
Event: view entity_id: "u4" target_entity_id: "i84"
Event: view entity_id: "u1" target_entity_id: "i85"
Event: view entity_id: "u2" target_entity_id: "i85"
Event: view entity_id: "u2" target_entity_id: "i86"
Event: view entity_id: "u3" target_entity_id: "i86"
Event: view entity_id: "u3" target_entity_id: "i87"
Event: view entity_id: "u4" target_entity_id: "i88"
Event: view entity_id: "u1" target_entity_id: "i89"
Event: view entity_id: "u2" target_entity_id: "i89"
Event: view entity_id: "u2" target_entity_id: "i90"
Event: view entity_id: "u3" target_entity_id: "i90"
Event: view entity_id: "u3" target_entity_id: "i91"
Event: view entity_id: "u4" target_entity_id: "i92"
Event: view entity_id: "u1" target_entity_id: "i93"
Event: view entity_id: "u2" target_entity_id: "i93"
Event: view entity_id: "u2" target_entity_id: "i94"
Event: view entity_id: "u3" target_entity_id: "i94"
Event: view entity_id: "u3" target_entity_id: "i95"
Event: view entity_id: "u4" target_entity_id: "i96"
Event: view entity_id: "u1" target_entity_id: "i97"
Event: view entity_id: "u2" target_entity_id: "i97"
Event: view entity_id: "u2" target_entity_id: "i98"
Event: view entity_id: "u3" target_entity_id: "i98"
Event: view entity_id: "u3" target_entity_id: "i99"
Event: view entity_id: "u4" target_entity_id: "i100"
All users: set(['"u3"', '"u4"', '"u1"', '"u2"'])
All items: set(['"i63"', '"i97"', '"i52"', '"i13"', '"i72"', '"i8"', '"i44"', '"i33"', '"i20"', '"i1"', '"i87"', '"i38"', '"i90"', '"i14"', '"i62"', '"i43"', '"i73"', '"i53"', '"i30"', '"i98"', '"i23"', '"i86"', '"i91"', '"i15"', '"i78"', '"i65"', '"i42"', '"i39"', '"i54"', '"i70"', '"i25"', '"i99"', '"i22"', '"i85"', '"i3"', '"i16"', '"i79"', '"i84"', '"i64"', '"i41"', '"i92"', '"i55"', '"i71"', '"i24"', '"i31"', '"i49"', '"i2"', '"i83"', '"i93"', '"i17"', '"i76"', '"i67"', '"i40"', '"i27"', '"i56"', '"i36"', '"i5"', '"i100"', '"i48"', '"i82"', '"i10"', '"i77"', '"i66"', '"i47"', '"i57"', '"i37"', '"i4"', '"i94"', '"i61"', '"i11"', '"i74"', '"i29"', '"i81"', '"i50"', '"i46"', '"i69"', '"i18"', '"i26"', '"i7"', '"i34"', '"i89"', '"i58"', '"i95"', '"i60"', '"i28"', '"i80"', '"i51"', '"i12"', '"i75"', '"i9"', '"i68"', '"i45"', '"i32"', '"i19"', '"i21"', '"i35"', '"i88"', '"i59"', '"i6"', '"i96"'])
250 events are imported.

Building and deploying model
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/lib/spark/pio-data-hdfs-assembly-0.11.0-incubating.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/lib/pio-assembly-0.11.0-incubating.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
[WARN] [Template$] /home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/TinyApp/template.json does not exist. Template metadata will not be available. (This is safe to ignore if you are not working on a template.)
[INFO] [Engine$] Using command '/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/sbt/sbt' at /home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/TinyApp to build.
[INFO] [Engine$] If the path above is incorrect, this process will fail.
[INFO] [Engine$] Uber JAR disabled. Making sure lib/pio-assembly-0.11.0-incubating.jar is absent.
[INFO] [Engine$] Going to run: /home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/sbt/sbt  package assemblyPackageDependency in /home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/TinyApp
[INFO] [Engine$] Compilation finished successfully.
[INFO] [Engine$] Looking for an engine...
[INFO] [Engine$] Found universal-recommender-assembly-0.6.0-deps.jar
[INFO] [Engine$] Found universal-recommender_2.10-0.6.0.jar
[INFO] [Engine$] Build finished successfully.
[INFO] [Pio$] Your engine is ready for training.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/lib/spark/pio-data-hdfs-assembly-0.11.0-incubating.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/lib/pio-assembly-0.11.0-incubating.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
[WARN] [Template$] /home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/TinyApp/template.json does not exist. Template metadata will not be available. (This is safe to ignore if you are not working on a template.)
[WARN] [WorkflowUtils$] Environment variable POSTGRES_JDBC_DRIVER is pointing to a nonexistent file /home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/lib/postgresql-42.1.4.jar. Ignoring.
[WARN] [WorkflowUtils$] Environment variable MYSQL_JDBC_DRIVER is pointing to a nonexistent file /home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/lib/mysql-connector-java-5.1.41.jar. Ignoring.
[INFO] [Runner$] Submission command: /home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/vendors/spark-1.6.3-bin-hadoop2.6/bin/spark-submit --driver-memory 2g --executor-memory 2g --class org.apache.predictionio.workflow.CreateWorkflow --jars file:/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/TinyApp/target/scala-2.10/universal-recommender-assembly-0.6.0-deps.jar,file:/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/TinyApp/target/scala-2.10/universal-recommender_2.10-0.6.0.jar,file:/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/lib/spark/pio-data-localfs-assembly-0.11.0-incubating.jar,file:/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/lib/spark/pio-data-elasticsearch1-assembly-0.11.0-incubating.jar,file:/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/lib/spark/pio-data-hbase-assembly-0.11.0-incubating.jar,file:/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/lib/spark/pio-data-hdfs-assembly-0.11.0-incubating.jar,file:/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/lib/spark/pio-data-jdbc-assembly-0.11.0-incubating.jar --files file:/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/conf/log4j.properties --driver-class-path /home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/conf:/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/lib/postgresql-42.1.4.jar:/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/lib/mysql-connector-java-5.1.41.jar --driver-java-options -Dpio.log.dir=/home file:/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/lib/pio-assembly-0.11.0-incubating.jar --engine-id com.actionml.RecommendationEngine --engine-version 2a8c7fa1a7a0e35ac1c8d1e3e4371c8663eedab2 --engine-variant file:/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/TinyApp/engine.json --verbosity 0 --json-extractor Both --env PIO_STORAGE_SOURCES_HBASE_TYPE=hbase,PIO_ENV_LOADED=1,PIO_STORAGE_REPOSITORIES_METADATA_NAME=pio_meta,PIO_FS_BASEDIR=/home/.pio_store,PIO_STORAGE_SOURCES_ELASTICSEARCH_HOSTS=localhost,PIO_STORAGE_SOURCES_HBASE_HOME=/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/vendors/hbase-1.2.6,PIO_HOME=/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating,PIO_FS_ENGINESDIR=/home/.pio_store/engines,PIO_STORAGE_SOURCES_LOCALFS_PATH=/home/.pio_store/models,PIO_STORAGE_SOURCES_PGSQL_URL=jdbc:postgresql://localhost/pio,PIO_STORAGE_SOURCES_ELASTICSEARCH_TYPE=elasticsearch,PIO_STORAGE_REPOSITORIES_METADATA_SOURCE=ELASTICSEARCH,PIO_STORAGE_REPOSITORIES_MODELDATA_SOURCE=LOCALFS,PIO_STORAGE_REPOSITORIES_EVENTDATA_NAME=pio_event,PIO_STORAGE_SOURCES_PGSQL_PASSWORD=pio,PIO_STORAGE_SOURCES_ELASTICSEARCH_CLUSTERNAME=antolaturES,PIO_STORAGE_SOURCES_ELASTICSEARCH_HOME=/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/vendors/elasticsearch-1.7.6,PIO_STORAGE_SOURCES_PGSQL_TYPE=jdbc,PIO_FS_TMPDIR=/home/.pio_store/tmp,PIO_STORAGE_SOURCES_PGSQL_USERNAME=pio,PIO_STORAGE_REPOSITORIES_MODELDATA_NAME=pio_model,PIO_STORAGE_REPOSITORIES_EVENTDATA_SOURCE=HBASE,PIO_CONF_DIR=/home/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/conf,PIO_STORAGE_SOURCES_ELASTICSEARCH_PORTS=9300,PIO_STORAGE_SOURCES_LOCALFS_TYPE=localfs
[INFO] [RecommendationEngine$] 

               _   _             __  __ _
     /\       | | (_)           |  \/  | |
    /  \   ___| |_ _  ___  _ __ | \  / | |
   / /\ \ / __| __| |/ _ \| '_ \| |\/| | |
  / ____ \ (__| |_| | (_) | | | | |  | | |____
 /_/    \_\___|\__|_|\___/|_| |_|_|  |_|______|


      
[INFO] [Engine] Extracting datasource params...
[INFO] [WorkflowUtils$] No 'name' is found. Default empty String will be used.
[INFO] [Engine] Datasource params: (,DataSourceParams(TinyApp,List(view),None,None))
[INFO] [Engine] Extracting preparator params...
[INFO] [Engine] Preparator params: (,Empty)
[INFO] [Engine] Extracting serving params...
[INFO] [Engine] Serving params: (,Empty)
[WARN] [Utils] Your hostname, CPU00265U resolves to a loopback address: 127.0.1.1; using 192.168.15.205 instead (on interface eth0)
[WARN] [Utils] Set SPARK_LOCAL_IP if you need to bind to another address
[INFO] [Remoting] Starting remoting
[INFO] [Remoting] Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@192.168.15.205:48794]
[INFO] [DataSource] 
╔════════════════════════════════════════════════════════════╗
║ Init DataSource                                            ║
║ ══════════════════════════════════════════════════════════ ║
║ App name                      TinyApp                      ║
║ Event window                  None                         ║
║ Event names                   List(view)                   ║
║ Min events per user           None                         ║
╚════════════════════════════════════════════════════════════╝

[INFO] [URAlgorithm] 
╔════════════════════════════════════════════════════════════╗
║ Init URAlgorithm                                           ║
║ ══════════════════════════════════════════════════════════ ║
║ App name                      TinyApp                      ║
║ ES index name                 urindex                      ║
║ ES type name                  items                        ║
║ RecsModel                     all                          ║
║ Event names                   List(view)                   ║
║ ══════════════════════════════════════════════════════════ ║
║ Random seed                   -248772398                   ║
║ MaxCorrelatorsPerEventType    50                           ║
║ MaxEventsPerEventType         500                          ║
║ BlacklistEvents               List(view)                   ║
║ ══════════════════════════════════════════════════════════ ║
║ User bias                     1.0                          ║
║ Item bias                     1.0                          ║
║ Max query events              100                          ║
║ Limit                         20                           ║
║ ══════════════════════════════════════════════════════════ ║
║ Rankings:                                                  ║
║ popular                       Some(popRank)                ║
╚════════════════════════════════════════════════════════════╝

[INFO] [Engine$] EngineWorkflow.train
[INFO] [Engine$] DataSource: com.actionml.DataSource@433ef204
[INFO] [Engine$] Preparator: com.actionml.Preparator@359ceb13
[INFO] [Engine$] AlgorithmList: List(com.actionml.URAlgorithm@71789580)
[INFO] [Engine$] Data sanity check is on.
[WARN] [TableInputFormatBase] Cannot resolve the host name for CPU00265U.vicomtech.es/127.0.1.1 because of javax.naming.NameNotFoundException: DNS name not found [response code 3]; remaining name '1.1.0.127.in-addr.arpa'
[INFO] [DataSource] Received events List(view)
[WARN] [TableInputFormatBase] Cannot resolve the host name for CPU00265U.vicomtech.es/127.0.1.1 because of javax.naming.NameNotFoundException: DNS name not found [response code 3]; remaining name '1.1.0.127.in-addr.arpa'
[INFO] [Engine$] com.actionml.TrainingData does not support data sanity check. Skipping check.
[INFO] [Preparator] EventName: view
[ERROR] [Executor] Exception in task 3.0 in stage 10.0 (TID 25)
[ERROR] [Executor] Exception in task 0.0 in stage 10.0 (TID 22)
[ERROR] [Executor] Exception in task 1.0 in stage 10.0 (TID 23)
[ERROR] [TaskSetManager] Task 3.0 in stage 10.0 (TID 25) had a not serializable result: org.apache.mahout.math.RandomAccessSparseVector
Serialization stack:
	- object not serializable (class: org.apache.mahout.math.RandomAccessSparseVector, value: {0:1.0,33:1.0,25:1.0,50:1.0,75:1.0,9:1.0,67:1.0,65:1.0,59:1.0,26:1.0,16:1.0,17:1.0,23:1.0,18:1.0,56:1.0,47:1.0,51:1.0,48:1.0,52:1.0,38:1.0,40:1.0,42:1.0,44:1.0,41:1.0,4:1.0,12:1.0,28:1.0,69:1.0,71:1.0,96:1.0,19:1.0,62:1.0,21:1.0,64:1.0,6:1.0,68:1.0,89:1.0,11:1.0,93:1.0,73:1.0,10:1.0,1:1.0,90:1.0,83:1.0,7:1.0,82:1.0,88:1.0,5:1.0,84:1.0,80:1.0})
	- field (class: scala.Tuple2, name: _2, type: class java.lang.Object)
	- object (class scala.Tuple2, (3,{0:1.0,33:1.0,25:1.0,50:1.0,75:1.0,9:1.0,67:1.0,65:1.0,59:1.0,26:1.0,16:1.0,17:1.0,23:1.0,18:1.0,56:1.0,47:1.0,51:1.0,48:1.0,52:1.0,38:1.0,40:1.0,42:1.0,44:1.0,41:1.0,4:1.0,12:1.0,28:1.0,69:1.0,71:1.0,96:1.0,19:1.0,62:1.0,21:1.0,64:1.0,6:1.0,68:1.0,89:1.0,11:1.0,93:1.0,73:1.0,10:1.0,1:1.0,90:1.0,83:1.0,7:1.0,82:1.0,88:1.0,5:1.0,84:1.0,80:1.0})); not retrying
[ERROR] [Executor] Exception in task 2.0 in stage 10.0 (TID 24)
[ERROR] [TaskSetManager] Task 0.0 in stage 10.0 (TID 22) had a not serializable result: org.apache.mahout.math.RandomAccessSparseVector
Serialization stack:
	- object not serializable (class: org.apache.mahout.math.RandomAccessSparseVector, value: {0:1.0,25:1.0,50:1.0,67:1.0,71:1.0,19:1.0,23:1.0,62:1.0,21:1.0,56:1.0,6:1.0,47:1.0,68:1.0,48:1.0,11:1.0,10:1.0,52:1.0,90:1.0,38:1.0,88:1.0,5:1.0,83:1.0,82:1.0,44:1.0,40:1.0})
	- field (class: scala.Tuple2, name: _2, type: class java.lang.Object)
	- object (class scala.Tuple2, (0,{0:1.0,25:1.0,50:1.0,67:1.0,71:1.0,19:1.0,23:1.0,62:1.0,21:1.0,56:1.0,6:1.0,47:1.0,68:1.0,48:1.0,11:1.0,10:1.0,52:1.0,90:1.0,38:1.0,88:1.0,5:1.0,83:1.0,82:1.0,44:1.0,40:1.0})); not retrying
[ERROR] [TaskSetManager] Task 1.0 in stage 10.0 (TID 23) had a not serializable result: org.apache.mahout.math.RandomAccessSparseVector
Serialization stack:
	- object not serializable (class: org.apache.mahout.math.RandomAccessSparseVector, value: {66:1.0,29:1.0,70:1.0,91:1.0,58:1.0,37:1.0,13:1.0,8:1.0,94:1.0,30:1.0,57:1.0,22:1.0,20:1.0,35:1.0,97:1.0,60:1.0,27:1.0,72:1.0,3:1.0,34:1.0,77:1.0,46:1.0,81:1.0,86:1.0,43:1.0})
	- field (class: scala.Tuple2, name: _2, type: class java.lang.Object)
	- object (class scala.Tuple2, (1,{66:1.0,29:1.0,70:1.0,91:1.0,58:1.0,37:1.0,13:1.0,8:1.0,94:1.0,30:1.0,57:1.0,22:1.0,20:1.0,35:1.0,97:1.0,60:1.0,27:1.0,72:1.0,3:1.0,34:1.0,77:1.0,46:1.0,81:1.0,86:1.0,43:1.0})); not retrying
[ERROR] [TaskSetManager] Task 2.0 in stage 10.0 (TID 24) had a not serializable result: org.apache.mahout.math.RandomAccessSparseVector
Serialization stack:
	- object not serializable (class: org.apache.mahout.math.RandomAccessSparseVector, value: {99:1.0,49:1.0,75:1.0,74:1.0,9:1.0,98:1.0,65:1.0,24:1.0,59:1.0,26:1.0,16:1.0,17:1.0,18:1.0,39:1.0,51:1.0,55:1.0,31:1.0,32:1.0,36:1.0,45:1.0,41:1.0,42:1.0,4:1.0,78:1.0,12:1.0,54:1.0,53:1.0,95:1.0,96:1.0,61:1.0,69:1.0,28:1.0,63:1.0,64:1.0,2:1.0,89:1.0,15:1.0,14:1.0,76:1.0,92:1.0,93:1.0,73:1.0,1:1.0,7:1.0,79:1.0,80:1.0,84:1.0,85:1.0,87:1.0,33:1.0})
	- field (class: scala.Tuple2, name: _2, type: class java.lang.Object)
	- object (class scala.Tuple2, (2,{99:1.0,49:1.0,75:1.0,74:1.0,9:1.0,98:1.0,65:1.0,24:1.0,59:1.0,26:1.0,16:1.0,17:1.0,18:1.0,39:1.0,51:1.0,55:1.0,31:1.0,32:1.0,36:1.0,45:1.0,41:1.0,42:1.0,4:1.0,78:1.0,12:1.0,54:1.0,53:1.0,95:1.0,96:1.0,61:1.0,69:1.0,28:1.0,63:1.0,64:1.0,2:1.0,89:1.0,15:1.0,14:1.0,76:1.0,92:1.0,93:1.0,73:1.0,1:1.0,7:1.0,79:1.0,80:1.0,84:1.0,85:1.0,87:1.0,33:1.0})); not retrying
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 3.0 in stage 10.0 (TID 25) had a not serializable result: org.apache.mahout.math.RandomAccessSparseVector
Serialization stack:
	- object not serializable (class: org.apache.mahout.math.RandomAccessSparseVector, value: {0:1.0,33:1.0,25:1.0,50:1.0,75:1.0,9:1.0,67:1.0,65:1.0,59:1.0,26:1.0,16:1.0,17:1.0,23:1.0,18:1.0,56:1.0,47:1.0,51:1.0,48:1.0,52:1.0,38:1.0,40:1.0,42:1.0,44:1.0,41:1.0,4:1.0,12:1.0,28:1.0,69:1.0,71:1.0,96:1.0,19:1.0,62:1.0,21:1.0,64:1.0,6:1.0,68:1.0,89:1.0,11:1.0,93:1.0,73:1.0,10:1.0,1:1.0,90:1.0,83:1.0,7:1.0,82:1.0,88:1.0,5:1.0,84:1.0,80:1.0})
	- field (class: scala.Tuple2, name: _2, type: class java.lang.Object)
	- object (class scala.Tuple2, (3,{0:1.0,33:1.0,25:1.0,50:1.0,75:1.0,9:1.0,67:1.0,65:1.0,59:1.0,26:1.0,16:1.0,17:1.0,23:1.0,18:1.0,56:1.0,47:1.0,51:1.0,48:1.0,52:1.0,38:1.0,40:1.0,42:1.0,44:1.0,41:1.0,4:1.0,12:1.0,28:1.0,69:1.0,71:1.0,96:1.0,19:1.0,62:1.0,21:1.0,64:1.0,6:1.0,68:1.0,89:1.0,11:1.0,93:1.0,73:1.0,10:1.0,1:1.0,90:1.0,83:1.0,7:1.0,82:1.0,88:1.0,5:1.0,84:1.0,80:1.0}))
	at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1431)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1419)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1418)
	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
	at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1418)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
	at scala.Option.foreach(Option.scala:236)
	at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:799)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1640)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1599)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1588)
	at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
	at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:620)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1832)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1952)
	at org.apache.spark.rdd.RDD$$anonfun$fold$1.apply(RDD.scala:1088)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
	at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
	at org.apache.spark.rdd.RDD.fold(RDD.scala:1082)
	at org.apache.mahout.sparkbindings.drm.CheckpointedDrmSpark.computeNRow(CheckpointedDrmSpark.scala:188)
	at org.apache.mahout.sparkbindings.drm.CheckpointedDrmSpark.nrow$lzycompute(CheckpointedDrmSpark.scala:55)
	at org.apache.mahout.sparkbindings.drm.CheckpointedDrmSpark.nrow(CheckpointedDrmSpark.scala:55)
	at org.apache.mahout.sparkbindings.drm.CheckpointedDrmSpark.newRowCardinality(CheckpointedDrmSpark.scala:219)
	at com.actionml.IndexedDatasetSpark$.apply(Preparator.scala:213)
	at com.actionml.Preparator$$anonfun$3.apply(Preparator.scala:71)
	at com.actionml.Preparator$$anonfun$3.apply(Preparator.scala:49)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
	at scala.collection.immutable.List.foreach(List.scala:318)
	at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
	at scala.collection.AbstractTraversable.map(Traversable.scala:105)
	at com.actionml.Preparator.prepare(Preparator.scala:49)
	at com.actionml.Preparator.prepare(Preparator.scala:32)
	at org.apache.predictionio.controller.PPreparator.prepareBase(PPreparator.scala:37)
	at org.apache.predictionio.controller.Engine$.train(Engine.scala:671)
	at org.apache.predictionio.controller.Engine.train(Engine.scala:177)
	at org.apache.predictionio.workflow.CoreWorkflow$.runTrain(CoreWorkflow.scala:67)
	at org.apache.predictionio.workflow.CreateWorkflow$.main(CreateWorkflow.scala:250)
	at org.apache.predictionio.workflow.CreateWorkflow.main(CreateWorkflow.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
"u1","i1"
"u2","i1"
"u2","i2"
"u3","i2"
"u3","i3"
"u4","i4"
"u1","i5"
"u2","i5"
"u2","i6"
"u3","i6"
"u3","i7"
"u4","i8"
"u1","i9"
"u2","i9"
"u2","i10"
"u3","i10"
"u3","i11"
"u4","i12"
"u1","i13"
"u2","i13"
"u2","i14"
"u3","i14"
"u3","i15"
"u4","i16"
"u1","i17"
"u2","i17"
"u2","i18"
"u3","i18"
"u3","i19"
"u4","i20"
"u1","i21"
"u2","i21"
"u2","i22"
"u3","i22"
"u3","i23"
"u4","i24"
"u1","i25"
"u2","i25"
"u2","i26"
"u3","i26"
"u3","i27"
"u4","i28"
"u1","i29"
"u2","i29"
"u2","i30"
"u3","i30"
"u3","i31"
"u4","i32"
"u1","i33"
"u2","i33"
"u2","i34"
"u3","i34"
"u3","i35"
"u4","i36"
"u1","i37"
"u2","i37"
"u2","i38"
"u3","i38"
"u3","i39"
"u4","i40"
"u1","i41"
"u2","i41"
"u2","i42"
"u3","i42"
"u3","i43"
"u4","i44"
"u1","i45"
"u2","i45"
"u2","i46"
"u3","i46"
"u3","i47"
"u4","i48"
"u1","i49"
"u2","i49"
"u2","i50"
"u3","i50"
"u3","i51"
"u4","i52"
"u1","i53"
"u2","i53"
"u2","i54"
"u3","i54"
"u3","i55"
"u4","i56"
"u1","i57"
"u2","i57"
"u2","i58"
"u3","i58"
"u3","i59"
"u4","i60"
"u1","i61"
"u2","i61"
"u2","i62"
"u3","i62"
"u3","i63"
"u4","i64"
"u1","i65"
"u2","i65"
"u2","i66"
"u3","i66"
"u3","i67"
"u4","i68"
"u1","i69"
"u2","i69"
"u2","i70"
"u3","i70"
"u3","i71"
"u4","i72"
"u1","i73"
"u2","i73"
"u2","i74"
"u3","i74"
"u3","i75"
"u4","i76"
"u1","i77"
"u2","i77"
"u2","i78"
"u3","i78"
"u3","i79"
"u4","i80"
"u1","i81"
"u2","i81"
"u2","i82"
"u3","i82"
"u3","i83"
"u4","i84"
"u1","i85"
"u2","i85"
"u2","i86"
"u3","i86"
"u3","i87"
"u4","i88"
"u1","i89"
"u2","i89"
"u2","i90"
"u3","i90"
"u3","i91"
"u4","i92"
"u1","i93"
"u2","i93"
"u2","i94"
"u3","i94"
"u3","i95"
"u4","i96"
"u1","i97"
"u2","i97"
"u2","i98"
"u3","i98"
"u3","i99"
"u4","i100"

Reply via email to