I'm sorry, I misunderstood. Oracle has not been supported as of yet.
https://github.com/apache/predictionio/pull/387

On Mon, Oct 15, 2018 at 2:38 PM takako shimamoto <chiboch...@gmail.com> wrote:
>
> This is because Oracle internally changes empty string to NULL values.
> Oracle simply won't let insert an empty string. I think this is a lack
> of consideration at the time of adding Oracle support. I'm not sure of
> how to fix at the moment.
>
> On Sun, Oct 14, 2018 at 11:24 AM Praveen Prasannakumar
> <praveen2399wo...@gmail.com> wrote:
> >>
> >> Hello Team
> >
> > I am using Classification template (Naive Bayes algorithm) . I am able to 
> > build ,train and getting response from engine. I am using Oracle database 
> > as event data store. I am running into sql exception while doing PIO eval. 
> > Please help me.
> >
> > Command used :
> > pio eval org.example.classification.AccuracyEvaluation 
> > org.example.classification.EngineParamsList
> >
> > Sql Exception:
> >
> >  [StatementExecutor$$anon$1] SQL execution failed (Reason: ORA-01400: 
> > cannot insert NULL into ("DMA"."PIO_META_EVALUATIONINSTANCES"."STATUS")
> >
> > ):
> >
> >
> >
> > Please find complete logs which I got during PIO eval.
> >
> >>
> >> Logs :
> >>
> >> bash-4.1$ pwd
> >>
> >> /proj/PredictionIO-0.12.0-incubating/MyClassification
> >>
> >> bash-4.1$ pio eval org.example.classification.AccuracyEvaluation 
> >> org.example.classification.EngineParamsList
> >>
> >> [WARN] [WorkflowUtils$] Environment variable POSTGRES_JDBC_DRIVER is 
> >> pointing to a nonexistent file 
> >> /proj/PredictionIO-0.12.0-incubating/lib/postgresql-42.0.0.jar. Ignoring.
> >>
> >> [WARN] [WorkflowUtils$] Environment variable MYSQL_JDBC_DRIVER is pointing 
> >> to a nonexistent file 
> >> /proj/PredictionIO-0.12.0-incubating/lib/mysql-connector-java-5.1.41.jar. 
> >> Ignoring.
> >>
> >> [INFO] [Runner$] Submission command: 
> >> /proj/PredictionIO-0.12.0-incubating/vendors/spark-2.1.1-bin-hadoop2.6/bin/spark-submit
> >>  --class org.apache.predictionio.workflow.CreateWorkflow --jars 
> >> file:/proj/PredictionIO-0.12.0-incubating/lib/ojdbc6_g.jar,file:/proj/PredictionIO-0.12.0-incubating/MyClassification/target/scala-2.11/template-scala-parallel-classification-assembly-0.1-SNAPSHOT-deps.jar,file:/proj/PredictionIO-0.12.0-incubating/MyClassification/target/scala-2.11/template-scala-parallel-classification_2.11-0.1-SNAPSHOT.jar,file:/proj/PredictionIO-0.12.0-incubating/lib/spark/pio-data-s3-assembly-0.12.0-incubating.jar,file:/proj/PredictionIO-0.12.0-incubating/lib/spark/pio-data-localfs-assembly-0.12.0-incubating.jar,file:/proj/PredictionIO-0.12.0-incubating/lib/spark/pio-data-hdfs-assembly-0.12.0-incubating.jar,file:/proj/PredictionIO-0.12.0-incubating/lib/spark/pio-data-hbase-assembly-0.12.0-incubating.jar,file:/proj/PredictionIO-0.12.0-incubating/lib/spark/pio-data-jdbc-assembly-0.12.0-incubating.jar,file:/proj/PredictionIO-0.12.0-incubating/lib/spark/pio-data-elasticsearch-assembly-0.12.0-incubating.jar
> >>  --files file:/proj/PredictionIO-0.12.0-incubating/conf/log4j.properties 
> >> --driver-class-path 
> >> /proj/PredictionIO-0.12.0-incubating/conf:/proj/PredictionIO-0.12.0-incubating/lib/postgresql-42.0.0.jar:/proj/PredictionIO-0.12.0-incubating/lib/mysql-connector-java-5.1.41.jar:/proj/PredictionIO-0.12.0-incubating/lib/ojdbc6_g.jar
> >>  --driver-java-options -Dpio.log.dir=/proj/PredictionIO-0.12.0-incubating 
> >> file:/proj/PredictionIO-0.12.0-incubating/lib/pio-assembly-0.12.0-incubating.jar
> >>  --engine-id org.example.classification.ClassificationEngine 
> >> --engine-version c8ba776f63d156ef316149846eeaf940d3c8caf8 --engine-variant 
> >> file:/proj/PredictionIO-0.12.0-incubating/MyClassification/engine.json 
> >> --verbosity 0 --evaluation-class 
> >> org.example.classification.AccuracyEvaluation 
> >> --engine-params-generator-class 
> >> org.example.classification.EngineParamsList --json-extractor Both --env 
> >> PIO_ENV_LOADED=1,PIO_STORAGE_SOURCES_ORACLE_TYPE=jdbc,PIO_STORAGE_REPOSITORIES_METADATA_NAME=pio_meta,PIO_FS_BASEDIR=/proj/PredictionIO-0.12.0-incubating/.pio_store,PIO_STORAGE_SOURCES_ORACLE_URL=jdbc:oracle:thin:@//b-6/DSSW,PIO_HOME=/proj/PredictionIO-0.12.0-incubating,PIO_FS_ENGINESDIR=/proj/PredictionIO-0.12.0-incubating/.pio_store/engines,PIO_STORAGE_REPOSITORIES_METADATA_SOURCE=ORACLE,PIO_STORAGE_REPOSITORIES_MODELDATA_SOURCE=ORACLE,PIO_STORAGE_REPOSITORIES_EVENTDATA_NAME=pio_event,PIO_STORAGE_SOURCES_ORACLE_PASSWORD=dma123,PIO_FS_TMPDIR=/proj/PredictionIO-0.12.0-incubating/.pio_store/tmp,PIO_STORAGE_REPOSITORIES_MODELDATA_NAME=pio_model,PIO_STORAGE_REPOSITORIES_EVENTDATA_SOURCE=ORACLE,PIO_CONF_DIR=/proj/PredictionIO-0.12.0-incubating/conf,PIO_STORAGE_SOURCES_ORACLE_USERNAME=
> >>
> >> [INFO] [CoreWorkflow$] runEvaluation started
> >>
> >> [INFO] [log] Logging initialized @20767ms
> >>
> >> [INFO] [Server] jetty-9.2.z-SNAPSHOT
> >>
> >> [INFO] [ContextHandler] Started 
> >> o.s.j.s.ServletContextHandler@7da31a40{/jobs,null,AVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Started 
> >> o.s.j.s.ServletContextHandler@28ee7bee{/jobs/json,null,AVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Started 
> >> o.s.j.s.ServletContextHandler@1b5a1d85{/jobs/job,null,AVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Started 
> >> o.s.j.s.ServletContextHandler@31e130bf{/jobs/job/json,null,AVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Started 
> >> o.s.j.s.ServletContextHandler@54755dd9{/stages,null,AVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Started 
> >> o.s.j.s.ServletContextHandler@f1f7db2{/stages/json,null,AVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Started 
> >> o.s.j.s.ServletContextHandler@4462efe1{/stages/stage,null,AVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Started 
> >> o.s.j.s.ServletContextHandler@7c3e4b1a{/stages/stage/json,null,AVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Started 
> >> o.s.j.s.ServletContextHandler@2db4ad1{/stages/pool,null,AVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Started 
> >> o.s.j.s.ServletContextHandler@765d55d5{/stages/pool/json,null,AVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Started 
> >> o.s.j.s.ServletContextHandler@2513a118{/storage,null,AVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Started 
> >> o.s.j.s.ServletContextHandler@2bfb583b{/storage/json,null,AVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Started 
> >> o.s.j.s.ServletContextHandler@73ae0257{/storage/rdd,null,AVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Started 
> >> o.s.j.s.ServletContextHandler@6fc1020a{/storage/rdd/json,null,AVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Started 
> >> o.s.j.s.ServletContextHandler@5762658b{/environment,null,AVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Started 
> >> o.s.j.s.ServletContextHandler@2629d5dc{/environment/json,null,AVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Started 
> >> o.s.j.s.ServletContextHandler@2596d7f4{/executors,null,AVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Started 
> >> o.s.j.s.ServletContextHandler@42a0501e{/executors/json,null,AVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Started 
> >> o.s.j.s.ServletContextHandler@6aa3bfc{/executors/threadDump,null,AVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Started 
> >> o.s.j.s.ServletContextHandler@6e4599c0{/executors/threadDump/json,null,AVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Started 
> >> o.s.j.s.ServletContextHandler@7dffda8b{/static,null,AVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Started 
> >> o.s.j.s.ServletContextHandler@3d1f558a{/,null,AVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Started 
> >> o.s.j.s.ServletContextHandler@6abdec0e{/api,null,AVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Started 
> >> o.s.j.s.ServletContextHandler@28f4f300{/jobs/job/kill,null,AVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Started 
> >> o.s.j.s.ServletContextHandler@2b5c4f17{/stages/stage/kill,null,AVAILABLE,@Spark}
> >>
> >> [INFO] [ServerConnector] Started Spark@50448409{HTTP/1.1}{0.0.0.0:4040}
> >>
> >> [INFO] [Server] Started @20949ms
> >>
> >> [INFO] [ContextHandler] Started 
> >> o.s.j.s.ServletContextHandler@62cba181{/metrics/json,null,AVAILABLE,@Spark}
> >>
> >> [ERROR] [StatementExecutor$$anon$1] SQL execution failed (Reason: 
> >> ORA-01400: cannot insert NULL into 
> >> ("DMA"."PIO_META_EVALUATIONINSTANCES"."STATUS")
> >>
> >> ):
> >>
> >>
> >>
> >>    INSERT INTO pio_meta_evaluationinstances VALUES( 
> >> '9cc6f545-195f-44b7-aac4-5c20e21851e0', '', '2018-10-12 15:42:05.657', 
> >> '2018-10-12 15:42:05.676', 
> >> 'org.example.classification.AccuracyEvaluation', 
> >> 'org.example.classification.EngineParamsList', '', 
> >> 'PIO_ENV_LOADED=1,PIO_STORAGE_SOURCES_ORACLE_TYPE=jdbc,PIO_STORAGE_REPOSITORIES_METADATA_NAME=pio_met...
> >>  (833)', 'spark.executor.extraClassPath=.', '', '', '')
> >>
> >>
> >>
> >> [INFO] [ServerConnector] Stopped Spark@50448409{HTTP/1.1}{0.0.0.0:4040}
> >>
> >> [INFO] [ContextHandler] Stopped 
> >> o.s.j.s.ServletContextHandler@2b5c4f17{/stages/stage/kill,null,UNAVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Stopped 
> >> o.s.j.s.ServletContextHandler@28f4f300{/jobs/job/kill,null,UNAVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Stopped 
> >> o.s.j.s.ServletContextHandler@6abdec0e{/api,null,UNAVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Stopped 
> >> o.s.j.s.ServletContextHandler@3d1f558a{/,null,UNAVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Stopped 
> >> o.s.j.s.ServletContextHandler@7dffda8b{/static,null,UNAVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Stopped 
> >> o.s.j.s.ServletContextHandler@6e4599c0{/executors/threadDump/json,null,UNAVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Stopped 
> >> o.s.j.s.ServletContextHandler@6aa3bfc{/executors/threadDump,null,UNAVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Stopped 
> >> o.s.j.s.ServletContextHandler@42a0501e{/executors/json,null,UNAVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Stopped 
> >> o.s.j.s.ServletContextHandler@2596d7f4{/executors,null,UNAVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Stopped 
> >> o.s.j.s.ServletContextHandler@2629d5dc{/environment/json,null,UNAVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Stopped 
> >> o.s.j.s.ServletContextHandler@5762658b{/environment,null,UNAVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Stopped 
> >> o.s.j.s.ServletContextHandler@6fc1020a{/storage/rdd/json,null,UNAVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Stopped 
> >> o.s.j.s.ServletContextHandler@73ae0257{/storage/rdd,null,UNAVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Stopped 
> >> o.s.j.s.ServletContextHandler@2bfb583b{/storage/json,null,UNAVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Stopped 
> >> o.s.j.s.ServletContextHandler@2513a118{/storage,null,UNAVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Stopped 
> >> o.s.j.s.ServletContextHandler@765d55d5{/stages/pool/json,null,UNAVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Stopped 
> >> o.s.j.s.ServletContextHandler@2db4ad1{/stages/pool,null,UNAVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Stopped 
> >> o.s.j.s.ServletContextHandler@7c3e4b1a{/stages/stage/json,null,UNAVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Stopped 
> >> o.s.j.s.ServletContextHandler@4462efe1{/stages/stage,null,UNAVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Stopped 
> >> o.s.j.s.ServletContextHandler@f1f7db2{/stages/json,null,UNAVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Stopped 
> >> o.s.j.s.ServletContextHandler@54755dd9{/stages,null,UNAVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Stopped 
> >> o.s.j.s.ServletContextHandler@31e130bf{/jobs/job/json,null,UNAVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Stopped 
> >> o.s.j.s.ServletContextHandler@1b5a1d85{/jobs/job,null,UNAVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Stopped 
> >> o.s.j.s.ServletContextHandler@28ee7bee{/jobs/json,null,UNAVAILABLE,@Spark}
> >>
> >> [INFO] [ContextHandler] Stopped 
> >> o.s.j.s.ServletContextHandler@7da31a40{/jobs,null,UNAVAILABLE,@Spark}
> >>
> >> Exception in thread "main" 
> >> java.sql.SQLIntegrityConstraintViolationException: ORA-01400: cannot 
> >> insert NULL into ("DMA"."PIO_META_EVALUATIONINSTANCES"."STATUS")
> >>
> >>
> >>
> >>         at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:459)
> >>
> >>         at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:400)
> >>
> >>         at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:999)
> >>
> >>         at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:539)
> >>
> >>         at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:235)
> >>
> >>         at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:543)
> >>
> >>         at 
> >> oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:239)
> >>
> >>         at 
> >> oracle.jdbc.driver.T4CPreparedStatement.executeForRows(T4CPreparedStatement.java:1448)
> >>
> >>         at 
> >> oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1764)
> >>
> >>         at 
> >> oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:4401)
> >>
> >>         at 
> >> oracle.jdbc.driver.OraclePreparedStatement.executeUpdate(OraclePreparedStatement.java:4568)
> >>
> >>         at 
> >> oracle.jdbc.driver.OraclePreparedStatementWrapper.executeUpdate(OraclePreparedStatementWrapper.java:5579)
> >>
> >>         at 
> >> org.apache.commons.dbcp2.DelegatingPreparedStatement.executeUpdate(DelegatingPreparedStatement.java:98)
> >>
> >>         at 
> >> org.apache.commons.dbcp2.DelegatingPreparedStatement.executeUpdate(DelegatingPreparedStatement.java:98)
> >>
> >>         at 
> >> scalikejdbc.StatementExecutor$$anonfun$executeUpdate$1.apply$mcI$sp(StatementExecutor.scala:330)
> >>
> >>         at 
> >> scalikejdbc.StatementExecutor$$anonfun$executeUpdate$1.apply(StatementExecutor.scala:330)
> >>
> >>         at 
> >> scalikejdbc.StatementExecutor$$anonfun$executeUpdate$1.apply(StatementExecutor.scala:330)
> >>
> >>         at 
> >> scalikejdbc.StatementExecutor$NakedExecutor.apply(StatementExecutor.scala:18)
> >>
> >>         at 
> >> scalikejdbc.StatementExecutor$$anon$1.scalikejdbc$StatementExecutor$LoggingSQLAndTiming$$super$apply(StatementExecutor.scala:310)
> >>
> >>         at 
> >> scalikejdbc.StatementExecutor$LoggingSQLAndTiming$class.apply(StatementExecutor.scala:254)
> >>
> >>         at 
> >> scalikejdbc.StatementExecutor$$anon$1.scalikejdbc$StatementExecutor$LoggingSQLIfFailed$$super$apply(StatementExecutor.scala:310)
> >>
> >>         at 
> >> scalikejdbc.StatementExecutor$LoggingSQLIfFailed$class.apply(StatementExecutor.scala:287)
> >>
> >>         at 
> >> scalikejdbc.StatementExecutor$$anon$1.apply(StatementExecutor.scala:310)
> >>
> >>         at 
> >> scalikejdbc.StatementExecutor.executeUpdate(StatementExecutor.scala:330)
> >>
> >>         at 
> >> scalikejdbc.DBSession$$anonfun$updateWithFilters$1.apply(DBSession.scala:432)
> >>
> >>         at 
> >> scalikejdbc.DBSession$$anonfun$updateWithFilters$1.apply(DBSession.scala:430)
> >>
> >>         at scalikejdbc.LoanPattern$class.using(LoanPattern.scala:18)
> >>
> >>         at 
> >> scalikejdbc.ActiveSession.scalikejdbc$DBSession$$super$using(DBSession.scala:586)
> >>
> >>         at scalikejdbc.DBSession$class.using(DBSession.scala:30)
> >>
> >>         at scalikejdbc.ActiveSession.using(DBSession.scala:586)
> >>
> >>         at 
> >> scalikejdbc.DBSession$class.updateWithFilters(DBSession.scala:429)
> >>
> >>         at scalikejdbc.ActiveSession.updateWithFilters(DBSession.scala:586)
> >>
> >>         at 
> >> scalikejdbc.DBSession$class.updateWithFilters(DBSession.scala:407)
> >>
> >>         at scalikejdbc.ActiveSession.updateWithFilters(DBSession.scala:586)
> >>
> >>         at scalikejdbc.SQLUpdate.apply(SQL.scala:539)
> >>
> >>         at 
> >> org.apache.predictionio.data.storage.jdbc.JDBCEvaluationInstances$$anonfun$2.apply(JDBCEvaluationInstances.scala:82)
> >>
> >>         at 
> >> org.apache.predictionio.data.storage.jdbc.JDBCEvaluationInstances$$anonfun$2.apply(JDBCEvaluationInstances.scala:67)
> >>
> >>         at 
> >> scalikejdbc.DBConnection$$anonfun$3.apply(DBConnection.scala:305)
> >>
> >>         at 
> >> scalikejdbc.DBConnection$class.scalikejdbc$DBConnection$$rollbackIfThrowable(DBConnection.scala:274)
> >>
> >>         at scalikejdbc.DBConnection$class.localTx(DBConnection.scala:303)
> >>
> >>         at scalikejdbc.DB.localTx(DB.scala:60)
> >>
> >>         at scalikejdbc.DB$.localTx(DB.scala:257)
> >>
> >>         at 
> >> org.apache.predictionio.data.storage.jdbc.JDBCEvaluationInstances.insert(JDBCEvaluationInstances.scala:67)
> >>
> >>         at 
> >> org.apache.predictionio.workflow.CoreWorkflow$.runEvaluation(CoreWorkflow.scala:129)
> >>
> >>         at 
> >> org.apache.predictionio.workflow.Workflow$.runEvaluationViaCoreWorkflow(Workflow.scala:129)
> >>
> >>         at 
> >> org.apache.predictionio.workflow.Workflow$.runEvaluationTypeless(Workflow.scala:109)
> >>
> >>         at 
> >> org.apache.predictionio.workflow.Workflow$.runEvaluation(Workflow.scala:89)
> >>
> >>         at 
> >> org.apache.predictionio.workflow.CreateWorkflow$.main(CreateWorkflow.scala:274)
> >>
> >>         at 
> >> org.apache.predictionio.workflow.CreateWorkflow.main(CreateWorkflow.scala)
> >>
> >>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >>
> >>         at 
> >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >>
> >>         at 
> >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >>
> >>         at java.lang.reflect.Method.invoke(Method.java:498)
> >>
> >>         at 
> >> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743)
> >>
> >>         at 
> >> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
> >>
> >>         at 
> >> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
> >>
> >>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
> >>
> >>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> >>
> >>
> >>
> >>

Reply via email to