Hi,
My MLEngine use PAlgorithm for training and predicting model. It requires
to save Model by extending IPersistentModel and IPersistentModelLoader.
I followed this steps and saving below model:

{

def save(id: String, params: AlgorithmParams, sc: SparkContext): Boolean = {
 ldaModel.save(sc, s"/tmp/${id}/ldamodel")
 corpus.saveAsObjectFile(s"/tmp/${id}/ldaCorpus")
 sc.parallelize(Seq(vocab)).saveAsObjectFile(s"/tmp/${id}/ldaVocab")
 true

}

It completes training properly but while saving model it shows below exception:

Log:

...

[INFO] [Engine$] EngineWorkflow.train completed [INFO] [Engine]
engineInstanceId=afcabdee-9f1b-4454-8e80-e20185673769

[ERROR] [InsertIntoHadoopFsRelation] Aborting job.
[ERROR] [DefaultWriterContainer] Job job_201610061606_0000 aborted.
Exception in thread "main" org.apache.spark.SparkException: Job aborted.

...

Caused by: org.apache.spark.SparkException: Job aborted due to stage
failure: Task serialization failed: java.lang.StackOverflowError

Please suggest me how to solve this issue.

Thank you

Regards,
Bansari

Reply via email to