the plain error is like this:

[ERROR] [Storage$] Error initializing storage client for source LOCALFS
Exception in thread "main"
org.apache.predictionio.data.storage.StorageClientException: Data source
LOCALFS was not properly initialized.
at
org.apache.predictionio.data.storage.Storage$$anonfun$10.apply(Storage.scala:282)
at
org.apache.predictionio.data.storage.Storage$$anonfun$10.apply(Storage.scala:282)
at scala.Option.getOrElse(Option.scala:120)
at
org.apache.predictionio.data.storage.Storage$.getDataObject(Storage.scala:281)
at
org.apache.predictionio.data.storage.Storage$.getDataObjectFromRepo(Storage.scala:266)
at
org.apache.predictionio.data.storage.Storage$.getModelDataModels(Storage.scala:382)
at
org.apache.predictionio.workflow.CoreWorkflow$.runTrain(CoreWorkflow.scala:79)
at
org.apache.predictionio.workflow.CreateWorkflow$.main(CreateWorkflow.scala:250)
at
org.apache.predictionio.workflow.CreateWorkflow.main(CreateWorkflow.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Because without doing sudo that LOCALFS doesn't execute correctly. I don't
know how else could I run it. Thanks

On Sun, Mar 26, 2017 at 11:56 PM, Vaghawan Ojha <[email protected]>
wrote:

> Where would I set JAVA_OPTS is it in pio-env.sh?
>
> On Sun, Mar 26, 2017 at 11:49 PM, Marius Rabenarivo <
> [email protected]> wrote:
>
>> You have do add pass-through parameters to the pio train command
>>
>> pio train -- --executor-memory 4g --driver-memory 4g
>>
>> and set JAVA_OPTS="Xmx4g" environment variable
>>
>> 2017-03-26 21:37 GMT+04:00 Vaghawan Ojha <[email protected]>:
>>
>>> Hi,
>>> Thanks but the error was because I was not inside the template dir while
>>> running pio build. It builded now successfully, but it seems in every step
>>> there is some crazy errors awaiting for me. Now it actually fails at
>>> training. Can you suggest me anything from the train log?
>>> I'm sorry but they are really hard to grab unless I ask for help.
>>>
>>> Thank you very much
>>>
>>> On Sun, Mar 26, 2017 at 10:00 PM, Marius Rabenarivo <
>>> [email protected]> wrote:
>>>
>>>> Hi,
>>>>
>>>> The error is :
>>>>
>>>> [ERROR] [Storage$] Error initializing storage client for source PGSQL
>>>>
>>>> I think you need to change it to HBASE if you want to use HBase
>>>>
>>>> PIO_STORAGE_REPOSITORIES_EVENTDATA_SOURCE=PGSQL
>>>> ->
>>>> PIO_STORAGE_REPOSITORIES_EVENTDATA_SOURCE=HBASE
>>>>
>>>> in your pio-env.sh
>>>>
>>>> And start HBase before if not using the pio-start-all script.
>>>>
>>>> If you want to use PostreSQL pio-start-all attempt to start it too.
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> 2017-03-26 19:29 GMT+04:00 Vaghawan Ojha <[email protected]>:
>>>>
>>>>> I followed the procedure of manual install, everything was fine until
>>>>> I stumbled into the pio build.
>>>>>
>>>>> I've a directory something like this /abc/pio0.0.10/pio and inside
>>>>> that another dir pio, in total it would be like :
>>>>> /abc/pio0.0.10/pio /
>>>>>
>>>>> where do I actually run build? inside /abc/pio0.0.10 or
>>>>> /abc/pio0.0.10/pio / ?
>>>>>
>>>>> I don't know but I get some weird errors which I can't properly
>>>>> diagnose. I"ve attached my log file here. I've followed to load the engine
>>>>> template. here http://predictionio.incubator.apache.org/templates/reco
>>>>> mmendation/quickstart/
>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>

Reply via email to