Yes a site revamp left some partials that are not working, thanks, they are 
fixed now.

HBase 1.2.4 and Spark 1.6.3 are what I use.

Please let me know if there are any other errors in that tutorial, it’s just 
been edited so I hope its put back together correctly. BTW you can PR any 
changes here: https://github.com/actionml/docs.actionml.com 
<https://github.com/actionml/docs.actionml.com>

On Dec 22, 2016, at 1:50 AM, Bruno LEBON <[email protected]> wrote:

The link to hbase 1.2.3 is also dead. So I downloaded the last stable version, 
which is 1.2.4. I hope it is fully compatible.

2016-12-22 10:27 GMT+01:00 Bruno LEBON <[email protected] 
<mailto:[email protected]>>:
Ok thanks for your answer, i'll keep an eye for the announcement. At this 
address, right? http://actionml.com/docs/pio_versions 
<http://actionml.com/docs/pio_versions>

Also I am following the tuto here : http://actionml.com/docs/small_ha_cluster 
<http://actionml.com/docs/small_ha_cluster>
and there is a dead link at:
Download Services On All Hosts - 1 Download -> the link for Spark's archive is 
dead.
It is:
http://www.us.apache.org/dist/spark/spark-1.6.2/spark-1.6.3-bin-hadoop2.6.tgz 
<http://www.us.apache.org/dist/spark/spark-1.6.2/spark-1.6.3-bin-hadoop2.6.tgz>
I think it should be:
http://www.us.apache.org/dist/spark/spark-1.6.3/spark-1.6.3-bin-hadoop2.6.tgz 
<http://www.us.apache.org/dist/spark/spark-1.6.3/spark-1.6.3-bin-hadoop2.6.tgz>

Thanks for your time and help, much appreciated!
Bruno

2016-12-18 18:58 GMT+01:00 Pat Ferrel <[email protected] 
<mailto:[email protected]>>:
There was a bug in this feature in the Apache PIO version that has been fixed 
in the SNAPSHOT. We will do a source tag to fix it before the next release . 
The page you reference is being changed now to advise an Apache PIO install as 
the root source of the project going forward. Keep and eye out here for the 
announcement.


On Dec 7, 2016, at 9:01 AM, Bruno LEBON <[email protected] 
<mailto:[email protected]>> wrote:

Hi,

I am following the instructions here: http://actionml.com/docs/pio_versions 
<http://actionml.com/docs/pio_versions>
to add the possibility to delete events that are too old to template other than 
Universal recommender. I want to add it to ecommerce recommendation.

However it doesnt seem to work.

I have the PredicitonIO 097-aml and the template ecommerce-recommendation from 
apache-incubator (0.4.0)

I have the following 3  errors:
[ERROR] [Console$] [error] 
/home/aml/incubator-predictionio-template-ecom-recommender/src/main/scala/DataSource.scala:21:
 not found: type SelfCleaningDataSource
[ERROR] [Console$] [error] with SelfCleaningDataSource {
[ERROR] [Console$] [error]      ^
[ERROR] [Console$] [error] 
/home/aml/incubator-predictionio-template-ecom-recommender/src/main/scala/DataSource.scala:26:
 value eventWindow is not a member of 
org.template.ecommercerecommendation.DataSourceParams
[ERROR] [Console$] [error]   override def eventWindow = dsp.eventWindow
[ERROR] [Console$] [error]                                  ^
[ERROR] [Console$] [error] 
/home/aml/incubator-predictionio-template-ecom-recommender/src/main/scala/DataSource.scala:31:
 not found: value cleanPersistedPEvents
[ERROR] [Console$] [error]     cleanPersistedPEvents(sc)
[ERROR] [Console$] [error]     ^
[ERROR] [Console$] [error] three errors found



The first is related to the fact that DataSourceParams has no object declared, 
so I added it in the declaration of the object DataSourceParams but which type 
should I give to it? Is it an Object? I dont know Scala, so I am not sure how 
to modify the code to make it work. In engine.json it refers to :
  "datasource": {
    "params" : {
      "name": "some-name",
      "appName": "autocleangites22",
      "eventNames": ["view"],
      "eventWindow": {
        "duration": "3650 days",
        "removeDuplicates": false,
        "compressProperties": false
      }
    }
so the code accesses to eventWindow.duration.


The second error says that the interface (that would be called so in Java at 
least) SelfCleaningDataSource is not found in PredictionIO. When I have a look 
at the code I find the interface, under some annotation @DeveloperApi:
@DeveloperApi
trait SelfCleaningDataSource {

and here too (for the method generating the third error):
@DeveloperApi
  def cleanPersistedPEvents(sc: SparkContext): Unit ={

which makes me wonder whether there is a special way to compile the source code 
so that the code under this annotation is to be added? 

Thanks in advance for your help,
Bruno





Reply via email to