Hi Michael,

I'm quite sure lately annotating the Riak key method was enabled at the client but I'm not 100% of its usage yet (I haven't used it), in the meantime for testing you can annotate the property with @RiakKey inside your POJO, that should do, then just treat that property as any other property (generate its getter and setter)

/public class Account {//

///*@RiakKey*/
//    private String accountId;//
//    ...//
//    ...//
//    ...//
//}/

HTH,

Guido.

On 11/11/13 22:06, Michael Guymon wrote:
So I had to add @JsonProperty("metadata") for the @RiakUsermeta to appear in the serialized json being processed by the Reduce phase. I have been using "ejsLog('/tmp/map_reduce.log', JSON.stringify(values));" to see what is being passed in.

One last question, the field with @RiakKey is always null. I added the @JsonProperty("id") and it is null in the serialized json as well. Is there a step I am missing in the store to populate the @RiakKey?

thanks,
Michael

On 11/11/2013 04:12 PM, Brian Roach wrote:
If your mapping function you simply add a qualifier to detect tombstones;

if (values[i].metadata['X-Riak-Deleted'] == 'true')

- Roach

On Mon, Nov 11, 2013 at 1:59 PM, Michael Guymon
<[email protected]> wrote:
Ahh, yes, now that makes sense. I see with @RiakUsermeta or @RiakTombstone it is possible to filter the results of the MapReduce for tombstones. Is it
possible to add a phase to reduce the tombstones instead of manually
filtering the final result?

thanks,
Michael


On 11/11/2013 03:16 PM, Brian Roach wrote:
Michael -

You have something stored in that bucket that isn't the JSON you're
expecting when you run your second map/reduce. As I mentioned, there's
nothing special about how the Java client works; it just serializes
the POJO instance using Jackson.

My suggestion would be using curl / your browser (or the Java client)
and seeing what that is; listing the keys and checking the contents.

I notice you're using the ".withoutFetch()" option when storing;
that's guaranteed to create a sibling if you have allow_multi=true set
in the bucket. If that's the case then that behavior is expected; both
versions are stored in Riak.

Also worth noting is that if you're recently deleted something
(explicitly via a delete operation) it's very likely to get a
tombstone pass to map/reduce.  If you're doing explicit deletes from
Riak you need to check the object metadata for the
<<"X-Riak-Deleted">> header being true, and then ignore that object in
your map function.

- Roach


On Mon, Nov 11, 2013 at 12:46 PM, Michael Guymon
<[email protected]> wrote:
Hi Roach,

Thanks for taking a moment to give me a hand with this. Let me try and be
a
bit more clear on what I am trying to figure out. My first steps are a
Class
Account:

public class Account implements Serializable {
      private String email;
}

Storing the account via

myBucket.store("key", account).withoutFetch().execute();

then retrieving it with a map reduce using JS, along the lines of:

var accounts = [];
for( i=0; i<values.length;i++) {
    if ( values[i].email == '[email protected] ) {
      accounts.push(values[i]);
    }
}
return accounts

works as expected.

Now I updated the Class Account to have the name property:

public class Account implements Serializable {
      private String name;
      private String email;
}

and storing with data to the same bucket, for the same key and attempting
to
Map Reduce for "name" I get a

{"phase":1,"error":"[{<<\"lineno\">>,1},{<<\"message\">>,<<\"TypeError:
values[i].name is

undefined\">>},{<<\"source\">>,<<\"unknown\">>}]","input":null,"type":null,"stack":null}.

If I change the bucket to a new one, the Map Reduce runs successfully
without the above error.

This is Riak 1.4.2 running on Ubuntu 13.04

thanks,
Michael


On 11/11/2013 02:32 PM, Brian Roach wrote:

Hi Michael,

I'm somewhat confused by your question; map/reduce doesn't really have
anything to do with your Java POJO/class.

When using the Riak Java client and storing a POJO, the default
converter (JSONConverter)  uses the Jackson JSON library and converts
the instance of your POJO into a JSON string and stores it in Riak.

If you change that POJO class and store more things, the resulting
JSON is obviously going to be different (in your case having an
additional field named "minty").

When doing Map/Reduce, whatever JavaScript or Erlang functions you
provide are executing in Riak and being given the data stored in Riak
(the JSON you stored); they have no connection to Java.

Can you expand on  "Now the map reduce fails for that the new
property" with what exactly the problem is? It sounds like you have a
problem with your JavaScript or Erlang function(s).

Thanks!
- Roach


On Mon, Nov 11, 2013 at 12:07 PM, Michael Guymon
<[email protected]> wrote:

Hello,

I have a (hopefully dumb) question about working with the Java client and
POJOs. I justed started tinkering with Riak and have created a simple
Account POJO and happily crammed it into a bucket "test1" and mapped
reduced
it (hooray). The problem starts when I updated the Class for Account,
adding
a new String property "minty". Now the map reduce fails for that the new
property in the bucket "test1". Seems like the POJO is always being
serialized to the format of the older Account class. If I create a new
bucket, "test2", and cram and reduce anew, everything works again.

I have been grepping around the docs, but have not been able to zero in
on
my issue. Am I doing something bone headed? Is it possible to update a
bucket to support a modified POJO class?

thanks,
Michael

_______________________________________________
riak-users mailing list
[email protected]
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com




_______________________________________________
riak-users mailing list
[email protected]
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com

_______________________________________________
riak-users mailing list
[email protected]
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com

Reply via email to