when doing json deserializaion on sequel model objects, im seeing some
strange effects. namely, a deserialized object looks like you did
`ModelSubclass.new` and filled in column values without any checking
or casting being done on their type. for instance, accessing a date
column will return a date string instead of a proper date class
object. whats worse is that these newly deserialized objects cannot be
modified and saved to cause the changes to be persisted to the
database -- they throw an exception like "Sequel::DatabaseError:
Mysql::Error: Duplicate entry '1' for key 1". these objects return
true when '.new?' is called, and it appears to me that this value
causes sequel to attempt an insert instead of an update when
'.save_changes' is called (or something similar), thereby causing a
primary key conflict. try the sample code below in irb for further
clarification on these points.
i suppose what im really looking for is a clean way to have the
deserialize step ('JSON.parse(foo)') optionally do stuff like
typecasting of columns, updating to current values from the db, and
setting of model instance flags (at least @new and perhaps
@modified)... this could be done with options like 'JSON.parse(foo,
{:typecast=>true, :autoRefresh=>false, :assumeNew=>true})'. i presume
that it would be best to do typecasting by default, and NOT do
the .refresh by default, but im not sure what would be best (in
general) for the @new and @modified flags.
what im doing now looks like this:
deserializedModelObject = JSON.parse(serializedString).tap{|o|
o.keys.each{|k| o[k]=o[k] }}.instance_ev...@new=@modified=false ;
@changed_columns=[] ; self}
i would much rather NOT do this myself for every deserialization...
especially because in my desired usage, deserialized sequel model
objects are buried inside other non-sequel model objects.
################
# set up example
################
require 'sequel'
require 'json'
Sequel.mysql(@db_name, :user => @db_user, :password =>
@db_password, :host => @db_host)
Sequel::Model.plugin(:schema)
Sequel::Model.plugin(:json_serializer)
class Ptime < Sequel::Model
set_schema do
primary_key :id, :type=>Bignum, :unsigned=>true, :auto_increment=>true
DateTime :begin, :null=>false
DateTime :finish
end
create_table unless table_exists?
end
Ptime.create(:begin=>Time.parse("1970-01-01 00:00:00"))
################
# retrieve entry from db
################
p1=Ptime.all.first
p1.begin.class # => Time
p1.begin # => 1970-01-01 00:00:00 -0500
p1.new? # => false
p1.modified? # => false
################
# retrieve same entry and perform serialization/deserialization steps
# note:
# - the datetime field not properly typecast
# - '.new?' shows true, though this essentially matches existing
record in db
# - '.modified?' shows true, though '.changed_columns' shows none
changed!
# - double-equals shows this object matches original, triple-equals
doesnt
# (because of lack of typecasting on 'begin' field, this may be
type specific
# (datetime in this case, varchar/strings pass this fine))
################
p2=JSON.parse(p1.to_json)
p2.begin.class # => String
p2.begin # => "1970-01-01 00:00:00 -0500"
p2.new? # => true
p2.modified? # => true
p2.changed_columns # => []
p1==p2 # => false
p1===p2 # => true
################
# trigger typecast by assigning column to itself
# note:
# - triple-equals now shows equivalence
# - '.changed_columns' now (rightfully?) shows modified column
################
p2.begin=p2.begin
p2.begin.class # => Time
p2.begin # => 1970-01-01 00:00:00 -0500
p2.new? # => true
p2.modified? # => true
p2.changed_columns # => [:begin]
p1==p2 # => true
p1===p2 # => true
################
# '.refresh' does typecast as well, but keeps '.changed_columns' empty
# (still shows true for '.modified?'). caveat: '.refresh' retrieves
values from
# db, this may or may not be what your application expects...
################
p3=JSON.parse(p1.to_json)
p3.refresh
p3.begin.class # => Time
p3.begin # => 1970-01-01 00:00:00 -0500
p3.new? # => true
p3.modified? # => true
p3.changed_columns # => []
p1==p3 # => true
p1===p3 # => true
################
# regardless, either of these fail with
# pk uniqueness errors when persisting changes back to db:
################
p2.finish = Time.parse("2012-12-21 11:11:00 UTC")
p2.save_changes # => Sequel::DatabaseError: Mysql::Error: Duplicate
entry '1' for key 1 ...
p3.update(:finish => "2012-12-21 11:11:00 UTC") # =>
Sequel::DatabaseError: Mysql::Error: Duplicate entry '1' for key 1 ...
################
# however, setting the instance member '@new' to false
# fixes the constraint problem on persisting changes:
# (remember, we are making all these changes on the actual data
# stored in the dbms AND on the same row in the ptimes table,
# so you might want to follow all these commands with a
# 'select * from ptimes;' in a mysql (or equivalent) console)
################
p2.instance_ex...@new=false}
p2.finish = Time.parse("2012-12-21 11:11:00 UTC")
p2.save_changes # => success
p3.instance_ex...@new=false}
p3.update(:finish => "2000-01-01 00:00:00 UTC") # => success
################
# so, ive resigned myself to doing stuff like this after a
# deserialize to make my objects behave better, though it makes me
# feel dirty (note that the '.refresh' step not only performs
typecasting,
# but updates all columns to their current values in the db, which is
# fine for my current needs, but may not always be desirable ; the
'.tap'
# step performs typecasting but preserves values, though its 1.9 only)
################
p5 =
JSON.parse(p1.to_json).refresh.instance_ev...@new=@modified=false;self}
p6=JSON.parse(p1.to_json).tap{|o| o.keys.each{|k|
o[k]=o[k] }}.instance_ev...@new=@modified=false ;
@changed_columns=[] ; self}
--
You received this message because you are subscribed to the Google Groups
"sequel-talk" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to
[email protected].
For more options, visit this group at
http://groups.google.com/group/sequel-talk?hl=en.