So I made a change which allows passing a map of deserializers instead
of having them initialized globally, it's more or less the same as the
previous version otherwise, let me know if you feel it makes the
reader too complex
http://gist.github.com/565764
here's sample usage with that approach
Sorry, I can't accept any patch that modifies behavior globally. What
happens when two different libraries try to parse JSON with different
deserializers?
The only thing I would consider is a function that is passed into read-
json and invoked in read-json-object. But even that seems like adding
Sorry, I can't accept any patch that modifies behavior globally. What
happens when two different libraries try to parse JSON with different
deserializers?
The only thing I would consider is a function that is passed into read-
json and invoked in read-json-object. But even that seems like
You can already extend the Write-JSON protocol to any type. But it
doesn't work in reverse. JSON has no standardized way to express types
beyond Object/Array/String/Number, so any deserialization will always
be application-specific.
-S
On Sep 3, 8:58 am, Baishampayan Ghose b.gh...@gmail.com
The problem I was trying to avoid is having to do a second pass over
the data after it comes out of the parser, it's more expensive and
it's also ugly for nested data structures. Would using defonce- and
defmacro- from clojure-contrib address the problem with namespace
collisions?
On Sep 3, 12:01
No. I'm talking about collisions when multiple deserialization
functions are added from different sources. It cannot be a global
setting.
-S
On Sep 3, 1:28 pm, Dmitri dmitri.sotni...@gmail.com wrote:
The problem I was trying to avoid is having to do a second pass over
the data after it comes
That's a very good point, I can't think of a good way to address that
off top of my head, I agree that passing in a function isn't really
great either.
On Sep 3, 3:17 pm, Stuart Sierra the.stuart.sie...@gmail.com wrote:
No. I'm talking about collisions when multiple deserialization
functions
I added the *deserializers* atom, converted read-json-object to a
macro
(def *deserializers* (atom {}))
(defn add-deserializer [k deserializer]
(swap! *deserializers* #(assoc % k deserializer)))
(defn remove-deserializer [k]
(swap! *deserializers* #(dissoc % k)))
(defmacro
I posted the complete file on github here http://gist.github.com/549771
--
You received this message because you are subscribed to the Google
Groups Clojure group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient
Thanks, I'll give it a try.
On Aug 25, 12:00 pm, Dmitri dmitri.sotni...@gmail.com wrote:
I posted the complete file on github herehttp://gist.github.com/549771
--
You received this message because you are subscribed to the Google
Groups Clojure group.
To post to this group, send email to
On Aug 23, 9:03 pm, Dmitri dmitri.sotni...@gmail.com wrote:
Would there be an issue with adding something like that to the
contrib?
I don't want to add anything that impacts performance in the plain
parsing case.
-S
--
You received this message because you are subscribed to the Google
Groups
I understand the desire to keep the parser clean, but at the same time
the ability to register custom data deserializers would be very
convenient. Would something like the following help with the
performance issue, since if no deserializers were registered there
would only be a one time penalty
Transforming the data after it comes out of the parser can be
cumbersome with complex data structures though, it would be nice to
have a way for the parser to return the data in the desired format.
I updated clojure.contrib.json with the ability to add custom
deserializers:
(def *deserializers*
I suppose one could override the (private) read-json-object function
to transform maps after they are read, based on the presence of
certain keys. But that would seriously complicate the reader. It's
probably easier to transform the data after it comes back from the
JSON parser.
-S
On Aug 20,
I'm currently using Dan Larkin's clojure-json, and it provides a way
to serialize and deserialize dates, it also provides the option to
specify custom serializers, eg:
(defn date-encoder
[date writer pad current-indent start-token-indent indent-
size]
(.append writer (str
Extending the writer is pretty trivial
(defn write-date [date]
(.format (new java.text.SimpleDateFormat MMM dd, hh:mm:ss a)
date))
(extend Date Write-JSON
{:write-json write-date})
but it seems like deserializing a date wouldn't be quite so trivial.
--
You received this message
Since there is no standard for how to represent dates in JSON, it is
unlikely to be built in. But you can extend the writer with
application-specific date formats.
-S
On Aug 20, 2:15 pm, Dmitri dmitri.sotni...@gmail.com wrote:
I'm currently using Dan Larkin's clojure-json, and it provides a
My concern is more to do with the reader, I think extending writer
works quite well, it would be nice if it was possible to do the same
thing with the reader, so you could specify how to deserialize
specific types of data. Right now it seems to be baked into read-json-
reader and there's no easy
18 matches
Mail list logo