On Thu, Mar 29, 2012 at 10:04 AM, Andrew Dunstan <and...@dunslane.net> wrote: > 1. I've been in discussion with some people about adding simple JSON extract > functions. We already have some (i.e. xpath()) for XML.
I've built a couple of applications that push data in and out of xml via manual composition going out and xpath coming in. TBH, I found this to be a pretty tedious way of developing a general application structure and a couple of notches down from the more sql driven approach. Not that jsonpath/xpath aren't wonderful functions -- but I thing for general information passing there's a better way. Your json work is a great start in marrying document level database features with a relational backend. My take is that storing rich data inside the database in json format, while tempting, is generally a mistake. Unless the document is black box it should be decomposed and stored relationally and marked back up into a document as it goes out the door. This is why brevity and flexibility of syntax is so important when marshaling data in and out of transport formats. It encourages people to take the right path and get the best of both worlds -- a rich backend with strong constraints that can natively speak such that writing data driven web services is easy. What I'm saying is that jsonpath probably isn't the whole story: another way of bulk moving json into native backend structures without parsing would also be very helpful. For example, being able to cast a json document into a record or a record array would be just amazing. merlin -- Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-hackers