Thanks, Joni.

I've been playing with just that for comprehension syntax over the
weekend. How would I do it if I had multiple packets?
{
  "packets": [
  {
    "node": "00:1D:C9:00:04:9F",
    "dt": 1254553581405,
    "temp": 27.5
  },
  {
    "node": "00:1D:C9:00:04:9E",
    "dt": 1254553582405,
    "temp": 24.3
  }
 ]
}

I've had some problems iterating across the parsed results. If I do:
for {
  json <- parse(s)
  JField("node", JString(node)) <- json
  JField("dt", JInt(dt)) <- json
  JField("temp", JDouble(temp)) <- json
} yield .... // construct Packet here

I will end up with 8 Packets. Should I be doing something like JArray
(json) <- parse(s)?

Thanks for your help,
Peter

On Oct 4, 3:08 pm, Joni Freeman <freeman.j...@gmail.com> wrote:
> > I don't know how hard would it be to add this feature, so I don't know
> > if this is a reasonable request. This would make making JSON API
> > endpoints really easy for me and I hope for other people too.
>
> This certainly sounds like a reasonable feature request, I will take a
> deeper look at it.
>
> Meanwhile, you can use tmp case class as Kevin noted, or use for-
> comprehension to query the json. Something like:
>
> {
>   "packet": {
>     "node": "00:1D:C9:00:04:9F",
>     "dt": 1254553581405,
>     "temp": 27.5
>   }
>
> }
>
> val json = parse(s)
> for {
>   JField("node", JString(node)) <- json
>   JField("dt", JInt(dt)) <- json
>   JField("temp", JDouble(temp)) <- json
>
> } yield .... // construct Packet here
>
> That's a bit verbose but quite flexible. This test case contains more
> query 
> examples:http://github.com/dpp/liftweb/blob/master/lift-json/src/test/scala/ne...
>
> Cheers Joni
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Lift" group.
To post to this group, send email to liftweb@googlegroups.com
To unsubscribe from this group, send email to 
liftweb+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/liftweb?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to