On Tue, Dec 4, 2012 at 9:37 PM, Johan Tibell <johan.tib...@gmail.com> wrote:

> Hi Oleg,
>
> On Tue, Dec 4, 2012 at 9:13 PM,  <o...@okmij.org> wrote:
> > I am doing, for several months, constant-space processing of large XML
> > files using iteratees. The file contains many XML elements (which are
> > a bit complex than a number). An element can be processed
> > independently. After the parser finished with one element, and dumped
> > the related data, the processing of the next element starts anew, so
> > to speak. No significant state is accumulated for the overall parsing
> > sans the counters of processed and bad elements, for statistics. XML
> > is somewhat like JSON, only more complex: an XML parser has to deal
> > with namespaces, parsed entities, CDATA sections and the other
> > interesting stuff. Therefore, I'm quite sure there should not be
> > fundamental problems in constant-space parsing of JSON.
> >
> > BTW, the parser itself is described there
> >         http://okmij.org/ftp/Streams.html#xml
>
> It certainly is possible (using a SAX style parser). What you can't
> have (I think) is a function:
>
>     decode :: FromJSON a => ByteString -> Maybe a
>
> and constant-memory parsing at the same time. The return type here
> says that we will return Nothing if parsing fails. We can only do so
> after looking at the whole input (otherwise how would we know if it's
> malformed).
>

I thought it was possible to get around this with lazy patterns such
"Wadler's force function" [1]?

(untested code)

force y =
  let Just x = y
  in Just x

lazyDecode :: FromJSON a => ByteString -> Maybe a
lazyDecode = force . decode

[1] http://www.haskell.org/haskellwiki/Maintaining_laziness
_______________________________________________
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe

Reply via email to