Thanks for the pointer Mukesh I'll go over the blog.
Changing the xml parser to another one from hackage - xml - helped but not
fully. I think I would need to change to bytestring. But for now, I split
the program into smaller programs and it seems to work.
Regards,
Kashyap
On Sat, Mar
Hi Kashyap
I am not sure if this solution to your problem but try using Bytestring
rather than String in
parseXML' :: String - XMLAST
parseXML' str =
f ast where
ast = parse (spaces xmlParser) str
f (Right x) = x
f (Left x) = CouldNotParse
Also see this post[1] My Space is
Hi folks,
I've run into more issues with my report generation tool I'd really
appreciate some help.
I've created a repro project on github to demonstrate the problem.
git://github.com/ckkashyap/haskell-perf-repro.git
There is a template xml file that needs to be replicated several times
I got some profiling done and got this pdf generated. I see unhealthy
growths in my XML parser.
On Fri, Mar 22, 2013 at 8:12 PM, C K Kashyap ckkash...@gmail.com wrote:
Hi folks,
I've run into more issues with my report generation tool I'd really
appreciate some help.
I've created a
Oops...I sent out the earlier message accidentally.
I got some profiling done and got this pdf generated. I see unhealthy
growths in my XML parser.
https://github.com/ckkashyap/haskell-perf-repro/blob/master/RSXP.hs
I must be not using parsec efficiently.
Regards,
Kashyap
On Sat, Mar 23,
On 03/19/2013 07:12 AM, Edward Kmett wrote:
Konstantin,
Please allow me to elaborate on Dan's point -- or at least the point
that I believe that Dan is making.
Using,
let bug = Control.DeepSeq.rnf str `seq` fileContents2Bug str
or ($!!)will create a value that *when forced* cause the
On Tue, Mar 19, 2013 at 2:01 PM, Konstantin Litvinenko
to.darkan...@gmail.com wrote:
Yes. You (and Dan) are totally right. 'Let' just bind expression, not
evaluating it. Dan's evaluate trick force rnf to run before hClose. As I
said - it's tricky part especially for newbie like me :)
To place
On 03/17/2013 07:08 AM, C K Kashyap wrote:
I am working on an automation that periodically fetches bug data from
our bug tracking system and creates static HTML reports. Things worked
fine when the bugs were in the order of 200 or so. Now I am trying to
run it against 3000 bugs and suddenly I
On 18 March 2013 21:01, Konstantin Litvinenko to.darkan...@gmail.com wrote:
On 03/17/2013 07:08 AM, C K Kashyap wrote:
I am working on an automation that periodically fetches bug data from
our bug tracking system and creates static HTML reports. Things worked
fine when the bugs were in the
Thanks Konstantin ... I'll try that out too...
Regards,
Kashyap
On Mon, Mar 18, 2013 at 3:31 PM, Konstantin Litvinenko
to.darkan...@gmail.com wrote:
On 03/17/2013 07:08 AM, C K Kashyap wrote:
I am working on an automation that periodically fetches bug data from
our bug tracking system
Do note that deepSeq alone won't (I think) change anything in your
current code. bug will deepSeq the file contents. And the cons will
seq bug. But nothing is evaluating the cons. And further, the cons
isn't seqing the tail, so none of that will collapse, either. So the
file descriptors will still
On 03/18/2013 06:06 PM, Dan Doel wrote:
Do note that deepSeq alone won't (I think) change anything in your
current code. bug will deepSeq the file contents.
rfn fully evaluate 'bug' by reading all file content. Later hClose will
close it and we done. Not reading all content will lead to semi
Konstantin,
Please allow me to elaborate on Dan's point -- or at least the point that I
believe that Dan is making.
Using,
let bug = Control.DeepSeq.rnf str `seq` fileContents2Bug str
or ($!!) will create a value that *when forced* cause the rnf to occur.
As you don't look at bug until much
On Sun, Mar 17, 2013 at 3:08 PM, C K Kashyap ckkash...@gmail.com wrote:
It's a small snippet and I've put in the comments stating how I run into
out of file handles or simply file not getting read due to lazy IO.
I realize that putting ($!) using a trial/error approach is going to be
futile.
One thing that typically isn't mentioned in these situations is that
you can add more laziness. I'm unsure if it would work from just your
snippet, but it might.
The core problem is that something like:
mapM readFile names
will open all the files at once. Applying any processing to the file
Hi Kashyap,
you could also use iteratees or conduits for a task like that. The beauty
of such libraries is that they can ensure that a resource is always
properly disposed of. See this simple example:
https://gist.github.com/anonymous/5183107
It prints the first line of each file given as an
Thanks everyone,
Dan, MapMI worked for me ...
Regards,
Kashyap
On Mon, Mar 18, 2013 at 12:42 AM, Petr Pudlák petr@gmail.com wrote:
Hi Kashyap,
you could also use iteratees or conduits for a task like that. The beauty
of such libraries is that they can ensure that a resource is always
Hi,
I am working on an automation that periodically fetches bug data from our
bug tracking system and creates static HTML reports. Things worked fine
when the bugs were in the order of 200 or so. Now I am trying to run it
against 3000 bugs and suddenly I see things like - too many open handles,
18 matches
Mail list logo