Thank you for some ideas on using external parser.
Okay now I have 3 questions:
1. Is it possible to read CSV file streaming-style (for example record by
record) without loading everything in memory ? Even if I use some external
parsing solution like XSLT or just write something myself in some other
language than J, I will end up with large CSV instead of large XML. It
makes no difference. The reason that I need to parse it like this, is that
there are some rows that I won't need, those would be discarded depending
on their field values.
If it is not possible I would do more work outside of J in this first
parser XML -> CSV.
2. Is there a way to call external program for J script ? If it is
possible  to wait for it to finish ?
If it is not possible, there are definiately ways to run J from other
programs.
3. Can someone give a little bit of pointer or on how to use api/expat
library ? Do I need to familiarize myself with expat (C library) or just
good understanding of J and reading small test in package directory should
be enough ?
I could send some example file like Devon McCormick suggested.

Right now I am working through book "J:The natural language for analytic
computing" and playing around with problems like Project Euler, but I could
really see myself using J in serious work.

Best regards,
MG


śr., 11 sie 2021 o 09:51 <[email protected]> napisał(a):

> In similar situations -but my files are not huge- I extract what I want
> into flattened CSV using one or more XQuery scripts, and then load the CSV
> files with J.  The code is clean, compact and easy to maintain. For
> recurrent XQuery patterns, m4 occasionally comes to the rescue. Expect
> minor portability issues when using different XQuery processors
> (extensions, language level...).
>
>
>
> Never got round to SAX parsing beyond tutorials, so I cannot compare.
>
>
> De : Mariusz Grasko <[email protected]>
> À : [email protected]
> Sujet : [Jprogramming] Is is good idea to use J for reading large XML
> files ?
> Date : 10/08/2021 18:05:45 Europe/Paris
>
> Hi,
>
> We are ecommerce company and have a lot of integrations with suppliers,
> products info is nearly always in XML files. I am thinking about using J as
> an analysis tool, do you think that working with large files that need to
> be parsed SAX- style without reading everything at once is good idea in J ?
> Also is this even advantageous (as in, would code be terse). Right now XML
> parsing is done in Golang, so if parsing in J is not very good we could try
> to rely more on CSV exports. CSV is definiately very good in J.
> I am hoping that maybe XML parsing is very good in J and the code would
> become much smaller, if this is the case, then I would think about using J
> for XMLs with new suppliers.
>
> Best Regards
> M.G.
> ----------------------------------------------------------------------
> For information about J forums see http://www.jsoftware.com/forums.htm
>
> ----------------------------------------------------------------------
> For information about J forums see http://www.jsoftware.com/forums.htm
>
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to