Hi Bobby,
If the data isn't corrupted in some way and the process is still this
fragile, you may need to think of a whole new approach. If it takes
logging in as root, you'll probably end up doing the job yourself - not
an ideal choice ("Do my job for me once and I thank you, do it twice and
it's yours"). I'm not sure what that approach might be, since logical
XML data groups can be quite large and variable. XML seems to sacrifice
size for function, as some other things we all know and love do.
Whatever you end up with needs to be robust and very scalable. Sorry I
can't offer any ideas at the moment.
Regards,
Charlie Noah
On 01-12-2012 11:02 AM, Bobby Worley wrote:
I suppose sequential reads is one approach I may need to consider,
especially if I develop this as a tool for our end-users (doing the XML
import).
I was able to process the large file - It took some jumping through
hoops, but I got the file imported. 20,669 records. I set ulimit -d to
"unlimited" , but in order to do so, I had to login as root, set ulimit,
then go into UV. Then the program completed. This is on a back up UV
server with no other users.
- Bob Worley
Coburn Supply Co
-----Original Message-----
From: [email protected]
[mailto:[email protected]] On Behalf Of Joshua
Gallant
Sent: Thursday, January 12, 2012 10:30 AM
To: U2 Users List
Subject: Re: [U2] Extracting XML attributes
When working with large XML files in the past I've always run into
issues
like this but I use a combination of xmapopen, xmapreadnext, and
xmapclose. I couldn't ever come up with a great way to use the UV tools
for parsing a large file at once so I open as a sequential file and
parse
the file until I have a full record. I then process that one record
with
the xml functions. There might be 15 other ways but this has always
served its purpose.
- Josh
On 1/12/12 11:10 AM, "Bobby Worley"<[email protected]> wrote:
Does anybody have an experience reading in very large XML files into
Universe on AIX using PrepareXML() function?
We received a 176Mb XML file, and I'm running out of memory. I've maxed
out ulimit -d and it still blows up with this error:
ERROR MESSAGE A DOM error occured during parsing.
-----Original Message-----
From: [email protected]
[mailto:[email protected]] On Behalf Of Bobby Worley
Sent: Wednesday, November 16, 2011 7:34 AM
To: U2 Users List
Subject: Re: [U2] Extracting XML attributes
Turns out it is a memory issue. My XML file is 16Mb. Setting ulimit -d
750000 resolved the issue.
-----Original Message-----
From: [email protected]
[mailto:[email protected]] On Behalf Of Bobby Worley
Sent: Friday, November 11, 2011 10:42 AM
To: U2 Users List
Subject: Re: [U2] Extracting XML attributes
My latest challenge:
PREPARE.XML FEED.XML MYXML
Prepare the XMLDOM failed.
XMLParser error message: A DOM error occured during parsing.
UNIVERSE RELLEVEL
001 X
002 11.1.0
003 PICK
004 PICK.FORMAT
005 11.1.0
Aix Version 5.3.0.0
FEED.XML is 16mb.
It prepares just fine on UV 10.1.17. Unfortunately it wont list on UV
10.1 because UV 10.1 is not aware of namespaces.
This is making my pull my hair out... I don't need this on a Friday.
Bob Worley
Coburn Supply
_______________________________________________
U2-Users mailing list
[email protected]
http://listserver.u2ug.org/mailman/listinfo/u2-users
_______________________________________________
U2-Users mailing list
[email protected]
http://listserver.u2ug.org/mailman/listinfo/u2-users