Hi All,

Just wondering: Using LiveCode could you split and compress
this 38 gb text file in 10,000 smaller files?

Some years ago, when I was interested in creating 
an Offline Wikipedia Reader (using Livecode), 
I found the same problem to gather all parts
of an article from compressed files.

A Wikipedia article could start in the middle of a 
compressed file and end at the beginning of next.

The script to gather all parts of an article did this:
1) decompress the file where the article starts, 
2) if end tag of this same article is not in 
decompressed data, then
3) decompressed next file, search for end of article
and append to previous decompressed data.

This simple algorithm would fail if there was a
really large Wikipedia article that spans among
3 compressed files, but still today, do not exists 
such really large article in Wikipedia.

Alejandro



--
View this message in context: 
http://runtime-revolution.278305.n4.nabble.com/Looking-for-parser-for-Email-MIME-tp4702407p4702517.html
Sent from the Revolution - User mailing list archive at Nabble.com.

_______________________________________________
use-livecode mailing list
use-livecode@lists.runrev.com
Please visit this url to subscribe, unsubscribe and manage your subscription 
preferences:
http://lists.runrev.com/mailman/listinfo/use-livecode

Reply via email to