Niklas,
Thanks for the info
Kevin
Niklas Nebel wrote:
Kevin Cullis wrote:
I've got an issue with importing a print to text (Linux) text document with tables of data to convert to a spreadsheet. The file is a staight output data text file that has a large header (about 132 rows of header data) that then has fixed spaces and has tables of 12,000 rows of data. I've used the Insert > Link to External Data (didn't work), and the File > Open with "Text CVS" selected and it starts out going to the "ASCII filter options" first rather than the "Text Import" part that allows the selection of fixed spaces, etc. Once the large header file is deleted, this is not a problem getting to "Text Import" where you can select the columns to parse out the rows and colums to create a spreadsheet. They have too many files (we're talking TB's of data) like this to keep having to delete the header file to get the data into a spreadsheet.
Are there null bytes in the header? The Calc CSV filter doesn't allow them. If you don't mind modifying the file, you can open it into Writer and save it again. This will remove the null bytes, and you can open the file into Calc.
Niklas
--------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
--------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
