I'll take a quick second to remind folks too of VFP2C32.FLL for
accessing files greater than 2GB too. Functions are FGetSex, FCloseEx,
FReadEx, FOpenEx, etc.
--Mike
On 2017-04-20 13:21, Paul H. Tarver wrote:
In this case, I would probably get each line of data to memory or a
cursor
as quickly as possible to speed up the parsing (ie: FileToStr if you
can, or
FGETS/FREADS to a cursor if you can't). However, I believe the
low-level
file functions would be easier to use here.
FGETS( ) returns a maximum of 8,192 bytes. (If you need more characters
per
line you can FREAD up to 65,535 bytes but using FREAD will require more
programming to set the read length correctly for each line and to
manually
position your file pointer.)
I haven't tested this code but I think it would look something like
this:
LOCAL lnFileHandle, lnFileSize, lcLineData, lnFldLoop, lcDestFld,
laDataArray[1]
USE DestTable IN 0 &&I'm assuming that the number and order of fields
match text file you are importing
lnFileHandle = FOPEN("test.txt")
lnFileSize = FSEEK(lnFileHandle, 0, 2)
IF lnFileSize <= 0
WAIT WINDOW "Error! Empty file!" NOWAIT
ELSE
DO WHILE !FEOF(lnFileHandle)
lcLineData = FGETS(lnFileHandle, 8192)
FOR lnFldLoop = 1 TO ALINES(laDataArray, lcLineData, "|")
lcDestFld = FIELD(lnFldLoop,'DestTable')
REPLACE lcDestFld.&lcDestFld WITH
laDataArray(lnFldLoop) IN DestTable
ENDFOR
ENDDO
ENDIF
= FCLOSE(lnFileHandle) && Close the file
You can get field level specific if you want to convert imported
character
data into numeric, date, binary or memo fields by doing direct
replacements
instead the field loop process shown above.
The bigger challenge that I see is that when a field is bigger than 254
characters, it is typcally reserved for user input/notes/memos/etc and
invariably a user will enter delimiter values inside of the field which
can
affect your parse routine.
Paul H. Tarver
-----Original Message-----
From: Matt Wiedeman [mailto:[email protected]]
Sent: Thursday, April 20, 2017 9:58 AM
To: [email protected]
Subject: Text file import
Hello everyone,
I need to setup a job to import a pipe delimited text file. This is
easy
enough but one of the fields is larger than 254 characters. If I use a
memo
field, it does not import that field. I started to setup a routine to
step
through each character and store the fields manually but I would rather
not
do it that way.
Does anyone have a function or tip they can share to resolve this
situation?
[excessive quoting removed by server]
_______________________________________________
Post Messages to: [email protected]
Subscription Maintenance: http://mail.leafe.com/mailman/listinfo/profox
OT-free version of this list: http://mail.leafe.com/mailman/listinfo/profoxtech
Searchable Archive: http://leafe.com/archives/search/profox
This message:
http://leafe.com/archives/byMID/profox/[email protected]
** All postings, unless explicitly stated otherwise, are the opinions of the
author, and do not constitute legal or medical advice. This statement is added
to the messages for those lawyers who are too stupid to see the obvious.