Here's a quick & dirty idea to resolve your problem.
Read that file in via low-level file handling commands. Then, when you get to that field that's too long - chop it into 2 parts - with another pipe between - and spit it all out to a new Text file. Now you can simply read in that new Text file.
-K- On 4/20/2017 10:58 AM, Matt Wiedeman wrote:
Hello everyone, I need to setup a job to import a pipe delimited text file. This is easy enough but one of the fields is larger than 254 characters. If I use a memo field, it does not import that field. I started to setup a routine to step through each character and store the fields manually but I would rather not do it that way. Does anyone have a function or tip they can share to resolve this situation?
[excessive quoting removed by server] _______________________________________________ Post Messages to: [email protected] Subscription Maintenance: http://mail.leafe.com/mailman/listinfo/profox OT-free version of this list: http://mail.leafe.com/mailman/listinfo/profoxtech Searchable Archive: http://leafe.com/archives/search/profox This message: http://leafe.com/archives/byMID/profox/[email protected] ** All postings, unless explicitly stated otherwise, are the opinions of the author, and do not constitute legal or medical advice. This statement is added to the messages for those lawyers who are too stupid to see the obvious.

