This is a question on best practices and prudent programming.

I'm importing text data from a Big Company for updating a client's item table. It's a fixed field text file and the specs for the data file specify max line length of 700 characters.
The test file I'm using is 15K lines (8.5 Mb).
I import the data into a temp table:

CREATE TEMP TABLE `RawPkgSpecTmp` +
  (`RowNbr` INTEGER, +
  `PkgSpecData` TEXT (700), +
  `ColLength` = (SLEN(PkgSpecData)) INTEGER)

AUTONUM RowNbr IN RawPkgSpecTmp USING 1

--Load data
LOAD RawPkgSpecTmp FROM PkgSpec.TXT  +
  AS FORMATTED USING PkgSpecData 1 700

I cursor through the RawPkgSpecTmp table a row at a time for processing, loading the column PkgSpecData into the variable vPkgSpecData, then using SGETs to extract relevant data.

Most of the rows are 446 characters, I assume there's a CR at the end of the data, not position 700. Occasionally the data length is 668 characters and there is a clump of data at position 661 that I need.

So the question is:
Is it safe to always do a SET VAR vXtraPartNbr = (SGET(.vPkgSpecData,8,661)) even if there are only 446 characters in vPkgSpecData?

Or, for safety, should I check the length of the row first:
IF vColLength = 668 THEN
 SET VAR vXtraPartNbr = (SGET(.vPkgSpecData,8,661))
ENDI
to make sure I'm not wandering off into the data-hinterlands that could back to bite me someday? R:Base doesn't seem to mind - vXtraPartNbr is an expected null when the data is short and there aren't any apparent issues, messages or error messages.

Just asking, FMI.
Doug
p.s. This question came about because I was getting "Out of Dynamic Space" errors. The error was related to a subsequent command and totally unrelated to the SGET. But it got me wondering...


Reply via email to