On 24 Jan 2006 10:44:32 -0800, "IamIan" <[EMAIL PROTECTED]> wrote:
>I searched the archives but couldn't find anyone else with this >problem. Basically I'm grabbing all ASCII files in a directory and >doing geoprocessing on them. I need to calculate a z-factor based on >the latitude of the ASCII file being worked on, which is in the >filename. If I type in the code manually it works and reads the >latitude value from the ASCII filename, but when run within ArcGIS it >crashes when it gets to int(LatString). Isnumber() returned false for >Latitude as well. Is there something different about reading values >from an ASCII filename? Aren't you curious as to what the value of LatString was that failed? Don't you know how to find out? > >import sys, os, win32com.client, string, gc > ># Get a list of ASCII files in the workspace for ASCII To Raster >conversion >filenames = os.listdir(gp.workspace) >filenames = [filename.lower() >for filename in filenames >if (filename[-4:].lower() == ".asc" and filename[0] != "-" )] indentation of the above two lines would improve readability >for filename in filenames: I would try print repr(filename) here, to see what you are dealing with > > # For each ASCII file, create Hillshade. > # account for latitude by computing Z units using radians > Latitude = filename[1:3] > LatString = str(Latitude) you probably won't need a print repr(LatString) here if you see the above print > LatInt = int(LatString) > radians = LatInt * 0.0174532925 > zFactor = 1/(113200 * (cos(radians))) > BTW, capitalizing the first letter of python variable names is counter to usual convention. Regards, Bengt Richter -- http://mail.python.org/mailman/listinfo/python-list