If you need to dig into the .dbf file to get the headers, there are some programs out there that can do it. OpenOffice Calc can parse the .dbf file and show the metadata headers so the .dbfawk file can be written.

I have found that the fed's are great at updating the data records but very, very, sloppy in supplying the metadata info for the .dbfawk creation.

73 from 807,

Richard, N6NKO


Troy M. Campbell wrote:
All,
For some reason I decided to start from scratch with all my maps, spent
a week compiling the latest tiger files, downloading dbfawks,
shapefiles, tweeked this and that, etc till I was pretty tired of it.
Then "standardized" the zoom levels to get the rendering to appear as
fast as possible while being generic as possible.
The last task was get the GNIS files for pop places, historical places
and some others from http://geonames.usgs.gov/domestic/download_data.htm
.
Imagine my dismay when I found out that they are in a format that xastir
won't read.
No problem, that's why perl exists. Oooops, not only did the format
change, but they dropped the "estimated population" field.   I'm pretty
sure that xastir uses this to pick what to show at what zoom level.
Does anyone know where the files can be gotten in the original format...
(New data, old format)?
Or has anyone overcome the problem already? 73 de Troy, KC0MIC
_______________________________________________
Xastir mailing list
[email protected]
http://lists.xastir.org/cgi-bin/mailman/listinfo/xastir

_______________________________________________
Xastir mailing list
[email protected]
http://lists.xastir.org/cgi-bin/mailman/listinfo/xastir

Reply via email to