Re: [postgis-users] identify rows in a shapefile that have illegal characters for UTF-8 encoding
Hello, I often play with postgis data in arcgis, using original shapefiles encoded using LATIN1 and I have no characters problem. My major problem comes from null values which prevent arcgis to display data attributes but, usually, all my geometries are here and I can even play with symbology without the possibility to display the data I'm working on. Did you check null values or did you try to import them in arcgis as a request layer using the where clause to exclude null data ? Hugues. -Message d'origine- De : postgis-users-boun...@postgis.refractions.net [mailto:postgis-users-boun...@postgis.refractions.net] De la part de Mark Volz Envoyé : lundi 22 octobre 2012 22:01 À : postgis-users@postgis.refractions.net Objet : [postgis-users] identify rows in a shapefile that have illegal characters for UTF-8 encoding Hello, I am trying to load my parcels into PostGIS, which will eventually be consumed by MapServer, and ArcGIS. Initially when loaded my data I received a warning that I should change my encoding from UTF-8 to LATIN1. Doing so allowed me to load data into PostGIS however, I could not consume the data in ArcGIS. So from what I observed, I need to stick with UTF-8 encoding. I have determined that the field for the legal description in my parcels is stopping me from loading the shapefile into PostGIS using UTF-8. How can I find out which rows in my shapefile have illegal characters for UTF-8 encoding? Thank You Mark Volz GIS Specialist ___ postgis-users mailing list postgis-users@postgis.refractions.net http://postgis.refractions.net/mailman/listinfo/postgis-users ___ postgis-users mailing list postgis-users@postgis.refractions.net http://postgis.refractions.net/mailman/listinfo/postgis-users
Re: [postgis-users] identify rows in a shapefile that have illegal characters for UTF-8 encoding
On Mon, Oct 22, 2012 at 1:01 PM, Mark Volz markv...@co.lyon.mn.us wrote: I am trying to load my parcels into PostGIS, which will eventually be consumed by MapServer, and ArcGIS. Initially when loaded my data I received a warning that I should change my encoding from UTF-8 to LATIN1. How did you change your encoding? In your database, or in your data load? If you just ran shp2pgsql -W LATIN1 shpfile.shp tblename Then the non-ASCII characters in your dbf file would have been transcoded to UTF8 during the load and landed nicely in the database with the right UTF code points. Doing so allowed me to load data into PostGIS however, I could not consume the data in ArcGIS. This seems fishy. If your database is UTF and you load using the -W flag as above, everything is pretty bog standard and ArcGIS should be able to read it fine (particularly since the libpq library does all the transcoding for client apps! ArcGIS doesn't even have to think about transcoding, just declare the encoding it desires!) How can I find out which rows in my shapefile have illegal characters for UTF-8 encoding? There are no illegal character for UTF, UTF can represent any and all characters (and does). There's something else going on P. ___ postgis-users mailing list postgis-users@postgis.refractions.net http://postgis.refractions.net/mailman/listinfo/postgis-users