Hello Felix,

Comments inline...

> We have some shapefiles wich we want to, as the final result, show  
> on Google Maps. These shapefiles are made of thousands of polygons,  
> each being of 250 x 250 meters. These polygons have an integer  
> feature associated, lets say from 0 to 200. If the feature is below  
> "50", the polygon must be rendered in red. If betweeen "50 and 150",  
> in yellow, and if above "150" in green.
>

Great, looks like your filters below do a good job of that  
classification.

> Our maximum visualization zoom level in Google Maps is 14, which  
> gives a resolution of ~ 7.4 m/px at the location of interest. (each  
> polygon is then ~ 34 px). The shapefile width is 93000 m and 250000  
> m height. With these data, we believe we need to render the  
> shapefile to a file (PNG) of 12567 x 33784 px in size to keep the  
> accuracy of the original shapefile.
>

Wow, that's over a 1.2 GB image if that were a only blank geotiff. :)

Okay, although rendering to a single image is not usually the standard  
pathway for creating web tiles.

> The code to render the png is:
>
> from mapnik import *
>
> m = Map(12567,33784,"+proj=latlong +datum=WGS84")
> s = Style()
>
> a_rule=Rule()
> a_rule.filter = Filter('[Value] > -87 and [Value] <= -78')
> a_rule.symbols.append(PolygonSymbolizer(Color('#FF0000')))
>
> b_rule=Rule()
> b_rule.filter = Filter('[Value] > -78 and [Value] <= -72')
> b_rule.symbols.append(PolygonSymbolizer(Color('#FFFF00')))
>
> c_rule=Rule()
> c_rule.filter = Filter('[Value] > -72')
> c_rule.symbols.append(PolygonSymbolizer(Color('#00FF00')))
>
> s.rules.append(a_rule)
> s.rules.append(b_rule)
> s.rules.append(c_rule)
>
> m.append_style('My Style',s)
> lyr = Layer('E')
> lyr.datasource = Shapefile(file='C:/shapefiles/E')
> lyr.styles.append('My Style')
> m.layers.append(lyr)
> m.zoom_to_box(lyr.envelope())
> render_to_file(m,'C:/shapefiles/E.png', 'png')


Note that you can also call `save_map()` to serialize all those styles  
to xml and then you can then use `load_map` to reload them.

> With that file size (12567,33784), we get a memory error. Is it  
> possible to increase the memory heap used by python in order to  
> generate this file?
>

I'm not sure, but good question. How much memory does your machine  
have? While I think that you don't want to try to render a single  
image that large, I'm also keen to knowing memory limits of mapnik  
rendering don't max at the system limits.

> If we reduce the file size (e.g. 8378, 22523), the file is rendered  
> but due to some antialiasing effect the borders of the polygons are  
> blurred. We attach a picture in which you can see this effect and  
> another showing how we would like to see it, as shown in Mapinfo  
> loading directly the shape file and with a thematic map.

Okay, I yes that is an interesting rendering artifact that I've seen  
before with mapnik when only a PolygonSymbolizer is used. What happens  
if you add a LineSymbolizer as well that matches the color of the  
polygons? Filing a Trac ticket on this would be great!

> Is it possible to disable antialiasing using mapnik?
>

No, but you could file a ticket on trac to discuss this. I would  
assume that the agg rendering backend will only antialias and cairo  
rendering backend (only available currently in Trunk) might have some  
options for control antialiasing level, but I've not looked into it.

> Once we have the file generated, we want to generate tiles for  
> Google Maps from it either using gdal2tiles

Sure, this approach makes logical sense if you are going to ultimately  
generate tiles with gdal2tiles. What you may need to do is render at  
least 3-4 chuncks of your shapefile and then use gdal_merge.py to  
combine them before chopping up into tiles with gdal2tiles.

> or tilecache + mapnik.

If you went this route you can fully skip the rendering of this single  
GB+ image, since TileCache is smart enough to create all the  
individual tiny tiles needed for web mapping directly from the vector  
file styled from mapnik.

> In the first case, we need the png file to be georreferenced. Is it  
> possible from mapnik to generate a world file for the png according  
> to the extents and bound of the source shapefile?
>

Yes, I've written a small python program that will do this for you  
called nik2img (http://code.google.com/p/mapnik-utils/wiki/Nik2Img).  
Just feed nik2img the XML mapfile that you generate via `save_map()`  
using the flag --worldfile .wld and it should output a worldfile  
suitable for gdal. I've yet to test the worldfile generation across  
various projections, so patches are welcome if you find subtle shifts.


> We haven't explored yet the tilecache + mapnik approach.

I would go this route myself, or look into using generate_tiles.py 
(http://wiki.openstreetmap.org/index.php/Deploying_your_own_Slippy_Map 
)

> gdal2tiles provides an option to resample the source file in order  
> to generate missing pixels for the desired zoom level, so we believe  
> we can provide a tinier source file which we can generate with  
> mapnik from the shapefile, and then resample to the desired zoom  
> level without loosing accuracy. Is it possible to make something  
> similar with mapnik + tilecache?

I think you will want to use TileCache with a mapnik xml stylesheet  
with Min and Max scale denominators so that tiles generated at each  
zoom level are styled differently/appropriately.

Cheers,

Dane


>
>
> Thanks in advance.
>
> -- 
> Félix
>
> < 
> mapinfo 
> .png><mapnik.png>_______________________________________________
> Mapnik-users mailing list
> [email protected]
> https://lists.berlios.de/mailman/listinfo/mapnik-users

_______________________________________________
Mapnik-users mailing list
[email protected]
https://lists.berlios.de/mailman/listinfo/mapnik-users

Reply via email to