Hi Martin, On Wed, Feb 6, 2013 at 3:38 AM, Larry Shaffer <[email protected]>wrote:
> Hi Martin, > > On Wed, Feb 6, 2013 at 2:53 AM, Martin Dobias <[email protected]> wrote: > >> On Wed, Feb 6, 2013 at 10:20 AM, Larry Shaffer <[email protected]> >> wrote: >> > Thanks for the explanation. Figured it wouldn't be that simple. So, is >> it >> > worth the effort to fix this (the 1.8 to 2.0 index issue)? I think many >> > users will find manually updating their data defined mappings, for every >> > layer in a project that has them, very annoying. This has the potential >> to >> > become a very large time-sucking problem for many users. >> >> As far as I know, it should be a problem only in data-defined new >> labeling - field names are used in other places. >> > > Yes, definitely they should be moved to field-name-based data defined > mappings. Even doing that, though, will require a means of auto-conversion > (like with api change) to properly map the old index value to its field > name, only once, to migrate over. > > > Could a lightweight QHash<old index, field name> be populated inline with >> > the current building of the attribute vectors; only for those providers >> that >> > would be different than before? The only-used-for-reads QHash could be >> used >> > to create an auto-update routine for data defined mappings. Is something >> > like that reasonable to implement? Is there another approach, or any at >> all? >> >> I think a real problem is there just with postgis layers - if I'm not >> wrong, the holes in 1.x are appearing when geometry column is not at >> the end of the list of fields, so for a layer with columns geom, >> attr1, attr2 you would get attribute indices 1,2. As you can see, >> implementing such mapping would be dependent on table structure. The >> problem is that the stored indices get broken even with a change of >> table structure - so even if we implemented some backward >> compatibility for postgis (that would take table structure into >> account), it might still cause problems. >> > > I understand, as this is the case with most of the labeled layers in my > PostGIS projects. Many are created from shapefile imports, then the > specialized labeling fields are added. Geometry column is basically never > the last one. All of those layers are at least partially messed up, data > defined-wise. > > My thoughts are to use that QHash in > QgsPalLayerSettings::_readDataDefinedProperty to convert the stored field > index property to its field name when the value is found to be an integer > (indicating old-style index stored). I would also be saving the new field > name mapping in a different property built off of the enum's literal string > value, e.g. > > key="/labeling/dataDefined/Bold" value="label_bold" > instead of > key="labeling/dataDefinedProperty1" value="9" > > erasing the old index-based property on successful write of the new one. > If the QHash is empty, then the current stored index value can be 'trusted' > to map to the proper field name, i.e. provider never had holes to begin > with. Does that make sense to you? Any ideas on adding such a backward > compatibility QHash to the base data provider and implement it in just the > postgres provider (unless there are others that had QMap hole issues)? > I implemented this in a very simple manner in a feature branch [0], which is also available as a patch [1]. Using the following in PyQGIS console, with a PostGIS layer selected: pv = iface.activeLayer().dataProvider() pv.palAttributeIndexNames() returns for my test layer... {0: PyQt4.QtCore.QString(u'id'), 2: PyQt4.QtCore.QString(u'Label_Text'), 3: PyQt4.QtCore.QString(u'Label_FontSize') ... } which matches previous index-to-name mapping (geom column is index #1) and can be used to fix the issue with QgsPalLabeling, make an auto-updating function, and move to field name mappings. If run on a layer not PostGIS, it returns an empty QHash, implying the fix is not needed for the provider. However, I don't want to mess with data providers without sound advice from those in the know. :^) Please let me know if such an approach is a reasonable solution, I'm missing something, or you have a better idea. [0] https://github.com/dakcarto/Quantum-GIS/tree/labeling_vector-api-fix_1 , https://github.com/dakcarto/Quantum-GIS/commit/a8bda0575db5783f53f44054a3a522953c9ffbcf [1] http://drive.dakotacarto.com/qgis/qgsdataprovider-qgspallabeling_patch.diff Regards, and thanks for your help, Larry > Of course, regardless of adding an auto-conversion, if the user adjusts > the table structure before the conversion is done there will be messed up > field mappings, just like before. Since the conversion should only happen > once on the first opening of the <2.0 project file (only time the QHash is > used), there is a high likelihood the auto-conversion will transparently > work with no issues. Then, the QHash fix can be deprecated sometime in the > future. > > Also, moving to field name-based data defined mappings should allow join > tables to function better for such mappings. > > Regards, > > Larry > > > Martin >> > >
_______________________________________________ Qgis-developer mailing list [email protected] http://lists.osgeo.org/mailman/listinfo/qgis-developer
