Hi Larry,

Thanks a lot for your patch.
It took me time to understand how it works.

I noticed that the overrideWithExistingCompatibleDbfFieldDef is used
for AttributeType.INTEGER and AttributeType.DOUBLE
but for DOUBLE, I could not find a case where the overriding method
can change the dbf field.
It seems to me that currently, the original number of decimal of a double
will never be preserved (i mean in OpenJUMP code) ?
What do you think ?

Michaël

A more general solution than my last post is:

            if (columnType == AttributeType.INTEGER) {
fields[f] = new DbfFieldDef(columnName, 'N', 11, 0); //LDB: previously 16 DbfFieldDef fromFile = overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
                if (fromFile.fieldnumdec == 0)
                    fields[f] = fromFile;
                f++;
            } else if (columnType == AttributeType.DOUBLE) {
                fields[f] = new DbfFieldDef(columnName, 'N', 33, 16);
DbfFieldDef fromFile = overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
                if (fromFile.fieldnumdec > 0)
                    fields[f] = fromFile;
               f++;

Larry

On Fri, Mar 1, 2013 at 10:38 AM, Larry Becker <becker.la...@gmail.com <mailto:becker.la...@gmail.com>> wrote:

I have a fix that works in SkyJUMP and should work in OpenJUMP. It isn't elegant but it works by avoiding calling
    overrideWithExistingCompatibleDbfFieldDef when a type change is
    detected.   There may be a more direct solution, but I didn't find
    one.

    The patch file for SkyJUMP is:

    ### Eclipse Workspace Patch 1.0
    #P SkyJumpSVN
    Index: com/vividsolutions/jump/io/ShapefileWriter.java
    ===================================================================
    --- com/vividsolutions/jump/io/ShapefileWriter.java (revision 2)
    +++ com/vividsolutions/jump/io/ShapefileWriter.java (working copy)
    @@ -407,11 +407,13 @@

                 if (columnType == AttributeType.INTEGER) {
                     fields[f] = new DbfFieldDef(columnName, 'N', 11,
    0);  //LDB: previously 16
    -                fields[f] =
    overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
    +                if ((fieldMap != null) &&
    !fieldMap.toString().endsWith("N 33.16}"))
    +                    fields[f] =
    overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
                     f++;
                 } else if (columnType == AttributeType.DOUBLE) {
                     fields[f] = new DbfFieldDef(columnName, 'N', 33, 16);
    -                fields[f] =
    overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
    +                if ((fieldMap != null) &&
    !fieldMap.toString().endsWith("N 11.0}"))
    +                    fields[f] =
    overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
                     f++;
                 } else if (columnType == AttributeType.STRING) {
                     int maxlength =
    findMaxStringLength(featureCollection, t);

    Larry


    On Fri, Mar 1, 2013 at 9:09 AM, Larry Becker
    <becker.la...@gmail.com <mailto:becker.la...@gmail.com>> wrote:

        I can reproduce it in SkyJUMP.

        I think this is thte result of using
        overrideWithExistingCompatibleDbfFieldDef in ShapeWriter.   My
        rational behind this method introduced a few years ago (July
        2010) is that the size of the numbers entered during an edit
        session should not automatically lead to a reduction of
        precision in the shapefile.  This was the case before this
        procedure was introduced. I had shapefiles for which there
        were published data standards, but SkyJUMP would change the
        number of digits each time I saved the file.   The change
        achieved its goal, but it may not work correctly for this
        particular case due to the type change.

        I agree with Jukka.  It may be an unusual case, but the
reliability of shapefile IO is crucial to the JUMP family. This must be fixed. I will work on it too.

        BTW, the result is the same if you Save Dataset As instead of
        Save Selected Datasets.

        Larry


        On Thu, Feb 28, 2013 at 7:00 AM, Rahkonen Jukka
        <jukka.rahko...@mmmtike.fi <mailto:jukka.rahko...@mmmtike.fi>>
        wrote:

            Hi,

            This is bad. I have tested only with r3277 but not with
            any older versions.  I can reproduce this:

            - Start OJ
            - Create a layer, add attribute "attr" of type DOUBLE
            - Make a point, set attr=1.0
            - Save as a shapefile
            - Delete layer, read in the saved shapefile, everything OK
            - Edit schema, change "attr" into type INTEGER
            - Do "Save selected dataset"
            - Shapefile is now corrupted

            OJ cannot open this saved shapefile. The error is

            java.io.EOFException
                    at java.io.DataInputStream.readFully(Unknown Source)
                    at java.io.DataInputStream.readFully(Unknown Source)
                    at
            
com.vividsolutions.jump.io.EndianDataInputStream.readByteLEnum(EndianDataInputStream.java
            :75)
                    at
            org.geotools.dbffile.DbfFile.GetDbfRec(DbfFile.java:230)
                    at
            
com.vividsolutions.jump.io.ShapefileReader.read(ShapefileReader.java:179)
                    at
            
com.vividsolutions.jump.io.datasource.DelegatingCompressedFileHandler.read(DelegatingComp
            ressedFileHandler.java:80)
                    at
            
com.vividsolutions.jump.io.datasource.ReaderWriterFileDataSource$1.executeQuery(ReaderWri
            terFileDataSource.java:61)
                    at
            
org.openjump.core.ui.io.file.DataSourceFileLayerLoader.open(DataSourceFileLayerLoader.jav
            a:107)
                    at
            
org.openjump.core.ui.plugin.file.open.OpenFileWizard.run(OpenFileWizard.java:131)
                    at
            
org.openjump.core.ui.plugin.AbstractWizardPlugin.run(AbstractWizardPlugin.java:73)
                    at
            
com.vividsolutions.jump.workbench.ui.task.TaskMonitorManager$TaskWrapper.run(TaskMonitorM
            anager.java:152)
                    at java.lang.Thread.run(Unknown Source)

            GDAL cannot open this shapefile either. It suggests that
            there is something wrong with the .dbf file.  From ogrinfo:
            Layer name: spoil
            Geometry: Point
            Feature Count: 1
            Extent: (280.000000, 127.000000) - (280.000000, 127.000000)
            Layer SRS WKT:
            (unknown)
            attr: Real (33.16)
            ERROR 1: fread(34) failed on DBF file.

            -Jukka Rahkonen-

            
------------------------------------------------------------------------------
            Everyone hates slow websites. So do we.
            Make your web apps faster with AppDynamics
            Download AppDynamics Lite for free today:
            http://p.sf.net/sfu/appdyn_d2d_feb
            _______________________________________________
            Jump-pilot-devel mailing list
            Jump-pilot-devel@lists.sourceforge.net
            <mailto:Jump-pilot-devel@lists.sourceforge.net>
            https://lists.sourceforge.net/lists/listinfo/jump-pilot-devel






------------------------------------------------------------------------------
Everyone hates slow websites. So do we.
Make your web apps faster with AppDynamics
Download AppDynamics Lite for free today:
http://p.sf.net/sfu/appdyn_d2d_feb


_______________________________________________
Jump-pilot-devel mailing list
Jump-pilot-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/jump-pilot-devel

------------------------------------------------------------------------------
Everyone hates slow websites. So do we.
Make your web apps faster with AppDynamics
Download AppDynamics Lite for free today:
http://p.sf.net/sfu/appdyn_d2d_feb
_______________________________________________
Jump-pilot-devel mailing list
Jump-pilot-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/jump-pilot-devel

Reply via email to