I can reproduce it in SkyJUMP.

I think this is thte result of using
overrideWithExistingCompatibleDbfFieldDef in ShapeWriter.   My rational
behind this method introduced a few years ago (July 2010) is that the size
of the numbers entered during an edit session should not automatically lead
to a reduction of precision in the shapefile.  This was the case before
this procedure was introduced. I had shapefiles for which there were
published data standards, but SkyJUMP would change the number of digits
each time I saved the file.   The change achieved its goal, but it may not
work correctly for this particular case due to the type change.

I agree with Jukka.  It may be an unusual case, but the reliability of
shapefile IO is crucial to the JUMP family.  This must be fixed.  I will
work on it too.

BTW, the result is the same if you Save Dataset As instead of Save Selected
Datasets.

Larry

On Thu, Feb 28, 2013 at 7:00 AM, Rahkonen Jukka
<jukka.rahko...@mmmtike.fi>wrote:

> Hi,
>
> This is bad. I have tested only with r3277 but not with any older
> versions.  I can reproduce this:
>
> - Start OJ
> - Create a layer, add attribute "attr" of type DOUBLE
> - Make a point, set attr=1.0
> - Save as a shapefile
> - Delete layer, read in the saved shapefile, everything OK
> - Edit schema, change "attr" into type INTEGER
> - Do "Save selected dataset"
> - Shapefile is now corrupted
>
> OJ cannot open this saved shapefile. The error is
>
> java.io.EOFException
>         at java.io.DataInputStream.readFully(Unknown Source)
>         at java.io.DataInputStream.readFully(Unknown Source)
>         at
> com.vividsolutions.jump.io.EndianDataInputStream.readByteLEnum(EndianDataInputStream.java
> :75)
>         at org.geotools.dbffile.DbfFile.GetDbfRec(DbfFile.java:230)
>         at
> com.vividsolutions.jump.io.ShapefileReader.read(ShapefileReader.java:179)
>         at
> com.vividsolutions.jump.io.datasource.DelegatingCompressedFileHandler.read(DelegatingComp
> ressedFileHandler.java:80)
>         at
> com.vividsolutions.jump.io.datasource.ReaderWriterFileDataSource$1.executeQuery(ReaderWri
> terFileDataSource.java:61)
>         at
> org.openjump.core.ui.io.file.DataSourceFileLayerLoader.open(DataSourceFileLayerLoader.jav
> a:107)
>         at
> org.openjump.core.ui.plugin.file.open.OpenFileWizard.run(OpenFileWizard.java:131)
>         at
> org.openjump.core.ui.plugin.AbstractWizardPlugin.run(AbstractWizardPlugin.java:73)
>         at
> com.vividsolutions.jump.workbench.ui.task.TaskMonitorManager$TaskWrapper.run(TaskMonitorM
> anager.java:152)
>         at java.lang.Thread.run(Unknown Source)
>
> GDAL cannot open this shapefile either. It suggests that there is
> something wrong with the .dbf file.  From ogrinfo:
> Layer name: spoil
> Geometry: Point
> Feature Count: 1
> Extent: (280.000000, 127.000000) - (280.000000, 127.000000)
> Layer SRS WKT:
> (unknown)
> attr: Real (33.16)
> ERROR 1: fread(34) failed on DBF file.
>
> -Jukka Rahkonen-
>
>
> ------------------------------------------------------------------------------
> Everyone hates slow websites. So do we.
> Make your web apps faster with AppDynamics
> Download AppDynamics Lite for free today:
> http://p.sf.net/sfu/appdyn_d2d_feb
> _______________________________________________
> Jump-pilot-devel mailing list
> Jump-pilot-devel@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/jump-pilot-devel
>
------------------------------------------------------------------------------
Everyone hates slow websites. So do we.
Make your web apps faster with AppDynamics
Download AppDynamics Lite for free today:
http://p.sf.net/sfu/appdyn_d2d_feb
_______________________________________________
Jump-pilot-devel mailing list
Jump-pilot-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/jump-pilot-devel

Reply via email to