Re: [JPP-Devel] Manage new attribute types BOOLEAN and LONG in jml and shp drivers

2015-03-16 Thread Rahkonen Jukka (MML)
Hi Michaël,


My test file can now be successfully saved and read to/from shape and JML.

I made also a test with GDAL-dev versions with the created shapefile


ogrinfo -so -al datatype_test.shp
INFO: Open of `datatype_test.shp'
  using driver `ESRI Shapefile' successful.

Layer name: datatype_test
Geometry: Point
Feature Count: 1
Extent: (310.00, 406.00) - (310.00, 406.00)
Layer SRS WKT:
(unknown)
string_att: String (6.0)
char_attri: String (14.0)
varchar_at: String (17.0)
longvarcha: String (21.0)
text_attri: String (14.0)
boolean_at: String (1.0)
bit_attrib: String (1.0)
smallint_a: Integer64 (11.0)
tinyint_at: Integer64 (11.0)
integer_at: Integer64 (11.0)
long_attri: Real (21.0)
bigint_att: Real (21.0)
decimal_at: Real (33.16)
numeric_at: Real (33.16)
bigdecimal: Real (33.16)
float_attr: Real (33.16)
double_att: Real (33.16)
real_attri: Real (33.16)
date_attri: Date (10.0)
time_attri: Date (10.0)
timestamp_: Date (10.0)
object_att: String (1.0)


Notice that all the short integers are interpreted as long integers (Integer64) 
and the long ones as Reals.  Perhaps you should consider to make the numbers a 
little bit shorter? According to this ticket

http://trac.osgeo.org/gdal/ticket/3615 ever a number with 10 digits (10.0) can 
be too big as an Integer. I suppose that the biggest Integer is 4,294,967,29. 
And numbers with 20 digits can be bigger than Long integers if they are 
18,446,744,073,709,551,615.


GDAL shp driver

http://www.gdal.org/drv_shapefile.html is behaving this way when it creates 
numeric fields :

  *   Integer fields without an explicit width are treated as width 9, and 
extended to 10 or 11 if needed.

  *   Integer64 fields without an explicit width are treated as width 18, and 
extended to 19 or 20 if needed.

I made some tests about what GDAL does in reading. It appears to reports 
numbers only up to (9.0) as Integers and up to (18.0) as Long Integers.


I wonder if it would be better to change the shapefile writer of OpenJUMP to 
create Integer fields by default as (9.0) and change format into (10.0) only if 
integer field contains values between 1,000,000,00 and 4,294,967,29. Bigger 
values than the upper limit should not be accepted into integer field because 
they are invalid everywhere.


There seems to be another issue with Long integers in the shapefiles. Long 
integers can need up to 20 digits but  the standard dBase format has an 18 
digit limit for numbers

http://www.clicketyclick.dk/databases/xbase/format/data_types.html.http://www.clicketyclick.dk/databases/xbase/format/data_types.html
 Some version has extended that to 20 numbers.  Because of best possible 
interoperability I think that OpenJUMP should create the Long fields as (18.0) 
by default and (19.0) or (20.0) only if needed.


-Jukka Rahkonen-





Michaël Michaud  wrote:

 Hi Jukka,

 Thank you for the test and sorry for the exceptions.
I just completed with BIGINT, TIME and NUMERIC.
Shapefile driver will not really handle all these types. I've just handled
long and boolean in a specific way. Other types are just mapped to old
types.

 This is how new types are supposed to be converted to dbf then back to 
 OpenJUMP :

CHAR, VARCHAR, LONGVARCHAR, TEXT, STRING, OBJECT  - C- STRING
FLOAT, DOUBLE, REAL, NUMERIC, DECIMAL, BIGDECIMAL - N(33,16) - DOUBLE
TINYINT, SMALLINT, INTEGER  - N(11) - INTEGER
LONG, BIGINT- N(21) - LONG
DATE, TIME, TIMESTAMP   - D - DATE
BOOLEAN, BIT- L - BOOLEAN

The only data types that I've sometimes missed are boolean and long.
That's why I tried to map them in a way that can preserve type information when
you save to dbf and back.
For other data types, my main concern is just to make the drivers compatible 
with
the UI.

Michaël

Le 15/03/2015 18:26, Rahkonen Jukka (MML) a écrit :

Hi,


I  made a test file with one point and one attribute of each selectable data 
type. However, OpenJUMP is not totally ready for handling all those.


Saving the JML file as shapefile stops to the following error:


java.lang.Exception: ShapefileWriter: unsupported AttributeType found in 
featurecollection. : BIGINT
at 
com.vividsolutions.jump.io.ShapefileWriter.writeDbf(ShapefileWriter.java:537)
at 
com.vividsolutions.jump.io.ShapefileWriter.write(ShapefileWriter.java:292)
at 
com.vividsolutions.jump.io.datasource.ReaderWriterFileDataSource$1.executeUpdate(ReaderWriterFileDataSource.java:73)
at 
com.vividsolutions.jump.workbench.datasource.AbstractSaveDatasetAsPlugIn.run(AbstractSaveDatasetAsPlugIn.java:28)
at 
com.vividsolutions.jump.workbench.ui.task.TaskMonitorManager$TaskWrapper.run(TaskMonitorManager.java:152)
at java.lang.Thread.run(Unknown Source)


Next I tried to edit the schema and remove the BIGINT attribute but it was not 
so easy. Changes in the schema can be confirmed only after removing all the 
attributes of the following data types first:


CHAR

VARCHAR


Re: [JPP-Devel] Manage new attribute types BOOLEAN and LONG in jml and shp drivers

2015-03-16 Thread Michaël Michaud

Jukka,

OK, I tried to implement it more or less as described in your previous mail,

One other drawback with this change is that all previous shapefiles 
containing integers and saved as N(11,0) will now be read back as longs.


Let me now if someone think this can be a problem,

Michaël

Le 16/03/2015 10:39, Rahkonen Jukka (MML) a écrit :


Hi Michaël,


My test file can now be successfully saved and read to/from shape and JML.

I made also a test with GDAL-dev versions with the created shapefile


ogrinfo -so -al datatype_test.shp
INFO: Open of `datatype_test.shp'
  using driver `ESRI Shapefile' successful.

Layer name: datatype_test
Geometry: Point
Feature Count: 1
Extent: (310.00, 406.00) - (310.00, 406.00)
Layer SRS WKT:
(unknown)
string_att: String (6.0)
char_attri: String (14.0)
varchar_at: String (17.0)
longvarcha: String (21.0)
text_attri: String (14.0)
boolean_at: String (1.0)
bit_attrib: String (1.0)
smallint_a: Integer64 (11.0)
tinyint_at: Integer64 (11.0)
integer_at: Integer64 (11.0)
long_attri: Real (21.0)
bigint_att: Real (21.0)
decimal_at: Real (33.16)
numeric_at: Real (33.16)
bigdecimal: Real (33.16)
float_attr: Real (33.16)
double_att: Real (33.16)
real_attri: Real (33.16)
date_attri: Date (10.0)
time_attri: Date (10.0)
timestamp_: Date (10.0)
object_att: String (1.0)


Notice that all the short integers are interpreted as long integers 
(Integer64) and the long ones as Reals.  Perhaps you should consider 
to make the numbers a little bit shorter? According to this ticket


http://trac.osgeo.org/gdal/ticket/3615 ever a number with 10 digits 
(10.0) can be too big as an Integer. I suppose that the biggest 
Integer is 4,294,967,29. And numbers with 20 digits can be bigger 
than Long integers if they are 18,446,744,073,709,551,615.



GDAL shp driver

http://www.gdal.org/drv_shapefile.html is behaving this way when it 
creates numeric fields :


 *

Integer fields without an explicit width are treated as width 9,
and extended to 10 or 11 if needed.

 *

Integer64 fields without an explicit width are treated as width
18, and extended to 19 or 20 if needed.

I made some tests about what GDAL does in reading. It appears to 
reports numbers only up to (9.0) as Integers and up to (18.0) as Long 
Integers.



I wonder if it would be better to change the shapefile writer of 
OpenJUMP to create Integer fields by default as (9.0) and change 
format into (10.0) only if integer field contains values between 
1,000,000,00 and 4,294,967,29. Bigger values than the upper limit 
should not be accepted into integer field because they are invalid 
everywhere.



There seems to be another issue with Long integers in the shapefiles. 
Long integers can need up to 20 digits but  the standard dBase format 
has an 18 digit limit for numbers


http://www.clicketyclick.dk/databases/xbase/format/data_types.html. 
http://www.clicketyclick.dk/databases/xbase/format/data_types.html Some 
version has extended that to 20 numbers.  Because of best possible 
interoperability I think that OpenJUMP should create the Long fields 
as (18.0) by default and (19.0) or (20.0) only if needed.



-Jukka Rahkonen-





Michaël Michaud  wrote:
 Hi Jukka,

 Thank you for the test and sorry for the exceptions.
I just completed with BIGINT, TIME and NUMERIC.
Shapefile driver will not really handle all these types. I've just handled
long and boolean in a specific way. Other types are just mapped to old
types.

 This is how new types are supposed to be converted to dbf then back 
to OpenJUMP :


CHAR, VARCHAR, LONGVARCHAR, TEXT, STRING, OBJECT  - C- STRING
FLOAT, DOUBLE, REAL, NUMERIC, DECIMAL, BIGDECIMAL - N(33,16) - DOUBLE
TINYINT, SMALLINT, INTEGER  - N(11) - INTEGER
LONG, BIGINT- N(21) - LONG
DATE, TIME, TIMESTAMP   - D - DATE
BOOLEAN, BIT- L - BOOLEAN

The only data types that I've sometimes missed are boolean and long.
That's why I tried to map them in a way that can preserve type 
information when

you save to dbf and back.
For other data types, my main concern is just to make the drivers 
compatible with

the UI.

Michaël

Le 15/03/2015 18:26, Rahkonen Jukka (MML) a écrit :


Hi,


I  made a test file with one point and one attribute of each 
selectable data type. However, OpenJUMP is not totally ready for 
handling all those.



Saving the JML file as shapefile stops to the following error:


java.lang.Exception: ShapefileWriter: unsupported AttributeType found 
in featurecollection. : BIGINT
at 
com.vividsolutions.jump.io.ShapefileWriter.writeDbf(ShapefileWriter.java:537)
at 
com.vividsolutions.jump.io.ShapefileWriter.write(ShapefileWriter.java:292)
at 
com.vividsolutions.jump.io.datasource.ReaderWriterFileDataSource$1.executeUpdate(ReaderWriterFileDataSource.java:73)
at 

[JPP-Devel] SVN: [4341] core/trunk/src

2015-03-16 Thread jump-pilot-svn
Revision: 4341
  http://sourceforge.net/p/jump-pilot/code/4341
Author:   michaudm
Date: 2015-03-16 23:03:34 + (Mon, 16 Mar 2015)
Log Message:
---
Change integer field length in dbf file format (length = 3,6,9,12,15,18, + 
according to real data length). Integer with more than 9 digits are interpreted 
as long.

Modified Paths:
--
core/trunk/src/com/vividsolutions/jump/io/ShapefileWriter.java
core/trunk/src/com/vividsolutions/jump/workbench/ui/AttributeTablePanel.java
core/trunk/src/org/geotools/dbffile/DbfFile.java

Modified: core/trunk/src/com/vividsolutions/jump/io/ShapefileWriter.java
===
--- core/trunk/src/com/vividsolutions/jump/io/ShapefileWriter.java  
2015-03-15 22:10:27 UTC (rev 4340)
+++ core/trunk/src/com/vividsolutions/jump/io/ShapefileWriter.java  
2015-03-16 23:03:34 UTC (rev 4341)
@@ -437,18 +437,46 @@
 if (columnType == AttributeType.INTEGER ||
 columnType == AttributeType.SMALLINT ||
 columnType == AttributeType.TINYINT) {
-fields[f] = new DbfFieldDef(columnName, 'N', 11, 0);  //LDB: 
previously 16
+int maxlength = findMaxStringLength(featureCollection, t);
+if (maxlength = 3) fields[f] = new DbfFieldDef(columnName, 
'N', 3, 0);
+else if (maxlength = 6) fields[f] = new 
DbfFieldDef(columnName, 'N', 6, 0);
+else if (maxlength = 9) fields[f] = new 
DbfFieldDef(columnName, 'N', 9, 0);
+else fields[f] = new DbfFieldDef(columnName, 'N', maxlength, 
0);
 DbfFieldDef fromFile = 
overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
 if (fromFile.fieldnumdec == 0)
 fields[f] = fromFile;
 f++;
-} else if (columnType == AttributeType.LONG || columnType == 
AttributeType.BIGINT) {
-fields[f] = new DbfFieldDef(columnName, 'N', 21, 0);
+}
+
+else if (columnType == AttributeType.LONG ||
+columnType == AttributeType.BIGINT) {
+int maxlength = findMaxStringLength(featureCollection, t);
+if (maxlength = 12) fields[f] = new DbfFieldDef(columnName, 
'N', 12, 0);
+else if (maxlength = 15) fields[f] = new 
DbfFieldDef(columnName, 'N', 15, 0);
+else if (maxlength = 18) fields[f] = new 
DbfFieldDef(columnName, 'N', 18, 0);
+else fields[f] = new DbfFieldDef(columnName, 'N', maxlength, 
0);
 DbfFieldDef fromFile = 
overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
 if (fromFile.fieldnumdec == 0)
 fields[f] = fromFile;
 f++;
-} else if (columnType == AttributeType.DOUBLE ||
+}
+
+//if (columnType == AttributeType.INTEGER ||
+//columnType == AttributeType.SMALLINT ||
+//columnType == AttributeType.TINYINT) {
+//fields[f] = new DbfFieldDef(columnName, 'N', 11, 0);  //LDB: 
previously 16
+//DbfFieldDef fromFile = 
overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
+//if (fromFile.fieldnumdec == 0)
+//fields[f] = fromFile;
+//f++;
+//} else if (columnType == AttributeType.LONG || columnType == 
AttributeType.BIGINT) {
+//fields[f] = new DbfFieldDef(columnName, 'N', 21, 0);
+//DbfFieldDef fromFile = 
overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
+//if (fromFile.fieldnumdec == 0)
+//fields[f] = fromFile;
+//f++;
+//}
+else if (columnType == AttributeType.DOUBLE ||
 columnType == AttributeType.REAL ||
 columnType == AttributeType.FLOAT ||
 columnType == AttributeType.NUMERIC ||
@@ -706,6 +734,7 @@
 return Math.max(1, maxlen); //LDB: don't allow zero length strings
 }
 
+
 /**
  * Find the generic geometry type of the feature collection.
  * Simple method - find the 1st non null geometry and its type

Modified: 
core/trunk/src/com/vividsolutions/jump/workbench/ui/AttributeTablePanel.java
===
--- 
core/trunk/src/com/vividsolutions/jump/workbench/ui/AttributeTablePanel.java
2015-03-15 22:10:27 UTC (rev 4340)
+++ 
core/trunk/src/com/vividsolutions/jump/workbench/ui/AttributeTablePanel.java
2015-03-16 23:03:34 UTC (rev 4341)
@@ -38,6 +38,9 @@
 import java.awt.event.MouseEvent;
 import java.awt.event.MouseMotionAdapter;
 import java.awt.image.BufferedImage;
+import java.math.BigDecimal;
+import java.sql.Time;
+import java.sql.Timestamp;
 import java.text.DateFormat;
 import 

Re: [JPP-Devel] Question about loading an image on OJ workbench via Layer.class

2015-03-16 Thread edgar . soldin
you mean like the other other image framework beside pirol/sextante
 com.vividsolutions.jump.workbench.imagery.ReferencedImagesLayer
?

..ede

On 16.03.2015 20:34, Giuseppe Aruta wrote:
 Hi all,
 I want to build a plugin that programatically load an image file into OJ
 workbench through Layer.class (not RasterImageLayer.class). For instance a
 layer that I already know the path (ex. D:/temp/test.tif, or jpg, or png).
 I am working around a simple OJ affine warp transformation plugin for
 images.
 Do we have some defined methods (like as in RasterImageIO.class or
 GridFloat.class for Sextante layers)? Or is there an OJ class that could be
 used as a sample?
 Thanks for replay.
 Peppe
 
 
 
 --
 Dive into the World of Parallel Programming The Go Parallel Website, sponsored
 by Intel and developed in partnership with Slashdot Media, is your hub for all
 things parallel software development, from weekly thought leadership blogs to
 news, videos, case studies, tutorials and more. Take a look and join the 
 conversation now. http://goparallel.sourceforge.net/
 
 
 
 ___
 Jump-pilot-devel mailing list
 Jump-pilot-devel@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/jump-pilot-devel
 

--
Dive into the World of Parallel Programming The Go Parallel Website, sponsored
by Intel and developed in partnership with Slashdot Media, is your hub for all
things parallel software development, from weekly thought leadership blogs to
news, videos, case studies, tutorials and more. Take a look and join the 
conversation now. http://goparallel.sourceforge.net/
___
Jump-pilot-devel mailing list
Jump-pilot-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/jump-pilot-devel