Ryan Ripken ha scritto:
> I'm running into some trouble writing features I get from a wfs server
> to a local shapefile.
>
> I've read through the csv2shp and the query lab example pages.
>
> I have a collection of features from:
>
> FeatureCollection<SimpleFeatureType, SimpleFeature> features =
> dataStore.getFeatureSource( dataStore.getTypeNames()[0] ).getFeatures(
> query );
>
> I build a ShapefileDataStore and createSchema using the schema I get
> from the dataStore.getSchema( dataStore.getTypeNames()[0] )
>
> when I add my FeatureCollection to the ShapefileDataStore I get an
> IndexOutOfBoundsException.
>
> org.geotools.data.DataSourceException: Could not create MapunitPoly out
> of provided feature: 459154
> at
> org.geotools.data.AbstractFeatureStore.addFeatures(AbstractFeatureStore.java:266)
> at WFSExample4.writeToShpFile(WFSExample4.java:206)
> at WFSExample4.dataAccess(WFSExample4.java:172)
> at WFSExample4.main(WFSExample4.java:73)
> Caused by: java.lang.IndexOutOfBoundsException: Index: 1, Size: 1
> at java.util.ArrayList.RangeCheck(Unknown Source)
> at java.util.ArrayList.get(Unknown Source)
> at
> org.geotools.feature.simple.SimpleFeatureImpl.setAttributes(SimpleFeatureImpl.java:203)
> at
> org.geotools.data.AbstractFeatureStore.addFeatures(AbstractFeatureStore.java:264)
> ... 3 more
>
>
> I've found that if I iterate through the FeatureCollection and call
> getType on one of the individual features and then use that to set the
> schema on the ShapefileDataStore
> that I can then store the features. I've verified that they do indeed
> go out to disk and the shapes look right but all the attribute names are
> null and the additional attributes aren't present.
>
> Why doesn't the schema in the individual features match the schema
> reported from the wfs dataStore?
I don't know of the specific case, but in general, you cannot expect
a createSchema call to create a schema that's exactly equal to the one
provided, each storage format has specific limitations that the
createSchema tries some way to work around.
Examples:
- shapefiles data stores can have just one geometry, and it has to be
the first field in the feature type
- Oracle uppercases all attributes, whether you like it or not
So in order to actually make a datastore to datastore convention you
have to:
- get and set attributes using the names instead of the position
- in some cases know how the attribute names have been changed
to follow specific store limitations.
The latter is unfortunately a kind of knowledge that is not exposed
in any way by the data stores, you just have to know.
Here is a little script that I use to import shapefiles into
Oracle, might be useful:
----------------------------------------------------------------
import java.io.File;
import java.io.IOException;
import java.io.Serializable;
import java.util.Arrays;
import java.util.HashMap;
import java.util.Map;
import org.geotools.data.DataStore;
import org.geotools.data.DataUtilities;
import org.geotools.data.DefaultTransaction;
import org.geotools.data.FeatureStore;
import org.geotools.data.Transaction;
import org.geotools.data.oracle.OracleNGDataStoreFactory;
import org.geotools.data.shapefile.ShapefileDataStore;
import org.geotools.feature.FeatureIterator;
import org.geotools.feature.simple.SimpleFeatureBuilder;
import org.geotools.jdbc.JDBCDataStoreFactory;
import org.opengis.feature.simple.SimpleFeature;
import org.opengis.feature.simple.SimpleFeatureType;
import org.opengis.feature.type.AttributeDescriptor;
import org.opengis.filter.Filter;
public class OracleImporter {
public static void main(String[] args) throws IOException {
String path =
"/home/aaime/devel/gs2.0.x/data/release/data/shapefiles/states.shp";
ShapefileDataStore shp = new ShapefileDataStore(new
File(path).toURL());
Map<Serializable, Object> params = new HashMap<Serializable,
Object>();
params.put(JDBCDataStoreFactory.USER.key, "usr");
params.put(JDBCDataStoreFactory.PASSWD.key, "pwd");
params.put(JDBCDataStoreFactory.HOST.key, "localhost");
params.put(JDBCDataStoreFactory.PORT.key,
OracleNGDataStoreFactory.PORT.sample);
params.put(JDBCDataStoreFactory.DATABASE.key, "xe");
params.put(JDBCDataStoreFactory.DBTYPE.key, "oracle");
DataStore oracle = new
OracleNGDataStoreFactory().createDataStore(params);
if(oracle != null && oracle.getTypeNames() != null)
System.out.println("Oracle connected");
String typeName = shp.getTypeNames()[0].toUpperCase();
if(!Arrays.asList(oracle.getTypeNames()).contains(typeName))
oracle.createSchema(shp.getSchema());
FeatureStore oraStore = (FeatureStore)
oracle.getFeatureSource(typeName);
oraStore.removeFeatures(Filter.INCLUDE);
SimpleFeatureType targetSchema = (SimpleFeatureType)
oraStore.getSchema();
SimpleFeatureBuilder builder = new
SimpleFeatureBuilder(targetSchema);
FeatureIterator fi =
shp.getFeatureSource().getFeatures().features();
SimpleFeatureType sourceSchema = shp.getSchema();
Transaction t = new DefaultTransaction();
oraStore.setTransaction(t);
while(fi.hasNext()) {
SimpleFeature source = (SimpleFeature) fi.next();
for(AttributeDescriptor ad :
sourceSchema.getAttributeDescriptors()) {
String attribute = ad.getLocalName();
builder.set(attribute.toUpperCase(),
source.getAttribute(attribute));
}
oraStore.addFeatures(DataUtilities.collection(builder.buildFeature(null)));
}
t.commit();
t.close();
}
}
----------------------------------------------------------------
Cheers
Andrea
--
Andrea Aime
OpenGeo - http://opengeo.org
Expert service straight from the developers.
------------------------------------------------------------------------------
Download Intel® Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
Geotools-gt2-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/geotools-gt2-users