In this case the nesting hard-coded limit is present on JSON-lib library:
   https://github.com/aalmiray/Json-lib

   
https://github.com/aalmiray/Json-lib/blob/master/src/main/java/net/sf/json/util/JSONBuilder.java#L61

   But, of course, if jettison is present on others encoding processes it could 
be needed to fork it too since it has nesting hard-coded limit in the same way 
(and same value).


   From "Andrea Aime" [email protected]
   To [email protected]
   Cc "Geoserver-devel" [email protected]
   Date Sat, 29 Jun 2019 10:56:29 +0200
   Subject Re: [Geoserver-devel] JSON Nesting too deep exception due to 
hard-coded limit
   On Fri, Jun 28, 2019 at 7:09 PM [email protected] 
<[email protected]> wrote:

   Hi community.

   I'm reproducing an exception using Complex Features and WFS GeoJSONBuilder 
on Geoserver 2.16 (master) with this stack trace:
   net.sf.json.JSONException: Nesting too deep.
       at net.sf.json.util.JSONBuilder.push(JSONBuilder.java:270)
       at net.sf.json.util.JSONBuilder.object(JSONBuilder.java:240)
       at 
org.geoserver.wfs.json.ComplexGeoJsonWriter.encodeFeature(ComplexGeoJsonWriter.java:105)
       at 
org.geoserver.wfs.json.ComplexGeoJsonWriter.encodeComplexAttribute(ComplexGeoJsonWriter.java:563)
       at 
org.geoserver.wfs.json.ComplexGeoJsonWriter.encodeProperty(ComplexGeoJsonWriter.java:409)
       at 
org.geoserver.wfs.json.ComplexGeoJsonWriter.encodeProperty(ComplexGeoJsonWriter.java:383)
       at java.util.ArrayList.forEach(ArrayList.java:1257)

   This exception is not present on previous versions (2.14 for example) and 
looks like we are facing it since now some extra "properties" attributes were 
added to GeoJSON encoding recently.

   Bit of context, in 2.16.x full features are always encoded as GeoJSON 
features, no matter where in the containment hierachy, that preserves their 
identity and
   makes them recognizeable as features. Before they were plain json object 
without type or identity.
   However, if one has a model in which everything is an identifiable feature, 
it's adding an extra level of "properties" per object.
   We could also discuss having a flag in the WFS configuration on whether to 
preserve nature and identity of nested features (and/or a request parameter, I 
believe
   there is already a map of "format options" in WFS, but not 100% sure).

   Checking for the root problem seems like json-lib JSONBuilder class has a 
hard-coded limit of 20 nested levels as we can see on:

   
https://github.com/aalmiray/Json-lib/blob/master/src/main/java/net/sf/json/util/JSONBuilder.java#L61

   Since we added new levels to GeoJSON encoding it is a good idea to relax a 
bit this limit, but as you can see the problem is this limit is not 
configurable since the static variable is final.  So I see two solutions to 
this:

     * Fork json-lib and relax limit or make it configurable.
     * Use reflection for modify the value at runtime.

   Third option we cannot afford in the short term: rewrite the json encoder 
using another library (jackson?). And yep, it's a lot of work, hence the 
"cannot afford".
   Running inside a security manager is uncommon, but not unheard of... the 
intersection of complex json users with deeply nested models and
   users running inside a security manager might as well be empty, but I'd be 
more comfortable having our own fork of jettison, considering tha
   the library is basically dead anyways (there is a semi-dormant version at 
https://github.com/jettison-json/jettison still limiting encoding to 20, but I 
believe there were backwards compat issues, Torben, is it jettison or am I 
confusing it with another library?)
   we have forked other libraries for similar reasons.
   We should probably schedule a sprint in the future to cleanup our 
json/geojson management situation, since now both OGC batch of services and 
practical usage are
   agreeing on favoring JSON over XML.

   Of course the first solution sounds better since reflection can be 
problematic for some environments (for example blocked by a security manager).
   I currently did a fork relaxing the limit to 5000 in:
   https://github.com/fernandor777/Json-lib/tree/limit-fix-2.4

   A fixed limit is still fixed... I'd recommend making it at least tunable 
with a system variable, e.g.
   maxFeatures = Integer.parseInt(System.getProperty("json.maxFeatures", 5000)
   Also, 5000 seems like a large limit, I'm guessing it would allow 5000 by 
5000 objects in memory before encoding, that
   seems a too big of a number. How about defaulting to 100?
   Cheers
   Andrea
   ==

   GeoServer Professional Services from the experts! Visit http://goo.gl/it488V 
for more information. == Ing. Andrea Aime @geowolf Technical Lead GeoSolutions 
S.A.S. Via di Montramito 3/A 55054 Massarosa (LU) phone: +39 0584 962313 fax: 
+39 0584 1660272 mob: +39 339 8844549 http://www.geo-solutions.it 
http://twitter.com/geosolutions_it 
------------------------------------------------------- Con riferimento alla 
normativa sul trattamento dei dati personali (Reg. UE 2016/679 - Regolamento 
generale sulla protezione dei dati “GDPR”), si precisa che ogni circostanza 
inerente alla presente email (il suo contenuto, gli eventuali allegati, etc.) è 
un dato la cui conoscenza è riservata al/i solo/i destinatario/i indicati dallo 
scrivente. Se il messaggio Le è giunto per errore, è tenuta/o a cancellarlo, 
ogni altra operazione è illecita. Le sarei comunque grato se potesse darmene 
notizia. This email is intended only for the person or entity to which it is 
addressed and may contain
   information that is privileged, confidential or otherwise protected from 
disclosure. We remind that - as provided by European Regulation 2016/679 “GDPR” 
- copying, dissemination or use of this e-mail or the information herein by 
anyone other than the intended recipient is prohibited. If you have received 
this email by mistake, please notify us immediately by telephone or e-mail.
_______________________________________________
Geoserver-devel mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/geoserver-devel

Reply via email to