Hi I have a question for a case and want to ask for your opinion.

So basically with the new GDPR law coming soon we need to mask all outgoing 
data from our API. 

To accomplish this we have implemented a java servlet filter that will 
intercept all outgoing data. 

The problem

   - Many of our api return heavy json response 
   - We also have Elasticsearch going through our java instance as a proxy 
   meaning big json


The filter does following things. (Current solution)

   1. Converts the byte[] to JsonNode using ObjectReader > readTree
   2. I am traversing the JsonNode and masking needed data using regex 
   3. Then with the ObjectWriter I write the node as bytes again


Ideally I would like to work with the data stream without having to load it 
into memory, and I was looking into your streaming api. But found it quite 
difficult to work with for my case. 
What I failed to accomplish with the streaming api was that I wanted to 
work with the data stream and make necessary changes (masking data) meaning 
I should parse and write data sequentially. 

Now my current solution works with loading the data into memory. 

Questions

   - Will this affect performance vs if I would be using the streaming api? 
   - Is it possible to improve my current solution?
   - If I would be using streaming api, how would I need to approach the 
   problem? I want to parse & write the stream sequentially 


-- 
You received this message because you are subscribed to the Google Groups 
"jackson-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to