Inefficient inputstream reading in JSONDataSource
-------------------------------------------------

                 Key: AXIS2-4731
                 URL: https://issues.apache.org/jira/browse/AXIS2-4731
             Project: Axis2
          Issue Type: Sub-task
          Components: modules
    Affects Versions: 1.5.1
            Reporter: Jean Marc


I am experiencing 100% CPU on InputStream.read() on 30Kb json data from 
JSONDataSource.
Using InputStream.read() is highly inefficient since we don't know the size of 
the JSON data; 
a reasonably-sized char buffer should be used to cut down on CPU and to lower 
the method calls on StringBuilder.append(). 

Note also that regardless of the solution, there must be a charset conversion 
between an InputStream and a Reader (or a String object for that matter), and 
that the cast between int and char in the old code will corrupt an InputStream 
containing multibyte characters. Since JSON data is usually sent in UTF-8, we 
could hardcode the conversion in the InputStreamReader to overcome problems on 
Windows: 

            BufferedReader in = new BufferedReader(new 
InputStreamReader(jsonInputStream,"UTF-8"));
            StringBuilder sb = new StringBuilder(512); 
            char[] tempBuf = new char[512];
            int readLen = -1;
            
            while( (readLen = in.read(tempBuf)) != -1 )
                sb.append(tempBuf,0, readLen);
             tempBuf = null;

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to