[ 
https://issues.apache.org/jira/browse/KARAF-5632?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16375937#comment-16375937
 ] 

ASF GitHub Bot commented on KARAF-5632:
---------------------------------------

jbonofre closed pull request #464: #KARAF-5632: Reuse in HeapDumpProvider now 
the copy function of LogDu…
URL: https://github.com/apache/karaf/pull/464
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git 
a/diagnostic/core/src/main/java/org/apache/karaf/diagnostic/core/internal/HeapDumpProvider.java
 
b/diagnostic/core/src/main/java/org/apache/karaf/diagnostic/core/internal/HeapDumpProvider.java
index 03c50cfcf7..cf1ed40022 100644
--- 
a/diagnostic/core/src/main/java/org/apache/karaf/diagnostic/core/internal/HeapDumpProvider.java
+++ 
b/diagnostic/core/src/main/java/org/apache/karaf/diagnostic/core/internal/HeapDumpProvider.java
@@ -52,10 +52,7 @@ public void createDump(DumpDestination destination) throws 
Exception {
             File heapDumpFile = new File("heapdump.txt");
             in = new FileInputStream(heapDumpFile);
             out = destination.add("heapdump.txt");
-            byte[] buffer = new byte[2048];
-            while ((in.read(buffer) != -1)) {
-                out.write(buffer);
-            }
+            LogDumpProvider.copy(in, out);
             // remove the original dump
             if (heapDumpFile.exists()) {
                 heapDumpFile.delete();
diff --git 
a/diagnostic/core/src/main/java/org/apache/karaf/diagnostic/core/internal/LogDumpProvider.java
 
b/diagnostic/core/src/main/java/org/apache/karaf/diagnostic/core/internal/LogDumpProvider.java
index ba5a24e51c..740cc2eafa 100644
--- 
a/diagnostic/core/src/main/java/org/apache/karaf/diagnostic/core/internal/LogDumpProvider.java
+++ 
b/diagnostic/core/src/main/java/org/apache/karaf/diagnostic/core/internal/LogDumpProvider.java
@@ -87,7 +87,7 @@ public void createDump(DumpDestination destination) throws 
Exception {
      * @param outputStream Destination stream.
      * @throws IOException When IO operation fails.
      */
-    private void copy(InputStream inputStream, OutputStream outputStream) 
throws IOException {
+    static void copy(InputStream inputStream, OutputStream outputStream) 
throws IOException {
         byte[] buffer = new byte[4096];
         int n = 0;
         while (-1 != (n = inputStream.read(buffer))) {


 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


> dev:create-dump creates invalid dump file
> -----------------------------------------
>
>                 Key: KARAF-5632
>                 URL: https://issues.apache.org/jira/browse/KARAF-5632
>             Project: Karaf
>          Issue Type: Bug
>    Affects Versions: 3.0.8
>         Environment: karaf 3.0.8, windows 10
>            Reporter: Richard Hierlmeier
>            Assignee: Jean-Baptiste Onofré
>            Priority: Major
>             Fix For: 3.0.9
>
>         Attachments: screenshot-1.png
>
>
> When creating with the {{dev:create-dump}} command an heap dump, this dump 
> can not be opened with eclipse memory analyser I  get the following error:
> {noformat}
> The HPROF parser encountered a violation of the HPROF specification that it 
> could not safely handle. This could be due to file truncation or a bug in the 
> JVM. Please consider filing a bug at eclipse.org. To continue parsing the 
> dump anyway, you can use -DhprofStrictnessWarning=true or set the strictness 
> mode under Preferences > HPROF Parser > Parser Strictness. See the inner 
> exception for details.
> The HPROF parser encountered a violation of the HPROF specification that it 
> could not safely handle. This could be due to file truncation or a bug in the 
> JVM. Please consider filing a bug at eclipse.org. To continue parsing the 
> dump anyway, you can use -DhprofStrictnessWarning=true or set the strictness 
> mode under Preferences > HPROF Parser > Parser Strictness. See the inner 
> exception for details.
> (Possibly) Invalid HPROF file: Expected to read another 83.886.080 bytes, but 
> only 1.500 bytes are available.
> (Possibly) Invalid HPROF file: Expected to read another 83.886.080 bytes, but 
> only 1.500 bytes are available.
> {noformat}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to