[
https://issues.apache.org/jira/browse/HADOOP-8151?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Harsh J updated HADOOP-8151:
----------------------------
Resolution: Fixed
Fix Version/s: 3.0.0
Target Version/s: (was: 2.0.3-alpha)
Hadoop Flags: Reviewed
Status: Resolved (was: Patch Available)
Committed this to trunk, thanks Matt.
> Error handling in snappy decompressor throws invalid exceptions
> ---------------------------------------------------------------
>
> Key: HADOOP-8151
> URL: https://issues.apache.org/jira/browse/HADOOP-8151
> Project: Hadoop Common
> Issue Type: Bug
> Components: io, native
> Affects Versions: 1.0.2, 2.0.0-alpha
> Reporter: Todd Lipcon
> Assignee: Matt Foley
> Fix For: 3.0.0, 1.0.3
>
> Attachments: HADOOP-8151-branch-1.0.patch, HADOOP-8151.patch,
> HADOOP-8151.patch
>
>
> SnappyDecompressor.c has the following code in a few places:
> {code}
> THROW(env, "Ljava/lang/InternalError", "Could not decompress data. Buffer
> length is too small.");
> {code}
> this is incorrect, though, since the THROW macro doesn't need the "L" before
> the class name. This results in a ClassNotFoundException for
> Ljava.lang.InternalError being thrown, instead of the intended exception.
--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira