thanks, I sussed the main issue - encoding!
I changed the handler thus:
public String convertToString(Object obj, int sqlTypeCode) throws
ConversionException
{
if (obj == null)
{
return "";
}
else
{
log.debug("encoding");
try {
return new String(Base64.encodeBase64((byte[])obj),
"UTF-8");
} catch (Exception e) {
log.error("exception",e);
return "";
}
}
its ugly but it works.
I am extremely concerned that large datafields will break this app and
there wont be anything I can do about it. individual fields can
potentially hold gigabytes of data. One day when you find some time you
should consider passing a byte stream around instead of byte[]
Cheers
Jason.
Thomas Dudziak wrote:
On 4/12/06, Jason <[EMAIL PROTECTED]> wrote:
<writeDataToFile
outputFile="${database.backup.path}/src/schema/${database.backup.path}-data.xml">
<converter jdbcType="BINARY"
className="org.apache.ddlutils.io.converters.ByteArrayBase64Converter"/>
</writeDataToFile>
with the latest jdbc driver but no luck.
throws exceptions of various ilks - Itried trapping themm and chasing a
bit but it doesnt seem that a byte[] is even passed - maybe - dunno, I'm
tired.
no, I think you're on the right track, only there is a bug in the
converter (or in commons-codec if you want) in that it does not
properly handle a null bytea value (i.e. the byte[] is null). This is
why you get the NullPointerException.
I can fix this tomorrow the earliest, but you could create your own
converter by adapting the ByteArrayBase64Converter and add
if-statements for textRep == null / obj == null in which case you
simply return null.
Tom