On Mon, Jun 25, 2012 at 6:05 PM, Sean P. DeNigris <[email protected]>wrote:

>
> Mariano Martinez Peck wrote
> >
> > Sean asked me if the package contained non-asci characters..and yes, it
> > has
> > widestrings directly in the literal of a method (this is on purpose).
> > The method is #testWideString
> >
>
> testWideString contains wide characters. But the write is not failing
> inside
> testWideString because zip writing is buffered in 4096 byte chunks, so it
> fails at the beginning of that buffer write (in testTimestamp).
> If I remove the wide char literals from TestWideString, and resave the
> package, I can unzip it and file in source.st
>
>

Wow. I love this community. Pavel founds the problem. Sean debugs it and
founds the real cause.
Thank you guys, you cover our back when I don't have much time :)
So...as a workaround, since the literals of a CM are visited as any other
object, it is not really needed to have a test with WideStrings in the
source code.
So I have just have:

testWideString

    self assertSerializationEqualityOf: 'aString' asWideString.
    self assertSerializationEqualityOf: (WideString
               streamContents: [ :stream |
                       2000 timesRepeat: [
                               stream nextPut: (256 to: 1000) atRandom
asCharacter ] ] ).


and that should work :)




> The write logic is in ZipArchiveMember>>copyDataWithCRCTo:
>                ...
>                data := self readRawChunk: (4096 min: readDataRemaining).
>                aStream nextPutAll: data.
>                ...
>
> When #readRawChunk: returns a WideString, all hell breaks loose. I don't
> know what the solution is, but at least we found the problem...
>
>
We should try to write a unit test for this. I will at least open an issue
;)



> --
> View this message in context:
> http://forum.world.st/Problem-with-Monticelo-package-and-corrupted-sourcecode-zip-tp4636458p4636561.html
> Sent from the Pharo Smalltalk mailing list archive at Nabble.com.
>
>


-- 
Mariano
http://marianopeck.wordpress.com

Reply via email to