Hmm, interesting ! I have to study this further and read the specs.
On 17 Aug 2012, at 22:29, Paul DeBruicker <[email protected]> wrote: > Hi Sven, > > Updating certainly lets me download it but I'm then faced with the problem > that the resulting file is a tar archive and is no longer gzipped. > > > E.g. > > wget http://www.iana.org/time-zones/repository/tzdata-latest.tar.gz > > > paul@paul-laptop:~/pharo/Pharo-1.4$ls -la tzdata-latest.tar.gz > -rw-rw-r-- 1 paul paul 206251 Aug 2 21:08 tzdata-latest.tar.gz > > paul@paul-laptop:~/pharo/Pharo-1.4$ file tzdata-latest.tar.gz > tzdata-latest.tar.gz: gzip compressed data, from Unix, max compression > > > > > and after running the download in pharo, as you saw, the file is 686kB. > > paul@paul-laptop:~/pharo/Pharo-1.4$ls -la tzdata-latest.tar.gz > -rw-rw-r-- 1 paul paul 686080 Aug 17 13:23 tzdata-latest.tar.gz > > paul@paul-laptop:~/pharo/Pharo-1.4$ file tzdata-latest.tar.gz > tzdata-latest.tar.gz: POSIX tar archive (GNU) > > > Is there a way to not unzip the file when downloading it through Pharo? I > admit its a minor quibble, but its not expected. > > > Thanks > > Paul > > > On 08/17/2012 11:47 AM, Sven Van Caekenberghe wrote: >> Paul, >> >> On 17 Aug 2012, at 19:55, Paul DeBruicker <[email protected]> wrote: >> >>> Thanks for looking into it Sven >>> >>> Pharo 1.4 (Summer) 14457. Eliots 2585 VM. Ubuntu 12.04 64bit. >> >> You seem to have hit a problem in Zn that was recently fixed (beginning of >> august), if you update Zn it works. >> >> Regards, >> >> Sven >> >>> On 08/17/2012 10:49 AM, Sven Van Caekenberghe wrote: >>>> Works for me, Pharo 2.0 ##20201 >>>> >>>> [sven@voyager:~/smalltalk]$ tar tvfz tzdata-latest.tar.gz >>>> -rw-r--r-- 0 0 0 44941 Jul 19 02:45 africa >>>> -rw-r--r-- 0 0 0 15827 Jul 19 02:30 antarctica >>>> -rw-r--r-- 0 0 0 112760 Jul 25 16:13 asia >>>> -rw-r--r-- 0 0 0 68423 Jul 25 16:40 australasia >>>> -rw-r--r-- 0 0 0 121390 Jul 25 16:13 europe >>>> -rw-r--r-- 0 0 0 135756 Jul 25 16:13 northamerica >>>> -rw-r--r-- 0 0 0 73854 Jul 25 16:13 southamerica >>>> -rw-r--r-- 0 0 0 1190 Jul 19 02:30 pacificnew >>>> -rw-r--r-- 0 0 0 2955 Jul 19 02:30 etcetera >>>> -rw-r--r-- 0 0 0 4083 Jul 19 02:30 backward >>>> -rw-r--r-- 0 0 0 1546 Jul 19 02:30 systemv >>>> -rw-r--r-- 0 0 0 393 Jul 19 02:30 factory >>>> -rw-r--r-- 0 0 0 19306 Jul 19 02:30 solar87 >>>> -rw-r--r-- 0 0 0 19324 Jul 19 02:30 solar88 >>>> -rw-r--r-- 0 0 0 19600 Jul 19 02:30 solar89 >>>> -rw-r--r-- 0 0 0 4326 Jul 19 02:30 iso3166.tab >>>> -rw-r--r-- 0 0 0 19913 Jul 19 02:30 zone.tab >>>> -rw-r--r-- 0 0 0 3186 Jul 25 16:13 leapseconds >>>> -rw-r--r-- 0 0 0 680 Jul 19 02:30 yearistype.sh >>>> [sven@voyager:~/smalltalk]$ file !$ >>>> file tzdata-latest.tar.gz >>>> tzdata-latest.tar.gz: POSIX tar archive (GNU) >>>> [sven@voyager:~/smalltalk]$ ls -la !$ >>>> ls -la tzdata-latest.tar.gz >>>> -rw-r--r--@ 1 sven sven 686080 Aug 17 19:47 tzdata-latest.tar.gz >>>> >>>> What image, platform, VM are you on ? >>>> >>>> On 17 Aug 2012, at 19:19, Paul DeBruicker <[email protected]> wrote: >>>> >>>>> I'm trying to download the Olson time-zone database so Chronos can >>>>> automatically update its ruleset. >>>>> >>>>> In a workspace, when I do: >>>>> >>>>> ZnClient new >>>>> url:'http://www.iana.org/time-zones/repository/tzdata-latest.tar.gz'; >>>>> downloadTo:FileDirectory default pathName. >>>>> >>>>> >>>>> I get a SubscriptOutOfBounds error . The file is ~207kB. The >>>>> GzipReadStream>#getFirstBuffer gets the first 65,536 bytes and then hits >>>>> the out of bounds error. Moving the limit in that method to 262,144 (1 >>>>> << 18 rather than 1 << 16) gets us to another error from which I don't >>>>> know the right way to proceed. >>>>> >>>>> >>>>> I think that because its an content type 'application/x-tar' it shouldn't >>>>> hit the GzipReadStream class at all but should just be a ReadStream that >>>>> does not get decompressed in the image and instead sent straight to the >>>>> disk. I'm not sure how to implement that though inside Zinc and would >>>>> welcome ideas to try. >>>>> >>>>> Thanks >>>>> >>>>> Paul >>>>> >>>> >>>> >>> >>> >> >> > >
