2009/11/21 Yanrui Li <liyanrui...@gmail.com>:
> 2009/11/21 Hans Hagen <pra...@wxs.nl>:
>> Yanrui Li wrote:
>>>
>>> 2009/11/20 Hans Hagen <pra...@wxs.nl>:
>>>>
>>>> Yanrui Li wrote:
>>>>>
>>>>> I did so and solved this problem. However I can not use Chinese OTF
>>>>> fonts such as 'AdobeSongStd-Light.otf', because luatex need very very
>>>>> large memory to load it in the stage of *enhancing* . This is a test
>>>>> file:
>>>>>
>>>>> \definefont[song][name:adobesongstd]
>>>>> \starttext
>>>>> \song 测试
>>>>> \stoptext
>>>>>
>>>>> The process of compiling stops here:
>>>>>
>>>>> !load otf       : loading:
>>>>> /usr/share/fonts/adobe/AdobeSongStd-Light.otf (hash:
>>>>> adobesongstd-light)
>>>>> !load otf       : file size: 16229772
>>>>> !load otf       : enhancing ...
>>>>
>>>> that's only the first time
>>>>
>>>
>>> But it looks like abnormal. It is still *enhancing* after spending
>>> 3.3GB memory and about 1 minute. Finally I got the message:
>>
>> fixed in beta, was some kind of loop
>>
>
> Sorry for my noice. The problem still exists in the latest beta
> (2009.11.21 00:06, linux x86) which was installed after I removed the
> previous one completely. The problem just arise in loading Chinese OTF
> fonts. It is normal for loading Chinese TTF font.
>

I find out that the problem is caused by the new
'resolvers.split_path' function in data-res.lua. It can be solved with
the old  'resolvers.split_path' function instead of the new one.

-- 
Best regards,

Li Yanrui
___________________________________________________________________________________
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://tex.aanhet.net
archive  : http://foundry.supelec.fr/projects/contextrev/
wiki     : http://contextgarden.net
___________________________________________________________________________________

Reply via email to