On 2/9/2018 3:34 PM, Patrick Gundlach wrote:
there is a test for the type being virtual .. can you omit that flag? just
pretend it's a real font .. works here (i need to check why that test is there,
it might not be needed in luatex any more)
so no type="virtual"
that was the culprit! Wit
> there is a test for the type being virtual .. can you omit that flag? just
> pretend it's a real font .. works here (i need to check why that test is
> there, it might not be needed in luatex any more)
>
> so no type="virtual"
that was the culprit! Without that type="virtual" everything is
On 2/9/2018 2:17 PM, Patrick Gundlach wrote:
some more details (before I make a test case)
the glyphs of the real font have an expansion_factor of -1 (post_linebreak)
and the virtual fonts have an expansion_factor of 0 (so the question is: why).
The PDF also shows the expected differences:
some more details (before I make a test case)
the glyphs of the real font have an expansion_factor of -1 (post_linebreak)
and the virtual fonts have an expansion_factor of 0 (so the question is: why).
The PDF also shows the expected differences:
real:
BT
/F11 9.9 Tf 0.99 0 0 1 28.346 8
> Font expansion looks at a glyph and its properties. It actually is not
> interested in what that glyph is and only the expansion factors and width
> matters.
ok, then I will reduce my Lua files to a minimal variant. I will probably find
my problem then.
Thank you!
Patrick
On 2/9/2018 10:34 AM, Patrick Gundlach wrote:
Hello all,
I always come up with vague question, so here is another one.
I use a virtual font to re-encode a real font and it seems that font expansion
is not used anymore. Is this a well known thing? Both fonts (the real font and
the virtual font