My code was uploaded.

https://github.com/kusumotonorio/factor/blob/win-transparent-background-text/basis/windows/uniscribe/uniscribe.factor


I want to know how slow the code when it draws transparent background texts. So I am writing
a test app to know it. But it has an issue.
I try to measure time by actually drawing a label with a transparent background over and over again, and that happens very quickly. Creating an image from text takes a long time (3: and 4:) , so it should actually take longer.
What's wrong with my code?


! Copyright (C) 2019 KUSUMOTO Norio.
! See http://factorcode.org/license.txt for BSD license.
USING: kernel accessors locals math formatting ui ui.gadgets.labels
ui.commands ui.gadgets ui.gadgets.toolbar ui.gadgets.tracks ui.gestures
fonts colors tools.time memory math.ranges namespaces arrays sequences
windows.uniscribe windows.uniscribe.private ;
IN: uni-bench

TUPLE: uni-bench-gadget < track
    opaque-label
    transparent-label
    opaque-time
    transparent-time ;

SYMBOL: test-strings
V{ } clone test-strings set-global
1 5000 [a,b]  [
    "%04d" sprintf test-strings get push
] each
test-strings get >array test-strings set-global

: <uni-bench-gadget> ( -- gadget )
    vertical uni-bench-gadget new-track
     {
         "  Press any of the keys listed below"
         ""
         "    1: Opaque background label (5000 times)"
         "    2: Transparent background label (5000 times)"
         "    3: Opaque script-string>image (5000 times)"
         "    4: Transparent script-string>image (5000 times)"
         ""
    } <label> f track-add

    "0000" <label> >>opaque-label
    "0000" <label> >>transparent-label
    "" <label> >>opaque-time
    "" <label> >>transparent-time

    dup opaque-label>>
    <font>
    "monospace" >>name
    T{ rgba f 0.0 0.0 0.0 1.0 } >>foreground
    T{ rgba f 0.0 0.0 1.0 1.0 } >>background
    36 >>size
    >>font f track-add

    dup transparent-label>>
    <font>
    "monospace" >>name
    T{ rgba f 0.0 0.0 0.0 1.0 } >>foreground
    T{ rgba f 0.0 0.0 1.0 0.0 } >>background
    36 >>size
    >>font f track-add

    dup opaque-time>> " Opaque:" label-on-left f track-add
    dup transparent-time>> " Transparent:" label-on-left f track-add ;

:: com-benchmark-opaque ( gadget -- )
    gadget opaque-label>> :> test-label
    gc
    [
        test-strings get-global [
            test-label swap >>text relayout-1
        ] each
    ] benchmark
    gadget opaque-time>>
    swap 1000000 / "%d ms" sprintf >>text relayout-1 ;

:: com-benchmark-transparent ( gadget -- )
    gadget transparent-label>> :> test-label
    gc
    [
        test-strings get-global [
            test-label swap >>text relayout-1
        ] each
    ] benchmark
    gadget transparent-time>>
    swap 1000000 / "%d ms" sprintf >>text relayout-1 ;


:: com-benchmark-opaque-script-string>image ( gadget -- )
    gadget opaque-label>> font>> :> test-font
    gc
    [
        test-strings get-global [
            test-font swap <script-string> script-string>image drop
        ] each
    ] benchmark
    gadget opaque-time>>
    swap 1000000 / "%d ms" sprintf >>text relayout-1 ;

:: com-benchmark-transparent-script-string>image ( gadget -- )
    gadget transparent-label>> font>> :> test-font
    gc
    [
        test-strings get-global [
            test-font swap <script-string> script-string>image drop
        ] each
    ] benchmark
    gadget transparent-time>>
    swap 1000000 / "%d ms" sprintf >>text relayout-1 ;


uni-bench-gadget "gestures" f {
    { T{ key-down { sym "1" } } com-benchmark-opaque }
    { T{ key-down { sym "2" } } com-benchmark-transparent }
    { T{ key-down { sym "3" } } com-benchmark-opaque-script-string>image }
    { T{ key-down { sym "4" } } com-benchmark-transparent-script-string>image }
} define-command-map

M: uni-bench-gadget pref-dim* drop { 300 250 } ;

MAIN-WINDOW: uni-bench { { title "Uniscribe benchmark" } }
    <uni-bench-gadget> >>gadgets ;



On 2019/05/30 21:17, KUSUMOTO Norio wrote:
Thank you, Alexander! I will read it and study. I tried the idea I wrote earlier. I've only tried a little, but it seems to work well. Yay! <https://pbs.twimg.com/media/D70GEGkVUAUxgWb.png> -- KUSUMOTO Norio
2019/05/30 20:19、Alexander Ilin <ajs...@yandex.ru>のメール: Hello! I think I have figured out the algorithm for the image processing. I'm not sure if it's the same one that you came up with. https://github.com/factor/factor/issues/152#issuecomment-497292323 Also, I don't know if there are standard WinApi functions to perform the necessary operation, namely copying a color channel into the alpha channel, but I suspect there might be, in which case we won't lose performance there. 30.05.2019, 03:50, "KUSUMOTO Norio" <kusum...@na.rim.or.jp>:
Although it seems that to transfer the appropriate data to the device context is the 'right' solution, I have a hunch that it will be a difficult task. So I am beginning to think that we should take another, a cheat, approach. It's like a chroma key. Uniscribe draws text with a color background instead of transparency, and a word replaces the color with transparency when converting from the bitmap to an Factor's image. It may be slow, but we don't have many chances to draw characters on a transparent background. And for example, on a button label, once Factor creates an image with such characters, Factor use the image again, so I think there are few problems. This approach can localize changes. I can't define the word for image conversion to do such a special action, but I think that it's not a difficult task for someone familiar with image processing words.
---=====--- Александр _______________________________________________ Factor-talk mailing list Factor-talk@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/factor-talk
_______________________________________________ Factor-talk mailing list Factor-talk@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/factor-talk


_______________________________________________
Factor-talk mailing list
Factor-talk@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/factor-talk

Reply via email to