> Patrik Stridvall <psÉleissner.se> writes:
> 
> > Alexandre, can you describe what you dislike so much about 
> _my_ solution?
> > Your argument that the Wine binary is large enough as it is,
> > is not very persuasive even though, I do admit you have a point.
> > I think having a cleaner, more maintainable and faster solution
> > is worth the somewhat larger binaries.
> 
> It is only faster if you assume infinite RAM and cache, which few of
> us have on our machines... in real life doubling the size of the code
> has a large performance impact.

Yes, but both variants will not be in memory (or in cache) unless you run
one Unicode and one ASCII application with almost identical API usage.
If you only run ASCII applications, the code in memory/cache will
actually be less than now since the "conversion" variant need not be
in memory. This is not to mention, the less need for a data memory/cache
for storing the converted strings.

> And I also strongly disagree with the
> "more maintainable" part; 

Do you consider an Windows _application_ that can be compiled as both ASCII
and Unicode as less maintainable?

Sure you get TCHAR, LPTSTR and friend instead in the code, and sizeof(TCHAR)
is not always 1, and such details. But do you _really_ consider the
application
less maintainable. If you do not, why do you consider my solution less
maintainable?

> compile-time options are a pain to maintain
> and debug 

Actually, my solution are not primarily a compile time option.
Supporting multiple internal format is not very useful,
even if it can be easily done if anybody so wishes. 
Supporting ASCII only or UNICODE only is more usable though.

We will just have, with source code in foo.c

--- foo.ascii.c ---
#undef __WINE__
#define ASCII
#include "foo.c"
--- foo.unicode.c ---
#undef __WINE__
#define UNICODE
#include "foo.c"
---------------------

foo.ascii.c and foo.unicode.c is simply generated by

.c.ascii.c :
        $(BUILD) @BUILDFLAGS@ -o $@ -ascii $<

.c.unicode.c :
        $(BUILD) @BUILDFLAGS@ -o $@ -unicode $<
        
> (not to mention that compilation times are doubled 
> of course).

Yes, that is unavoidable, but note _only_ for functions with two variants
and you save the time to compile the "conversion" variant.

> > Note that it doesn't
> > hurt embedded system, my solution allows an ASCII only Wine.
> 
> Embedded systems may want Unicode too (CE is mostly Unicode AFAIK).

Sure you can run Unicode only if you wish.

Note that Unicode only means that Wine can't handle ASCII,
the application can of course convert ASCII to Unicode for
all system calls, which normal Unicode application does.
 
> > It is not the double conversion that seems wasteful it the extra
> > memory allocations.
> 
> There is nothing that says we have to allocate memory on every
> conversion. There are a lot of possible optimizations, like the
> per-thread Unicode string buffer that ntdll uses.

Yes, I know. Of course we can optimize the current solution,
but that doesn't mean that better solutions, like possibly 
mine doesn't exist. In fact many other solution can use
that optimization to, so that is not really an argument
for the current solution.

Reply via email to