On Fri, 26 Feb 2010, Gustavo Sverzut Barbieri wrote:

On Fri, Feb 26, 2010 at 8:12 AM, Vincent Torri <vto...@univ-evry.fr> wrote:
On Fri, 26 Feb 2010, Gustavo Sverzut Barbieri wrote:

On Fri, Feb 26, 2010 at 4:22 AM, Vincent Torri <vto...@univ-evry.fr>
wrote:

On Thu, 25 Feb 2010, Enlightenment SVN wrote:

Log:
 TRUE/FALSE are gone, use EINA_TRUE/EINA_FALSE instead.

Author:       barbieri
Date:         2010-02-25 16:59:11 -0800 (Thu, 25 Feb 2010)
New Revision: 46500

Modified:
 trunk/eina/src/include/eina_types.h

Modified: trunk/eina/src/include/eina_types.h
===================================================================
--- trunk/eina/src/include/eina_types.h       2010-02-26 00:57:20 UTC
(rev 46499)
+++ trunk/eina/src/include/eina_types.h       2010-02-26 00:59:11 UTC
(rev 46500)
@@ -214,25 +214,13 @@
#endif /* ! __GNUC__ && ! _WIN32 && ! __SUNPRO_C */


-/* remove this TRUE/FALSE redifinitions */
-
/**
- * @deprecated Use #EINA_TRUE instead.
- */
-#ifndef TRUE
-# define TRUE 1
-#endif
-
-/**
- * @deprecated Use #EINA_FALSE instead.
- */
-#ifndef FALSE
-# define FALSE 0
-#endif
-
-/**
 * @typedef Eina_Bool
 * Type to mimic a boolean.
+ *
+ * @note it differs from stdbool.h as this is defined as an unsigned
+ *       char to make it usable by bitfields (Eina_Bool name:1) and
+ *       also take as few bytes as possible.
 */
typedef unsigned char Eina_Bool;

char bitfields are an extension of gcc (and of some other compilers).
The C standard says that "fields are packed into 'storage units', which
are typically machine words". Compile a small example with gcc -pedantic
and you'll see a warning. C99 allows _Bool, though.

I'm not sure at all that they will take as few bytes as possible in the
sense that you implicitely say (one char -> only 1 byte in memory). They
are always aligned to 32 bits. They just need to be consecutive to take
the less space in memory as possible. If you use some gcc options to pack
bitfields, be aware that the library might not be working at all with
vc++
as the way vc++ pack bitfields is different. That said, you'll certainly
add that Windows is my territory, not yours :p

damn, if it breaks then we need to convert from char to int ASAP, so
we don't break the ABI after release.

I'm quite sure this is in lots, lots of code paths in EFL/E17 now, so
it should "just work", no?

I've found that option to gcc (mingw):

-mms-bitfield

it should solve the problem when compiling the efl on Windows.

anyway, i think that your note is not entirely correct

but should we keep using it as uchar or move to uint?

uint it the standard. Using uchar should not be a problem on Windows if I pass -mms-bitfield.

I'm really wondering if there is some size improvement when using a char and not an int.

Vincent
------------------------------------------------------------------------------
Download Intel&#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
enlightenment-devel mailing list
enlightenment-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/enlightenment-devel

Reply via email to