So to make it more clear -- sorry if I'm a bit slow on this, but I like things black-on-white and crystal-clear:
In the moment (as far as i understand this) _all_ the records and typedefs from the whole include-chain triggered by a single "import_from" are stored in a database to be read by orogen. This allows usage of "struct timespec" although no one explicitly did "import_from time.h". This also allowed usage of "uint16_t" for properties although no one explicitly did "import_from stdint.h". This was caused by how gccxml works. Is this parse-and-store-everything still desired? Makes the database large -- considering an eigen-based-example with ~200 RecordDecls from system-headers and ~130 from eigen-headers. Now, given a set of header files, the clang-based tool should: - give all TypeDecl defined somewhere in the include-chain of the given header with its CanonicalName and the field-layout. - If this is a RecordDecl: For each field then do recursively: return the offset and BuiltinType or the CanonicalName and field-layout - _all_ mappings "typedef" -> "CanonicalName" declared somewhere in the include-chain of the given header file. - give all enum-values as strings The outputs from different calls are then merged and collected in some typelib database, however this is organized... Additionally reject Records with: - pointers - virtual functions - references - bitfields? And warn about Records with: - FieldTypes with architecture-dependent size, like int Hm? _______________________________________________ Rock-dev mailing list [email protected] http://www.dfki.de/mailman/cgi-bin/listinfo/rock-dev
