Thanks Raul,
Showing my ignorance, but how does the compiler impact on the stored
structure? Is this an Endian issue or do different compilers store the
same type differently in other ways? I think that the gcc compiler has
been used until recently, but that the Intel compiler is likely to be
used going forward.

It shouldn't be too big an issue to get test files to check against.
Yes sorting out the byte size and bit conversion method for each C
type seems like way to go.

On Wed, Jun 1, 2011 at 10:16 AM, Raul Miller <[email protected]> wrote:
> On Tue, May 31, 2011 at 5:52 PM, Ric Sherlock <[email protected]> wrote:
>> I need to interact with some binary files that are basically stored C
>> structures.
>> Before I attempt to reinvent the wheel, I was wondering if anybody has
>> any tools/utilities they are willing to share for reading such files
>> an converting them to J arrays?
>> I have the declared type structure of the files - for example:
>>
>> struct INDIV  {
>>    int id;
>>    int sireid;
>>    int damid;
>>    unsigned short breed;
>>    unsigned short yob;
>>    short inbreed;
>>    float brd[4];
>> };
>>
>> Alternatively I'm also open to advice on how best to approach the task.
>
> Note that you need to know which compiler is being used before you can
> interpret the structs.
>
> Note also that the best approach probably also involves a test file so
> you can quickly see if your code is working right.
>
> That said, each of the types involved corresponds to a size and a bit
> interpreter, and if you had names in one locale for sizes and names in
> another locale for methods to convert the bits to J form I imagine you
> could build a "record -> read function" meta function.
>
> --
> Raul
> ----------------------------------------------------------------------
> For information about J forums see http://www.jsoftware.com/forums.htm
>
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to