Hello everyone,

I am currently in the process of porting gcc to an ISA I designed with a few others (spec [1]). Using the old ggx/moxie tutorial as a guideline and looking at the source of other backends, as well as the quite decent gccint document, I managed to get basic programs running (binutils, especially gas, was also not too hard to get running). I am now partially writing here because the gccint documents tells me to do this (I am unsure if our project will ever be mature enough to be added to gcc, but I don't think it hurts to strive for that), and partially because I do need a bit of help with some stuff that I can't find documentation for.

One of the big challenges I am facing is that for our backend we sometimes support 16bit as the max size a CPU supports, including for pointers, and sometimes 32bit or 64bit. Am I seeing it correctly that POINTER_SIZE has to be a compile time constant and can therefore not easily be changed by the backend during compilation based on command line arguments?

Also, on another backend I saw comments relating to libgcc (or newlib?) not working that well on systems where int is 16bit. Is that still true, and what is the best workaround?

And a bit more concrete with something I am having a hard time debugging. I am getting errors `invalid_void`, seemingly triggered by an absent of `gen_return` when compiling with anything higher than -O0. How do I correctly provide an implementation for that? Or disable it's usage? Our epilogue is non-trivial, and it doesn't look like the other backends use something like `(define_insn "return" ...)`.

Many thanks in advance,

MegaIng


[1]: https://github.com/ETC-A/etca-spec

Reply via email to