Hi guys,

Go gurus, please, help me with the following problem.

I want to build a simple calculator for algebraic expression, aka computer 
algebra system. In this calculator every object has a `type Ex uintptr` 
(expression). An optimization I want to introduce is that atomic objects, 
like small numbers and symbols, are encoded by a single 64-bit integer, 
while more complicated objects, like polynomials are stored separately in 
larger pieces of data, e.g., arrays. To implement that I want 3 lower bits 
of my Ex to decode a type of every expression. So that if I see 000 then I 
know it is a pointer to a Container (assuming that every pointer is aligned 
to 8 bits), if 001 then it is a 61-bit integer, if 010 then it is a symbol, 
etc.

With such a notation, I am not sure if my Containers are safe of GC, 
because currently Ex is defined as uintptr, which is not a real pointer in 
Go, so GC may think that there are no references to a given Container. Can 
anyone comment on that and suggest some correct implementation of the 
idea above?

Best regards,
Oleksandr.

-- 
You received this message because you are subscribed to the Google Groups 
"golang-nuts" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to golang-nuts+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to