The idea of a large mega or 100 kilo bit number, is that it allows for an
extensive assignment of meaning-by-position. But only a small number of
bits would be used during any calculation so a compression method would be
employed. I would not actually do any calculations with million bit
numbers. The calculations would be done with an algorithm that used
compressed representations of the numbers so they would not have to be
decompressed to use them. Another alternative that I am thinking of is to
use a series of 'sentences', but the sentences would not be thought of as
text (in the common sense) but as numbers. I might also use a combination
of the two methods or of other methods.

The mathematics that I am thinking about now, would be like a preliminary
indexing of information. I am thinking that instead of using traditional
computational methods, I might use sets of specialized computational
methods that were developed for the system I am considering. If the
computational methods - the mathematical abstractions that guide the
computations - were separated from the conceptual database, then it might
be possible to use them through a fast look-up. (Traditional computational
methods use an extremely efficient set of mathematical abstractions (or
algorithms) which make them extremely efficient. The program does not have
to do an extensive search through the database of concepts in order to find
how it should make a computation using traditional mathematics.) I wonder
if I could design something that was efficient for my specially designed
computational methods. The solution (to that little sub-problem) is to keep
the mathematical abstractions of computation separate from the rest of the
concept data base. It could be kept in RAM using an efficient look up
method.
Jim Bromer


On Wed, Jun 12, 2019 at 8:58 AM <[email protected]> wrote:

> This probably sounds stupid but..
> With the generation idea, about getting the a.i. to develop images from an
> algebraic model.  It has a problem where the output is no longer symbolic
> (its pictures) I wonder it might be easier for the computer to describe it
> in words as if it was a picture...   so i wonder if there is a textual
> language you could develop, which passes on to a graphics unit in turn
> after it.
>
> And a million bits gives you a very very big number,  more than most
> quantum computers would bother to support, if "they" had them. :)
> *Artificial General Intelligence List <https://agi.topicbox.com/latest>*
> / AGI / see discussions <https://agi.topicbox.com/groups/agi> +
> participants <https://agi.topicbox.com/groups/agi/members> + delivery
> options <https://agi.topicbox.com/groups/agi/subscription> Permalink
> <https://agi.topicbox.com/groups/agi/T395236743964cb4b-M1a5d67cb599e68b101067488>
>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T395236743964cb4b-Mace9e68543027da8d8fa099a
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to