I know of things we can do to GPT or DALL-E to make it more AGI-like. PM me or post below if it interest you like it interests me.
GPT has short term memory because you can present it "my cat ate food, my cat ate" and will predict food with high confidence. GPT actually has a somewhat passable "online learning", because it can learn in batches as long as its network is big enough. GPT therefore has long term memory and can store its own thoughts. Improving this should be able to be done if we _try_. An OpenAI.com blog post posted recently, shows they tried mental RL and it works. Facebook's Blender also did this, it worked. I've had first hand experience generating with GPT, Jukebox, IGPT, etc and saw what DALL-E made. It made nearly human level completions that were long and considered the whole piece of input, even on first tries. I know similar methods that do the same things called word2vec, PPM, etc, I KNOW it is not just repeating parts of data, it intelligently predicts novel data. I have dreamt new music like Ninja Gaiden Black songs, it was similar but way new. My brain - like GPT - can predict the rest of a song/ text/ image/ video quickly without putting much thought into what I'm composing - like GPT does. GPT does not critically ponder about on how to edit a piece. So we need GPT algorithm because it can do what my brain does for the numb part and is the world's most efficient technique to do so. I'm looking for others who are interested in AGI and DALL-E. If interested in building the future of DALL-E, PM me or post beneath. We will discuss at my group E-MERGE and we will learn together what has been proven to work and future parts/tricks we should look into, so that we can build a picture and understand how to create AGI. ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T52b88d09b39e3984-Mc34b54e8b8f7329653a116ee Delivery options: https://agi.topicbox.com/groups/agi/subscription
