Look up AIXI for the relevance of compression to AGI. You are rather obligated to do this since "AGI" -- the term -- originated with AIXI. If anyone is going to get "banned" from an "AGI" group, it should be those who are attempting to appropriate the term "AGI" for vague definitions more appropriate to the less specific term "AI".
Here's the way to think about AIXI/AGI: AIXI = Compression * Sequential Decision Theory Compression = Science = "Is" Sequential Decision Theory = Engineering = "Ought" On Sat, Mar 21, 2020 at 10:24 AM Alan Grimes via AGI <[email protected]> wrote: > I had been holding back on flaming the compression threads because there > is some level of validity to the approach. > > I mostly come from an engineering mindset, as in taking a robot or > avatar and implementing all the capabilities of the human baseline. > > Another approach is the theoretical model and then applying it to the > specific cases such as robotics, etc. > > Now this second approach is a bit beyond what I am comfortable thinking > about but I can't deny it's a valid approach. > > Ok, compression. What links compression to AGI? > > Compression can be considered a valid approach to AGI to the extent that > the compression achieved reflects the creation of a functional > conceptualization, that is a conceptualization that can be removed from > the mechanics of the compression and the decompression and be used in > cognitive processes in general. In these threads, I'm just seeing a > sprint for compression ratio without focusing on exctracting or using > concepts that are actually useful to AGI research. > > Second, there is the focus on text compression. Text is an example of a > low-medium information density structured data. Audio and video > recordings also have these properties and are vastly more interesting as > research targets as there are none of the stupid hacks that could be > implemented with something as limited as text such as hard-coded > knowledge. So that's my second gripe, there is no focus on generalizable > principles of data preparation and pipelining for data in general that > would plausibly lead to AGI. =| > > -- > Clowns feed off of funny money; > Funny money comes from the FED > so NO FED -> NO CLOWNS!!! > > Powers are not rights. > ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T2a0cd9d392f9ff94-Mec144b450769e4af70208222 Delivery options: https://agi.topicbox.com/groups/agi/subscription
