Confirmation bias is violation of the "necessity" in Ockham's Razor: Do not
multiply entities beyond necessity.  The choice to eliminate observations
that you _think_ are "noise" or you _think_ have no implication for shaping
models to be more predictive, is unjustifiable in the limit of Solomonff
Induction if, for no other reason, than that SI's reliance on Kolmogorov
Complexity is uncomputable.

On Thu, Oct 10, 2019 at 1:04 PM <[email protected]> wrote:

> I dont know what you mean by confirmation bias,  but lossy matching is
> definitely imo the most important thing a neural network does for people.
> even tho its actually slower, and exact lossless matching is quicker,
> starts off simpler to code, but ends up more complex in the end to get to
> actually do anything any good.
>
> But confirmation bias???
> *Artificial General Intelligence List <https://agi.topicbox.com/latest>*
> / AGI / see discussions <https://agi.topicbox.com/groups/agi> +
> participants <https://agi.topicbox.com/groups/agi/members> + delivery
> options <https://agi.topicbox.com/groups/agi/subscription> Permalink
> <https://agi.topicbox.com/groups/agi/T8c8ee84b385720a5-Ma541e34330e4bdba3fdb2150>
>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T8c8ee84b385720a5-Mc668a17ffc80b6898c0e629d
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to