You're making gross assumptions about what similarity is without leaving any
wiggle room, which is fine and might work for the purposes. I just
generalized the formula a particular way.

 

I like my formula.

 

Michael didn't put any limits on the numbers J

 

John 

 

From: Aaron Hosford [mailto:[email protected]] 
Sent: Friday, February 21, 2014 7:46 PM
To: AGI
Subject: Re: [agi] Numeric Similarity

 

How would you go about efficiently estimating Kolmogorov complexity for
floating point numbers?

 

The formula f(a, b) = 1 / (1 + d(a, b)) should do the job regardless of the
choice of distance metric d, so long as it conforms to the formal definition
of a distance metric. It's really a matter of what dimension of similarity
you want to look at. There is linear distance, d(a, b) = |a - b|, your
Kolmogorov-based distance |K(a) - K(b)|, harmonic distance, d(a, b) = n * d
(where n / d = |a / b| and gcd(n, d) = 1, only good for nonzero a, b),
discrete distance d(a, b) = 1 if a = b and 0 otherwise, or even other
definitions based on Kolmogorov complexity, like d(a, b) = K(|a - b|). I
think, though, that for most applications, the simplest definition, d(a, b)
= |a - b|, is going to make the most sense.

 

On Fri, Feb 21, 2014 at 6:27 PM, <[email protected]> wrote:


K() would be the Kolmogorov complexity. Matt Mahoney always complains that
the K Complexity is not computable but never talks about it being estimable.

John




On 2014-02-21 19:05, Piaget Modeler wrote:

I like this too. How would one define K ?

~PM

-------------------------


From: [email protected]
To: [email protected]
Subject: RE: [agi] Numeric Similarity
Date: Fri, 21 Feb 2014 16:18:16 -0500

It could also be something like this:

Similarity(A, B) = 1 / (1 + |K'(A) - K'(B)|)

where K'(A) is the estimated complexity of A. The K' function is
dependent on observer formulaics and resources.

John

FROM: Piaget Modeler [mailto:[email protected]]
SENT: Friday, February 21, 2014 3:46 PM
TO: AGI
SUBJECT: RE: [agi] Numeric Similarity



Actually Aaron Hosford just recommended

 1 / (1 + | a - b | )

Which I like much better.

Thanks Aaron.

~PM

-------------------------



From: [email protected]
To: [email protected]
Subject: RE: [agi] Numeric Similarity
Date: Fri, 21 Feb 2014 06:02:46 -0800

Thanks to all respondents.

In the end I found a classic numeric similarity metric: 1 - | a - b |

It's not ideal since numeric scores can dominate other attribute
scores.

Ergo, I have to devise a good weighting scheme.

Nothing's perfect I suppose.

Cheers,

~ PM

AGI | Archives [1] [2]| Modify [3] Your Subscription

 [4]

AGI | Archives [1] [5]| Modify [3] Your Subscription

 [4]

                 AGI | Archives [1] [2] | Modify [3] Your Subscription
                 [4]

                 AGI | Archives [1] [5] | Modify [6] Your Subscription [4]



Links:
------
[1] https://www.listbox.com/member/archive/303/=now
[2] https://www.listbox.com/member/archive/rss/303/19999924-4a978ccc
[3] https://www.listbox.com/member/? <https://www.listbox.com/member/?&amp>
&amp;
[4] http://www.listbox.com
[5] https://www.listbox.com/member/archive/rss/303/248029-3b178a58
[6] https://www.listbox.com/member/? <https://www.listbox.com/member/?&;> &




-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/23050605-2da819ff
Modify Your Subscription: https://www.listbox.com/member/?
<https://www.listbox.com/member/?&;> &
Powered by Listbox: http://www.listbox.com

 


AGI |  <https://www.listbox.com/member/archive/303/=now> Archives
<https://www.listbox.com/member/archive/rss/303/248029-3b178a58> |
<https://www.listbox.com/member/?&;>
Modify Your Subscription

 <http://www.listbox.com> 

 




-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to