Jochen Schulz:
Could you please send me an email with an existing From: address? I
tried to reply to you but apparently your From: is forged.
Sorry for the delay, I'll send you an email.
In the meantime I have created a fast BK-tree implementation for
Psyco, on my PC it's about 26+ times faster
* [EMAIL PROTECTED]:
Please plug such good things. It seems the Python community is an
endless source of interesting modules I didn't know about. Your
(single) module looks very nice. I'll take a better look later.
Could you please send me an email with an existing From: address? I
tried to
I believe the best way to implement this would be a binary search
(bisect?) on the actual times, which would be O(log N). Though since
they are timestamps they should be monotonically increasing, in which
case at least you don't have to go to the expense of sorting them.
Some kind of
Steven Clark [EMAIL PROTECTED] wrote:
Thanks for the reply. Can you explain how I could be bitten by
floating point precision here?
I'm familiar with howwhy 1.3*3 != 3.9, etc., but I'm not sure how it
applies here, or what you are gaining by converting to int.
Well, it depends on how you get
* Steven Clark:
Hi all-
I'm looking for a data structure that is a bit like a dictionary or a
hash map. In particular, I want a mapping of floats to objects.
However, I want to map a RANGE of floats to an object.
This solution may be more than you actually need, but I implemented two
metric
bisect is definitely the way to go. You should take care with
floating point precision, though. One way to do this is to choose a
number of digits of precision that you want, and then internally to
your class, multiply the keys by 10**precision and truncate, so that
you are working
Jochen Schulz:
This solution may be more than you actually need, but I implemented two
metric space indexes which would do what you want (and I wanted to plug
it anyway :)):
Please plug such good things. It seems the Python community is an
endless source of interesting modules I didn't know
Few more notes on the code:
You may use the @property in such situations (or you may just use
attributes, dropping the property). Note that Python doesn't inline
functions calls like Java HotSpot does quite often.
def __children(self):
raise NotImplementedError()
children =
More bits from your code:
neighbours = list()
==
neighbours = []
If you have a recent enough version of Python you can use:
candidate_is_neighbour = any(distance n[1] for n in neighbours)
Instead of:
candidate_is_neighbour = bool([1 for n in neighbours if distance
n[1]])
It's shorter
Hi all-
I'm looking for a data structure that is a bit like a dictionary or a
hash map. In particular, I want a mapping of floats to objects.
However, I want to map a RANGE of floats to an object.
This will be used for timestamped storage / lookup, where the float
represents the timestamp.
I know that foo.get() will be called many times for each foo.put(). Is
there any way to achieve O(1) performance for foo.get(), maybe via
some kind of hash function? Or is the best thing to use some kind of
binary search?
If you know something about the density of the input values, O(1) is
Steven Clark wrote:
Hi all-
I'm looking for a data structure that is a bit like a dictionary or a
hash map. In particular, I want a mapping of floats to objects.
However, I want to map a RANGE of floats to an object.
This will be used for timestamped storage / lookup, where the float
If you can imply a partial order on your ranges then you can get O(n lg n)
random access using a heap data structure.
You'll have to implement your own heap, but heap search is easy to implement
(it's Heapify that might require a little thinking).
This will only work, of course, if your ranges
On Apr 7, 3:58 pm, Steve Holden [EMAIL PROTECTED] wrote:
I believe the best way to implement this would be a binary search
(bisect?) on the actual times, which would be O(log N).
bisect is definitely the way to go. You should take care with
floating point precision, though. One way to do
14 matches
Mail list logo