On Nov 10, 11:33 am, Jennie <namedotpor...@gmail.com> wrote:
> What is the best solution to solve the following problem in Python 3.3?
>
> import math
>  >>> class Point:
> ...     def __init__(self, x=0, y=0):
> ...         self.x = x
> ...         self.y = y
> ...     def __sub__(self, other):
> ...         return Point(self.x - other.x, self.y - other.y)
> ...     def distance(self, point=Point()):
> ...         """Return the distance from `point`."""
> ...         return math.sqrt((self - point).x ** 2 + (self - point).y ** 2)

Before you do anything else, introduce a Vector class into your app.
The difference between two Points is not a Point; it's a Vector.
Create a magnitude() method in your Vector class, then make your
Point.distance return the results of Vector.magnitude(self - other).
To define the distance of a point from the origin, don't make your
distance() method have default arguments; instead, define another
method called distance_from_origin().

-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to