Hi experts!
I have:
* a list of Q 'NODES'=[(x,y)_1,, (x,y)_Q], where each element
(x,y) represent the spatial position of the node in 2D Cartesian space.
* a matrix 'H' with QxQ elements {H_k,l}.
H_k,l=0 if nodes 'k' and 'l' aren't joined by a edge, and H_k,l = the length of
the edge
The solution below is maybe not optimal, but it's something you can figure
out easily enough yourself. Also, I believe this question is better asked
on stackoverflow as it is not an actual matplotlib issue, but rather a
programming problem (that shows no effort).
Let me first redefine your