Dear Matplotlib users,

I need to plot a (time x distance) array of measurements associated with 
a color bar. Although time is fixed (i.e 0, 1, 2, etc..), to the number 
of rows, the distances are not evenly distributed (e.g are fixed to 
1.22, 1.53, 1.84, 2,11.), although are always constant.

My question is simple. How can I modify the 'extent ' argument so that I 
represent the real distance values, and not simply evenly distributed 
events?

Any suggestion or comment would be greatly appreciated!

A minimal example plot  is given bellow:

Thanks a lot in advance

Jose.

#=================================
import matplotlib.pyplot as plt
import numpy as np

# fake data
random = np.random.randint(0, 300, size=(55,127)

fig = plt.figure()
ax = fig.add_subplot(111)

myaspect = 100.0

# How can adjust this data to my REAL distances
# (e.g 1.22, 1.53, 1.84, 2,01) ???.
myextent=[-400,800,0,10]

cax = ax.imshow(random, aspect = myaspect, extent=myextent, vmin=0, 
vmax=300)
ax.set_ylabel('Time (ms)')
ax.set_xlabel('Distance (mm)')

colorbar = fig.colorbar(cax, ticks=[0,100,200,300,400])
colorbar.ax.set_yticklabels(['0','100','200','300','400'])
colorbar.ax.set_ylabel('Measurement')

plt.show()

#=================================



------------------------------------------------------------------------------

_______________________________________________
Matplotlib-users mailing list
Matplotlib-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/matplotlib-users

Reply via email to