G'd day all,
 
For my mapping work I use very detailed polyline and contour data (amongst others), which have been produced from 1:50,000 scale maps through digitizing.  The line data are extremely finely sampled and hence reproduce beautifully all curved portions of lines at that scale. However, the ultimate scale at which my maps will be produced is 1:200,000 or 1:400,000, a linear reduction of 1 in 4 or even 1 in 8 and the precision of the original data is therefore way too high for the end product.
 
Question is: Is anyone aware of a routine through which one can pass the data files, and which will reduce the data density in such a way that at the smaller scale the exactness is still maintained. There is a mathematical rule which says that the spatial sampling frequency is dictated by the 'curviness'  of the sampled line in order to restore the original curve without aliasing. (The word Nyquist frequency comes to mind).
 
Anyone has any thoughts on this??
 
TIA
andr�
 
 
 
from: andr� boessenkool
po box 101 - vlottenburg 7604
south africa
tel (+27 21) 881 3188
fax (+27 21) 881 3189
[EMAIL PROTECTED]

Reply via email to