> I keep hearing about this better data from commercial propagation
> software but can't find any reference to it on any of their marketing
> material nor references to how they actually do it -- even
> assuming it's
> proprietary, I don't even see hints about it anywhere.
A lot of the original 30" and 3" data obtained from the Gummint had problems
with it. The biggest problems seem to have been at the edge of
latitude/longitude boundaries (e.g. right at 40.000 degrees N). I've run
models for clients and found what looks like a weird ridge line blocking the
signal, where I knew there wasn't one, only to determine that there were
erroneous terrain elevations in the data set along that line. RadioSoft and
others have hand-edited a lot of the 3" data and also supplemented it with
10-meter and 30-meter data to improve the accuracy. Keep in mind that the
NGDC 30 and 3 second data was derived by digitizing topo maps (quite often
1:250,000 scale maps which have wide contour inverals), so, at best, it's
only as accurate as those maps were. There is also a newer 30-meter and
3-second data set called NED (National Elevation Dataset) that is quite a
bit more accurate than the old NGDC 30-second database.
There is also a new data set from SRTM (the Shuttle Radar Topography
Mission) where a majority of the earth's surface was measured using
microwave interferometry. The accuracy isn't necessarily superior to
existing data (I think 50 feet was the predicted elevation accuracy), but
the nice (or bad) thing about the data is that it includes obstructions
(buildings, water tanks, etc.). Maybe Bob NO6B knows more about it - I
think JPL was involved in it?
> I always wonder if it's just Bravo Sierra from the people
> trying to sell
> the software.
No, really, it's not.
> Never met anyone who said their job was driving around gathering
> geologic data better than USGS who worked for a software company.
The editing of the data is mainly limited to fixing problems in the digital
domain, not going out and doing survey work.
> Geographical data for Geocoding and mapping at one company I worked at
> was all from the same commercial providers -- all our competitors used
> the same data, too. We could confirm it by looking at their output
> after one of our users reported a data error... yep, they
> have the same
> error we do. We'd fix ours, and not too far into the future
> we'd notice
> all the competitors had fixed theirs also, even without that
> "fix" being
> sent to the upstream data provider. (In other words, we all watched
> each other's changes and then double-checked them for
> ourselves, but we
> didn't have any better raw data to start out with than the next guy.)
Non sequitur follows, read only if you're bored.
Back in my mispent youth I wrote software for a company (in Denver by the
way, Nate) that catered to the oil exploration industry. One of the aspects
of the business was the hand-digitizing of USGS 7.5' topo maps. The reason
these maps had to be hand digitized (at least at that time, back in the mid
80's) was that in many western states oil and gas well locations were
surveyed with respect to a gridding system whereby territories were broken
town into 1 mile x 1 mile "sections". Sections were grouped into 6x6
clusters called townships. Each township was numbered by township (Y) and
range (X). This township/range/section system was created shortly after the
Lousiana purchase so the government could sell tracts of land to the public
in quasi-uniform pieces. The next time you fly over a western state take
notice how so much of the landscape is "square" - roads run perfectly
north/south/east/west, tracts of land are squares, etc., for the most part
at 1 mile intervals along section boundaries. It's because of this system.
Anyway, in a perfect world, each section would have been perfectly square,
thereby making the lat/lon of its corner points determinable by automated
means, but in reality, the surveys were done by guys that probably spent
most of their paycheck on whiskey. They were out there wandering around
pounding iron pegs in the ground and making little rock piles in uninhabited
territory - what else was there to do? And quite often, when they ran into
some kind of terrain irregularity, river, stream, pond, etc., they took
whatever shortcut they wanted, such as just using the stream as one edge of
the section rather than crossing over the stream to tags points to form a
square. The boundaries of all of these township/range/sections wasn't in
any USGS, NGDC, or state BLM database with any degree of accuracy, yet all
of the wells were surveyed based on their distance from the side or corner
of the section in which they were located. As analysis of all of the data
from all of these wells became done more and more by computer than by
geologists, geophysicists, and geochemists, knowing the exact location of
the well (in terms of lat/lon) became extremely important, hence the need to
digitize all of these thousands of maps.
So, what's the point in all of this? Mainly just trivia, but the point I
wanted to make is that there is topographic data being produced by others
than just the Government. And yes, I've seen surveyors out at sites doing
field measurements, including using laser sighting techniques along
microwave paths. Sometimes you do need to get out from behind the
keyboard...
> So run a Radio Mobile plot and see if it matches your commercial
> software's plot or what the percentage difference is. You could then
> feel "comfortable" with it based solely on how well it matched a plot
> from your other software.
I'm fairly certain that RM is just as accurate as RadioSoft, EDX, Vsoft, et
al when you're looking at just the accuracy of the algorithms (Longley Rice,
et al). Any differences you are likely to find should be limited to the
terrain data and whether or not supplemental datasets (such as LULC,
land-use, land-cover) are used to apply corrections. It's just a matter of
GIGO. The commercial packages are well integrated in the sense that all of
the various databases (terrain data, streets, city borders, county borders,
city centers/name, population, FCC databases, etc.) are able to be read
directly by the software, with various ways of overlaying data sets, and
being able to do supplemental calculations based on that data (such as
counting how many people receive a signal > 100 uV).
I've done many miles of drive testing broadcast FM stations using automated
signal measurement systems and have found that the models can correlate very
closely to the real-world measurements with the except at terrain extremes -
extremely mountainous terrain and coastal/wetland areas. Urban areas with a
lot of terrain obstructions (high rise buildings and the like) will
sometimes not correlate well with drive test data due to excessive
multipath. But for 99+% of the country, models can track measurements very
closely.
I use RadioSoft Comstudy Pro and also have EDX's older DOS-based packages.
I like Comstudy for the most part, though it does have some deficiences.
They routinely release updates which incrementally fix problems and add
features. We also use RadioSoft for repeater coordination analyses.
--- Jeff
Yahoo! Groups Links
<*> To visit your group on the web, go to:
http://groups.yahoo.com/group/Repeater-Builder/
<*> To unsubscribe from this group, send an email to:
[EMAIL PROTECTED]
<*> Your use of Yahoo! Groups is subject to:
http://docs.yahoo.com/info/terms/