Quoting Matt Mills <[EMAIL PROTECTED]>:

> Hey,
>
> Just signed up...
> Wanted to add a bit to what Adam Hill said here. First off, we're not
> 501(c)(3) tax exempt yet (We are an incorporated non profit, just not
> tax
> exempt), but we're working on it. We expect to be tax exempt within
> the next
> 6 months to 1 year.
>
> As we speak we have a Dual Xeon 3 Ghz machine with 1 TB of storage
> going
> online in Seattle. Next to my desk, we have our next server, which is
> a Dual
> Core Opteron with 800 GB of storage which will be online within a few
> weeks
> (they shipped us out the wrong chassis).
>
> We're going to have about 48u of rack space available in Seattle
> Westin
> (datacenter) for either our own equipment or other geo related stuff
> that
> we'll be sponsoring. It will be behind a gigabit switch that is being
> donated from Foundry Networks (foundry.com) and be on 450 mbps
> (3xOC3) of
> bandwidth that is being donated by Randy Bush (psg.com).
>
> While it is true, that initially we're going to be offering only
> World Wind
> Tile format, we're going to be working on getting WMS access to that
> up as
> soon as possible.
That's great news!  Also note that if you have lots of storage you can
just stick squid in front of your WMS, and then all the tiles will get
automatically cached as WMS clients make the same requests.  Which is
why a spec about how to divide up the world would be quite valuable -
and I'm happy to push forth with what worldwind has so far (though
haven't got a chance to check it out yet...)  After it's going on the
server side we can experiment with the p2p thing, which I do believe
can work even with the small jpegs.  The key I think may be for it to
figure out if it can get a tile faster from the p2p network and the
server.  If it's fast to use initially then the network cache will just
build up and soon will be faster to ask your neighbor computer on the
same LAN instead of going all the way to the server.  This is less of a
concern with big popular datasets with really solid hosting, more for
little guys wanting to set up a WMS.  It could be a great service if
your foundation could provide transparent caching to help take the load
of new WMS servers that go up.  But that would quickly generate just
amazing amounts of data I'm afraid.

But as David points out, the process is more important, do try to make
public on the wiki all the lessons you learn about loading up all the
data and setting up a WMS and all.

best regards,

Chris

The reduce the cost of equipment, we've decided to
> get a
> separate "processing cluster" which is going to be composed of
> (eventually)
> 100 high end Pentium 3's (1 ghz, 256 MB ram, 20 GB disk). This is
> obviously,
> to reduce our costs while still having an appropriate amount of
> "umph"
> power. (We can get these Pentium 3's from anywhere from free to 50$
> each)
>
> Anyway, feel free to let me know off-list of anyone needs anything in
> the
> way of hosting or such, [EMAIL PROTECTED] and I'll see if we can
> get
> something arranged.
>
> Oh, and BTW, we're in talks with GeoEye to get some IKONOS imagery
> released
> for use in World Wind ;)
>
>
> Matt Mills
> Director, The Free Earth Foundation
> Office: (267)-895-0096
>
> -----Original Message-----
> From: Adam Hill [mailto:[EMAIL PROTECTED]
> Sent: Tuesday, January 24, 2006 11:00 AM
> To: [email protected]
> Cc: Chris Holmes
> Subject: Re: [Geowanking] Multi-Resolution Dataset Wiki
>
> The FEF is a small 503(c), started by a bunch of WW hackers, that
> pretty much only has bootstrap funding from Adsense and donations on
> WorldWind Central (http://www.worldwindcentral.com). We dont have an
> informational website up yet. (they are/were busy working on a
> processing cluster)
>
> Basically we are looking for free datasets to archive, process and
> tile, currently we are WorldWind centric on the serving side, since
> our donators are only providing space and not a full hosting
> environment, WW has a very simple level, x/y grid that can be served
> up with only HTTP, no WMS required. The P2P idea is attractive, but
> it
> doesnt seem scalable with 100's of 1000's of small JPG/PNG's. Someone
> prove me wrong :)
>
>  If we had some processing power that was scalable to 100K's of users
> we could set up WMS/WFS as well. We have also thought about how to
> present a 'default tile grid' spec to the OGC -
> www.ceteranet.com/nww-tile-struct.pdf is the current way WW does it,
> but are unsure about how it would be recieved.
> _______________________________________________
> Geowanking mailing list
> [email protected]
> http://lists.burri.to/mailman/listinfo/geowanking
>
>
> _______________________________________________
> Geowanking mailing list
> [email protected]
> http://lists.burri.to/mailman/listinfo/geowanking
>


***
Chris Holmes
The Open Planning Project
thoughts at: http://cholmes.wordpress.com

----------------------------------------------------------
This mail sent through IMP: https://webmail.limegroup.com/
_______________________________________________
Geowanking mailing list
[email protected]
http://lists.burri.to/mailman/listinfo/geowanking

Reply via email to