You have asked some important questions, which I have tried to answer; however I must ask you to keep on topic, and the size and number of your posts to a reasonable level so as not to damage our signal-to-noise ratio.
Write-access to any open source code-base is founded on personal trust and proven performance in terms of code commits and communication skill. Earning a role in this project won't be very hard for those who put in the leg-work. Cheers, On Tue, Dec 16, 2008 at 2:05 AM, Mark Malewski <[email protected]> wrote: > (provided you haven't seen any LL copyrighted code). > > <Closing my eyes> > > I see nothing, I know nothing. ;-) > > > We are hopeful that this means OpenSim will allow you to work on reX and > OpenSim at the same time > > This would be good. > > We understand that some of you have been using reX in production > environments, and we will continue to recommend reX 0.5 for real-world > use. > > Ok, now how different is the reX 0.5 from reX 0.4 (current version I'm > using). Is 0.5 an actual official build or are you referring to the current > updated CVS as 0.5? > > Has an official 0.5 been released? (as a stable build?) Please bear with > me, because I'm still getting my feet wet, and stumbling through all this. > I'm still trying to learn more about exactly where we are at (and where we > are headed, and the timeline, and projected future build dates). So I do > apologize for all the questions. ;-) > > However it is our intention that reX-NG would become usable by > the end of 2009. > Ok, now I'm really confused. Ok, so what is "reX-NG"? is this based on the > 0.5 code, or is this a complete re-write? > > Also how "compatible" is this future "reX-NG" with the current OpenSim main > CVS? > > Will the reX-NG be identical to the OpenSim main CVS (base code) so that all > the features in OpenSim be available inside the new "reX-NG"? > > I just want to make sure that the two are somewhat compatible (with > additional features being built on top of the OpenSim source, and making > realXtend a "more advanced" version of the OpenSim, with more > advanced/additional features). > > At least this is how I currently understand how/why these two forked > projects currently exist, correct? > > Please correct me/clarify if I'm wrong, because I'm still trying to > understand the direction that realXtend is headed, and I just want to make > sure that realXtend is built upon same basic foundation as the OpenSim > project (with just additional functionality added). > > I do agree that severing our ties with LL is the best way to move forward. > The legacy viewer will have another release, 0.5, to tidy up loose > ends we feel we have left. After that we will concentrate all of 2009 > on a more permanent solution: reX-Next Generation. > Ok, this is good to know. At least we have an idea of "where we are headed" > right now. > > The next question I have, is: What exactly is "reX-Next Generation" > (rex-NG)? > > Does this even exist yet? Is this based off of the 0.5 build? Is it still > backwards compatible with the OpenSim project? Is this a complete 100% > re-write? What on earth is it? Is it just what we are calling the > "completion" of the OpenSim & realXtend integration? (the integration > between OpenSim, and realXtend projects back into one main fork again, > instead of 2 separate forks?) > > How "compatible" is the "rex-NG" with the OpenSim project? What are the > main differences between "rex-NG" and OpenSim, and how "compatible" is the > "rex-NG" with the current OpenSim? > > I'm still trying to get a better understanding of exactly where we are > headed/heading. What is our current "Road Map" looking like for 2009, and > have we setup projected dates, and timelines? > > Is there any way that maybe Jani could conjur up some more pretty slides for > us (maybe edit those old slides she had done previously, and maybe just give > us a bit of a "clearer picture" of where we are headed for 2009. > > I'm just trying to see a basic outline of what we are doing, where we are > headed, and the current (and future) Road Map for the project. > > Since arriving at realXtend I have been working hard to help define a > strategy and plan for 2009. > Yes, I feel your pain. ;-) > > But we definitely need to have a strategy, and a plan for 2009, so we (as > developers) at least have an idea of where we are headed with this project, > and what our goals/projected time tables are for the future 2009 builds. > >> We felt we had no choice but to undertake writing our own viewer from >> scratch. > I completely agree, and this is by far the simplest and best choice. We > need to just "break free" from the LL. > > The direction I would like to head with the viewers/browsers is incompatible > and inconsistent with LL, and I would like to see a LOT of additional > features added to the browser(s)/viewer(s) that would require additional > features on the backend server builds (that are completely incompatible with > LL). > > So there is no reason to "weigh ourselves down" by LL. We'll just move > forward, work hand-in-hand with OpenSim, and continue to move forward. > Leave LL in the dust. > > I see at least 3 different browser/viewer builds that I would like to get > done for the Windows and OS X/Linux side. > > Then also 1 separate "thin portable" browser version (thin client) for > embedded system devices, and mobile devices (iPhone 3G, Blackberry Storm, > and G-1 Android). > > The 3 browser versions for the PC side will be much more powerful (support > higher end 3D graphics engines/processors/high end graphics cards, etc.) and > will have higher end features (to take advantage of the larger screens, and > higher performance/better graphics) of the large PC's, but the embedded > systems (and mobile devices) should at least have basic functionality (and > instant messaging, and some simple thin-client functionality). > > I believe our three main "high end" platforms should be Windows, OS X, and > Linux. We can do OpenGL 2.1 on OS X and Linux right now today. Windows, I > can get OpenGL support probably done within 3 months. > > This will at least take us to the "next level" as far as graphics > capabilities are concerned. I don't want to get "flamed" or bombarded by > people saying "it's not possible" to push those kinds of graphics over the > internet. > > That's not true. We've done it on the government side for the past 30-40 > years. So please don't bombard me with the silly nonsense. I'll explain > more later, but we'll probably have to move more towards a more advanced > engine in the later browser versions. > > I'm working on trying to get a U.S. Government project released (for > civilian use) and hopefully I should have it released "Open Source" by mid > to late next year. I believe this should be the basis of our next > generation graphics engine. > > I can't say too much more than that, because it's still a classified > government project, but it's an old legacy version (id Tech System 5) which > we have abandoned almost 15+ years ago, but it's still a million times > better than ANYTHING that even exists in the civilian sector, and probably > better than anything that will even exist in the civilian sector over the > next 20-30 years. > > So if I can get it declassified and released (as open source) for civilian > use, then I think it would be a good basis for our "Next Generation" > browser. This probably wouldn't happen till late 2009 or early 2010. The > government is very slow moving, and even though I have a LOT of "pull", I'm > still bogged down by politics (security issues, etc.). We're still trying > to clean up the code a bit, and "sanitize" it a bit, so that it can be > released for civilian use. > > Again, I don't have the current funding or resources dedicated to the > project (as this is something I'm trying to do on the side, for the civilian > community), but I do believe it's the "wave of the future". > > If you could only see what I see (on a daily basis) then trust me... you > would understand that the way civilians are currently doing this, is all > completely backwards (and extremely primitive). > > So I already have the "vision" on where we need to be, it's only a matter of > getting some of this technology filtered down to "civilian use" so that we > could use some of it to head in the "right direction". > > Civilians are currently in the "early VR stages" that the U.S. Government > was in back in the early 1950's. We're at least 58 years ahead of civilians > right now, and about $8 Trillion dollars ahead, as far as research and > development budgets are concerned. > > So I can't say too much about what the U.S. is doing, but I do know that > most civilians are really wasting their time with "dead end" technologies. > > I do see some good "key points" here in realXtend (for practical civilian > use), and that's the only reason I'm interested in trying to see this > project move forward, closer in line to something similar as to where the > U.S. Government is. > > Again, the real-world civilian budgets are nothing compared to what the U.S. > Government spends on Military/Government research and development, plus > civilian hardware is such garbage compared to what the U.S. Government > currently uses, and also we are "crippled" by the old legacy garbage > Internet network that is currently in place (for civilian use), and I've > taken all this "crippling" factors into consideration, but I still think we > could build a somewhat "enjoyable" and extremely "realistic" experience for > civilian use. > > I firmly believe I could launch this project about 30+ years "ahead of > schedule" over the next 2-3 years, if I can get various U.S. Government (old > legacy dinosaur/mothballed projects) declassified and tossed into the Open > Source public domain community. > > It would at least give civilians a decent "leap forward" in technology. > Again this is all going to be an "uphill battle" but having seen (and having > worked with) both sides (civilian and military), I do understand how > "crippled" the civilians really are (as far as VR). > > The research and development being done on the civilian side, is almost > non-existent, and is so incredibly primitive, that it's almost as if > civilians are still rubbing two rocks together, and trying to discover > "fire". > > Sometimes I try not to laugh when I read various threads, and various > technologies (especially when I see the stupidity that Microsoft is doing). > I nearly pee in pants everytime I see Microsoft open their mouth about VR. > > I know most of those guys over on the ESP project, and their Microsoft > "earth" project, and it seems almost comical with the cartoonish garbage > they are working on. > > I believe I know the direction we need to head. It's hard for me to "throw > out hints" as far as technology is concerned (because I have to watch what I > say, or "Big Brother" will put me in the tank). So I have to stay pretty > "tight lipped" but I can assure you that the "Battlefield Visualization" > that I started and developed (between 1991-2001) is way too advanced for > most civilians to even understand or even comprehend. > > I will just leave it at that. There are "foreign nationals" on this thread, > and this is not a discussion that we can have on a mailing list thread. All > I can say is, I agree with the "core beliefs" of what realXtend is trying to > do. > > I do see the "good" in getting this technology pushed out to civilians for > "civilian use". I see lots of benefits of it. I believe it would be good > for the "common good" of society, to move forward with this. > > I do think it would "revolutionize" the way civilians view the "internet" > and "social browsing" experience. > > So far, with what you have, and the resources you have (and are working > with), you're doing extremely well. (Extremely primitive, but as far as > civilians are concerned, it's a good "first step"). > > Unfortunately, you have a VERY VERY VERY long road ahead of you. It's like > being a 45 year old adult, and watching a toddler take it's first steps and > attempt to stand up on it's own two feet. > > It seems comical, and almost funny to watch. > > To be honest, if we really wanted to move forward with such a "real-life" > large scale project (for civilian use), there's way too much we would need > to do and accomplish. > > First off, I would need to redesign the current "Internet" as you know it > (as civilians know it). The whole TCP/IP stack, and various other "dead > end" technologes that you are currently using (as civilians) have been > abandoned almost 58+ years ago (as "dead end" technologies). > > There's reasons for this, and in the next 20-30 years, you (as civilians) > will slowly "wake up" and realize that some/many of your efforts have been > completely wasted, and then you'll be forced to take several hundred steps > backwards, and start completely over. > > Trust me on this one. I've studied the work (classified work) from the > 1920's to present, and I can see all the "mistakes" we have made (along the > way). > > Civilians are still rubbing rocks together, and trying to "make fire". At > this point, I see nothing but a whole lot of "smoke". No fire, just smoke. > > To be honest, even if I took 8 or 10 of the Top Universities in the world > (and compiled ALL their talent and resources), I just don't think we could > even "put a dent" in where we need to be (technology wise) in the civilian > sector. > > That is the first problem. It comes down to money and resources. I'm not > saying we don't have smart people in the civilian sector (because we do, and > I see plenty of them), but unfortunately it comes down to financial budgets. > > Civilians don't have $480 Billion budgets, or $1 Trillion dollar budgets > (over 5 years). > > These numbers, and concepts are "foreign" to civilians. Civilians still > can't understand how America was putting a man on the moon, or building the > Atomic bomb, or building spy satellites (again, I can't confirm nor deny > whether they exist...) or putting landing Rovers on Mars. > > This is all "old dead-end technology". If I could tell you what I know > (without spending the rest of my life in prison, or being put to death for > Treason), it would make you crap your pants. > > So when I say, that we are currently "rubbing rocks together", I'm just > being blunt and honest. The only way I see us moving forward (with a > somewhat large leap forward), is to start modeling the civilian side, after > what the U.S. Government has been doing over the past 20-30 years. > > From 1991-2001, those technologies (even though they are well over 8-17+ > years old, and considered "dead end" technologies by the government), I > still think they would give civilians A HUGE HUGE HUGE "leap forward" with > where they need to be, as civilians. > > Civilians just don't have the "financial resources" (for research & > development) to "catch up" to the U.S. Government. So civilian research is > lagging about 50+ years behind. > > I watch civilians scratch their heads like monkeys, and sometimes I listen > to reseachers say foolish things like "That is NOT possible!" > > (That's only because in "their eyes" the world is STILL FLAT. They haven't > even discovered that the world is ROUND yet, so civilians really are > clueless, when it comes to virtual reality & advanced/secure communications > systems). > > First off, without saying too much... just think for one brief moment, that > a U.S. Satellite (in theory) floats around in orbit, and images about 3,800 > images a second. (In theory of course, but I don't even want to speculate > on true capabilities), but just theory of course. > > Do you have any idea how large just ONE of those images is? Again, I can't > even discuss capabilities, or speculate on current technologies, but just > think for a moment (and use your brains) and just try to understand HOW > LARGE just one single image is (understand the resolution of those "birds" > and the imagery). Again, I can't speculate one way or another, but let's > just say that a "chip" is about 500GB. > > A "chip" is just a very small portion of an actual high resolution image. A > standard image could be anywhere from 2TB (for a small image) to 80TB (for a > decent size image, like an airfield or naval base). > > Now just think that those "birds" are flying around rattling off about 3,800 > images a second (in theory of course). Do you understand the type of > bandwidth involved? > > Again, 99.999% of the civilians wouldn't believe it (or understand it) even > if you showed it, or explained it to them (simply because they way they view > & understand "the internet" is wrong). > > The "internet" (as you understand it) is broken. That's why it was 'tossed > in the garbage" by DARPA in the mid to late 1950's, and turned over for > "civilian & scientific use". > > Don't think for a second, that the U.S. Government would just turn over a > technology without having a much more advanced replacement. ;-) > > Again, this is all "theory" of course. I can't confirm nor deny one way or > another, but having seen, and worked on the "replacement" (basically > "version 2" and "version 3" of what you civilians would consider "the > internet"), I can at least see the "short comings" and problems you would > face (in the future) with your old legacy and dead end technology. > > Sure it's fine for static web pages (whoopti doo, civilians learned how to > browse static web pages), but the point is, when we start to get into > advanced photogrammetry techniques there will be "bottlenecks" (limited by > the bandwidth between the secure communications between the satellites and > ground stations). > > There will also be protocol problems. That's about all I can say. > Civilians need to learn how to drastically "improve efficiency" in the way > they handle large amounts of data and limited bandwidth. (Trust me on this > one). > > I can't say too much more than that, but that is one of the "key fundamental > problems" that civilians will have. > > The next step, is the problems with the way you are viewing/rendering > graphics and rastering your images. Again, you are going about it ALL > WRONG. > > I can't say too much (on an open thread), but I can throw a few "hints" out > there, just for something to "chew on". > > The type of technology that you want to be using (on the civilian "browser" > side) would need to have an advanced rendering engine, that is capable of > not rastering data (as you understand it now), and NOT do it by DOWNLOADING > textures, but instead do it by "streaming" textures (as needed). > > Again, I could sit down and work on a "proof of concept" (for civilian use) > just to demonstrate what I am talking about, but this would give you a > 80,000% increase in "efficiency" of bandwidth. > > Just as "streaming a movie" is much more efficient than trying to download a > whole movie all at once (from one single server). > > The next step, is bandwidth resources. Your civilian "ideology" of using > servers is all wrong. It's completely backwards. > > It's futile and "dead-end" technology. Those servers act as "bottlenecks". > Imagine if 35,000 people tried calling the same phone number all at once. > What do you think would happen? > > Yes, same concept. So just throwing a little "hint" out there (to chew on), > the real goal is to move more towards a "meshing" technology. No central > server, no real "command center". No real "one focal point" or "top down" > topology. That structure doesn't work. Stop trying to think like that (as > civilians). It doesn't work. > > Think more like the concept of how the internet works (lots of different > relays, that just bounce information between each other, from one to the > next, to the next, to the next, until the information reaches it's intended > target. > > Same concept. To move large images, it's much more effective to use > something that would break an image up into small microscopic pieces, and > then stream those pieces (instead of a "download" type technology). Think > "bit torrent", because that's about the closest concept you have in civilian > use right now. > > Same concept. Then the next hurdle is the actual storage and rendering. > Data storage will be a major problem for civilians. It's just too costly > and primitive at this point. You would need "deep pockets" like the U.S. > Government, to afford the storage capacity. Even Google can't afford (or > even fathom) the storage capacities of the U.S. Government. > > So this is something we would need to consider, and think about. We would > have to somehow generate some revenue inside the "world/grid" to help offset > the costs of a large-scale Grid storage center. These would be located (and > mirrored/duplicated) in several locations throughout the world (as basic > "caching servers"). > > With the main location probably being stateside (in America), and with maybe > 8-10 mirrors located throughout the world (as local "caching servers") to > reduce the bottlenecks caused between Transatlantic fiber connections. > > There would probably need to be 3 main sites in America. The first, I would > probably place in Chicago (main Global hub, for the Internet backbone). > > 95% of the main Internet traffic gets routed through Chicago (between East > Coast/NY, and between West Coast/CA). > > So the first datacenter, would probably go in Chicago. Two additional data > centers would later pop up (one in NYC, and one in Los Angeles, CA). The > two additional data centers would serve as "caching servers". > > Those "Tier 1" caching servers, would be used as "relays" over TransAtlantic > and TransPacific fiber connections between Europe (on the Atlantic side) and > Asia (on the pacific side). > > The "Tier 2" caching servers would be housed in various countries (England, > Germany, Finland, China, Russia, etc.) > > Then "Tier 3" caching servers would be housed in major cities throughout > each country. > > Then "Tier 4" caching servers would be housed in smaller townships, > villages, etc. and tied directly to the closest major city/Tier 3 server. > > I know this goes back to a "top down" topology, but understand that this is > NOT how data is passed between users, this is simply more of a "DNS" type of > topology. (Similar to how our current DNS structure works). You have ROOT > servers, and then different Tier servers. > > The DNS servers don't control anything, they simply cache DNS information > for local users (and keep the root servers from getting overloaded). > > Same concept. Then each local "Tier 4" caching server would implement a > "Mesh" type technology. (Almost similar to a "bit torrent" like technology) > with the local client servers, and each client workstation would "mesh" > between local adhoc users (at the ground level) and relay packets (adhoc) > between local mesh users, and eventually the information would get routed > back to the main local caching server, and then make it's way back towards > the upper tier servers (but the majority of the data would be passed ad hoc > among the mesh, and not via the direct connections between root servers). > > Just like data on the internet is not passed between the DNS servers, > instead it is passed and routed between various other servers (in the > network path). Same concept. > > No client workstation would get overloaded, because it would just be small > little bits being relayed (small pieces), kinda like a torrent technology. > > The only difference is, it's not really a "download" per se (like torrent), > but instead its' a "live stream". > > It's hard to explain, because it doesn't exist in civilian use right now, > but torrent is about the only thing civilians would understand or comprehend > at this point. But it's the same/similar concept. > > Only difference is, it's almost like having a "streaming bit torrent" > server. How/why does this work? Because now you are NOT downloading all > your information directly from ONE media server (all your textures), but > instead are pulling texture pieces (very very high resolution and large > texture pieces) from various other users (peer to peer). > > This takes the load off of the texture servers and world servers. Since > everything is done P2P on the client side. > > The next advantage is, it's streamed (not downloaded). So a user doesn't > have to wait till a full portion of the world is completely downloaded to > begin walking around. The first few pieces are downloaded immediately, and > then the additional pieces continue to come in (similar to "video caching" > that you see on a YouTube client/server, the viewer caches maybe 30 seconds > or 45 seconds of footage, so that it doesn't appear choppy to the user. > > Same concept. The textures are all streamed (and cached), and the advanced > graphics engine, actually uses STREAMING textures (not static file > textures). > > Again, this all needs to be done on the browser side. These are the first > few "baby steps" tha we need to make, when it comes to bringing decent VR > technology to civilian use. > > These are things that I was doing back in 1986-1991, and it's stuff that we > need to start bringing to the civilian sector in 2009-2011. > > I believe civilian computers are "mature enough", and a Mac Pro could easily > handle OpenGL 2.1 graphics (and possibly even OpenGL 3.x graphics), so I > think the civilians are at least ready for the technology. > > Now our next step is to begin bringing it to them. I know this is a LOT to > "chew on", and there may be some skepticism that this technology even > exists, or that it's even possible, but I can assure you... that yes, it > does exist (on the government/military side). But it's time to start > bringing this technology to the masses (the civilians). > > And although it will be extremely "primitive" compared to the actual > government side, it's still about 30-40 years ahead of what current > civilians are doing and working on. > > So this ideally, would be where I would like to see "realXtend" head. I > can't do all this myself, and it would take teams and teams of engineers > willing to work as one cohesive unit, to make this all happen. > > I believe it's best to keep it Open Source, a keep the whole community > working together (as one unit). I can work on getting some of the older > "dead end" technologies released as "Open Source" (once I can get things > "cleaned up" and "sanitized" and get approvals from the government to > declassify and "officially abandon" some of the older "mothballed' > technology). > > id Tech 5, would be good for civilian use. We're currently using id Tech 7 > and id Tech 8, but I know 5 is way long gone and way mothballed. > > So I'm pretty sure I can get that released as Open Source, or at least > released for civilian use over the next 12 months. > > I already have a good portion of it, sanitized, and I have "leaked" bits and > pieces (unclassified portions) for public use. I can't say too much, but > just give us a little bit of time, and I think I could bring a "new advanced > engine" to the table by November or December of 2009. > > Hopefully we canget this incorporated into the new browser builds for 2010 > (as an advanced "high end" gaming engine). > > It's cross platform, extremely light, and it would kick the pants off of > anythign that XBOX 360, Playstation 3, or anyone else could even come up > with. > > At that point, we would probably "port" the browser as a "game". Then we > would have to swallow the licensing fees, and licensing costs (shoved up our > backside by Microsoft and Sony) and then release the browsers as "Games" > that people can purchase. > > We could probably sell them for $29, and still cover our licensing costs, > fees, manufacturing/production costs, and turn a small profit (about 99 > cents profit per DVD/game). > > It's not much, but at least it could be used to help fund further research > and development. The next logical step, would be to show Developers how > extremely easy it is to develop for the new "cross platform" advanced > engine. > > Gaming developers could now design and build for ONE SINGLE SYSTEM, and it > would be cross platfrom (work on PS3, XBOX 360, WII, etc.) > > No more having to develop for 3 or 4 different gaming platforms, and no more > having to pay 3 or 4 different licensing fees. > > We could offer the platform to gaming developers FOR FREE. > > Just as you can use the Ogre engine, same concept. Developers can develop > games, no licensing fees, nothing. Sure it would destroy Microsoft's (and > Sony's) business model, because the hardware platforms would just be used as > hardware platforms (nothing more). > > The XBOX 360 and PS3 do have some decent graphics processors, and they are > very nice pieces of hardware that we could use as platforms for our eventual > future gaming engine (and gaming technology). > > Everything would be so realistic and life-like, that I seriously doubt that > ANYONE would use anything other than our gaming engine. That's how > confident that I am, that we could build a TRUE industry standard VR > platform. > > Microsoft and Sony and Nintendo would just become hardware manufacturers, > and if they got upset about the whole situation, and attempted to refuse to > license us in the future (on a next generation XBOX or Playstation) simply > because we are hurting their licensing fee structure and licensing sales > (since developers are all developing games under our platform, instead of > the Microsoft platform) and are selling their games in our "Virtual > Marketplace" versus the Microsoft XBOX 360 Marketplace. > > That would "cripple" Microsoft's sales a bit. So I'm sure that Microsoft > will get their panties in a bunch, and possibly not license us in the > future. This is not a problem, because I can work with Dell. I know Jeff > Clark (CEO of DelL?), and Carla and a few of the VP's over at Dell, and I'm > fairly confident that I could get Dell (and even HP) to build hardware > specific gaming systems (identical to a XBOX 360/Playstation 3) type system, > except with integrated Blu Ray players/recorders, and an integrated 1TB hard > drive. > > Why? Because I would like to market the devices as "MeshBox" devices (that > are a gaming device, a Tivo/DVR device, and a social networking VR platform) > all in one. > > Plus the device would be used as a replacement for the standard home > computer. It would have HDMI outputs (and would connect directly to a HD > flat panel television, with 1080p), and would have integrated 7.1 digital > surround sound. > > So that users could use it to experience Blu Ray movies, and online gaming, > and also High Definition recording (of television, movies, etc.) similar to > a Tivo device. I would also put TWO tuners inside of it, so you can watch > one channel, and record a second channel, or even record two channels at the > same time. > > At first, I would like to release a "System 1" version (with a 1TB hard > drive and 2 tuners), but later we can release an upgraded "System 2" and > "System 3" versions (with 8 cores, or 12 cores, or 16 core processors). > > The "System 1" would probably be similar in technology to a current 8-core > Mac Pro. Very similar hardware/mainboard. Just slightly modified, so that > it could fit in a smaller enclosure, and the graphics chipsets would > probably have to be embedded on the mainboard (similar to the iMac). > > It would have built in USB 3.0, built in Firewire 800, and built in > Bluetooth, and built in 1 gigabit ethernet (possibly even a 10GB ethernet, > if we can get the Intel 82598EB chipset (10GB Ethernet Controller) licensed > fairly cheaply. I would need to hammer out the details with Intel, and see > if they would be willing to work on us (with some decent pricing). Because > I would like to implement the newer Intel 10GbE controller, which is > optimized for multi-core XEON processor based hardware. > > The Gaming platform would be perfect (and optimized) for technologies such > as VoIP, and real-time video on demand. > > It would be a very power efficient solution, and it could even be used as a > "home based server" for storage of music, movies, family pictures, and even > host a family website. > > I supposed it would even be powerful enough to be used as a small business > server, and even be used to replace a modern "XEON-based" small business > server (running Server 2008/Exchange 2007/SQl 2008, etc.). > > We could come out with a "Small Business Version" that runs Windows Server > 2008 core (no internal GUI), and then system administators could login (via > remote desktop) to the Core Meshbox, and use a standard client side gui, to > control and administer the server. > > It's "Meshbox" device would connect P2P (if you opened up ports on your > firewall, and allowed it). The "meshing technology" would be used highspeed > downloads (similar to a bit torrent technology). > > So people could download HD movies, HD games, music, pictures, or whatever > they want. > > Sorry for "hijacking" the thread, but those are just my personal views, and > personal opinions. I know it's a LOT to swallow, and digest, but I think > eventualy that's where we need to head. > > All I need is a small handful of bright people, and we could easily make > this a reality. Just by looking at some of the people in this message > list, I already know that there is some very "bright talent" here. > > I know all this probably wasn't on the "Original realXtend Road Map", and > POOR Jani is probably shaking her head in disbelief right now, but that's > really where I wold like to see this project eventually head. > > I have patents on a lot of technologies, and I've released a lot of > technologies as "Open Source" projects (for civilian use), and most of my > technologies are still being used by the U.S. Government today, and some of > my technologies have filtered their way down to civilian use. I did a > project for some individual in Bali, back in 2001-2002. We rolled out some > "meshing technology" (that I had engineered back in 1986) and used it for a > wireless mesh system in Bali, and about 37 other remote islands (thoughout > rural parts of Indonesia, West Papua, etc.) > > It was used for missionary work, and educational purposes. At the time, it > was "bleeding edge" technology (as far as civilians were concerned), but > it's dead end stuff, just garbage I was throwing out there for the civilians > to "chew on". Just to prove that it could be done. > > The "big picture" that I have, is very different. I want to use the same > types of technology that I have designed and developed for the U.S. > Government, and eventually "replace" the current "internet" infrastructure. > > I know it sounds like an "impossible task", but really it's not. I can > probably get a $100 Billion grant from the U.S. Government, and start by > laying a new infrastructure in the United States (for civilan use). Then > once that is complete (it may take 5-6 years), then I can work with 2 > researchers I know (and trust) on certain technologies that they are working > on (for Transatlantic transmission rates). I believe I could push about > 2.8Gbps over standard copper (phone lines). > > This would only be temporary, until I could raise the funds and equity > necessary to lay the Advanced under water fiber necessary to link the > continents together. It's a different type of cabling, and the current > civilian stuff simply won't work (or give me the bandwidth that I need) for > a decent backbone infrastructure. I want 1TB speeds (minimum) on the fiber > channel backbone, but this could later be upgraded to 10TB or 100TB if > needed, but 1TB should be plenty for now. > > The cabling would be sufficient to handle 100TB (in the future) if needed. > Ah, i'm getting way off topic here, but yes... I've seen and worked on a lot > of stuff on the government side, and I'm just frustrated that these > technologies are not currently in use on the civilian side. > > It could be another 50 or 100 years before civilians get off their lazy > rumps, and start doing this stuff, so as I finish up my work on the > government side (at least for now), I believe this would be a great way to > stimulate the global economy, and also bring hundreds of thousands of > engineers together (from all over the world), that can now use a much more > advanced high speed network. > > Once I bring the network trans-Atlantic (no, I don't want to use satellite, > there's just too much latency, I need fiber, don't ask...) > > but once I bring the Advanced High Speed network Trans-Atlantic, then we'll > use it to connect all the main universities (globally) to the network > backbone. (Similar to how the "civilian internet" spread). > > Then local internet service providers will tie into the local university > fiber backbone, and then server their local communities. At this point, it > would probably be moved from Fiber, over to a secure Advanced Wireless > Technology. Something much more advanced than Wi-Max, and something that we > can push out 100GB speeds (on the backbone side), and still push out > 500Mbp/s on the client/customer side. > > I have a couple of advanced protocols, that we could use, and I'll save all > that for a much much deeper discussion, but for now... that's just a > "general outline" of what/where I would like to head. > > Who would "own" the advanced network? Well, I've mulled over it, and that's > a very good and very tough question. > > I don't think it should be a commercial entity (like AT&T, MCI, etc.), I > believe more in an independent non-profit organization, something that I > would like to "found" that would be similar to an independent "Civilian > DARPA" research facility. > > Where researchers from all over the world, could come together (similar to > the U.S. based DARPA), except it would be civilian based (for civilian > projects) and eventually would be used to fund projects such as bringing > clean water, and FREE & clean/renewable energy to all of the world's > poppulation. > > Trust me, it can be done. This is NOT rocket science. If I can get a > satellite off the ground, and into space, certainly we can solve the "simple > problems" of bringing this technology to civilians, and helping to better > the global economy, and global socialism. (and hopefully use this as a > "bridge" for World Peace). It will help "level the playing field" between > poor third-world nations (similar to Africa or China) and at least make > these individuals productive workers in the global economy. > > if I could have half or one third of africa modeling virtual worlds (for > pennies on the dollar), we could easily get the whole solar system "modeled" > in maybe two or three lifetimes. > > I probably would never be alive to see it, nor would anyone in this > forum/message list, but at least I could lay the ground work. At least I > could get the ball rolling, and just hope that the future "civilian DARPA" > continues to carry the torch, well after I am dead and long gone. > > There are always pioneers. Many of my ancestors (Orville & Wilbur Wright, > and Frank Lloyd Wright) were well known early engineers (and pioneers), and > much of their early engineering (and thinking) set the "standard" for > concepts such as "flight" and even "advanced building architectures". > > I have that same bloodline, I lay in bed at night thinking up crazy things, > and unfortunately I just need a team of good engineers that are willing to > "step forward" and "fight the good fight". > > The "MeshBox" platform, is just a simple strategy, to show the world what > it's like to "Think OUTSIDE the box". > > What will the "MeshBox" platform be used for? Curing cancer. That may > sound stupid, or comical, but it's true. > > 99% of servers are "underutilized" in this world. How much idle processing > time does your current computer have? Go ahead, click on task Manager, and > see what your current CPU utilization is. > > Those unused clock cycles are just wasted energy. By utilizing a "thin > client" plugin, embedded in each and every MeshBox device, that unused > processing power could be used for extremely high-performance Grid super > computing (for civilian scientific purposes). Such as DNA mapping, gene > mapping, or even looking for a cure for cancer. > > With millions of console gaming devices globally, all those devices (and > unused clock cycles) could be "pooled" and then researchers could apply for > "processing power" grants. We can give out grants (based on the project) > and donate the processing power to support their independent projects (on an > "as needed" basis). > > users could even get a monthly e-mail, almost as a "Kind thank-you" letting > them know that their unused clock cycles are being used for Global warming > research, or Geothermal Research, or Cancer research, or whatever. > > I believe in the far future, we could even setup a web-based control, in the > browsing client, that users could actually go online, and see a list of > maybe 50 or 100 different scientific projects, and select (in order from > 1-10) where they would like to see their unused clock cycles and processing > power to go. > > So clients can actually decide where to "donate" their unused processing > power to. Ultimately, it would help researchers (and cut down on their > costs of having to have huge server farms, and super high speed networks). > It would really bring down the "overhead' costs of Advanced Civilian > Scientific Research projects. > > We thank you very much for your patience, and hope that you can stick > with us as we make this transition to a more sound long-term > direction. > > No worries, I'm here to stay. I believe we're heading in the right > direction. Now just give us a detailed Road Map with slides, and > timelines. Then start dividing the tasks up, and begin handing out "Team > Lead" positions, and let's get cracking on the development. ;-) > > Hopefully we can evolve this into a great Open Source project, and develop > a very Advanced Technology that we could eventually use as a platform for > advanced gaming, social networking, and educational/professional use. > > Mark > > P.S. Ryan, I would like to Lead the realXtend Browser/Viewer cross-platform > development if at all possible. Thank-you! > > > On Mon, Dec 15, 2008 at 7:13 AM, Ryan McDougall <[email protected]> wrote: >> >> Hello, >> >> Since arriving at realXtend I have been working hard to help define a >> strategy and plan for 2009. >> >> One of the main problems we discovered was that working with the >> forked LL viewer code was making our changes more and more difficult, >> and more and more unstable as time went on. We felt we had no choice >> but to undertake writing our own viewer from scratch. >> >> Right now we are in the very initial design phase, and very little has >> been decided except it will be Apache 2.0 licensed, and contain no >> code what so ever from LL. We are hopeful that this means OpenSim will >> allow you to work on reX and OpenSim at the same time (provided you >> haven't seen any LL copyrighted code). >> >> The legacy viewer will have another release, 0.5, to tidy up loose >> ends we feel we have left. After that we will concentrate all of 2009 >> on a more permanent solution: reX-Next Generation. >> >> Practically this means that some of the bugs you have filed with us >> will be marked "won't fix". Realistically however, the reason we are >> making reX-NG is that it was practically impossible to fix them in the >> current code-base any ways. >> >> We understand that some of you have been using reX in production >> environments, and we will continue to recommend reX 0.5 for real-world >> use. However it is our intention that reX-NG would become usable by >> the end of 2009. >> >> We thank you very much for your patience, and hope that you can stick >> with us as we make this transition to a more sound long-term >> direction. >> >> Cheers, >> >> > --~--~---------~--~----~------------~-------~--~----~ this list: http://groups.google.com/group/realxtend realXtend home page: http://www.realxtend.org/ -~----------~----~----~----~------~----~------~--~---
