On 9 December 2014 at 14:10, Christian de Larrinaga <[email protected]> wrote:
> Hi Gord, > > ? http://www.artemis.com/pcell > Be interested to learn if anybody has got under the veil of this one? > I haven't seen that before, but I can guess having skipped through the the "tech" vid in a minute or less. What follows is a common rant, tailored to the "wow technology!" being pushed|sold|wavedabout|linked-to .... Instead of quite a lot of radio station sites oriented in cells, which works OK until capacity is reached in that service area, they will use lots of little radio stations, effectively forming smaller and smaller cells. Nowt new - some cells are tiny, like femtocells. Some users use P2P methods if they want the same things that other do. Nowt new BitTorrent or VHS copy sharing are examples. To share effectively means proximity, awareness and similar needs. Every single radio station of any form that had capacity or throughput issues has used a similar technique since day one. Not all had good marketing. Reduce the range and you improve co-channel interference (as well as the side effect of reduced adjacent channel interference due to lower EIRPs, both wanted and spurious). As long as each new small, local site has not hit maximum capacity, for the required users in that service area, everything is sweet. A soon as a limit is reached, you have to reduce the service areas, with reduced powers (which doesn't always mean reduced performance as you might think), by adding more cells. The process repeats until someone has a good idea. Limit hit, add more sites or upgrade tech to faster or wider or $something_normal. Every now and again, someone has a good idea. Some of these need marketing, some gain traction own their own merits, some are obvious as soon as you see it. Some are purely theoretical and may take decades and several generations of kit to be practical. In the days of morse and trained telegraphists, a station could hit it's capacity in various ways. Staffing limits (morse training takes time and good fists take decades to become good), lack of telegraphy equipment and links, power and cooling constraints (watercooled sub-megawatt stations existed in many countries), paperwork/procedural inefficiencies, weather and climatic problems. Originally, none of these where limitations of the "ether" as the transmission medium was called. Eventually, there were so many stations that more and more frequencies, or groups of them, known as "bands", needed to be used, as well as some form of planning to make sure that the could co-exist without saturating the medium. Lower power, shorter range stations were introduced, in a similar method to nano-cells. They were shorter range not really by design, but by chance, unused bands that were less suitable to long range working could be utilised for shorter range use at certain times of the day. At this point in history, we have frequency division (different frequencies and bands), time division (operators allowing other traffic on a time-sharing basis), range-limiting (where long range working wasn't needed), backhaul limitations (onward transmission by telegraph cables), procedural optimisation (streamlining message handling), beamforming (using one or more directional antennae and exploiting them actively, often by human intervention). At this point in the tale, we're looking at the late 1920's and early 1930s. In the remaining 90 years, throughput speeds have increased, equipment has become more portable, cheaper in relative terms for the end-user (arguable), all-pervasive and an expected part of everyday life. The problem of the 20's and 30's are the same today. User expectations, which should be high, are in effect destroyed by bad experience of mobile communications especially in busy areas. Cells are large due to lack of investment and forward thinking. Backhaul is a problem, as it is everywhere, until you have enough. If "enough" backhaul can't scale up when demand goes up, you hit the problems when the limit is reached. The actual radio media (the "ether") hasn't changed, the same laws of physics are the same as they were 100 years ago, and probably have always been similar to what we see today (or think we see), we (RF people as a subset of tech society) may understand them better, that's all. The obvious solution to most of the perceived problems are to use diverse backhaul (install femtocells on xDSL all over the place), or massively diverse backhaul (use technologies like bittorrent or similar P2P) to provide more and more sites (cells) to provide the service to the users. Nearly 90 years ago the same thing happened:- More radio stations were installed where needed to provide capacity as needed, not just fill in gaps in coverage ..and.. Radio ops would share the weather and traffic lists and even rebroadcast it to stations further away or with less advanced equipment, using diverse methods. That makes 3 assumptions - customers had to be willing to pay, companies wouldn't overspend or under-provide and that weather was wanted by many people. he selling point of this pCell $product|concept seems to be based on the assumption that everyone wants the same Netflix movie at roughly the same time. It claims to be able to provide 100% capacity to every user. As long as they want the same Netflix movie. I'm not convinced that the speaker knows the physics or understands the No1 problem he's creating for the customer - if user 1 wants to watch porn and user 2 doesn't, then user 3 had better have the porn that user 2 wants or the 100% claim is bullshit. You can only have 100% once, not twice, if the content is different, in a given spectrum and coverage area. Yes, the coverage areas are small (probably near-field). Yes, it's new to him. It's possibly new to the audience. There are many parallels to what he|they have claimed to invent|find|do. Hell, the POTS telephone system itself is a possible analogy back to the days of dial-a-disc and speaking clock. Limited content in the case of dial-a-disc or speaking clock. Shared information at the end of the link (you set your watch by it and could tell people that asked you the time thereafter). Backhaul and coverage was extended until almost every building in the country was served by a local dissemination station that provided the service. Humans provided the P2P bit at the end. Near-field, local comms, were by acoustic dissemination between people who wanted the information/message/content. Service area is limited by the laws of physics (transmission lines, EMC issues, power). You can optimise the exchanges to improve speed (STD). For datacomms over RF we have come so far as a society from those early days that the users are far, far removed from the technology that they can't understand the physics or identify the limiting factors in their troubles. It's an exciting technology in the same way that blue smarties were exciting. They are still smarties and are fucking chocolate inside. There are only so many in a box. If you want, you could cut them up and move your mouth closer to eat them. You could pop one in your mouth and share with your neighbour if they want one. Smarties can't be duplicated. You cannot break the laws of physics, or smarties. Forgive me if I chuckle at the audience at the end. Either it's a polite audience or he's completely baffled them. I give 10/10 to the heckler, the "tech" not so much. The team are obviously having fun and I don't want to piss on that for one minute, but it's hardly a breakthrough in historical terms as well as communications technology terms. I wish them well. I feel sorry for people who buy into that expecting to use it to solve a problem unless it's a netflix problem AND everyone wants the same. Enthusiasm for something like that is hard to exude when the only TV-style media I watch are MST3K shows from the nineties. I doubt anyone nearby me would benefit even if I did rebro them locally at low power. I accept that groups of people tend to want to watch the same things, sometimes at roughly the same times. They tend to group closely together. That's a software opportunity for delivery efficiency, certainly not an RF breakthrough. There is only one 100%, even using "means diverse" as we all it. You cannot provide 101%, nor can nature, without some convenient assumptions. I wish he'd found an unlimited number of universes we could parallel up by using bonding, *that* would help. Actually, I wish he'd found a better compression algorithm for Netflix movies. PS: Sorry, due to ranting nature of this I haven't spell-checked it or read for continuity. Several phone calls interrupted it so read between the lines and paras to get the sentiment. -- sent via Gmail web interface, so please excuse my gross neglect of Netiquette
