Whilst I only know what I have learned from watching a couple of the vids on 
the Artemis web site, I think we can say for sure that they aren’t proposing 
lots of smaller and smaller cells oriented around more and more transmitters, 
there also isn’t any suggestion of P2P.

In essence their technology seems to revolve around using software defined 
radios to dynamically create radio interference patterns by sending low power 
signals from multiple aerials. The interference pattern can be targeted so that 
there is a small area of positive interference that exactly coincides with 
where the receiving radio is. This targeting is automatically tuned based on 
picking up the transmitted signal from the remote device – iPhones in their 
example videos.  This whole system requires a bunch of real time processing on 
the back end using stacks of Linux servers, apparently this scales linearly so 
they can increase capacity by simply adding more tin.

What seems particularly interesting is that they say they can independently 
transmit to multiple devices at full bandwidth over the same piece of spectrum 
simultaneously using this effect. In addition because individual devices don't 
share spectrum with each other they can all talk different protocols without 
causing each other issues.

There isn’t much talk about how much bandwidth the devices get for talking back 
and how that works, so that may be a little more of a traditional problem but 
there is a comment that the individual devices need less power to send and 
consequently get better battery life, which suggests that they are also able to 
dynamically focus the receiving antennas as part of the process.

So not breaking the laws of physics but certainly working around the 
limitations we are used to dealing with in innovative ways, and if all of the 
above is true then I think they are right when they say it is a revolutionary 
use of radio.

From: uknof [mailto:[email protected]] On Behalf Of Gord Slater
Sent: 09 December 2014 16:27
To: [email protected]
Cc: [email protected]; Richard Halfpenny
Subject: Re: [uknof] High Density Wifi



On 9 December 2014 at 14:10, Christian de Larrinaga <[email protected]> wrote:
Hi Gord,

? http://www.artemis.com/pcell
Be interested to learn if anybody has got under the veil of this one?

I haven't seen that before, but I can guess having skipped through the the 
"tech" vid in a minute or less. What follows is a common rant, tailored to the  
"wow technology!" being pushed|sold|wavedabout|linked-to ....


Instead of quite a lot of radio station sites oriented in cells, which works OK 
until capacity is reached in that service area, they will use lots of little 
radio stations, effectively forming smaller and smaller cells. Nowt new - some 
cells are tiny, like femtocells. Some users use P2P methods if they want the 
same things that other do. Nowt new BitTorrent or VHS copy sharing are 
examples. To share effectively means proximity, awareness and similar needs.

Every single radio station of any form that had capacity or throughput issues 
has used a similar technique since day one. Not all had good marketing.

Reduce the range and you improve co-channel interference (as well as the side 
effect of reduced adjacent channel interference due to lower EIRPs, both wanted 
and spurious). As long as each new small, local site has not hit maximum 
capacity, for the required users in that service area, everything is sweet.
A soon as a limit is reached, you have to reduce the service areas, with 
reduced powers (which doesn't always mean reduced performance as you might 
think), by adding more cells. The process repeats until someone has a good 
idea. Limit hit, add more sites or upgrade tech to faster or wider or 
$something_normal.

Every now and again, someone has a good idea. Some of these need marketing, 
some gain traction own their own merits, some are obvious as soon as you see 
it. Some are purely theoretical and may take decades and several generations of 
kit to be practical.

In the days of morse and trained telegraphists, a station could hit it's 
capacity in various ways. Staffing limits (morse training takes time and good 
fists take decades to become good), lack of telegraphy equipment and links, 
power and cooling constraints (watercooled sub-megawatt stations existed in 
many countries), paperwork/procedural inefficiencies, weather and climatic 
problems. Originally, none of these where limitations of the "ether" as the 
transmission medium was called. Eventually, there were so many stations that 
more and more frequencies, or groups of them, known as "bands", needed to be 
used, as well as some form of planning to make sure that the could co-exist 
without saturating the medium. Lower power, shorter range stations were 
introduced, in a similar method to nano-cells. They were shorter range not 
really by design, but by chance, unused bands that were less suitable to long 
range working could be utilised for shorter range use at certain times of the 
day.

At this point in history, we have frequency division (different frequencies and 
bands), time division (operators allowing other traffic on a time-sharing 
basis), range-limiting (where long range working wasn't needed), backhaul 
limitations (onward transmission by telegraph cables), procedural optimisation 
(streamlining message handling), beamforming (using one or more directional 
antennae and exploiting them actively, often by human intervention).
At this point in the tale, we're looking at the late 1920's and early 1930s.

In the remaining 90 years, throughput speeds have increased, equipment has 
become more portable, cheaper in relative terms for the end-user (arguable), 
all-pervasive and an expected part of everyday life.

The problem of the 20's and 30's are the same today.
User expectations, which should be high, are in effect destroyed by bad 
experience of mobile communications especially in busy areas. Cells are large 
due to lack of investment and forward thinking. Backhaul is a problem, as it is 
everywhere, until you have enough. If "enough" backhaul can't scale up when 
demand goes up, you hit the problems when the limit is reached.

The actual radio media (the "ether") hasn't changed, the same laws of physics 
are the same as they were 100 years ago, and probably have always been similar 
to what we see today (or think we see), we (RF people as a subset of tech 
society) may understand them better, that's all. The obvious solution to most 
of the perceived problems are to use diverse backhaul (install femtocells on 
xDSL all over the place), or massively diverse backhaul (use technologies like 
bittorrent or similar P2P) to provide more and more sites (cells) to provide 
the service to the users.

Nearly 90 years ago the same thing happened:-

More radio stations were installed where needed to provide capacity as needed, 
not just fill in gaps in coverage
..and..
Radio ops would share the weather and traffic lists and even rebroadcast it to 
stations further away or with less advanced equipment, using diverse methods.


That makes 3 assumptions - customers had to be willing to pay, companies 
wouldn't overspend or under-provide and that weather was wanted by many people.

he selling point of this pCell $product|concept seems to be based on the 
assumption that everyone wants the same Netflix movie at roughly the same time. 
It claims to be able to provide 100% capacity to every user. As long as they 
want the same Netflix movie.

I'm not convinced that the speaker knows the physics or understands the No1 
problem he's creating for the customer - if user 1 wants to watch porn and user 
2 doesn't, then user 3 had better have the porn that user 2 wants or the 100% 
claim is bullshit. You can only have 100% once, not twice, if the content is 
different, in a given spectrum and coverage area. Yes, the coverage areas are 
small (probably near-field). Yes, it's new to him. It's possibly new to the 
audience. There are many parallels to what he|they have claimed to 
invent|find|do.
Hell, the POTS telephone system itself is a possible analogy back to the days 
of dial-a-disc and speaking clock. Limited content in the case of dial-a-disc 
or speaking clock. Shared information at the end of the link (you set your 
watch by it and could tell people that asked you the time thereafter). Backhaul 
and coverage was extended until almost every building in the country was served 
by a local dissemination station that provided the service. Humans provided the 
P2P bit at the end. Near-field, local comms, were by acoustic dissemination 
between people who wanted the information/message/content. Service area is 
limited by the laws of physics (transmission lines, EMC issues, power). You can 
optimise the exchanges to improve speed (STD).

For datacomms over RF we have come so far as a society from those early days 
that the users are far, far removed from the technology that they can't 
understand the physics or identify the limiting factors in their troubles.
It's an exciting technology in the same way that blue smarties were exciting. 
They are still smarties and are fucking chocolate inside. There are only so 
many in a box. If you want, you could cut them up and move your mouth closer to 
eat them. You could pop one in your mouth and share with your neighbour if they 
want one. Smarties can't be duplicated. You cannot break the laws of physics, 
or smarties.

Forgive me if I chuckle at the audience at the end. Either it's a polite 
audience or he's completely baffled them. I give 10/10 to the heckler, the 
"tech" not so much.

The team are obviously having fun and I don't want to piss on that for one 
minute, but it's hardly a breakthrough in historical terms as well as 
communications technology terms. I wish them well. I feel sorry for people who 
buy into that expecting to use it to solve a problem unless it's a netflix 
problem AND everyone wants the same.
Enthusiasm for something like that is hard to exude when the only TV-style 
media I watch are MST3K shows from the nineties. I doubt anyone nearby me would 
benefit even if I did rebro them locally at low power. I accept that groups of 
people tend to want to watch the same things, sometimes at roughly the same 
times. They tend to group closely together. That's a software opportunity for 
delivery efficiency, certainly not an RF breakthrough.

There is only one 100%, even using "means diverse" as we all it. You cannot 
provide 101%, nor can nature, without some convenient assumptions. I wish he'd 
found an unlimited number of universes we could parallel up by using bonding, 
*that* would help. Actually, I wish he'd found a better compression algorithm 
for Netflix movies.


PS: Sorry, due to ranting nature of this I haven't spell-checked it or read for 
continuity. Several phone calls interrupted it so read between the lines and 
paras to get the sentiment.

--
sent via Gmail web interface, so please excuse my gross neglect of Netiquette

________________________________

This communication contains information which is confidential and may also be 
privileged. It is for the exclusive use of the intended recipient(s).

If you are not the intended recipient(s) please note that any distribution, 
copying or use of this communication or the information in it is strictly 
prohibited. If you have received this communication in error please notify us 
by email ([email protected]) or by telephone +44 (0)203 3716666 and then 
delete the email from your system together with any copies of it.

All communication sent to and from Modrus Limited is subject to monitoring of 
content. By using this method of communication you give consent to the 
monitoring of such communications. Any views or opinions are solely those of 
the author and do not necessarily represent those of the companies listed below 
unless specifically stated.'

Modrus Limited is registered in England and Wales (no. 5022857). The registered 
office of Modrus Limited is Midland House, 2 Poole Road, Bournemouth, Dorset 
BH2 5QY.

Reply via email to