Dear authors,

Your work is excellent, and here are some comments I have.

In the draft, to extend ALTO in cellular networks to support context based 
service is innovative and significant. In my view, Use Case 1 may face several 
challenges before being applied, but Use Case 2 makes a lot of sense for the 
future mobile network.

In Use Case 1, the challenges to bring ALTO into wireless world can be 
concluded as following.
1. The RF cost is hard to be measured for real-time services because only 
considering the load is not enough. The wireless channel consists of larger 
scale fading and small scale fading, where the small scale fading (also called 
frequency-selective fading) is caused by multi-path effects and Doppler 
effects. The transmission performance at different frequency points varies, 
which is why cellular networks adopt per-RB scheduling. To transmit the delay 
sensitive data, the decisions for selecting cells by using the cost map may be 
not effective.
2. With regard to the “unattended data” (UD), my understanding is that they are 
tailed for the “context-aware” transmission, which is an important topic for 
the next generation of mobile network to realize “user centric” network. It is 
good, but I doubt that the amount of UD may be not as much as you think. And 
the data is not mandatory and is delay tolerant, therefore using another cell 
to transmit seems to be not that necessary.
3. The overhead of signaling to apply ALTO seems to be considerable. Such data 
should belong to the control plane and transmitted by the RRC (Radio Resource 
Control) container. Only the RRC establishment and re-configuration will 
trigger this transmission, and the control plane signals are designed to be as 
short and reliable as possible. But considering the data size for network map 
request/response and the interval for updating, the overhead may be relatively 
high.

In Use Case 2, the access-aware endpoint selection is more reasonable. The 
transportation (backbone) network and the Core Network (CN) are more stable 
compared with the Radio Access Network (RAN), and they are more likely to be 
the bottleneck in today’s mobile network. For example, for WiFi and LTE, when 
the conditions in RAN are similar, then the QoE is largely dependent on the 
selection of accessed gateway to PDN, especially for some video stream 
downloading service. The only challenge lies here is the technique for 
“Multi-RAT” coordination, which is also a big direction in 5G.

Issues:
1. In the 1st paragraph of section 2.1, when describing LAOC can be associated 
to several cells, it is needed to state that the control plane of LAOC is 
anchored at the serving cells, and all the signals are transmitted with the 
serving cells.
2. In the 1st paragraph of section 2.1, one can easily think of the “Duel 
Connectivity”, which is supported in 3GPP and described as a UE can transmit 
data with two cells at the same time, specifically with a macro cell and a 
small cell. But in this draft, it seems to be multi-connectivity, that I don’t 
think 3GPP supports.
3. In the 2nd paragraph of section 2.2, the words “serving PGW” may confuse the 
audience by another entity SGW. SGW (Serving Gateway) is the gateway which 
terminates the interface towards E-UTARN. For each UE associated with the EPS, 
at given point of time, there is a single Serving GW. The PGW is the gateway 
which terminates the SGi interface towards PDN. If UE is accessing multiple 
PDNs, there may be more than one PGW for that UE. I assume the authors wanted 
to say PGW, but the words “serving PGW” may bring misunderstanding.
4. In section 2.2, a UE can dynamically choose Wifi or LTE for transmission 
according to the endpoint cost. One important element that the author need to 
announce is the handover, i.e., how to execute the handover and how to make 
sure the data is not dropped when doing handover.
5. In the example of Use Case 1, I’m curious about the data size for the cost 
map, say how many bits it will be roughly.
6. In the example of Use Case 2, I’m curious about how to calculate the cost 
value of WiFi and LTE. They are two RATs, and they all include both RAN and 
backbone networks.

*Suggestion:
The concept of cost map continues to be used. However, according to the map, 
the decision is made by the base station or ALTO server instead of the UE.
In the draft, the procedure can be described as:
UE-->base station (query for the map)
base station-->UE (return the map)
UE-->base station (make the association decision)
base station-->UE (resource allocation signal for scheduling in PDCCH/PUCCH)
UE<-->base station (transmit data, both uplink or downlink)
My suggestion is to move the decider from UE to base station, and the station 
(or ALTO server) just return the final association result, then they can start 
to transmit data. The procedure becomes:
UE-->base station (query for the map)
base station-->UE (make decision according to the map, then resource allocation 
signal for scheduling in PDCCH/PUCCH)
UE<-->base station (transmit data, both uplink or downlink)
In that case, there are two advantages. First, it can reduce the overhead and 
signal transmitting to realize ALTO; second, it can reduce the processing load 
of UE.

Hope these are helpful.

Best, Geng
Computer Science, Yale


_______________________________________________
alto mailing list
[email protected]
https://www.ietf.org/mailman/listinfo/alto

Reply via email to