Seems like we have a similar goal, Amaury! Cool :)

> On Mon, Jul 14, 2014 at 03:52:42AM -0700, Amaury Hernández Águila wrote:
>> Yeah that would be nice. So, isn't that a good reason to have websockets
>> in
>> PocoLisp?
> I would not say so. In a video game you have so much continuous
> communication going on (most notably the stream of image frames), that
> you don't need an extra channel.
> --

Online games don't work by feeding single frames to clients. There are now
a few companies trying to get this alive - like video streaming for games
- but this not working fluently as the amount of data to transfer when
rendering images on the server is just too big to have a reasonable
latency for real-time-games (in opposite to turn based games).

The continuous communication in a online game is chat and game logic
messages - user input going up to the server, and results of that inputs
together with results of behaviour from other agents going down to client.
There performance is essential, as the reaction time (time from input to
result) for one client is not determined by the connection of the
particular client, but determined by the worst connection (highest
latency) of all the players in the same session/environment.

-- non essential game dev stuff ---

There are numerous ways to solve this, mostly boiling down to predict
agents moves on every client (and server if you want to prevent cheating)
based on past agent behaviour ("Y started moving to the right, so it is
probably at position X by now"), and correcting that if the actual actions
differ from the predicted model (e.g. player position gets forcibly reset
to correct position, such corrections are experienced as "Lag").

Still, the basic statement that the worst client connection in an online
game influences the user experience for all users participating in the
same session holds true. This is not a problem for all games, as not all
have game mechanics involving a lot of players in the same area.

-- end of non essential game dev stuff ---

Point is, it makes sense to optimize such things for certain cases. This
also works the opposite, so one shouldn't use such optimization tools for

When different users have different limitations, it is not the right
answer to limit all users to the most restricted limitations. That turns
into artificial quality minimization.

I believe this is what is wrong with current web development, and a
problem, this the "one size fits all"-approach is sometimes just stupid,
see the rise of the "responsive design client frameworks" for websites.
User agents usually inform the server about their identity, so the server
knows you are connecting with w3m text browser, so the server should just
not attempt to make you build a websocket connection! It could even stop
sending you images, and instead server you with descriptions or start
sending them as ASCII art instead, that would save bandwidth, too.

One has just to be clear that developer time isn't optimized here.

But user experience wise it might be worthwhile. The line is blurring here
into premature optimization, but being able to scale and even work over a
bad mobile internet connection could be a real advantage.
Also, saving information overhead is saving energy and so in the end its
at least a beneficial to the environment. :P

And soon, when smart phones are rolled out to entire Africa, it would be
really cool being able to interact with this guys with real bad
connections out in wild, too.


Reply via email to