Re: [Opensim-users] looking for full viewer-server API and communication details.

2019-01-21 Thread Serendipity Seraph
Thanks.

Yeah.  My idea concerning UDP was to substitute a websocket to the front
end with a the backend converting to/from whatever the server expects.

On Thu, Jan 17, 2019 at 1:28 PM Dahlia Trimble 
wrote:

> There's a file that is part of the viewer distribution:
> message_template.msg which defines the UDP packet layouts. There's some
> minimal documentation on the SL wiki at
> http://wiki.secondlife.com/wiki/Protocol
>
> libopenmetaverse has a protocol analysis tool: wingridproxy.exe which can
> intercept, display, and modify various messages sent over UDP and HTTP
> between the viewer and the server. It may not be fully updated to the
> latest protocols used in SL but may still work with OpenSImulator.
>
> If you're planning on creating a web-capable viewer you wont be able to use
> much of these protocols as web browsers cannot support UDP. You'll need to
> use something else.
>
> On Thu, Jan 17, 2019 at 11:40 AM Adam Frisby 
> wrote:
>
> > Yeah - I don't know if anyone has done it since my time, but yes - there
> > was very little documentation of it.
> >
> > If my memory serves me correctly; there was a packet description file
> > served by the viewer, which we then compiled into C# classes (either
> > automated or by hand, I forget) which went into libOMV.
> >
> > Unfortunately I think a lot of the real knowledge was on IRC which has
> > been lost to time. Broadly speaking much of the protocol makes implicit
> > sense; two things are in my memory though as being challenging to
> discover.
> >
> > The first was the packet acknowledgement system, it relies on a lot of
> > arcane timing for it to work correctly, and the second was they use a
> > custom RLE scheme called ZLE which is RLE but for zero values only.
> >
> > That said, if you're really serious about redeveloping the protocol -
> > tweaking this one is not a good idea. There's a lot of good free options
> -
> > RakNet is now free and open source, for example, and is behind a *lot* of
> > games and MMOs.
> >
> > Netcode is hard. Flee in terror.
> >
> > Adam
> >
> > -Original Message-
> > From: opensim-users-boun...@opensimulator.org <
> > opensim-users-boun...@opensimulator.org> On Behalf Of Marcus Llewellyn
> > Sent: Thursday, 17 January 2019 12:00 PM
> > To: opensim-users@opensimulator.org
> > Subject: Re: [Opensim-users] looking for full viewer-server API and
> > communication details.
> >
> > To my knowledge, the protocol has never really been documented. Those
> that
> > knew it well have either moved onto other pursuits or (sadly) passed
> away.
> >
> > In this case, a good place to look is at libopenmetaverse. It is a C#
> > implementation of the protocol (and other things), and in this case the
> > code and some samples might serve as documentation of a sort. You can
> find
> > it at the following link:
> > https://github.com/openmetaversefoundation/libopenmetaverse (
> >
> https://link.getmailspring.com/link/1547686550.local-639449c7-58b0-v1.5.5-b7939...@getmailspring.com/0?redirect=https%3A%2F%2Fgithub.com%2Fopenmetaversefoundation%2Flibopenmetaverse=b3BlbnNpbS11c2Vyc0BvcGVuc2ltdWxhdG9yLm9yZw%3D%3D
> > )
> >
> > On Jan 16 2019, at 5:16 pm, Serendipity Seraph 
> > wrote:
> > > I looked briefly at code for the Singularity Viewer but it was not
> > > obvious what the API calls and information flows are in their
> > > fullness. What document defines the full API and communication details
> > > between opnensim/SL clients and servers? I have looked via Google a
> > > few times without much certainty I have found the right stuff.
> > >
> > > Thanks!
> > > ___
> > > Opensim-users mailing list
> > > Opensim-users@opensimulator.org
> > > http://opensimulator.org/cgi-bin/mailman/listinfo/opensim-users
> > >
> >
> > ___
> > Opensim-users mailing list
> > Opensim-users@opensimulator.org
> > http://opensimulator.org/cgi-bin/mailman/listinfo/opensim-users
> > ___
> > Opensim-users mailing list
> > Opensim-users@opensimulator.org
> > http://opensimulator.org/cgi-bin/mailman/listinfo/opensim-users
> >
> ___
> Opensim-users mailing list
> Opensim-users@opensimulator.org
> http://opensimulator.org/cgi-bin/mailman/listinfo/opensim-users
>
___
Opensim-users mailing list
Opensim-users@opensimulator.org
http://opensimulator.org/cgi-bin/mailman/listinfo/opensim-users


[Opensim-users] CFP: Special issue on "Innovations in XR & Immersive Technologies for Learning"

2019-01-21 Thread Lee, Mark
+++

Call for Papers for a Special Issue of the Springer journal
* Virtual Reality * on 


INNOVATIONS IN XR AND IMMERSIVE TECHNOLOGIES FOR LEARNING


Guest Editors: Mark J. W. Lee, Minjuan Wang, and Dennis E. Beck 
for the Immersive Learning Research Network

ISSN: 1359-4338 (Print), 1434-9957 (Online)

http://link.springer.com/journal/10055

SCI Impact Factor: 1.375 (2017)

+++

[A PDF version of this Call is available at https://bit.ly/2vdyKEL ]

XR refers to technology-mediated experiences that combine digital and 
biological realities. Technologies supporting the creation of XR encompasses a 
wide range of hardware and software, including sensory interfaces, 
applications, and infrastructures, that enable content creation for virtual 
reality (VR), mixed reality (MR), augmented reality (AR), cinematic reality 
(CR), 360-degree video, and more. With these tools, users generate new forms of 
reality by bringing digital objects into the physical world and/or bringing 
physical world objects into the digital world. XR technologies have 
applications in all sectors of education and training, from early schooling 
through to higher education, workforce development, and lifelong learning.

This special issue of the Springer journal Virtual Reality is being created in 
conjunction with a special track on “XR and Immersive Learning Environments” at 
the IEEE TALE 2018 Conference (http://www.tale2018.org/xr), co-organized with 
the Immersive Learning Research Network (iLRN at http://www.immersivelrn.org/). 
Both the special issue and special track focus on the use of XR for creating 
environments and experiences that excite, inspire, and engage learners in 
immersive ways. Of interest are reports of both research studies and 
applications covering the entire spectrum of immersive platform types, 
including desktop, mobile, wearable and room-based (e.g., CAVE). 
Interdisciplinary contributions are especially welcome, and authors are 
encouraged to think creatively in terms of how they might frame their work to 
accommodate different conceptions of and perspectives on immersion.

Most importantly, in order to be considered for publication in the special 
issue, papers must demonstrate a potential to help advance research and/or 
practice in the field of XR from a technical, theoretical/conceptual, 
empirical, and/or methodological perspective. Papers that engage deeply with 
the implications for the broader XR field arising from the work will be given 
higher priority, while papers focusing largely on reporting applications of the 
technology within educational/learning contexts are also welcome but will 
receive lower priority.


POTENTIAL TOPIC AREAS

1. Pedagogy and learning design for XR and immersive environments
2. Technical infrastructure and standards for supporting XR and immersive 
learning
3. XR and immersive technologies in early childhood and K-12 education
4. XR and immersive technologies in higher education
5. XR and immersive technologies in vocational/workplace training
6. XR and immersive technologies in informal and lifelong learning
7. Collaborative learning (co-located or distributed) with XR and immersive 
technologies
8. Simulation-based learning with XR and immersive technologies
9. Intelligent, adaptive, and personalized learning in XR and immersive 
environments 
10. Serious games for learning based on XR and immersive technologies
11. Promoting access and equity in education through XR and immersive 
technologies


PAPER TYPES

- Original research paper*
- Theoretical/conceptual paper
- Position paper

*Submission of data with manuscripts is encouraged, but not required. Also, 
links to online locations from which the XR-based immersive environments may be 
accessed and/or downloaded is strongly encouraged.


MANUSCRIPT PREPARATION AND SUBMISSION

For author guidelines and submission instructions, please see the journal’s web 
site at https://www.springer.com/computer/image+processing/journal/10055 . 
Manuscripts will not be accepted via email.


IMPORTANT DATES

- Full manuscripts due: February 22, 2019 (EXTENDED)
- Notification of review outcomes: May 3, 2019
- Revised manuscripts due: June 28, 2019
- Anticipated publication of special issue: Late 2019

Though not mandatory, prospective authors for the special issue may wish to 
submit shorter, preliminary versions of their papers for presentation at TALE 
2018, with a view to further developing and expanding those papers for 
consideration for the special issue (subject to additional peer review in 
accordance with the journal’s policies). For those interested in this option, 
the Call for Papers for the TALE 2018 special track, including applicable 
guidelines and dates, can be found at https://www.tale2018.org/xr .


GUEST EDITORS

- Mark J. W. Lee – Adjunct Senior Lecturer, School of Education,