Hi guys, Well, I'm having some trouble with the playback implementation, what I had in mind was was getting a copy of the deltas from the DeltaStoreBasedSnapshotStore that WaveletContainerImpl uses, and creating a new temporal Wavelet to whom the Deltas would be applied, and then rendered gradually. This would create a very simple playback that would only support a "forward apporach"; it would go from the very start of the wave, to the last change done, step by step (of course, for testing purposes).
But what I found (or think), is that when creating a new Wavelet, when Deltas are applied to it, the changes "persist", I mean, they are stored, in DeltaRecords and added to the DeltaStoreBasedSnapshotStore...which, would be redundant, since I would already be getting the Deltas from a store....So, the question is this, would you recommend this approach? and, if so, I guess I would have to create alternate methods that don't store the Deltas, or, add a playback flag, that would save or not save the deltas in the store depending on it's value. Another thing, is that I still don't understand well the rendering process. The deltas are stored in a "tree" of some sort.. and from what I saw, when changes are made, the client doesn't render the changes, it renders documents, so, at some point, the changes are applied to document structures and are rendered afterwards. When changes are made, are the deltas responsible for implementing the changes made to the document structures? Or, are the changes caught, applied, and then stored in the deltas? Another approach would be gradually storing snapshots of the documents (of the wavelet) and gradually displaying them, but perhaps this would create too much overhead. Though, it would be easier to display things, since it wouldn't be deltas in their raw instance. What do you guys think? Thanks On Thu, Feb 24, 2011 at 5:42 AM, Paul Thomas <[email protected]> wrote: > Which is why I put "embedded" in quotes. > > You should be able to view them separately. You might have a recipient that > that > isn't in the parent wavelet. > > Sulwavelet are only one possible way of "relating" wavelets and other > things in > together. This is a subject that I touched on before but isn't really a > remit of > WAIB, but is an idea for the future of wave. > > > How are things related on the web? One of the oldest forms of relation is a > hyperlink. There is no inherent specific relationship between the link and > source, it is psuedo-graphable, becuase there isn't really a true inherent > relationship there. Backlinking, trackback, pingback, tagging, hashing, > search > engine indexing, semantics are all attempts at improve on aspects of that. > subwavelts are a way of relating wavelets, however it implies that the > there is > definate hierarchy, the wavlets are nested. However there is no reason why > you > could just relatate wavelets in a non-nested way and in a way that is > inherently > graphable. They could be represented in a graph as well as in tabs, or > whatever. > > > On top of that any open social type app doc instance like with gadgets > could be > related in a non embedded form or represented as both tab and inline. > Gadget > themselves might not be powerful or versitile enough for every type of > app-doc > instance you would want related. However open social could still provide > the app > interface. > > > A the moment there is only really the conversation model. However this is > only > one possible model. In a project you communicating in many different way > and > working in different mediums. All of thing could be related could relate > together more holistically, and improve project management. > > > Heirin lies a problem how to distribute new models and the agent code that > provides the behind the scenes functionality? Client side optamistic > distributable agents for example caja can help with general automation on > existing infrastructure/models that allow it. It could be viable to have a > declarative language for serverside distributable agents, that could go do > the > sort of filtering operations that some robots to but in a less > optimistically > and faster with more volume. This could make access control, "moderation", > and > other forms of automation more flexible. it doesn't however make new > capabilities possible. However non declartive languages would be more of > problem > becuase most don't have the kind of sandboxing capabilities the likes of > appengine. > > > > > ----- Original Message ---- > From: Michael MacFadden <[email protected]> > To: [email protected] > Sent: Thu, 24 February, 2011 0:14:24 > Subject: Re: is wave playback a priority right now? > > Technically, wavelets aren't embedded in one another. All wavelets are > equal, > in that there is no hierarchy of wavelets. The conversational model simply > puts > them in to the appropriate place during rendering. > > ~Michael > > On Feb 23, 2011, at 4:03 PM, Paul Thomas wrote: > > > Subwavlets are just wavelets, except they are "embedded" within another > >wavelet. > > > > Like in google wave when you want to send a private message within the > context > > > of a wave. You could do that with any recipients. > > > > > > > > ----- Original Message ---- > > From: Thomas Wrobel <[email protected]> > > To: [email protected] > > Cc: Sean Wendt <[email protected]> > > Sent: Wed, 23 February, 2011 20:30:51 > > Subject: Re: is wave playback a priority right now? > > > > Would a subwavelet just be, metaphorically, akin to an iFrame? > > > > I could see that being pretty usefull, but also potentialy pretty > > complex. I mean, would all the members of the overall wave have to > > also be invited to the subwave to see it? > > > > -Thomas > > arwave.org > > > > > > ~~~~~~ > > Reviews of anything, by anyone; > > www.rateoholic.co.uk > > Please try out my new site and give feedback :) > > > > > > > > On 23 February 2011 21:00, Sean Wendt <[email protected]> wrote: > >> I haven't heard about subwavelets before. Is it in the whitepapers? > >> > >> On Wed, Feb 23, 2011 at 00:21, Paul Thomas <[email protected]> > wrote: > >> > >>> The simplest would just be to have subwavelts. In that they are two > >>> wavelets > >>> with distinct histories, but one is in the other. But regardless their > >>> histories > >>> are only interweaved after merging. If you are copying over, then > >>> esscailly > >>> that is a snapshot with no history. subwavelts they are distinct > entities. > >>> The > >>> exist both separately and with one embedded in another. > >>> > >>> > >>> > >>> > >>> ----- Original Message ---- > >>> From: Sean Wendt <[email protected]> > >>> To: [email protected] > >>> Sent: Tue, 22 February, 2011 22:55:35 > >>> Subject: Re: is wave playback a priority right now? > >>> > >>> I was contemplating the things said in the "What is Google Wave?" video > >>> about the monster: email is bad because of the fragmentation of a > >>> conversation thread, so wave unifies the history and you can invite > >>> everyone > >>> and it is good. > >>> > >>> But what if you wanted to apply the same improvement to wave itself and > >>> merge two waves? > >>> Is crossposting currently the only option? > >>> If we made merging possible, would the result have a history-'tree' > with > >>> its root in then present? Otherwise, how would the histories be weaved > >>> together? How would the separate contents of the two waves appear after > >>> merging? > >>> Or should merging be done using linking, with a blip as a mountpoint > >>> inside the wave, which you can then also use to mount userspace > filesystems > >>> from your computer, on the moon, sitting on a pile of pizza, made of > >>> kryptonite. Ignore that last part. > >>> > >>> On Tue, Feb 22, 2011 at 16:41, Juan Antonio Osorio < > [email protected] > >>>> wrote: > >>> > >>>> Hi all, > >>>> > >>>> We currently need the playback because we are developing a wave-based > >>>> application with educational purposes. This application is part of a > >>>> research > >>>> involving "context intelligence", which, in this particular case, is > how > >>>> students > >>>> interact in a collaborative environment such as the Wave Client. In > this > >>>> application, playback is a key feature, because it gives researchers > the > >>>> possibility of looking at the student's progress throughout various > >>> tests, > >>>> thus > >>>> being a very important source of information. > >>>> > >>>> Students will be solving "Database modelling"-like problems, and > creating > >>>> ER Diagrams (using a Gadget that was previously programmed for GWave). > >>>> > >>>> What I was thinking for the playback, was using the WaveletContainer > >>> that's > >>>> stored locally, and applying the Deltas (that where stored in the > >>>> DeltaStoreBasedWaveletState) to a temporary Document that would be > >>>> created exclusively for playback purposes. > >>>> > >>>> We're still not sure how to implement this but hopefully, we'll get a > >>>> clearer > >>>> perspective of the code within these weeks. Thanks for the ideas, and > any > >>>> more suggestions are more than welcome. > >>>> > >>>> On Tue, Feb 22, 2011 at 8:22 AM, Thomas Wrobel <[email protected]> > >>>> wrote: > >>>> > >>>>> Speaking as a gwave user, the most useful aspect of playback was when > >>>>> something went wrong, accidental deletion or copying over of content. > >>>>> (usually with a crash too if the wave was big...) > >>>>> Being able to revert to a previous version via the playback was the > >>>>> easiest way to solve the problem. > >>>>> > >>>>> Speaking as someone working on a Augmented Reality use case for > >>>>> wfp...essentially not dealing with text at all, but 3d models placed > >>>>> and positioned with data stored in blips. The idea of playing back > the > >>>>> whole creation of a 3d scene is very appealing. (especially if the > >>>>> scene was made by a large group collaboration). > >>>>> > >>>>> So those are the two use's in my mind and while I have no real > >>>>> knowledge of the inner workings of the wave client or server beyond > an > >>>>> overview of the protocol, it seems having "key frames" might be the > >>>>> best compromise solution between storage and loading speed. > >>>>> > >>>>> -Thomas > >>>>> arwave.org > >>>>> > >>>>> ~~~~~~ > >>>>> Reviews of anything, by anyone; > >>>>> www.rateoholic.co.uk > >>>>> Please try out my new site and give feedback :) > >>>>> > >>>>> > >>>>> > >>>>> On 22 February 2011 15:06, Paul Thomas <[email protected]> wrote: > >>>>>> Thanks is interesting. > >>>>>> > >>>>>> One point of playback is to quickly get updated on what you have > >>>> missed. > >>>>> So > >>>>>> therefore you don't really have to have have every singe change. > >>>>>> > >>>>>> It is kind of like flicking through the unread blips, except that > >>>> doesn't > >>>>> have > >>>>>> blip level history. I would be good if you could flick through > unread > >>>>> changes. > >>>>>> > >>>>>> Using history to revert or fork wave might be used less often so > that > >>>>> sort of > >>>>>> history doesn't need to be played back smoothly, it just needs to be > >>>>> usable. > >>>>>> > >>>>>> > >>>>>> > >>>>>> > >>>>>> ----- Original Message ---- > >>>>>> From: David Hearnden <[email protected]> > >>>>>> To: [email protected] > >>>>>> Sent: Tue, 22 February, 2011 12:43:23 > >>>>>> Subject: Re: is wave playback a priority right now? > >>>>>> > >>>>>> Hi Gerardo, > >>>>>> > >>>>>> It depends on what kind of playback experience you would like. > >>>>>> > >>>>>> In Google Wave, playback does not necessarily play things > >>>>> chronologically, > >>>>>> but instead can reorder things to make the history simpler. e.g., > if > >>>> two > >>>>>> users A and B are concurrently adding their own replies, then the > >>>>> playback > >>>>>> history can show A's complete reply as one history frame, then B's > >>>> reply > >>>>> in > >>>>>> a subsequent frame, even though there was no point in chronological > >>>>> history > >>>>>> where A's reply was complete and B hadn't started replying ...if > >>> that > >>>>> makes > >>>>>> sense. So mild reordering of the operation history in order to make > >>> it > >>>>>> simpler is one complex part of playback. > >>>>>> > >>>>>> Another part of playback is grouping segments of history into > >>> "frames", > >>>>>> where the boundaries between frames are historically interesting > >>> events > >>>>>> (starting editing, stopping editing, participants being added and > >>>>> removed, > >>>>>> etc). Finding a good set of rules to group operations into useful > >>>> frames > >>>>> is > >>>>>> another complex part of playback. > >>>>>> > >>>>>> Being able to step backwards as well as forwards adds more > >>> complexity, > >>>>>> because of the difference between "reversible" and "invertible" ops > >>>> (the > >>>>>> inverse of an invertible op is derivable from the op itself; the > >>>> inverse > >>>>> of > >>>>>> a reversible op, however, depends on the state to which it is > >>> applied). > >>>>>> > >>>>>> There are many other cases where adding some improvement to the > >>> feature > >>>>> can > >>>>>> add significant complexity, e.g., efficiently moving wave state > >>> between > >>>>> two > >>>>>> frames points in history, rather than applying all the operations > one > >>>> by > >>>>>> one. > >>>>>> > >>>>>> So starting out with a simple goal of just playing back the > >>> operations > >>>>>> individually, in order to play forwards through history, would be a > >>>> good > >>>>>> start. Perhaps adding in some simple framing (no re-ordering) to > >>> group > >>>>> ops > >>>>>> based on timestamp so that chunks of edits appear as a single frame? > >>> I > >>>>>> think that would be the start of a reasonably usable playback > >>> feature. > >>>>> The > >>>>>> web client can create a wave model on an empty state, stub out the > >>>>> incoming > >>>>>> operation stream component (MuxConnector) with a new one that's > >>> hooked > >>>> up > >>>>>> some play/pause UI control, and fetch the entire operation history > >>> from > >>>>> the > >>>>>> server, putting those ops in the operation stream based on that UI > >>>>> control. > >>>>>> It will be probably be quite slow, and won't scale for waves with > big > >>>>>> history, but it's certinaly a great start. > >>>>>> > >>>>>> Beyond that, you'd probably want to have a separate endpoint (maybe > >>>> even > >>>>> a > >>>>>> separate protocol, rather than the client/server operation protocol) > >>>> for > >>>>>> delivering a more compact representation of the history to the > >>> client. > >>>>>> e.g., do some basic framing, and compose the ops in each frame > >>> together > >>>>> to > >>>>>> only a few ops per frame. That will significantly reduce the > >>>> client-side > >>>>>> processing, and sounds reasonably doable right now. > >>>>>> > >>>>>> -Dave > >>>>>> > >>>>>> On Tue, Feb 22, 2011 at 6:31 AM, Gerardo Lozano < > [email protected] > >>>> > >>>>> wrote: > >>>>>> > >>>>>>> What would be the best way to approach playback implementation? > >>>>>>> > >>>>>>> This is what we've got: > >>>>>>> > >>>>>>> We've been looking at the code for the past few days now, and we > >>> think > >>>>> that > >>>>>>> a good approach is to somehow get the a history of the wavelet > >>> deltas > >>>>>>> (either from memory of from store) and then either apply the delta > >>>> (done > >>>>>>> with or in an Instance WaveletState) or append (done with or in an > >>>>> instance > >>>>>>> of WaveletProvider them each time the playback is requested. > >>>>>>> > >>>>>>> To us, it seems that the most viable way to implement playback is > to > >>>> get > >>>>>>> the > >>>>>>> delta history from the store (with last week's implementation) and > >>>> then > >>>>>>> somehow build up from that. > >>>>>>> > >>>>>>> What would you guys recommend doing? > >>>>>>> > >>>>>>> > >>>>>>> > >>>>>>> 2011/2/8 James Purser <[email protected]> > >>>>>>> > >>>>>>>> Not at the moment, but if anyone wants to pick it up and run with > >>>> it, > >>>>>>> then > >>>>>>>> please feel free :) > >>>>>>>> > >>>>>>>> James > >>>>>>>> > >>>>>>>> On Wed, Feb 9, 2011 at 5:17 AM, Yuri Z <[email protected]> wrote: > >>>>>>>> > >>>>>>>>> AFAIK - playback is not a priority at the moment and no one is > >>>>> working > >>>>>>> on > >>>>>>>>> it. If someone does - please correct me. > >>>>>>>>> > >>>>>>>>> 2011/2/8 Gerardo Lozano <[email protected]> > >>>>>>>>> > >>>>>>>>>> Hi everybody! > >>>>>>>>>> > >>>>>>>>>> Is anybody planning on working on wave playback? This is on > >>> the > >>>>> WIAB > >>>>>>>>>> roadmap, but it has a blank status. > >>>>>>>>>> > >>>>>>>>>> Thanks! > >>>>>>>>>> > >>>>>>>>>> -- > >>>>>>>>>> > >>>>>>>>>> Gerardo L. > >>>>>>>>>> > >>>>>>>>> > >>>>>>>> > >>>>>>> > >>>>>>> > >>>>>>> > >>>>>>> -- > >>>>>>> > >>>>>>> Gerardo L. > >>>>>>> > >>>>>> > >>>>>> > >>>>>> > >>>>>> > >>>>>> > >>>>> > >>>> > >>>> > >>>> > >>>> -- > >>>> Juan Antonio Osorio R. > >>>> e-mail: [email protected] > >>>> > >>>> "All truly great thoughts are conceived by walking." > >>>> - F.N. > >>>> > >>> > >>> > >>> > >>> > >>> > >> > > > > > > > > > > > > -- Juan Antonio Osorio R. e-mail: [email protected] "All truly great thoughts are conceived by walking." - F.N.
