On Wednesday, March 12, 2014 10:01:27 PM UTC+1, Tom wrote:
>
> Hi Thomas, I knew how to use "optimistic locking" when updating the data, 
> but my question is not about updating, but about viewing data. For example, 
> you build a Question Answer system like Stackoverflow.com.
>
> Let say the user "A" create the question  "What is car?", the user "B" saw 
> "What is car?" question and start to answer "Car is ....". But before user 
> "B" submitting the answer, a user "A" modified the question to "What is 
> bicycle?". 
>
> So user "B" did not modify the question, but he just submitted the answer 
> to the question. However, in this case he submitted the answer to the OLD 
> question, not the latest updated question.
>
> What you & Jens mentioned is about many people tried to modified the same 
> question. I knew how to deal with it, but my question is about "User 
> submitted data based on the old info".
>
> This is Stackoverflow problem. For example, many times I provided the 
> question & later I changed my question a bit, but many people already 
> answered my old question not the latest ones. So that created some kind of 
> confusion as I had to explained to the people (who answered my question) 
> that I changed my question so that they can change their answer. However 
> most people don't change their answers even I changed my question. 
> Stackoverflow didn't have a mechanism to lockup data in that situation. 
>
> So, in your point of view, do u think what Stackoverflow did is good 
> enough or they need to manage things better? 
> & that is my question "Do we need to check the consistency of all data on 
> 1 page whenever we get new related data" about?
>

Well, SO “updates local data from the server as often as possible” (telling 
you “there's a new answer” and/or “the question (or answer) has been 
edited”, and allowing you to load the changes before you save yours) and 
“provides an history of changes so the user can easily revert to any 
previous value” (and see what the changes from other users were).
So yes, I think they do it right, and as best as it can possibly be done 
(note: I have no idea how they handle concurrent edits of the same 
question/answer, possibly just saving the latest on top of the previous 
one, with a warning that you should look at the history of changes; that's 
more or less how I'd do it at least)
 

>
> On Thursday, March 13, 2014 2:13:16 AM UTC+11, Thomas Broyer wrote:
>>
>> I tend to think that robust webapps need to a) save data as early as 
>> possible (e.g. as soon as you exit the field) and b) update local data from 
>> the server as often as possible, and if possible c) provide a mean to undo 
>> the last auto-save, or provide an history of changes so the use can easily 
>> revert to any previous value.
>> This basically only applies to "live" data though; for data that needs to 
>> be frozen at some point (e.g. that goes through a workflow for processing), 
>> then you'd either 1) first lock edits by other users to be sure that what 
>> the user sees is what the server knows and then only after some last-minute 
>> edits freeze the data, or 2) freeze the local data (still allowing other 
>> users to work on the actual data, but no longer updating the local data to 
>> reflect those changes) and then after some last-minute edits saves a frozen 
>> *copy* of the local data.
>>
>> Oh, and it gets worse when you take into accounts bad connectivity 
>> (happens more and more as people use are “in mobility”)
>>
>> The ideal scenario is "add only" data because there can never be a 
>> conflict, or if you can't, then "save a copy on conflicts" (this is what 
>> GMail does for example, when you edit a draft mail from 2 distinct devices: 
>> you'll end up with 2 drafts if the edits weren't all sequential; 
>> technically speaking, this is optimistic locking with an "automatically 
>> save a copy" conflict resolution strategy; note: this is what I think they 
>> do from what I observed, I don't work for Google so I don't know the exact 
>> details). And if you can't, then have a look at Operation Transforms and 
>> other similar techniques.
>> If you're short on money or time, then use (so-called “pessimist”) 
>> locking as Jens explained; unless you can define a clear conflict 
>> resolution strategy, in which case go for optimistic locking.
>>
>> Note that the issue you point out does not only apply to Web 2.0 apps, 
>> but also to all the non-web apps (aka native apps) out there!
>> …and it's a real pain.
>>
>> On Wednesday, March 12, 2014 3:13:00 PM UTC+1, Tom wrote:
>>>
>>> In the Web2.0 Application (normally using the Ajax-typed technique), we 
>>> normally don't need to download all data to One page.
>>>
>>> We only need to download the necessary data (say "begin data") & when we 
>>> need other data (say "other related data") then we just need to call them.
>>>
>>> However, there is a problem. That is, at the time we downloaded the 
>>> "begin data" we got the latest up-to-date data. We then use that data for 
>>> many hours, but at that time the "begin data" in the database was changed 
>>> by other user. 
>>>
>>> Ok, now we has the old "begin data" on the page of our Web2.0 app & we 
>>> also have the new (latest) "begin data" in the Database.
>>>
>>> Then, we get the "other related data". At this time, the user didn't 
>>> know that the "begin data" got updated, so they continue to work on the 
>>> "other related data" with the assumption that the "other related data" is 
>>> in the context of OLD "begin data", but in fact the "other related data" 
>>> was already in the context of NEW "begin data".
>>>
>>> Let see 1 simple example. We got 2 tables BeginData (ID1 is primary key) 
>>> & OtherRelatedData (ID2 is primary key) all related to each other.
>>>
>>> At the beginning we have these 2 tables:
>>>
>>> BeginData
>>> ID1 - ID2 - Text
>>> 1   - 3   - begin data text 1
>>>
>>> OtherRelatedData
>>> ID2 - Text
>>> 3   - other related data text 1
>>>
>>> Now user "A" downloaded "begin data" so he working on this record "1 - 
>>> 3 - begin data text 1" for many hours. At the same time, a user "B" 
>>> modified BeginData & because they are related so "other related data" also 
>>> got modified:
>>>
>>> User "B" modified data:
>>>
>>> BeginData
>>> ID1 - ID2 - Text
>>> 1   - 3   - begin data text 2
>>>
>>> OtherRelatedData
>>> ID2 - Text
>>> 3   - other related data text 2
>>>
>>> Now user "A" did not download the latest "begin data" & he is using "1 
>>> - 3 - begin data text 1" then he get the "other related data" then the 
>>> system will provide him "3 - other related data text 2", so he thought 
>>> the "3 - other related data text 2" is in the context of "1 - 3 - begin 
>>> data text 1" but actually it is not.
>>>
>>> This is very serious. What if we have so many related data like this 
>>> then how can we manage them?
>>>
>>> Do we need to check the consistency of "begin data" & "other related 
>>> data" on 1 page whenever we get the "other related data"?
>>>
>>

-- 
You received this message because you are subscribed to the Google Groups 
"Google Web Toolkit" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/google-web-toolkit.
For more options, visit https://groups.google.com/d/optout.

Reply via email to