Hi France,

Some updates:

• I fixed the locking bug that caused a null pointer exception.

• As you probably know, the WebDAV locks were organized in an
additional ~webdav database on disk. I decided to change this quite
fundamentally: From now on, the locks will be kept in main-memory.
Locks will get lost if BaseX is restarted (but I expect this to rarely
happen in productive environments).

• The good news is that the oXygen WebDAV explorer will be much faster
now! I noticed that 50.000 internal log checks were performed with
oXygen. This didn’t happen with other WebDAV clients.

I’d be pleased if you could check out the latest snapshot [1] and give
me an update if it works as expected. The actual problem you reported
has not been fixed yet, but I’m positive that things are clearing up.

Best,
Christian

[1] http://files.basex.org/releases/latest/



On Thu, May 17, 2018 at 5:42 PM, Christian Grün
<christian.gr...@gmail.com> wrote:
> Hi France,
>
> The delay for retrieving the file list seems to be oXygen-specific:
> BaseX itself requires appr. 1 second to create a list of the 50.000
> files, but it takes around 180 seconds until the resources are
> displayed in the oXygen WebDAV explorer. I tried another WebDAV
> implementation: With the WebDAV plugin of the windows application
> TotalCommander, the files are listed after 3 seconds.
>
> But back to your original question: My troubles started when I tried
> to open and close a file with oXygen (version 20): If I open a single
> resource, a NullPointerException is output by BaseX (on command line).
> If I close the file and try to reopen it, oXygen returns 500 (“Problem
> while trying to acquire lock”).
>
> Do you experience a similar behavior? Which versions of BaseX and
> oXygen are you currently working with?
>
> Unfortunately, the WebDAV protocol has been causing problems since the
> very beginning we implemented it. This is on the one hand due to the
> outdated library we use, on the other hand to the protocol itself
> (each WebDAV client seems to use it differently). Maybe you could have
> a look at Axxepta’s Argon Author plugin for oXygen:
>
>   http://argon-author.com/
>
> Best,
> Christian
>
>
>
> On Thu, May 17, 2018 at 10:55 AM, France Baril
> <france.ba...@architextus.com> wrote:
>> Perfect, happy we could finally find a way to get this issue replicated!.
>>
>> We only have one DB with more than 1000 files, so display speed is not much
>> of an issue. We'd be happy to get speed anyway :-). The 50,000 files were
>> just to make the bug easy to replicate.
>>
>> On Wed, May 16, 2018 at 4:04 PM, Christian Grün <christian.gr...@gmail.com>
>> wrote:
>>>
>>> Dear France,
>>>
>>> A first update:
>>>
>>> I noticed that the oXygen file access while updating the database
>>> causes various exceptions (which are written to the BaseX logs). As a
>>> result, I also get duplicate files in the database. I will try to find
>>> out if this is something we can resolve, or if it goes back to thev
>>> Milton WebDAV library we use.
>>>
>>> A minor info: You can speed up the duplicates lookup by using group by:
>>>
>>>   let $duplicates := (
>>>     for $file-group in db:list('mydb')
>>>     group by $path := string($file-group)
>>>     let $count := count($file-group)
>>>     where $count > 1
>>>     return <li>There are { $count } instances of { $path }.</li>
>>>   )
>>>   return
>>>     if ($duplicates)
>>>     then <ul> { $duplicates }</ul>
>>>     else <p>All is good. No duplicate found.</p>
>>>
>>> Apart from that, I noticed that it takes a very long time to list the
>>> 50.000 files in oXygen. Yet another issues that may be due to the
>>> restrictions of WebDAV; but I’ll see if something can be done in BaseX
>>> to get this accelerated.
>>>
>>> Best,
>>> Christian
>>>
>>>
>>>
>>>
>>> On Sun, May 13, 2018 at 3:36 PM, France Baril
>>> <france.ba...@architextus.com> wrote:
>>> > Hi,
>>> >
>>> > Just wondering if this slipped through the cracks.
>>> >
>>> > On Wed, May 2, 2018 at 1:11 PM, France Baril
>>> > <france.ba...@architextus.com>
>>> > wrote:
>>> >>
>>> >> Hi,
>>> >>
>>> >> We've been having this issue for a while and we think resolving it may
>>> >> be
>>> >> the key to resolving an intermittent server 500 error that we've been
>>> >> having.
>>> >>
>>> >> When a user tries to save a file while a batch process is running,
>>> >> BaseX
>>> >> saves duplicates of the file.
>>> >>
>>> >> How to reproduce:
>>> >>
>>> >> 1) Take a fresh BaseX 9.0.1 installation
>>> >> 2) Copy the attached .xqm in webapp
>>> >> 3) Create an empty DB called mydb
>>> >> 4) Access localhost:port-num/test/create-update-a-lot-of-files to
>>> >> populate
>>> >> your db.
>>> >> 5) In OxygenXML, set a webdav connection to the db and open a file, add
>>> >> a
>>> >> character in one of the elements, but don't save the file.
>>> >> 6) From the browser, access 'localhost:port-num/test/update-something'
>>> >> 7) While the process in the browser is still running, save the file in
>>> >> Oxygen. You'll get a message saying that read timed out. Click ok and
>>> >> do not
>>> >> try saving the file again.
>>> >> 8) When the update-something process is done running, don't resave the
>>> >> file in Oxygen, instead go to localhost:port-num/test/oups-duplicates.
>>> >>    You'll get a message saying that some files are duplicated. If you
>>> >> don't try again from step #4 a few times. You'll only get duplicates if
>>> >> you
>>> >> get the time out message before the update-something process is still
>>> >> running. If you try to save the file many times, you'll get more
>>> >> duplicates,
>>> >> 4 or 6.
>>> >>
>>> >> We're not sure if it's a BaseX bug or if we are setting our user
>>> >> management and/or locking rules incorrectly.
>>> >>
>>> >> Do you have any suggestions?
>>> >>
>>> >> --
>>> >> France Baril
>>> >> Architecte documentaire / Documentation architect
>>> >> france.ba...@architextus.com
>>> >
>>> >
>>> >
>>> >
>>> > --
>>> > France Baril
>>> > Architecte documentaire / Documentation architect
>>> > france.ba...@architextus.com
>>
>>
>>
>>
>> --
>> France Baril
>> Architecte documentaire / Documentation architect
>> france.ba...@architextus.com

Reply via email to