Perfect, happy we could finally find a way to get this issue replicated!.

We only have one DB with more than 1000 files, so display speed is not much
of an issue. We'd be happy to get speed anyway :-). The 50,000 files were
just to make the bug easy to replicate.

On Wed, May 16, 2018 at 4:04 PM, Christian Grün <christian.gr...@gmail.com>
wrote:

> Dear France,
>
> A first update:
>
> I noticed that the oXygen file access while updating the database
> causes various exceptions (which are written to the BaseX logs). As a
> result, I also get duplicate files in the database. I will try to find
> out if this is something we can resolve, or if it goes back to the
> Milton WebDAV library we use.
>
> A minor info: You can speed up the duplicates lookup by using group by:
>
>   let $duplicates := (
>     for $file-group in db:list('mydb')
>     group by $path := string($file-group)
>     let $count := count($file-group)
>     where $count > 1
>     return <li>There are { $count } instances of { $path }.</li>
>   )
>   return
>     if ($duplicates)
>     then <ul> { $duplicates }</ul>
>     else <p>All is good. No duplicate found.</p>
>
> Apart from that, I noticed that it takes a very long time to list the
> 50.000 files in oXygen. Yet another issues that may be due to the
> restrictions of WebDAV; but I’ll see if something can be done in BaseX
> to get this accelerated.
>
> Best,
> Christian
>
>
>
>
> On Sun, May 13, 2018 at 3:36 PM, France Baril
> <france.ba...@architextus.com> wrote:
> > Hi,
> >
> > Just wondering if this slipped through the cracks.
> >
> > On Wed, May 2, 2018 at 1:11 PM, France Baril <
> france.ba...@architextus.com>
> > wrote:
> >>
> >> Hi,
> >>
> >> We've been having this issue for a while and we think resolving it may
> be
> >> the key to resolving an intermittent server 500 error that we've been
> >> having.
> >>
> >> When a user tries to save a file while a batch process is running, BaseX
> >> saves duplicates of the file.
> >>
> >> How to reproduce:
> >>
> >> 1) Take a fresh BaseX 9.0.1 installation
> >> 2) Copy the attached .xqm in webapp
> >> 3) Create an empty DB called mydb
> >> 4) Access localhost:port-num/test/create-update-a-lot-of-files to
> populate
> >> your db.
> >> 5) In OxygenXML, set a webdav connection to the db and open a file, add
> a
> >> character in one of the elements, but don't save the file.
> >> 6) From the browser, access 'localhost:port-num/test/update-something'
> >> 7) While the process in the browser is still running, save the file in
> >> Oxygen. You'll get a message saying that read timed out. Click ok and
> do not
> >> try saving the file again.
> >> 8) When the update-something process is done running, don't resave the
> >> file in Oxygen, instead go to localhost:port-num/test/oups-duplicates.
> >>    You'll get a message saying that some files are duplicated. If you
> >> don't try again from step #4 a few times. You'll only get duplicates if
> you
> >> get the time out message before the update-something process is still
> >> running. If you try to save the file many times, you'll get more
> duplicates,
> >> 4 or 6.
> >>
> >> We're not sure if it's a BaseX bug or if we are setting our user
> >> management and/or locking rules incorrectly.
> >>
> >> Do you have any suggestions?
> >>
> >> --
> >> France Baril
> >> Architecte documentaire / Documentation architect
> >> france.ba...@architextus.com
> >
> >
> >
> >
> > --
> > France Baril
> > Architecte documentaire / Documentation architect
> > france.ba...@architextus.com
>



-- 
France Baril
Architecte documentaire / Documentation architect
france.ba...@architextus.com

Reply via email to