[jira] [Commented] (JCR-3763) Import XML fails with jackrabbit webapp deployed

2014-04-24 Thread Shankar Pednekar (JIRA)

[ 
https://issues.apache.org/jira/browse/JCR-3763?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13979554#comment-13979554
 ] 

Shankar Pednekar commented on JCR-3763:
---

The issue is with following piece of code
org.apache.jackrabbit.webdav.jcr.version.report.JcrPrivilegeReport
Element hrefElem = info.getContentElement(DavConstants.XML_HREF, 
DavConstants.NAMESPACE);
String href = DomUtil.getTextTrim(hrefElem);
href = obtainAbsolutePathFromUri(href);
This href refers to absolute uri and contains webapp name eg:jackrabbit
So when 
resourceLoc.getFactory().createResourceLocator(resourceLoc.getPrefix(), 
href) is called, the code in 
AbstractLocatorFactory.createResourceLocator(String prefix, String href) is 
unable to extract workspace path due to webapp name. Hence to fix this 
following piece of code needs to be added in 
AbstractLocatorFactory.createResourceLocator(String prefix, String href){
method ---
if (pathPrefix != null  pathPrefix.length()  0) {
if (!b.toString().endsWith(pathPrefix)) {
b.append(pathPrefix);
}
if (href.startsWith(pathPrefix)) {
href = href.substring(pathPrefix.length());
}
/*Added new code start*/
if(!href.startsWith(pathPrefix)(href.contains(pathPrefix))){
href = 
href.substring(href.indexOf(pathPrefix)+pathPrefix.length());

}
/*Added new code end*/
}

This fixes the issue

 Import XML fails with jackrabbit webapp deployed
 

 Key: JCR-3763
 URL: https://issues.apache.org/jira/browse/JCR-3763
 Project: Jackrabbit Content Repository
  Issue Type: Bug
  Components: jackrabbit-webapp, jackrabbit-webdav
Affects Versions: 2.6.2, 2.7.5
 Environment: Windows 7
Reporter: Chris
Priority: Blocker
  Labels: DavLocatorFactory, import, jackrabbit, webapp, webdav, 
 xml
 Attachments: TestEAVCoreRepository.java


 UPDATE:   this seems to be very similar to the issue reported in:
 https://issues.apache.org/jira/browse/JCR-3668
 To reproduce simply deploy jackrabbit-webapp-2.6.2 or greater (haven't tested 
 on earlier versions).I have also confirmed this issue persists with 
 jackrabbit-webapp-2.7.5.The standalone versions do not have this issue.
 With the webapp deployed, attempt to import serialized XML using 
 sesson.importXML or workspace.importXML( ... ) fails.   Note:  this XML is 
 initially serialized from existing Jackrabbit node data via 
 session.exportSystem( ... ).  
 When your session or workspace points to the jackrabbit-standalone 
 configuration the import is successful and the node structure is generated on 
 the server.   When your session points to the webapp config, the process 
 fails because:
 DavLocatorFactoryImpl.getRepositoryPath() is returning the incorrect 
 workspace path.  In this example, the workspace is default.   The trace:
 SEVERE: Servlet.service() for servlet [JCRWebdavServer] in context with path 
 [/jackrabbit-webapp-2.7.5] threw exception
 java.lang.IllegalArgumentException: Unexpected format of resource path: 
 /jackrabbit-webapp-2.7.5/server/default/jcr:root/8b65d019-719f-47ec-ae2b-675c7a33048c/RTRD
  (workspace: /jackrabbit-webapp-2.7.5)
 at 
 org.apache.jackrabbit.webdav.jcr.DavLocatorFactoryImpl.getRepositoryPath(DavLocatorFactoryImpl.java:65)
 at 
 org.apache.jackrabbit.webdav.AbstractLocatorFactory$DavResourceLocatorImpl.getRepositoryPath(AbstractLocatorFactory.java:356)
 at 
 org.apache.jackrabbit.webdav.jcr.version.report.JcrPrivilegeReport.addResponses(JcrPrivilegeReport.java:117)
 at 
 org.apache.jackrabbit.webdav.jcr.version.report.JcrPrivilegeReport.init(JcrPrivilegeReport.java:102)
 at 
 org.apache.jackrabbit.webdav.version.report.ReportType.createReport(ReportType.java:72)
 at 
 org.apache.jackrabbit.webdav.jcr.AbstractResource.getReport(AbstractResource.java:487)
 at 
 org.apache.jackrabbit.webdav.jcr.WorkspaceResourceImpl.getReport(WorkspaceResourceImpl.java:84)
 at 
 org.apache.jackrabbit.webdav.server.AbstractWebdavServlet.doReport(AbstractWebdavServlet.java:1096)
 at 
 org.apache.jackrabbit.webdav.server.AbstractWebdavServlet.execute(AbstractWebdavServlet.java:402)
 at 
 org.apache.jackrabbit.webdav.server.AbstractWebdavServlet.service(AbstractWebdavServlet.java:291)
 at javax.servlet.http.HttpServlet.service(HttpServlet.java:728)
 at 
 org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305)
 at 
 org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
 at 
 

[jira] [Comment Edited] (JCR-3763) Import XML fails with jackrabbit webapp deployed

2014-04-24 Thread Shankar Pednekar (JIRA)

[ 
https://issues.apache.org/jira/browse/JCR-3763?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13979554#comment-13979554
 ] 

Shankar Pednekar edited comment on JCR-3763 at 4/24/14 10:30 AM:
-

The issue is with following piece of code
org.apache.jackrabbit.webdav.jcr.version.report.JcrPrivilegeReport
Element hrefElem = info.getContentElement(DavConstants.XML_HREF, 
DavConstants.NAMESPACE);
String href = DomUtil.getTextTrim(hrefElem);
href = obtainAbsolutePathFromUri(href);
This href refers to absolute uri and contains webapp name eg:jackrabbit
So when 
resourceLoc.getFactory().createResourceLocator(resourceLoc.getPrefix(), 
href) is called, the code in 
AbstractLocatorFactory.createResourceLocator(String prefix, String href) is 
unable to extract workspace path due to webapp name. Hence to fix this 
following piece of code needs to be added in 
AbstractLocatorFactory.createResourceLocator(String prefix, String href){
method ---
if (pathPrefix != null  pathPrefix.length()  0) {
if (!b.toString().endsWith(pathPrefix)) {
b.append(pathPrefix);
}
if (href.startsWith(pathPrefix)) {
href = href.substring(pathPrefix.length());
}
/*Added new code start*/
   if(!href.startsWith(pathPrefix) href.contains(pathPrefix)){
href = 
href.substring(href.indexOf(pathPrefix)+pathPrefix.length());

}
/*Added new code end*/
}

This fixes the issue


was (Author: pednekar):
The issue is with following piece of code
org.apache.jackrabbit.webdav.jcr.version.report.JcrPrivilegeReport
Element hrefElem = info.getContentElement(DavConstants.XML_HREF, 
DavConstants.NAMESPACE);
String href = DomUtil.getTextTrim(hrefElem);
href = obtainAbsolutePathFromUri(href);
This href refers to absolute uri and contains webapp name eg:jackrabbit
So when 
resourceLoc.getFactory().createResourceLocator(resourceLoc.getPrefix(), 
href) is called, the code in 
AbstractLocatorFactory.createResourceLocator(String prefix, String href) is 
unable to extract workspace path due to webapp name. Hence to fix this 
following piece of code needs to be added in 
AbstractLocatorFactory.createResourceLocator(String prefix, String href){
method ---
if (pathPrefix != null  pathPrefix.length()  0) {
if (!b.toString().endsWith(pathPrefix)) {
b.append(pathPrefix);
}
if (href.startsWith(pathPrefix)) {
href = href.substring(pathPrefix.length());
}
/*Added new code start*/
if(!href.startsWith(pathPrefix)(href.contains(pathPrefix))){
href = 
href.substring(href.indexOf(pathPrefix)+pathPrefix.length());

}
/*Added new code end*/
}

This fixes the issue

 Import XML fails with jackrabbit webapp deployed
 

 Key: JCR-3763
 URL: https://issues.apache.org/jira/browse/JCR-3763
 Project: Jackrabbit Content Repository
  Issue Type: Bug
  Components: jackrabbit-webapp, jackrabbit-webdav
Affects Versions: 2.6.2, 2.7.5
 Environment: Windows 7
Reporter: Chris
Priority: Blocker
  Labels: DavLocatorFactory, import, jackrabbit, webapp, webdav, 
 xml
 Attachments: TestEAVCoreRepository.java


 UPDATE:   this seems to be very similar to the issue reported in:
 https://issues.apache.org/jira/browse/JCR-3668
 To reproduce simply deploy jackrabbit-webapp-2.6.2 or greater (haven't tested 
 on earlier versions).I have also confirmed this issue persists with 
 jackrabbit-webapp-2.7.5.The standalone versions do not have this issue.
 With the webapp deployed, attempt to import serialized XML using 
 sesson.importXML or workspace.importXML( ... ) fails.   Note:  this XML is 
 initially serialized from existing Jackrabbit node data via 
 session.exportSystem( ... ).  
 When your session or workspace points to the jackrabbit-standalone 
 configuration the import is successful and the node structure is generated on 
 the server.   When your session points to the webapp config, the process 
 fails because:
 DavLocatorFactoryImpl.getRepositoryPath() is returning the incorrect 
 workspace path.  In this example, the workspace is default.   The trace:
 SEVERE: Servlet.service() for servlet [JCRWebdavServer] in context with path 
 [/jackrabbit-webapp-2.7.5] threw exception
 java.lang.IllegalArgumentException: Unexpected format of resource path: 
 /jackrabbit-webapp-2.7.5/server/default/jcr:root/8b65d019-719f-47ec-ae2b-675c7a33048c/RTRD
  (workspace: /jackrabbit-webapp-2.7.5)
 at 
 org.apache.jackrabbit.webdav.jcr.DavLocatorFactoryImpl.getRepositoryPath(DavLocatorFactoryImpl.java:65)
 at 
 

[jira] [Created] (JCR-3773) Lucene ConsistencyCheck reports nodes under /jcr:system/jcr:nodeTypes as deleted

2014-04-24 Thread Manfred Baedke (JIRA)
Manfred Baedke created JCR-3773:
---

 Summary: Lucene ConsistencyCheck reports nodes under 
/jcr:system/jcr:nodeTypes as deleted
 Key: JCR-3773
 URL: https://issues.apache.org/jira/browse/JCR-3773
 Project: Jackrabbit Content Repository
  Issue Type: Bug
  Components: indexing
Affects Versions: 2.6.5
Reporter: Manfred Baedke
Assignee: Manfred Baedke
Priority: Minor


The virtual nodes under /jcr:system/jcr:nodeTypes are reported as deleted and 
not repairable by org.apache.jackrabbit.core.query.lucene.ConsistencyCheck. The 
ConsistencyCheck should instead just ignore the tree /jcr:system/jcr:nodeTypes.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


support for sorting queries by document order

2014-04-24 Thread Ben Peter
Hi,


currently it is not possible to sort query results by document order
if respectDocumentOrder is left at the default of false in the workspace
configuration.
The need to support such sorting per query was discussed years back when
the default setting was changed to false in JCR-1237 (and postponed at the
time).

For applications that occasionally require sorting by document order,
setting respectDocumentOrder to true bears a huge performance impact, as
results are sorted in-memory.
An (ugly) workaround for such applications is to set respectDocumentOrder
to true and use order by @jcr:score on queries where order does not
matter (which is faster), but even disregarding its ugliness, this does not
work for mixed-scope applications (e.g. a specific project built on top of
Sling which would need to touch Sling code to implement this workaround).

In that light, would it make sense to support something along the lines of
- order by @jcr:none (to explicitly request disregarding document order for
installations where respectDocumentOrder is true
and/or
- order by @jcr:documentOrder (to explicity request ordering by
documentOrder for installations where respectDocumentOrder is false

(I guess the jcr namespace is not appropriate?)

The second one can proably boast a much larger number of use cases (I have
encountered two use cases in the last 2 days that would be covered by the
second case)

Cheers
Ben


Re: [VOTE] Release Apache Jackrabbit Filevault 3.1.6

2014-04-24 Thread Jukka Zitting
Hi,

On Tue, Apr 22, 2014 at 8:05 PM, Tobias Bocanegra tri...@apache.org wrote:
 Please vote on releasing this package as Apache Jackrabbit Filevault 3.1.6.

  [x] +1 Release this package as Apache Jackrabbit Filevault 3.1.6

BR,

Jukka Zitting


[jira] [Updated] (JCR-3772) Local File cache is not reduced to zero size after specifying in confuguration

2014-04-24 Thread Chetan Mehrotra (JIRA)

 [ 
https://issues.apache.org/jira/browse/JCR-3772?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chetan Mehrotra updated JCR-3772:
-

Resolution: Fixed
  Assignee: Chetan Mehrotra
Status: Resolved  (was: Patch Available)

Applied the patch in http://svn.apache.org/r1589926

 Local File cache is not reduced to zero size after specifying in confuguration
 --

 Key: JCR-3772
 URL: https://issues.apache.org/jira/browse/JCR-3772
 Project: Jackrabbit Content Repository
  Issue Type: Bug
Affects Versions: 2.7.5
Reporter: Shashank Gupta
Assignee: Chetan Mehrotra
Priority: Minor
 Fix For: 2.8

 Attachments: JCR-3772.patch


 The local cache size is specified in repository.xml
 {noformat}
 DataStore class=org.apache.jackrabbit.aws.ext.ds.S3DataStore
   param name=config value=${rep.home}/aws.properties/
 param name=secret value=123456789/
 param name=minRecordLength  value=16384/ 
 param name=cacheSize value=68719476736/
 param name=cachePurgeTrigFactor value=0.95d/
 param name=cachePurgeResizeFactor value=0.85d/
 param name=continueOnAsyncUploadFailure value=false/
 param name=concurrentUploadsThreads value=10/
 param name=asyncUploadLimit value=100/
 param name=uploadRetries value=3/
   /DataStore
 {noformat}
 To disable local cache, {{cacheSize}} is set to 0. Upon setting it to 0 and 
 restart, the expectation is that all files in cache would be deleted and 
 local cache won't be used in any operation. 
 The issue is that local cache is not resetting to 0 size. 
 *WorkAround*: set it to 1. 



--
This message was sent by Atlassian JIRA
(v6.2#6252)


Adding ProviderType and ConsumerType annotation to interfaces in exported packages

2014-04-24 Thread Chetan Mehrotra
As part of OAK-1741 I was changing the Version of exported packages to
1.0.0. Looking at the interfaces which are part of exported packages I
do not see usage of ConsumerType/ProviderType annotations [1]

In brief and simple terms the interfaces which are expected to be
implemented by users of Oak api (like
org.apache.jackrabbit.oak.plugins.observation.EventHandler) should be
marked with ConsumerType anotation. This enables bnd tool to generate
package import instructions with stricter range [1.0,1.1)

For all other interface which are supposed to be provided by Oak we
should mark them with ProviderType. This enables bnd to generate the
package import instructions with relaxed range [1.0,2) for our api
consumers. This would help us evolve the api in future easily.

Currently we are having following interfaces as part of exported
packages [2]. Looking at the list I believe most are of ProviderType
i.e. provided by Oak and not required by Oak API users.

Some like org.apache.jackrabbit.oak.plugins.observation.EventHandler
are of ConsumerType as we require the API users to implement them.

Should we add the required annotations for 1.0 release?

If yes then can team members look into the list and set the right type

Chetan Mehrotra
[1] 
https://github.com/osgi/design/raw/master/rfcs/rfc0197/rfc-0197-OSGiPackageTypeAnnotations.pdf

[2] 
https://issues.apache.org/jira/browse/OAK-1741?focusedCommentId=13979465#comment-13979465


Oak CI notifications not comming

2014-04-24 Thread Chetan Mehrotra
Hi,

I was checking the CI status for Oak trunk and it seems build are not
getting built at [1] and [2].

Do we have to enable it somehow?

Chetan Mehrotra
[1] https://travis-ci.org/apache/jackrabbit-oak/builds
[2] http://ci.apache.org/builders/oak-trunk/


Re: Oak CI notifications not comming

2014-04-24 Thread Bertrand Delacretaz
Hi Chetan,

On Thu, Apr 24, 2014 at 12:52 PM, Chetan Mehrotra
chetan.mehro...@gmail.com wrote:
 ...I was checking the CI status for Oak trunk and it seems build are not
 getting built at [1] and [2]...

I don't know about travis but ci.apache.org is currently not working,
AFAIK there was a multiple hardware failure that takes time to fix.

-Bertrand


Unit tests on 1.0 branch failing

2014-04-24 Thread Alex Parvulescu
Hi,

I'm running the unit tests on the 1.0 branch and I saw the Clock test
failing twice:

Failed tests:   testClockDrift(org.apache.jackrabbit.oak.stats.ClockTest):
Clock.Fast unexpected drift ater 100ms: -9ms (estimated limit was 5ms,
measured granularity was 1.0ms)

Failed tests:   testClockDrift(org.apache.jackrabbit.oak.stats.ClockTest):
Clock.Fast unexpected drift ater 100ms: -11ms (estimated limit was 5ms,
measured granularity was 1.0ms)


I'll rerun the tests to see if this goes away.

best,
alex


Re: Adding ProviderType and ConsumerType annotation to interfaces in exported packages

2014-04-24 Thread Michael Dürig


+1, not sure whether we have enough resources to do this in time for 1.0 
though. Maybe we could do it where obvious?


Michael

On 24.4.14 10:48 , Chetan Mehrotra wrote:

As part of OAK-1741 I was changing the Version of exported packages to
1.0.0. Looking at the interfaces which are part of exported packages I
do not see usage of ConsumerType/ProviderType annotations [1]

In brief and simple terms the interfaces which are expected to be
implemented by users of Oak api (like
org.apache.jackrabbit.oak.plugins.observation.EventHandler) should be
marked with ConsumerType anotation. This enables bnd tool to generate
package import instructions with stricter range [1.0,1.1)

For all other interface which are supposed to be provided by Oak we
should mark them with ProviderType. This enables bnd to generate the
package import instructions with relaxed range [1.0,2) for our api
consumers. This would help us evolve the api in future easily.

Currently we are having following interfaces as part of exported
packages [2]. Looking at the list I believe most are of ProviderType
i.e. provided by Oak and not required by Oak API users.

Some like org.apache.jackrabbit.oak.plugins.observation.EventHandler
are of ConsumerType as we require the API users to implement them.

Should we add the required annotations for 1.0 release?

If yes then can team members look into the list and set the right type

Chetan Mehrotra
[1] 
https://github.com/osgi/design/raw/master/rfcs/rfc0197/rfc-0197-OSGiPackageTypeAnnotations.pdf

[2] 
https://issues.apache.org/jira/browse/OAK-1741?focusedCommentId=13979465#comment-13979465



Re: Unit tests on 1.0 branch failing

2014-04-24 Thread Alex Parvulescu
I'm seeing this test fail each time. Am I the only one experiencing this?


On Thu, Apr 24, 2014 at 2:19 PM, Michael Dürig mdue...@apache.org wrote:



 On 24.4.14 2:01 , Alex Parvulescu wrote:


 Failed tests:   testClockDrift(org.apache.jackrabbit.oak.stats.
 ClockTest):
 Clock.Fast unexpected drift ater 100ms: -9ms (estimated limit was 5ms,
 measured granularity was 1.0ms)


 Most likely caused by http://svn.apache.org/r1589660. The estimated
 granularity pertains to the system clock. However might not be correct for
 the fast clock as there it could be set trough -Dfast.clock.interval and
 defaults to 10ms for Oak 1.0


 Michael



RE: Unit tests on 1.0 branch failing

2014-04-24 Thread Marcel Reutegger
I see it from time to time, but not always. maybe one out of ten
times

regards
 marcel

 -Original Message-
 From: Alex Parvulescu [mailto:alex.parvule...@gmail.com]
 Sent: Donnerstag, 24. April 2014 14:25
 To: oak-dev@jackrabbit.apache.org
 Subject: Re: Unit tests on 1.0 branch failing
 
 I'm seeing this test fail each time. Am I the only one experiencing this?
 
 
 On Thu, Apr 24, 2014 at 2:19 PM, Michael Dürig mdue...@apache.org
 wrote:
 
 
 
  On 24.4.14 2:01 , Alex Parvulescu wrote:
 
 
  Failed tests:   testClockDrift(org.apache.jackrabbit.oak.stats.
  ClockTest):
  Clock.Fast unexpected drift ater 100ms: -9ms (estimated limit was 5ms,
  measured granularity was 1.0ms)
 
 
  Most likely caused by http://svn.apache.org/r1589660. The estimated
  granularity pertains to the system clock. However might not be correct for
  the fast clock as there it could be set trough -Dfast.clock.interval and
  defaults to 10ms for Oak 1.0
 
 
  Michael
 


Re: Unit tests on 1.0 branch failing

2014-04-24 Thread Michael Dürig



On 24.4.14 2:19 , Michael Dürig wrote:



On 24.4.14 2:01 , Alex Parvulescu wrote:


Failed tests:
testClockDrift(org.apache.jackrabbit.oak.stats.ClockTest):
Clock.Fast unexpected drift ater 100ms: -9ms (estimated limit was 5ms,
measured granularity was 1.0ms)


Most likely caused by http://svn.apache.org/r1589660. The estimated
granularity pertains to the system clock. However might not be correct
for the fast clock as there it could be set trough -Dfast.clock.interval
and defaults to 10ms for Oak 1.0



Fixed in trunk and 1.0 now by adapting the test case to take the correct 
timer granularity into consideration.


Michael


Re: Adding ProviderType and ConsumerType annotation to interfaces in exported packages

2014-04-24 Thread Jukka Zitting
Hi,

On Thu, Apr 24, 2014 at 4:48 AM, Chetan Mehrotra
chetan.mehro...@gmail.com wrote:
 Currently we are having following interfaces as part of exported
 packages [2]. Looking at the list I believe most are of ProviderType
 i.e. provided by Oak and not required by Oak API users.

Not really. Almost all of the interfaces we expose are various kinds
of extension points that can be used for plugging in custom
functionality. The fact that we provide a number of default
implementations doesn't exclude anyone from adding their own
components.

I think only the Oak API itself is something for which custom
implementations wouldn't be expected.

BR,

Jukka Zitting