Re: build svtools with debug

2012-09-27 Thread Mathias Bauer
Am 26.09.2012 18:19, schrieb Armin Le Grand:
 Index: Library_hatchwindowfactory.mk
 ===
 --- Library_hatchwindowfactory.mk   (revision 1389804)
 +++ Library_hatchwindowfactory.mk   (working copy)
 @@ -43,6 +43,7 @@
  tk \
  tl \
  vcl \
 +   stl \
  $(gb_STDLIBS) \
   ))
 
 @Pedro: Is this a solution? Maybe when using debug, extra stuff using 
 stl to verify things (remember data in a vector?) gets compiled. I do 
 not know how bad it is to always link against stl for 
 hatchwindowfactory, even without debug build. What do You think?

Your assumption is a good one, I remember seeing this in other cases in
the past.

But there are other possible reasons. Using a template library might be
fine without linking against its binary part, if the compiler decides to
instantiate the templates in the library using the template. But
sometimes the compiler decides not to do that and requires external
linkage, so you have to link against the library. This may depend on
compiler settings in a very subtle way, so using DEBUG might be able to
trigger such a difference.

That's the beauty of templates. :-)

Especially with a compiler like the MSVC compiler that has some serious
defects in this regard (I once wrote a blog about the joy of linking
against symbols that contain templates on GulFOSS. Those where the days...).

Regards,
Mathias

(who currently works with Objective-C and would love to have problems
like this instead of using a programming language that neither has
templates nor namespaces. But hey, at least it has closures. :-))


Re: Help! JunitTest_framework_unoapi.mk:28: *** Malformed target-specific variable definition. Stop.

2011-12-20 Thread Mathias Bauer

Hi,

sorry, but I don't have the Windows machine anymore where I did my OOo 
builds. Nowadays I'm only building on Linux. But you can follow my build 
instructions for make in the OOo wiki: just download the 3.81 source 
tarball, unpack it, configure, apply the patch and build.


Regards,
Mathias

On 20.12.2011 02:38, Zhe Liu wrote:

Thank you for the comment.
Could you share your make?  Juergen said that he can provide space to store it.
I prefer to directly use it. -:)

2011/12/20 Mathias Bauermathias_ba...@gmx.net:

On 19.12.2011 23:23, Andrew Rist wrote:




On 12/16/2011 9:24 AM, Mathias Bauer wrote:


On 16.12.2011 03:43, Zhe Liu wrote:


Hi All,
I always break because of the error when build on Windows XP. I
mentioned before, nobody responsed on it. I did a little search and
found someone also encountered the problem. I still have no clue how
to resolve it.

JunitTest_framework_unoapi.mk:28: *** Malformed target-specific
variable definition. Stop.

To continue my build, I have to remove the lines related to Junitest.
There are several module with the same error. It's annoying to
workaround them all. Could anybody help me?

Thanks.



What version of GNU Make do you use? 3.82 has a bug that let GNU Make
spit out this error even if the variable definition is OK (and is
parsed without problems by 3.81).


I have run into this issue also on Win7. Cygwin has a current version of
3.82.90 (thus hitting 3.82 'bug').
What is the best way to deal with this? Is this something that is
considered a bug by gmake, or is this a regression that will continue?
(it involves 'make -r' not working correctly and I guess does look like
a regression - not a new feature)
If we can move to a new syntax, without breaking any other platforms,
but also using current Cygwin packages - would that would be the best
solution?

For now I will switch the instructions for the buildbot, call out the
use of make 3.81-2 (which is also available in current Cygwin).
At this point, this seems to be a solution for the build.



I reported the problem to cygwin several months ago:

http://cygwin.com/ml/cygwin/2011-02/msg00398.html

See also

http://wiki.services.openoffice.org/wiki/Build_Environment_Effort/Status_And_Next_Steps

At the end of the page you can see my instructions to build a custom version
of make. This gave me the best make on cygwin so far.

This page mentions a performance bug in make 3.82 that we also found on
cygwin. It's possible that this was a problem of the HEAD version at that
time.

Regards,
Mathias








Re: Build break due to CRLF in some scripts on Windows?

2011-12-19 Thread Mathias Bauer
On 15.12.2011 03:43, Zhe Liu wrote:
 I built successfully on Windows XP, even if not smoothly.  I
 encountered several errors like
 /cygdrive/d/aoosrc/aoo/main/svtools/JunitTest_svtools_unoapi.mk:28:
 *** Malformed target-specific variable definition.  Stop.
 dmake:  Error code 2, while making 'all'
 
 I rebuilt the module without doing nothing, then proceeded. Weirdly.
 I configured with --without-junit. Why is JunitTest still  made?

I think it's not built, but of course make must read the makefiles
before it can decide that its content doesn't need to be built.
Unfortunately make goes haywire before it can reach that state.

Regards,
Mathias


Re: Help! JunitTest_framework_unoapi.mk:28: *** Malformed target-specific variable definition. Stop.

2011-12-19 Thread Mathias Bauer

On 19.12.2011 23:23, Andrew Rist wrote:



On 12/16/2011 9:24 AM, Mathias Bauer wrote:

On 16.12.2011 03:43, Zhe Liu wrote:

Hi All,
I always break because of the error when build on Windows XP. I
mentioned before, nobody responsed on it. I did a little search and
found someone also encountered the problem. I still have no clue how
to resolve it.

JunitTest_framework_unoapi.mk:28: *** Malformed target-specific
variable definition. Stop.

To continue my build, I have to remove the lines related to Junitest.
There are several module with the same error. It's annoying to
workaround them all. Could anybody help me?

Thanks.



What version of GNU Make do you use? 3.82 has a bug that let GNU Make
spit out this error even if the variable definition is OK (and is
parsed without problems by 3.81).

I have run into this issue also on Win7. Cygwin has a current version of
3.82.90 (thus hitting 3.82 'bug').
What is the best way to deal with this? Is this something that is
considered a bug by gmake, or is this a regression that will continue?
(it involves 'make -r' not working correctly and I guess does look like
a regression - not a new feature)
If we can move to a new syntax, without breaking any other platforms,
but also using current Cygwin packages - would that would be the best
solution?

For now I will switch the instructions for the buildbot, call out the
use of make 3.81-2 (which is also available in current Cygwin).
At this point, this seems to be a solution for the build.


I reported the problem to cygwin several months ago:

http://cygwin.com/ml/cygwin/2011-02/msg00398.html

See also

http://wiki.services.openoffice.org/wiki/Build_Environment_Effort/Status_And_Next_Steps 



At the end of the page you can see my instructions to build a custom 
version of make. This gave me the best make on cygwin so far.


This page mentions a performance bug in make 3.82 that we also found on 
cygwin. It's possible that this was a problem of the HEAD version at 
that time.


Regards,
Mathias


Re: Help! JunitTest_framework_unoapi.mk:28: *** Malformed target-specific variable definition. Stop.

2011-12-19 Thread Mathias Bauer

On 19.12.2011 18:02, Jürgen Schmidt wrote:


On 12/19/11 12:27 AM, Mathias Bauer wrote:

Am 19.12.2011 00:19, schrieb Michael Stahl:


On 16/12/11 18:24, Mathias Bauer wrote:

On 16.12.2011 03:43, Zhe Liu wrote:

Hi All,
I always break because of the error when build on Windows XP. I
mentioned before, nobody responsed on it. I did a little search and
found someone also encountered the problem. I still have no clue how
to resolve it.

JunitTest_framework_unoapi.mk:28: *** Malformed target-specific
variable definition. Stop.



What version of GNU Make do you use? 3.82 has a bug that let GNU Make
spit out this error even if the variable definition is OK (and is
parsed
without problems by 3.81).


AFAIR this is a problem that only happens on Cygwin (i speculate the
problem is triggered by the CLASSPATH that contains jar files with mixed
path notation, containing a colon).


Yes, and I wrote it because Zhe Liu had this problem on Windows. ;-)


getting a make that works well on windows is surprisingly difficult :-/


Indeed. The best I could get is the one I built myself in the way I have
documented in the OOo Building Guide in the old Wiki.

maybe you can provide your build somewhere ;-) how about
http://people.apache.org/~mba/d´gnumake/...

Or somewhere else where people can easy download it. Or send a version
to me and i will provide it somewhere.


Please see my reply to Andrew.

Regards,
Mathias


Re: Help! JunitTest_framework_unoapi.mk:28: *** Malformed target-specific variable definition. Stop.

2011-12-18 Thread Mathias Bauer
Am 19.12.2011 00:19, schrieb Michael Stahl:

 On 16/12/11 18:24, Mathias Bauer wrote:
 On 16.12.2011 03:43, Zhe Liu wrote:
 Hi All,
 I always break because of the error when build on Windows XP. I
 mentioned before, nobody responsed on it.  I did a little search and
 found someone also encountered the problem.  I still have no clue how
 to resolve it.

 JunitTest_framework_unoapi.mk:28: *** Malformed target-specific
 variable definition.  Stop.

 
 What version of GNU Make do you use? 3.82 has a bug that let GNU Make 
 spit out this error even if the variable definition is OK (and is parsed 
 without problems by 3.81).
 
 AFAIR this is a problem that only happens on Cygwin (i speculate the
 problem is triggered by the CLASSPATH that contains jar files with mixed
 path notation, containing a colon).

Yes, and I wrote it because Zhe Liu had this problem on Windows. ;-)

 getting a make that works well on windows is surprisingly difficult :-/

Indeed. The best I could get is the one I built myself in the way I have
documented in the OOo Building Guide in the old Wiki.

Regards,
Mathias


Re: Help! JunitTest_framework_unoapi.mk:28: *** Malformed target-specific variable definition. Stop.

2011-12-16 Thread Mathias Bauer

On 16.12.2011 03:43, Zhe Liu wrote:

Hi All,
I always break because of the error when build on Windows XP. I
mentioned before, nobody responsed on it.  I did a little search and
found someone also encountered the problem.  I still have no clue how
to resolve it.

JunitTest_framework_unoapi.mk:28: *** Malformed target-specific
variable definition.  Stop.

To continue my build, I have to remove the lines related to Junitest.
There are several module with the same error. It's annoying to
workaround them all.   Could anybody help me?

Thanks.



What version of GNU Make do you use? 3.82 has a bug that let GNU Make 
spit out this error even if the variable definition is OK (and is parsed 
without problems by 3.81).


Regards,
Mathias


Re: ucpp sal dependency (was Re: OpenOffice nightly)

2011-12-14 Thread Mathias Bauer
Am 13.12.2011 17:01, schrieb Jürgen Schmidt:

 BTW: another fix for the current problem would be converting ucpp to
 gbuild.
 
 i would love to do that, do you know if it's already possible to use 
 gbuild to apply our patch process and trigger a build on the patched 
 sources?

No, that would be the part to implement. In another life ;-) I already
started working on that, but didn't get very far.

Regards,
Mathias


Re: ucpp sal dependency (was Re: OpenOffice nightly)

2011-12-14 Thread Mathias Bauer
Am 14.12.2011 22:44, schrieb Michael Stahl:

 On 13/12/11 17:01, Jürgen Schmidt wrote:
 
 BTW: another fix for the current problem would be converting ucpp to
 gbuild.
 
 i would love to do that, do you know if it's already possible to use 
 gbuild to apply our patch process and trigger a build on the patched 
 sources?
 
 not yet :(
 
 such an effort was not started during OOo times.

I worked on it for a few days somewhere in summer.

 
 there has been some work on this in LO but we don't yet build any external
 by default with gbuild...
 
 well, you could always use a CustomTarget to do it, but i guess it is very
 cumbersome that way...

Whether or not you do it with a CustomTarget, the most nasty problem was
to get clean dependencies - the essence of the gbuild idea. The pure
command execution seemed rather trivial compared with that.

Regards,
Mathias


Re: ucpp sal dependency (was Re: OpenOffice nightly)

2011-12-13 Thread Mathias Bauer

On 12.12.2011 15:09, Ariel Constenla-Haile wrote:


now on Windows:

link
/MACHINE:IX86
/IGNORE:4102
/IGNORE:4197
@C:/cygwin/tmp/mkM2vC8q
-safeseh
-nxcompat
-dynamicbase
-NODEFAULTLIB
-DEBUG
/SUBSYSTEM:CONSOLE
/BASE:0x1b00
-out:../../../../wntmsci12/bin/ucpp.exe
-map:../../../../wntmsci12/misc/ucpp.map

../../../../wntmsci12/obj/assert.obj
../../../../wntmsci12/obj/cpp.obj
../../../../wntmsci12/obj/eval.obj
../../../../wntmsci12/obj/hash.obj
../../../../wntmsci12/obj/lexer.obj
../../../../wntmsci12/obj/macro.obj
../../../../wntmsci12/obj/mem.obj
../../../../wntmsci12/obj/nhash.obj
msvcrtd.lib
uwinapi.lib SAL dependency
kernel32.lib
user32.lib
oldnames.lib
stlport_vc71_stldebug.lib


LINK : fatal error LNK1104: no se puede abrir el archivo 'uwinapi.lib'

dmake:  Error code 80, while making '../../../../wntmsci12/bin/ucpp.exe'
dmake:  Error code 255, while making
'./wntmsci12/misc/build/so_built_ucpp'


the solution for both errors on Linux and Windows seems to be

UWINAPILIB=$(0)


Yes, that's the unfortunate magic I mentioned that the old build 
system applies. uwinapi.dll is part of the standard set of libraries 
that our dmake based build system uses automatically.


This automatic linkage against uwinapi.dll is a desperate attempt. The 
purpose of this library is to fix bugs in the Windows API by providing a 
wrapper with functions that have the same name as the buggy ones. 
Linking against that library before any Windows library is seen prevents 
the call of the buggy code.


uwinapi.dll was definitely needed on Win9x (here it might have served as 
a UniCode wrapper also, I'm not sure), but AFAIR also for some Windows 
API functions on Win2000 and WinXP. The problem with non-automatic 
linking against uwinapi.dll was that developers needed to know which Win 
API functions they use and if one of them is one of the buggy ones at 
least on one OS version. Obviously there wasn't enough trust that 
developers would be able to track that, so uwinapi.dll was linked to 
everything on Windows.


Perhaps it is worth the effort to check how much of uwinapi is still 
needed because the only possible platform that could benefit from it 
would be WinXP - and possibly some of the bugs have been fixed in 
service packs anyway. Getting rid of uwinapi would be nice.


BTW: another fix for the current problem would be converting ucpp to gbuild.


LIBSALCPPRT=$(0)


Interesting, I didn't know that that such magic is active on Linux also 
(though I suspected that).


Good catch!

Regards,
Mathias


Re: Native support of the SVG graphic format in Apache OpenOffice.org

2011-12-06 Thread Mathias Bauer
Am 04.12.2011 21:27, schrieb Ariel Constenla-Haile:

 On Sun, Dec 04, 2011 at 07:14:26PM +0100, eric b wrote:
 P.S. : I got Mac Intel versions for testing purpose. Contact me in
 private if you are interested. Windows build should follow, but
 somebody seriously broken the build (probably yet another gnumake4
 mess)
 
 nobody has integrated anything from cws gnumake4. You can look at
 the history of solenv/gbuild files, and changes there have been rather
 small. So your build is broken due to other reasons.
 
 Last build breaker I've seen has been found by Andrew, in ucpp. You can
 see it in a clean build, if you do build --all in ucpp.
 On the other hand, I've experienced some random crashes with cygwin's
 make, again unrelated to source code changes.

The original Cygwin make is version 3.81 and has a concurrency problem.
This was fixed in make 3.82, but this version is not available on
Cygwin. Besides that, unfortunately 3.82 seems to have another bug that
is revealed by building OOo with it. So I took the 3.81 source code,
applied the bug fix for the 3.81 crash and it worked as fine as it can
on Cygwin. I had documented that in the Building Guide in the old OOo Wiki.

Regards,
Mathias


Re: OpenOffice nightly

2011-12-06 Thread Mathias Bauer
Am 04.12.2011 21:50, schrieb Ariel Constenla-Haile:

 On Sun, Dec 04, 2011 at 12:28:59AM -0300, Ariel Constenla-Haile wrote:
 
 Hi Andrew,
 
 On Sat, Dec 03, 2011 at 06:55:19PM -0800, Andrew Rist wrote:
  The nightly[1] has been updated and we have both RAT[2] and logs[3] now.
  Any help fixing the build would be great.
  Andrew
  
  [1] http://ci.apache.org/builders/openofficeorg-nightly/
  [2] http://ci.apache.org/projects/openoffice/rat-output.html
  [3] 
  http://ci.apache.org/projects/openoffice/buildlogs/log/unxlngx6.pro.build.html
  
 
 /bin/bash: 
 /home/buildslave19/slave19/openofficeorg-nightly/build/main/solver/340/unxlngx6.pro/bin/makedepend:
  No such file or directory
 
 This module is missing a dependency on soltools (where makedepend is
 built). Try the attached patch.
 
 on a clean build I noticed it also tries to link agains salcpprt, so you
 need to add sal to the dependency list.
 
 Regards

This would be a bad idea. External sources (even patched ones) should
not have dependencies on OOo sources. So it would be desirable to
replace any sal related code in external source code with some native
code. Besides that, I failed to see where ucpp might use something from
sal: the patch file only creates a dmake makefile and leaves the
original sources untouched. Why should it have any dependencies on sal?

Regards,
Mathias


Re: OpenOffice nightly

2011-12-06 Thread Mathias Bauer
Am 06.12.2011 21:35, schrieb Ariel Constenla-Haile:

 yes, this looked quit strange, and I didn't notice it was trying to link
 against salcpprt until I tested a clean build with only one process. It
 seems building with multi-processes is a bit tricky, because the build
 tries to go on as far as it can. So you may try to rebuild the broken
 module after build.pl stops, and it may seem to build find; in this
 case, sal seems to be build in parallel with that broken module, so you
 don't notice the dependency.

You perfectly described why we started to work on the new build system
based on a single process make in the first place. :-)

 So, it does try to link against ucpp.
 The switch seems to come from solenv/inc/unxlng.mk#226
 
 LIBSALCPPRT*=-Wl,--whole-archive -lsalcpprt -Wl,--no-whole-archive

That alone isn't wrong - the question is why is LIBSALCPPRT used as a
linker input? I can't find that anywhere in the makefile. Perhaps it's
one of the clever automatic dependencies that the old build system
added at its own will, e.g. the automatic dependency on uwinapi.dll in
every Windows library that I once removed in a painful procedure.

Instead of adding a dependency on sal I would rather recommend to look
for the place where this obviously wrong dependency is added. Maybe I
can find some time in the next days.

Regards,
Mathias


Re: License of used 3rd party component Berkeley DB - clarification needed

2011-12-02 Thread Mathias Bauer
Am 02.12.2011 16:08, schrieb Pedro Giffuni:

 For replacements you probably don't want to go back
 to the version in FreeBSD ... I don't know if maybe
 sqlite which is rather small fits the bill[1].

sqlite is an excellent small and fast database. I would definitely fit
the bill. Worth an investigation.

Nevertheless, I would only look for a replacement for BDB in our help
storage. Using a binary format like BDB for storing the user extension
repository was a bad decision and shouldn't be continued.

Regards,
Mathias


Re: License of used 3rd party component Berkeley DB - clarification needed

2011-12-01 Thread Mathias Bauer

Moin Oliver,

http://www.apache.org/legal/resolved.html#category-x

especially mentions the Sleepycat License (this is the license of BDB). 
So there should be no doubt that it is not AL compatible.


According to

http://www.gnu.org/licenses/license-list.html

it it GPL compatible.

Regards,
Mathias

On 01.12.2011 12:15, Oliver-Rainer Wittmann wrote:

Hi,

I think I was somehow to fast on my reaction regarding the license of
the used 3rd party component Berkeley DB.
Below you find the license text.
For me this license look very permissive regarding Apache's 3rd party
licensing policy.

I will submit a corresponding JIRA-legal issue to get this license
categorized as category A.


Here is the license text, found in the used source tarball [1]. It is
more or less the same as the license text found at [2] for the recent
version:
license text
/*-
* $Id: LICENSE,v 12.9 2008/02/07 17:12:17 mark Exp $
*/

The following is the license that applies to this copy of the Berkeley DB
software. For a license to use the Berkeley DB software under conditions
other than those described here, or to purchase support for this software,
please contact Oracle at berkeleydb-info...@oracle.com.

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
/*
* Copyright (c) 1990,2008 Oracle. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions
* are met:
* 1. Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
* 2. Redistributions in binary form must reproduce the above copyright
* notice, this list of conditions and the following disclaimer in the
* documentation and/or other materials provided with the distribution.
* 3. Redistributions in any form must be accompanied by information on
* how to obtain complete source code for the DB software and any
* accompanying software that uses the DB software. The source code
* must either be included in the distribution or be available for no
* more than the cost of distribution plus a nominal fee, and must be
* freely redistributable under reasonable conditions. For an
* executable file, complete source code means the source code for all
* modules it contains. It does not include source code for modules or
* files that typically accompany the major components of the operating
* system on which the executable file runs.
*
* THIS SOFTWARE IS PROVIDED BY ORACLE ``AS IS'' AND ANY EXPRESS OR
* IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
* WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, OR
* NON-INFRINGEMENT, ARE DISCLAIMED. IN NO EVENT SHALL ORACLE BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
* CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
* SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR
* BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
* WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE
* OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN
* IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
/*
* Copyright (c) 1990, 1993, 1994, 1995
* The Regents of the University of California. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions
* are met:
* 1. Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
* 2. Redistributions in binary form must reproduce the above copyright
* notice, this list of conditions and the following disclaimer in the
* documentation and/or other materials provided with the distribution.
* 3. Neither the name of the University nor the names of its contributors
* may be used to endorse or promote products derived from this software
* without specific prior written permission.
*
* THIS SOFTWARE IS PROVIDED BY THE REGENTS AND CONTRIBUTORS ``AS IS'' AND
* ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE REGENTS OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS
* OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
* HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT,
STRICT
* LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY
* OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF
* SUCH DAMAGE.
*/
/*
* Copyright (c) 1995, 1996
* The President and Fellows of Harvard University. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions

Re: Moving ext_src to Apache

2011-11-28 Thread Mathias Bauer
Am 30.10.2011 19:49, schrieb Mathias Bauer:

 Moin,
 
 thinking a bit about what would be the best to do I would like to sort
 all tarballs into several categories:
 
 (1) external source tarballs with an AL compatible license
 (2) external source tarballs with weak copyleft license
 (3) external source tarballs with strong copleft license

I decided to make things simple. So I just checked in the source
tarballs for (1) and (2), means sources with category A and B license.
You will find them in the new folder

trunk/ext_sources

Basically we should be able to use this folder in the build once we have
solved the remaining problems (see below). By luck this is the default
location where the OOo build looks for external tarballs if no parameter
was set in configure.

So I made an attempt to build with it (Ubuntu 11.10 64 Bit) and nearly
got through the build:

(1) ./configure (no parameters!)
(2) *no* bootstrap (I took dmake from the system and skipped the
fetch_tarballs that way)
(3) build

I got problems only in three cases:

(1) berkeley DB was missed in two places
(2) rhino missed swingExtSrc (this was discussed on the list already,
maybe we have a solution to fix that?)
(3) I then provided the missing tarballs temporarily and continued with
the build. Now I got a problem in instsetoo_native (building of deb
files didn't work, perhaps the system epm doesn't work here, I heard
that others were more successfull). But the archive install set was
build and worked!

That looks as if we are close to a build without external stuff with
strong copyleft license.

We still have to define how to deal with the weak copyleft stuff. I
found the following Category B licenses: MPL, CPL, CDDL. As we still
didn't get a definitive statement if we are allowed to keep them in svn
if we make sure that they are not built by default and they are not part
of our source releases, it seems that there is a common agreement to
keep them for now and possibly remove them later. Even if we had to
remove them, we now have an intermediate step that helps us a bit
forward, IMHO.

At this time my checkin does not change anything for the build, but you
can try it by providing a dmake instance before and not calling
bootstrap as described above.

Now we can start from there and change our build so that it uses this
folder by default.

Regards,
Mathias



Linux Build breaks in comphelper (Ubuntu 11.10, gcc 4.6.1

2011-11-26 Thread Mathias Bauer

Hi,

are there any recent changes in our build system that haven't been done 
for unxlngx6 in solenv/inc/gbuild?


When I try to build, comphelper can't link. There are so many symbols 
missing so that I assume that the libraries are just not found. I 
remember some changes around library postfix/prefix stuff. Perhaps 
unxlngx6 was forgotten in gbuild?!


Regards,
Mathias


Re: GPL'd dictionaries (was Re: ftp.services.openoffice.org?)

2011-11-25 Thread Mathias Bauer

On 25.11.2011 10:38, Andre Fischer wrote:

Hi Mathias,


On 24.11.2011 18:04, Mathias Bauer wrote:

Just a dumb question: why do we think that the dicts are source code?
At least those without patches are distributed without any treatment.
We just package them. So where is the difference between an MPL
library and an MPL .dic file? Just the extension and the encoding of
its content.


When we are just packaging them, then why not just provide the ready
made extensions and either bundle them or place them on the extension
repository. Which, by the way, already contains more spell-checking
extensions than the dictionaries module?


Mainly because we package some content from our side together with the 
dic files. That must be done somewhere, and until now it was desired 
that this somewhere is inside the OOo build process.


Treating MPL dics as source files would require to do the packaging of 
the dictionary extensions elsewhere, in the same way as building the MPL 
binaries must be done elsewhere.


I'm not convinced that this effort is needed, it is possible that we 
don't need to create dic files as sources in the same way as e.g. c++ 
files. Again, it was just a suggestion, some room for thoughts.


Regards,
Mathias


Re: GPL'd dictionaries (was Re: ftp.services.openoffice.org?)

2011-11-25 Thread Mathias Bauer
Am 25.11.2011 15:16, schrieb Andre Fischer:

 On 25.11.2011 14:59, Mathias Bauer wrote:
 On 25.11.2011 10:38, Andre Fischer wrote:
 Hi Mathias,


 On 24.11.2011 18:04, Mathias Bauer wrote:
 Just a dumb question: why do we think that the dicts are source code?
 At least those without patches are distributed without any treatment.
 We just package them. So where is the difference between an MPL
 library and an MPL .dic file? Just the extension and the encoding of
 its content.

 When we are just packaging them, then why not just provide the ready
 made extensions and either bundle them or place them on the extension
 repository. Which, by the way, already contains more spell-checking
 extensions than the dictionaries module?

 Mainly because we package some content from our side together with the
 dic files. That must be done somewhere, and until now it was desired
 that this somewhere is inside the OOo build process.
 
 But what about the extensions in the repository that do not have a 
 counterpart in the dictionaries module?  They have to come from 
 somewhere, too.  I know, that it is a frequently used pattern in OOo to 
 have at least two ways of doing things.  Maybe we can use this occasion 
 to remove one?

If it was so easy.

Having dictionaries in the code repository was a way to make sure that
we can provide them even if noone else maintains the dic files and
packages them as extensions.

I agree that if we had a safe home for all the dictionary extensions it
would be easier to use the packages provided from there. The question
remains if there is such safe home.

Regards,
Mathias


Re: GPL'd dictionaries (was Re: ftp.services.openoffice.org?)

2011-11-24 Thread Mathias Bauer
Just a dumb question: why do we think that the dicts are source code? At least 
those without patches are distributed without any treatment. We just package 
them. So where is the difference between an MPL library and an MPL .dic file? 
Just the extension and the encoding of its content.

Regards 
Mathias

Am 24.11.2011 um 15:57 schrieb Ariel Constenla-Haile arie...@apache.org:

 On Thu, Nov 24, 2011 at 06:29:42AM -0800, Pedro Giffuni wrote:
 Hunspell is still the main spellchecker in AOO but we
 cannot ship the italian dictionary and even the MPL
 dictionaries have to be removed from the repository.
 
 Exactly, what do you mean by saying You can go ahead and
 kill hunspell from the tree?
 
 We are not allowed to ship copyleft (strong or weak) in
 source releases so the same rules about not download+patching
 copyleft apply to hunspell.
 
 Unless I misunderstood something?
 
 https://cwiki.apache.org/OOOUSERS/ipclearance.html
 Task 1: Clarify legal usage of Category B (eg MPL) libraries
 
 Binary builds of libraries can be shipped with binary release of AOO.
 Source code of libraries can remain on an Apache server but (like
 ext_sources of old OOo.)
 BUT 
 *  source code of libraries is not shipped in a source release of AOO
 *  instead it can be downloaded and built during bootstrap, but only when
   developer uses a configure option that is off by default
 
 [end of quote]
 
 that's why rev. 1204995
 http://svn.apache.org/viewvc?view=revisionrevision=1204995
 introduces:
 --enable-hunspell - off by default
 --enable-hyphen   - off by default
 
 
 * Category B sources are not included
 * Using system/building Category B libraries is off by default
 
 Regards
 -- 
 Ariel Constenla-Haile
 La Plata, Argentina


Re: GPL'd dictionaries (was Re: ftp.services.openoffice.org?)

2011-11-24 Thread Mathias Bauer

Am 24.11.2011 um 19:50 schrieb Pedro Giffuni p...@apache.org:

 Hi Mathias;
 
 I wondered the same when I suggested they should be treated
 as documentation, and still it classifies as weak copyleft.
 
 --- On Thu, 11/24/11, Mathias Bauer mathias_ba...@gmx.net wrote:
 
 Just a dumb question: why do we think
 that the dicts are source code? At least those without
 patches are distributed without any treatment. We just
 package them. So where is the difference between an MPL
 library and an MPL .dic file? Just the extension and the
 encoding of its content.
 
 
 You mean like just tar them all and put them with the binary
 release?

Yes. Even packaging as extension and deploying these packages as part of a 
binary release does not look fundamentally different than bundling a library.

Regards,
Mathias

 


Re: GPL'd dictionaries (was Re: ftp.services.openoffice.org?)

2011-11-24 Thread Mathias Bauer


Am 24.11.2011 um 20:46 schrieb Pedro Giffuni p...@apache.org:

 Hi Mathias;
 
 --- On Thu, 11/24/11, Mathias Bauer mathias_ba...@gmx.net wrote:
 
 
 You mean like just tar them all and put them with the
 binary
 release?
 
 Yes. Even packaging as extension and deploying these
 packages as part of a binary release does not look
 fundamentally different than bundling a library.
 
 
 I think this is a perfectly viable solution.
 
 The only issue is what type of maintenance are
 we planning to do on this. I had suggested
 Apache Extras as a point of encounter for
 contributors as we can't maintain this directly,
 but perhaps this is something that doesn't get
 updated very much or perhaps the real maintainers
 can handle this on their own (as extensions).
 
 Pedro.

We don't need to change the dic files, we just take them as they are
 
 
 Regards,
 Mathias
 
 
 


Re: GPL'd dictionaries (was Re: ftp.services.openoffice.org?)

2011-11-24 Thread Mathias Bauer
Am 24.11.2011 21:27, schrieb Dennis E. Hamilton:

 Simple point: Something is category B because someone with the
 authority to do so put a category B license on it.  It doesn't matter
 what it is or how wrong-headed they were to do that.
 
 More complicated: It is important to understand the principle behind
 how category B material is handled the way it is in binaries.  It is
 about not having users commit errors with regard to the licensing of
 some material and making it difficult to innocently violate the
 applicable license.  
It seems that you don't get the point. I just wanted to mention that the
dictionary files we have in svn can be seen as and *end product* and so
probably(!) are comparable more with a binary file than with a source
file. This would leave the option that we even can ship them with a
source release. I don't say that this is a fact, I just wanted to point
out a possibility that is worth investigating.

Regards,
Mathias


Re: GPL'd dictionaries (was Re: ftp.services.openoffice.org?)

2011-11-24 Thread Mathias Bauer
Am 24.11.2011 22:43, schrieb Pedro Giffuni:

 That is exactly *your* point of confusion here. One
 of our mentors stated we cannot have infringing code
 in SVN at the time we graduate. (You had this pretty
 wrong with dmake which is GPL but it also applies to
 MPL). I really think you should add links to the mail
 archive in the Wiki, BTW.
 
 I don't see why you want to carry code that will not
 be in our code release but IP clearance is for
 everything in SVN.

As I see it the situation wrt. to MPL code in our svn repo is unclear. I
asked about that several times, but noone replied. So obviously noone
has the answer now.

I agree with you that keeping MPL code in our repo might be wrong. But
OTOH investing time into throwing it out now and discovering later on
that this wasn't necessary isn't a nice perspective either.

We don't plan to graduate tomorrow, so this leaves us time to check this
important point more carefully. In the meantime IMHO we don't create a
problem if we keep the MPL sources in svn for now and only make sure
that the process that creates a source release does not include them. We
still can remove them, for the time being we need to identify them and
think about possible alternatives.

Regards,
Mathias



Re: [Code] strategy for child works spaces

2011-11-23 Thread Mathias Bauer
Am 23.11.2011 19:40, schrieb Ariel Constenla-Haile:

 HI eric, *
 
 On Wed, Nov 23, 2011 at 05:16:35PM +0100, eric b wrote:
 FWIW, Michael Stahl had these CWSs in the pipeline, I hope
 he or someone else finds the time to merge them into some
 branch:
 
 
 gnumake4
 
 
 I started working in this one. I took the apply-per-commmit approach
 (I did one big diff but was very error prone). They are ca. 180
 commits,
 may be I'll finish by next Monday dedicating 2 hrs per day.
 
 
 To be sure : is the plan to use gmake for the whole build ?
 
 the original plan is to replace the old build.pl/dmake build system 
 with a new GNU make base one (there was a blog post on GullFOSS, now
 dead, explaining that).
 
 So far, with the commits I've applied, gnumake4 converts the following 
 modules to gbuild:
 
 basebmp
 basegfx
 canvas
 cppcanvas
 idl
 linguistic
 sax
 starmath
 ucbhelper
 unotools
 wizards
 xmlreader
 xmlscript
 
 not yet applied, but seen in that cws too:
 
 dbaccess
 oox
 reportdesign
 writerfilter

IIRC a cws that brings writerfilter to gbuild was already merged into
gnumake4.

Regards,
Mathias


Re: [Code] strategy for child works spaces

2011-11-21 Thread Mathias Bauer
On 20.11.2011 12:22, Christian Lohmaier wrote:
 Hi Eric, *,

 On Sun, Nov 20, 2011 at 10:12 AM, eric beric.bach...@free.fr  wrote:
 Le 19 nov. 11 à 22:55, Mathias Bauer a écrit :
 Am 19.11.2011 15:22, schrieb Pedro Giffuni:

 I still prefer the conversion of a cws in single diffs, each one
 representing a single commit.

 Me too.  That's the most efficient way to integrate a cws.

 No, it is not. A cws can be long lived, could have underwent multiple
 rebases with the current tree, so there are lots of commits that refer
 to a old version of the code and thus won't apply anymore.
 Unless you want to do lots of detecitve work and redo all the merging
 work that the author of the cws did do already over the course of
 time, trying to apply a cws by its individual commits is a waste of
 time, not to mention that it is

 The version to apply a cws to a different version-control system is to
 create a diff agains the current milestone the cws is based on and try
 to apply that one. (feel free to create a diff for each module). This
 will give way less conflicts and is much faster. You loose history,
 but that was your decision when converting the repo to begin with.

As I tried both methods on some cws already, I don't recommend the one
cws diff method, except in one single case (see below).

You will get merge conflicts in both ways, but by turning a cws into
single diffs you will get smaller merges, and especially you will get
many merges that someone already did. Looking on the merge commits in
Mercurial will help you with the conflict resolution.

Applying huge diffs in code you don't know perfectly is hard. Applying
diffs that each represent a single change with better defined meaning
and intent should be easier and less error-prone. It might take a little
bit longer, but the result will be better and the developers carrying
out the integration can learn something about the code.

Of course, if a cws isn't well organized with commits of manageable size
and at least somewhat helpful comments, working with single commit
diffs would suffer from its disadvantages without the advantages - so
applying a single diff and trying hard might be more efficient. But you
will lose the cws history then and I still think we should preserve it
when possible (it wasn't for the stuff already integrated,
unfortunately, but that's no reason to lose everything else too).

Regards,
Mathias


Re: [CODE] issue 118576: Crash on close

2011-11-19 Thread Mathias Bauer
Am 08.11.2011 23:40, schrieb eric b:

 Hi Dennis,
 
 Le 8 nov. 11 à 22:45, Dennis E. Hamilton a écrit :
 
 I concur on reverting the patch.
 
 
 Me too : Andre explained everything and convinced me that the issue  
 needs to be fixed correctly.
 
 Though, to my eyes, there is one big issue remaining : I'd like to  
 study what has been implemented (the full diff) when the new  
 configmanager was introduced, and I still don't know how to proceed :  
 sb111 (and other associated cws's) are no longer available (any  
 suggestion is welcome).

Though it's a bit late, here's a suggestion:

sb111 was integrated some time ago, but you could try to get its changes
from the check-in comments. If Stephan did it as expected, the comments
should contain an sb111.

You could also use hg serve on a local Mercurial repo of ooo340 and
try to use the graphical visualization to identify the changes.

Regards,
Mathias


Re: Anybody interested in TestTool? I wrote a Java lib to replace it.

2011-11-19 Thread Mathias Bauer
Am 18.11.2011 13:45, schrieb Joost Andrae:

 Hi,
 
 I prefer to have tests from several tiers, firstly having a UI test 
 framework that is written and performed by testers who are not that 
 experienced in writing tests in whatever programming language (BASIC is 
 not a must have but it needs to be as easy as LOGO turle graphics) 
 mostly to assure the quality of what has yet been tested and secondly 
 having API based tests to stress test the applications' programming 
 interface and thirdly having unit tests that are instanciated directly 
 from within the C++  or JAVA source code to detect common coding bugs 
 and at last to perform manual tests because UI based programs often show 
 bugs when a real person uses the application (like we found several bugs 
 in the past that would have never been found if we used the automation 
 based tests only).

The language is of second order. The quality of the tool is what
matters. Here the vcl testtool leaves a lot to desire. And, as I wrote,
we can't improve it, fix bugs or add code for new requirements (e.g.
parallel test execution to utilize multi core computers, removal of
frustrating waits etc.) because we don't have the source code.

 For me the VCLTesttool is dead. We shouldn't invest time there if we had
 an alternative GUI testing option with available source code. So we
 definitely should give the Java lib a try, if it could be made available
 under a suitable license.
 
 For me VCL testtool is soumething from the past but it is still needed 
 until we have a replacement.

I beg to differ. The test tool is for finding bugs in the UI code, for
everything else it's just a waste of time. As long as we don't do heavy
UI rewrites writing integration tests is far more effective.

The offer from Zhe Liu looks much more appealing than trying to revive
the vcl testtool zombie.

Regards,
Mathias


Re: [Code] strategy for child works spaces

2011-11-19 Thread Mathias Bauer
Am 18.11.2011 19:26, schrieb Pedro Giffuni:

 Hi Eric;
 
 --- On Fri, 11/18/11, eric b eric.bach...@free.fr wrote:
 
 Disclaimer:  this list is not
 easy to read, and if the topic was already discussed. In
 this case, thanks in advance to provide me the right link
 :-)
 
 Hi,
 
 I perfectly know the importance of the IP clearance, but in
 parallel, we'll need to work on the code, say partialy
 (e.g. vcl + sd + slideshow only). In OOo we used child work
 spaces in that purpose, but I'm not sure we defined
 something similar yet with Apache OpenOffice.org.

 
 I personally don't understand well how those CWSs worked
 or how they are integrated. I would personally prefer if
 just use SVN branches exclusively from now on.
 
 I think OOo has not really used this historically, but in
 other projects developers have their own branches and can
 do work without disrupting the main tree.

We have used them. At that time they where a PITA as updating them from
the master or integrating them into it was very time consuming and
required a lot of manual merging work. That worked much better with
Mercurial, but that's not an option now, I know.

We probably might want to reserve work on branches for larger code
changes that require weeks or months, but integrate smaller changes
directly into the master in a CI style with daily builds.

Regards,
Mathias


Re: [Code] strategy for child works spaces

2011-11-19 Thread Mathias Bauer
Am 19.11.2011 15:22, schrieb Pedro Giffuni:

 I think we could use a SVN branch as a buffer to integrate
 CWSs one by one; that way we don't interrupt current work
 and get to try the CWSs before the tree changes too much
 to make merging difficult.
 
 Creating branches is very easy and any committer can do it.
 Is there a way to get CWSs as diffs?

You mean the existing ones from hg?

We discussed that some weeks ago. I still prefer the conversion of a cws
in single diffs, each one representing a single commit. It seems to be
doable for most cws. For the ones we integrated, the diffs have been
applied to the trunk, there's nothing that prevents that they could be
applied to branches.

Regards,
Mathias


Re: Anybody interested in TestTool? I wrote a Java lib to replace it.

2011-11-18 Thread Mathias Bauer

On 16.11.2011 08:21, Raphael Bircher wrote:

Hi Liu

Am 16.11.11 03:29, schrieb Zhe Liu:


Hi all,

Vcl TestTool is used by GUI automated testing for OpenOffice.org, but it
has many drawbacks. Too many errors, difficult to debug, maintain and
execute parallelly. I found many people mentioned this problem.
Libreoffice
has abondoned the tool. But I think GUI automated testing still is
valuable. It can test the product more like the actual users.

The problems with the VCLTestTool are well known. But to write a new one
is not trivial too. It needs a load of work to bring a new tool to the
productiv use. Well, we can maybe use the old testscripts as guide for
new testscripts. But the VCLTestTool as is works fine, and you can find
errors realy quick. Mainly not working dialoges, crashs and freese.
VCLTestTool is not so buggy as many people say. But yes, it needs
aditional work to bring it to a productive tool for the community.


And here's the problem: to work on the testtool, you need the source 
code. And the source code is not available, at least not here at Apache. 
It is based on a very old version of OOo's source code that was written 
years ago. To build it, you will need large parts of the OOo source code 
in that old revision - no good idea, moreover, it's totally unclear if 
this old revision is covered by Oracle's code grant. At least it would 
require and additional IP review.


Of course some testtool source code also exists in the current release, 
but it is unused and untested since years. I remember that even the 
latest larger code rework in that area (help IDs becoming Strings 
instead of integer numbers) was not done for the testtool, only for the 
testtool library that is loaded in OOo when tests are executed. Instead 
of that larger parts of the testtool source code where just commented 
out to please the compiler (nobody needs that code, why invest time 
into it). Sigh.


Besides that, the available Linux version of the testtool does not run 
on at least Ubuntu 64Bit (with 32Bit libs installed), but probably also 
on other Linux versions. Sooner or later it won't run on more an more 
Linux and probably MacOS versions.


For me the VCLTesttool is dead. We shouldn't invest time there if we had 
an alternative GUI testing option with available source code. So we 
definitely should give the Java lib a try, if it could be made available 
under a suitable license.


Until then we should concentrate on writing integration tests in C++ or 
Java. We already have some of them (called complex tests) and they 
have proven to be the most effective tool for bug hunting that we have.


Some reasons why they are more helpful, at least ATM:

- they run much faster than the UI based tests
- they can be executed in parallel (as much as you want)
- the gbuild build system supports them, so they can be automated 
without additional tooling (no test launcher necessary)
- they are nearer to the code they test, so the root cause of a bug 
usually is found faster
- they don't require to learn a strange Basic dialect, developers can 
write them in the same language that they use daily
- it's common sense that testing through the UI should be done only if 
you want to test the UI itself, but not the code behind it


Regards,
Mathias


Re: Is it time for a build machine?

2011-11-15 Thread Mathias Bauer
As much as I liked the old build bots and used them a lot, the whole setup and 
their overall usage was far from what Hudson/Jenkins can deliver.

So I maintain my point that there is nothing we can use and starting with a new 
setup and new hardware is better. 

YMMV

Regards,
Mathias

Am 15.11.2011 um 22:12 schrieb Christian Lohmaier cl...@openoffice.org:

 Hi *,
 
 On Tue, Nov 15, 2011 at 6:17 PM, Mathias Bauer mathias_ba...@gmx.net wrote:
 On 15.11.2011 04:36, Pedro Giffuni wrote:
 
 Just wondering,
 
 Perhaps the older OOo at SUN/Oracle also had some
 setup for hudson/jenkins that we could reuse?
 
 No, unfortunately OOo never embraced continuous integration and all the
 other wonderful things you can build around it.
 
 There have been both tinderbox as well as buildbot available and in
 use in the OOo project.
 Tinderbox did keep track of commits, did flag build-results as dirty
 when there were commits after the last build started, and thus allowed
 rebuilding when a cws was touched, and (some) buildbots were
 autotriggered by watching the commit-mailinglist, so they as well did
 built whenever the code was changed.
 That the build-results have often been ignored by the corresponding
 developers is a different story. But stating that there was no such
 thing is, well, typical.
 
 ciao
 Christian


Re: [DISCUSS]: new home for pre-built unowinreg.dll

2011-11-10 Thread Mathias Bauer

On 09.11.2011 13:43, Jürgen Schmidt wrote:

On 11/5/11 1:30 PM, Mathias Bauer wrote:

Am 01.11.2011 14:15, schrieb Jürgen Schmidt:


Hi,

for all unix builds it is possible to use a pre-built unowinreg.dll that
is used in the SDK for Java client applications.

Background:
This dll contains some glue code that helps to find a default office
installation on windows. This is used to bootstrap an UNO environment
and establish a remote connection to an existing or new office instance
from the Java client application that triggers this code.

If is possible to cross compile this dll with mingw in some way but not
really necessary. It was always possible to download a pre-built version
and include it in the SDK on all plattforms expecting Windows where it
is built always.

I would suggest that we store this pre-built dll somewhere to ensure
that this mechanism can be used or will work in the future as well.

The URL to download the pre-built version is
http://tools.openoffice.org/unowinreg_prebuild/680/unowinreg.dll

The code is part of the odk module and is quite simple. Means it can be
always checked what's in the dll. We can apply a md5 checksum to ensure
that no manipulated dll is downloaded.

Any ideas where we can store this dll in the future?


In the build the unowinreg.dll is expected to be in external/unowinreg.
Usually the developer needs to copy it there. We could just check it in
there in case we wanted to stick with the binary.

i think it is not allowed to check in binaries in the source tree, at
least pre-built ones. I would be happy with this solution because it was
the solution we had at the beginning ;-)


IIRC the reason why we gave that up wasn't legal or technical, it was 
just the religious belief of a single contributor that everything must 
be built from source.


While I agree with that belief on a general level, I also think that 
allowing exceptions make sense when the effort to follow it becomes too 
high.


As we already agreed on possible keeping MPL binaries in our svn, why 
shouldn't we keep a binary that was built from our own source, but just 
on a different OS?


Regards,
Mathias


Re: Report Builder extension (was Re: [proposal] development for the first AOO release)

2011-11-10 Thread Mathias Bauer
Am 10.11.2011 16:22, schrieb Pedro Giffuni:

 Hi Ariel;
 
 I honestly have no idea how this works: can we turn this
 into an uno extension, and make it available in the
 extensions site?

The report build *is* an extension.

Or are you talking about the jfree jars alone? I'm afraid that they are
not used via UNO but plain Java APIs, so there is no chance.

Regards,
Mathias


Re: agg and epm are still in svn repo.

2011-11-10 Thread Mathias Bauer
Am 10.11.2011 16:52, schrieb Jürgen Schmidt:

 ok, drop counterproductive but i still don't understand why you have 
 checked in it at all. The update if necessary could have been done at a 
 later time as well.

You seem to misunderstand what Pedro did. agg was always part of the
source tree. IIRC Pedro tried to remove it, but then canvas couldn't be
built anymore. So he at least updated it as much as possible. IMHO a
good idea.

Regards,
Mathias


Re: [code] main/dmake

2011-11-09 Thread Mathias Bauer

Hi Ross,

I'm still a little but confused about what license incompatible code 
means here.


In its exact wording MPL code *is* incompatible, as only the binaries 
are allowed to be in an Apache release. Does that mean that we must not 
have MPL source code in our svn?


The link

http://www.apache.org/legal/resolved.html

does not answer this question:

Software under the following licenses may be included in binary form 
within an Apache product if the inclusion is appropriately labeled:


As binary releases contain binaries anyway, I assume that Software 
means the source code. The cited statement leaves it open if that means 
released source tarballs or svn.


Maybe the following link helps:

http://www.apache.org/dev/release.html#release-typeso

What Must Every ASF Release Contain?

Every ASF release must contain a source package, which must be 
sufficient for a user to build and test the release provided they have 
access to the appropriate platform and tools.


This rule can be followed by providing a source tarball as part of the 
binary release that contains the same MPL binaries as the product, but 
not the source code. It does not explicitly forbid MPL source code (that 
is used to build the binaries we deliver in source tarball and product) 
in our svn repo. But I failed to find a quote that explicitly *allows* it.


Can you shed some light on this?

Regards,
Mathias

On 05.11.2011 12:27, Ross Gardler wrote:

In order to graduate there can be no license incompatible code in
SVN. The solution below is ok only as an interim solution.

Sent from my mobile device (so please excuse typos)

On 4 Nov 2011, at 15:38, Oliver-Rainer
Wittmannorwittm...@googlemail.com  wrote:


Hi,

our build tool dmake is licensed under GPL. Thus, it can not be
part of our source releases. But, we can use it for building - as
we are using the gcc compiler.

Thus, I will move the dmake source folder from .../ooo/trunk/main/
to new folder .../ooo/buildtools/ in order to assure that
everything under .../ooo/trunk/ can become part of our source
release.

In order to get our bootstrap process still working it needs some
adaption: I am planning to introduce a configure option in order to
provide manually the path to the source folder of the build tool
dmake - something like with-dmake=$PATH to dmake folder. If this
option is not used, the default path ../../buildtools/dmake/ -
relative from folder main - will be taken. The configure will then
check, if this folder exists - the manual given one or the default.
The bootstrap process will then work with this path to create the
build tool dmake.

Any objections?


Best regards, Oliver.






Re: GPL'd dictionaries (was Re: ftp.services.openoffice.org?)

2011-11-08 Thread Mathias Bauer
On 07.11.2011 20:10, RGB ES wrote:

 At this point, IMHO, the best solution will be to deliver an Apache
 binary *without* dictionaries and put on the download page a huge button:

 |Download dictionaries   |
 |(External site)  |

 A clear note telling users something like:

 OOo releases do not include dictionaries because of license problems, you
 will need to download them separately

 will be enough, I think. As a matter of fact it is not a new situation.
 Just my 2¢

On Linux we even can use the Hunspell dictionaries that are installed
(or installable) in the system. It needs to provide a single
configuration file in the build to make that happen (at least last time
I used that feature).

We need to offer downloads only for the other platforms that don't
allow to install Hunspell dictionaries into the system into a defined
location.

Regards,
Mathias


Re: Hunspell dictionaries are not just words lists (+ other matters)

2011-11-08 Thread Mathias Bauer
On 07.11.2011 12:37, Pedro Giffuni wrote:
 For the record,

 I respect that this type of work takes a *lot* of time and
 hard work, and that people do have the right to make their
 work copyleft.

 There is however, for practical purposes, a huge difference
 for us between MPL/LGPL (the french case) and GPL-only (the
 italian case).

More precisely (as the useles discussion started in this thread
distracted from the real topics):

Apache OOo could include even Hunspell dictionaries under (L)GPL from a
legal perspective, as according to the FSF packaging dictionaries into
an application does not make this a derivative work and so the
application that packs the dictionary does not need to follow the same
license as the dictionary. This allowed us to use GPLed dictionaries in
the past in our LGPLed office application.

But from the Apache perspective we can only package dictionaries
released under compatible licenses, including MPL. And in the latter
case we can't provide the sources for them in our svn repository. This
is not enforced by the copyright of the dictionaries, but by the Apache
rules, as far as I understood. But at the end that doesn't make a big
difference in practice.

Regards,
Mathias

(Who thinks that it doesn't matter if the copyright of a dictionary can
be enforced in court or just applies because we respect the will of the
author that he as expressed by choosing a particular license)


Re: GPL'd dictionaries (was Re: ftp.services.openoffice.org?)

2011-11-08 Thread Mathias Bauer
Am 08.11.2011 02:25, schrieb Ariel Constenla-Haile:

 OOo has three kind of linguistic components:
 * spell checker
 * hyphenator
 * thesaurus
 
 OOo provides a default implementation for the three by means of
 Hunspell:
 
 1) hunspell - LGPLv2+ or GPLv2+ or MPLv1.1
 
 http://hunspell.cvs.sourceforge.net/viewvc/hunspell/hunspell/COPYING?view=markup
 2) hyphen - LGPLv2+ or MPLv1.1
 
 http://hunspell.cvs.sourceforge.net/viewvc/hunspell/hyphen/COPYING?view=markup
 3) mythes - BSD
 
 http://hunspell.cvs.sourceforge.net/viewvc/hunspell/mythes/README?view=markup#l38
 
 (3) with the BSD license is compatible with the Apache License
 http://www.apache.org/legal/resolved.html#category-a
 
 (1) and (2) with the MPLv1.1 are weak copyleft, this means both
 libraries may be included in binary form within an Apache product if
 the inclusion is appropriately labeled
 http://www.apache.org/legal/resolved.html#category-b
 
 And this is what actually happens with the three libraries: ./bootstrap
 fetches the compressed files, and the libraries are built in
 
 trunk/main/hunspell
 trunk/main/hyphen
 trunk/main/mythes
 
 Look at the folders:
 http://svn.apache.org/viewvc/incubator/ooo/trunk/main/hunspell/
 http://svn.apache.org/viewvc/incubator/ooo/trunk/main/mythes/
 http://svn.apache.org/viewvc/incubator/ooo/trunk/main/hyphen/
 
 there are only patches and makefiles, no copyleft source.
 AOOo can keep Hunspell based linguistic components, there is no need to
 remove anything, it is already IP-clean.
 A source release will never include the sources for these components,
 as it never did if you downloaded an archived version from
 hg.services.openoffice.org

But you need to pull the source tarballs from somewhere - and this again
touches the topic where to put external source tarballs of (weak)
copyleft modules. I'm going to continue with my moving ext_src to
Apache thread tomorrow, so we can follow-up there as this is not a
liguistic-only topic.

Regards,
Mathias



Re: GPL'd dictionaries (was Re: ftp.services.openoffice.org?)

2011-11-08 Thread Mathias Bauer

On 07.11.2011 20:10, RGB ES wrote:


At this point, IMHO, the best solution will be to deliver an Apache
binary *without* dictionaries and put on the download page a huge button:

|Download dictionaries   |
|(External site)  |

A clear note telling users something like:

OOo releases do not include dictionaries because of license problems, you
will need to download them separately

will be enough, I think. As a matter of fact it is not a new situation.
Just my 2¢


On Linux we even can use the Hunspell dictionaries that are installed 
(or installable) in the system. It needs to provide a single 
configuration file in the build to make that happen (at least last time 
I used that feature).


We need to offer downloads only for the other platforms that don't 
allow to install Hunspell dictionaries into the system into a defined 
location.


Regards,
Mathias


Re: Crystal and Oxygen images (was Re: GPL'd dictionaries)

2011-11-08 Thread Mathias Bauer

On 09.11.2011 03:20, Pedro Giffuni wrote:

Hello guys;

Ahem...

I am afraid the external_images directory only contains
copylefted content. I thought those corresponded to
KDE themes, but they look very specific to
OpenOffice.org. Is this also a big problem as it
seems?

Pedro.



No, we can just leave them out. I don't think that many users will miss 
them.


Regards,
Mathias


Re: Need a current build for WinXP 32bit

2011-11-07 Thread Mathias Bauer

On 07.11.2011 05:49, Dennis E. Hamilton wrote:

Thanks Mathias,

I found the ATL headers in the WinDDK/.../inc/atl71/, and libraries
too.  There is also the ATL Reference Guide and other materials
available at MSDN on-line, along with some books in a very dusty
corner of my office shelves.  That is one heck of a dependency.  I
wonder how much of it actually adds to OO.o to do a static binding
[;).

It would be interesting to see how much could be replaced by
plain-vanilla COM dependencies.  Not something I will be in any hurry
to dig into though.  Just something to nag my mind while I
concentrate on simpler things first.


Basically ATL is not needed at all - everything could be implemented 
without it. But at least some of the code in the very low level COM 
stuff was much easier to write with ATL.


In the former Framework team we have been working on replacing ATL code, 
but we only finished the task for one library. There's still a lot of 
work to do.


Regards,
Mathias


Re: Report Builder extension (was Re: [proposal] development for the first AOO release)

2011-11-07 Thread Mathias Bauer

On 07.11.2011 09:50, Oliver-Rainer Wittmann wrote:



On 03.11.2011 09:16, Mathias Bauer wrote:

On 02.11.2011 15:52, Oliver-Rainer Wittmann wrote:


I am not planning to remove the report builder extension.
I am planning to remove the 3rd party components which are used by the
report builder extension as they are licensed under LGPL.
This will have the effect that the report builder will not work anymore.
Thus, I have got in mind to disable its building without touching any
code of it.


You don't need to do that as by default the report builder isn't
built. :-)



I know ;-)

But, I wanted to assure that nobody uses the configure option to enable
the build and then fail.


Why prevent others from building Report Builder?

We don't build the JFree Report stuff anyway, to build the Report 
Builder you have to provide the pre-built jars. Currently they are 
pulled automatically from ext_src, but we could change that to pulling 
it from external like other stuff and throw errors in case the jars 
aren't there and the switch for building Report Builder is used. So 
someone who wants to build it just needs to copy the jars to 
external/jfreereport or so.


Regards,
Mathias


Re: Addressbook replacement: CardDAV

2011-11-06 Thread Mathias Bauer

On 01.11.2011 20:45, Pedro Giffuni wrote:

Hi;

I was looking at the IP_Clearance Wiki, and the known problem of
replacing Mozilla.

I have noticed that for many things Gecko is being replaced with
Apple's Webkit some further investigation shows that Apple
Released their Calendar and Contacts server under AL2:


We don't use Gecko anywhere in OOo. Besides that, Webkit is much more 
evil than Mozilla because on Windows it requires proprietary libraries 
from Apple (or using suboptimal cairo based libraries).


Regards,
Mathias


Re: Addressbook replacement: CardDAV

2011-11-06 Thread Mathias Bauer

On 01.11.2011 21:11, Dennis E. Hamilton wrote:

Looks interesting.

I thought there was a dependency on the Mozilla address book with
regard to digital signature certificates as well.


Yes, OOo has a dependency on Mozilla code (nss library) for digital 
signing on all non-Windows platforms (including the UI for it). But the 
address book stuff is not located in nss, it's in the Seamonkey lib we 
have in the mozilla module. It's completely unrelated to the digital 
signing code.


Regards,
Mathias


Re: Need a current build for WinXP 32bit

2011-11-06 Thread Mathias Bauer

On 31.10.2011 20:18, Dennis E. Hamilton wrote:

Regina,

I would like to find an already built Win32 WinXP version too.  I
despair of ever succeeding in building one myself without extensive
practice with the tools that I am expected to operate to accomplish
that.

I don't know how to deal with the ATL dependencies. I thought that it
was going to be made available independently, but it is apparently
still tied to VC++ 200xy non-express editions.


As I already wrote on this list some weeks ago, you can get ATL headers 
by installing the Windows driver SDK. It might require to add paths to 
your build environment, but basically it should work with VS Express.


It might be an idea to install the driver SDK, get the necessary stuff, 
move it into a suitable location, adapt include path and library path of 
your build env and then deinstall the otherwise useless SDK again.


Regards,
Mathias


Re: Report Builder extension (was Re: [proposal] development for the first AOO release)

2011-11-06 Thread Mathias Bauer

On 02.11.2011 15:52, Oliver-Rainer Wittmann wrote:


I am not planning to remove the report builder extension.
I am planning to remove the 3rd party components which are used by the
report builder extension as they are licensed under LGPL.
This will have the effect that the report builder will not work anymore.
Thus, I have got in mind to disable its building without touching any
code of it.


You don't need to do that as by default the report builder isn't built. :-)

Regards,
Mathias


Re: [DISCUSS]: new home for pre-built unowinreg.dll

2011-11-05 Thread Mathias Bauer
Am 01.11.2011 14:15, schrieb Jürgen Schmidt:

 Hi,
 
 for all unix builds it is possible to use a pre-built unowinreg.dll that 
 is used in the SDK for Java client applications.
 
 Background:
 This dll contains some glue code that helps to find a default office 
 installation on windows. This is used to bootstrap an UNO environment 
 and establish a remote connection to an existing or new office instance 
 from the Java client application that triggers this code.
 
 If is possible to cross compile this dll with mingw in some way but not 
 really necessary. It was always possible to download a pre-built version 
 and include it in the SDK on all plattforms expecting Windows where it 
 is built always.
 
 I would suggest that we store this pre-built dll somewhere to ensure 
 that this mechanism can be used or will work in the future as well.
 
 The URL to download the pre-built version is
 http://tools.openoffice.org/unowinreg_prebuild/680/unowinreg.dll
 
 The code is part of the odk module and is quite simple. Means it can be 
 always checked what's in the dll. We can apply a md5 checksum to ensure 
 that no manipulated dll is downloaded.
 
 Any ideas where we can store this dll in the future?

In the build the unowinreg.dll is expected to be in external/unowinreg.
Usually the developer needs to copy it there. We could just check it in
there in case we wanted to stick with the binary.

Regards,
Mathias


Re: Willing help on Test

2011-11-05 Thread Mathias Bauer
Am 02.11.2011 14:34, schrieb Rob Weir:

 So what do we have?  What do we need?
 
 I have no idea how QA was done before for OpenOffice.org, but it make
 sense that you have basic elements like:
 
 
 1) Unit tests that developers can execute before checking in code.  We
 already have those, right?  Are they working?  Do they have good
 coverage?  Would it be worth improving testing at that level?

Unit tests exist only for some low level libraries. We have some so
called complex tests and some simple API tests. All of them are
definitely worth to get improved, it's the best we could do.

We never investigated coverage, so no idea how much code is covered by
these tests.

Writing unit tests for most of the higher level OOo code is hard or
close to impossible, as the code refuses to be run in a test harness.
Too many code is depending on too much other code.

 2) Manual scripted tests.  This could be based on written test cases
 and test documents.  These tests require some expertise to
 design/write, but once the test cases are written they can be tested
 by a much larger set of volunteers.  Even power users could be helpful
 here.  A good tester follows the test case, but also has skills in
 describing a bug in the defect report, with all necessary detail, but
 little extraneous detail.  They know how to think like a bug.

That might be comparable to what I called complex test cases. As
writing unit tests is hard for many components, as mentioned above, this
is the kind of testing that gives us the most bang for the bug. The
build system has support for building and running them and basically all
of them could be run in parallel, if set up accordingly.

 4) Scripted/automated testing via the GUI.  Requires more effort and
 skill  to write and maintain, but once done, it requires less effort
 to execute.

That depends on what you mean by effort. The tests that we have run
awfully slow - even the most basic tests together sum up to a run time
of approximately 8 hours. If you wanted to run all tests that have been
written (and why wouldn't you want to do that?) you had to invest
several days for just one platform.

There are several reasons for that. It could be improved by running as
much tests in parallel as possible, using as much cores of the test
machine as possible. If the test infrastructure wouldn't have been so
byzantine and inflexible, we probably could have done that already.
The test tool and what we can still do with it might become a larger
topic, I will open a new thread for it.

Regards,
Mathias


Re: Greetings from Betsy

2011-11-02 Thread Mathias Bauer

On 01.11.2011 09:21, Jürgen Schmidt wrote:


the help content is currently coming from xhp files that you can find in
the helpcontent2 module in the source. xhp files are xml files that can
be edited with the office (a special filter is required) and there
exists an extensions that can help to edit these files. It's a
collection of macros as far as i know that provides some useful tooling
to manage help-ids etc. But i am not expert here and i can't say where
to find this extension. but i will try to figure that out or hopefully
somebody else can help us.
I took the liberty to commit the help authoring extension into one of my 
last cws some months ago. It should be integrated into AOOo already 
(module helpauthoring).


It was a quick hack just to save it, I didn't invest a lot of time to 
fix the build of it. But as this is only packaging, it should be easy to 
accomplish that.


Regards,
Mathias


Re: svn commit: r1190021 - /incubator/ooo/trunk/main/configure.in

2011-10-29 Thread Mathias Bauer
Hi Pedro,

Am 28.10.2011 03:29, schrieb Pedro Giffuni:

 To further clarify..
 
 I thought disabling it would be a good midpoint between
 removing it and keeping it. In anycase I think we must
 keep the option alive in the forseeable future.
 
 I see no hurry to take a decision and the patch is pretty
 small so it's easy to undo. How about we keep it like this
 for a while and we re-enable it by the default (lazy
 consensus again) before the release?

Yes, let's do it like that.

Regards,
Mathias



Re: how can I Extensions_Integration_into_Installation_Set?

2011-10-23 Thread Mathias Bauer
Hi,

Am 18.10.2011 07:51, schrieb jianlizhao:

 I want do  Extensions Integration into Installation Set, I find web page
 bellow:
 
 http://wiki.services.openoffice.org/wiki/Extensions_Integration_into_Install
 ation_Set
 
 I read the article many times. I still can not solve the problem.
 
 My question is:
 
 I do not know which directory the  Extensions  files  is located under
 slover.
 
 for example:Extension Dictionarie   is located under slover’s pck
 directory.

All extensions containing code should end up in the bin sub folder of
the solver.

Regards,
Mathias



Re: Python and other scripting framework

2011-10-23 Thread Mathias Bauer
Am 20.10.2011 17:47, schrieb Alexandro Colorado:

 Wonder what is the future of the UNO scripting framework since there are
 many languages with different languages like Python, Beanshell and other
 scriptings that OOo ships. OOo builds have a full Python 2.6 version and
 also IDE like Rhino and other applications that are stringly attached to the
 OpenOffice.org core.
Python is not related to the Scripting Framework, it has its own UNO
Language Binding. The Scripting Framework adds support for some
scripting languages with an interpreter written in Java.

Besides that I would expect that the future of the Scripting Framework
will be defined by those who will work on it. Until developers show up
for that, it most probably will stay as it is.

Regards,
Mathias



Re: How start to build AOOo on WinXP

2011-10-15 Thread Mathias Bauer
Hi,

doesn't that call for a change in the wiki? I would have done it, but I
always get errors when I try to create a wiki account:

 Detected bug in an extension! Hook wfLanguageSelectorAbortNewAccount failed 
 to return a value; should return true to continue hook processing or false to 
 abort.
 
 Backtrace:
 
 #0 /x1/mediawiki-1.15.5live/includes/specials/SpecialUserlogin.php(333): 
 wfRunHooks('AbortNewAccount', Array)
 #1 /x1/mediawiki-1.15.5live/includes/specials/SpecialUserlogin.php(164): 
 LoginForm-addNewAccountInternal()
 #2 /x1/mediawiki-1.15.5live/includes/specials/SpecialUserlogin.php(106): 
 LoginForm-addNewAccount()
 #3 /x1/mediawiki-1.15.5live/includes/specials/SpecialUserlogin.php(17): 
 LoginForm-execute()
 #4 [internal function]: wfSpecialUserlogin(NULL, Object(SpecialPage))
 #5 /x1/mediawiki-1.15.5live/includes/SpecialPage.php(771): 
 call_user_func('wfSpecialUserlo...', NULL, Object(SpecialPage))
 #6 /x1/mediawiki-1.15.5live/includes/SpecialPage.php(559): 
 SpecialPage-execute(NULL)
 #7 /x1/mediawiki-1.15.5live/includes/Wiki.php(229): 
 SpecialPage::executePath(Object(Title))
 #8 /x1/mediawiki-1.15.5live/includes/Wiki.php(59): 
 MediaWiki-initializeSpecialCases(Object(Title), Object(OutputPage), 
 Object(WebRequest))
 #9 /x1/mediawiki-1.15.5live/index.php(116): 
 MediaWiki-initialize(Object(Title), NULL, Object(OutputPage), Object(User), 
 Object(WebRequest))
 #10 {main}


Regards,
Mathias

Am 14.10.2011 22:57, schrieb Pedro Giffuni:

 Hi;
 
 That has changed for everyone.. and has to be updated in
 the guides.
 
 You need GNU autoconf : it generates the configure
 script.
 
 cheers,
 
 Pedro.
 
 --- On Fri, 10/14/11, Regina Henschel rb.hensc...@t-online.de wrote:
 
 
 I know that guide, but 'configure' mentioned there, does
 not exist.
 
 $ ./configure --help
 
 results in
 
 bash: ./configure: No such file or directory
 
 Therefore I ask.
 
 Kind regards
 Regina
 
 
 



Re: How start to build AOOo on WinXP

2011-10-15 Thread Mathias Bauer
Am 15.10.2011 16:20, schrieb Ariel Constenla-Haile:

 Hello Mathias,
 
 On Sat, Oct 15, 2011 at 04:04:15PM +0200, Mathias Bauer wrote:
 Hi,
 
 doesn't that call for a change in the wiki? I would have done it, but I
 always get errors when I try to create a wiki account:
 
 are you trying at http://ooo-wiki.apache.org/ ?

Ah, yes.

 This instance is still read-only:
 http://mail-archives.apache.org/mod_mbox/incubator-ooo-dev/201109.mbox/%3c4e770880.2040...@gmx.ch%3E

Didn't notice that - I don't have time to read all the mails here. :-)

So I changed that in the old wiki:

http://wiki.services.openoffice.org/wiki/Documentation/Building_Guide/Building_on_Windows#configure

Regards,
Mathias



Re: GStreamer avmedia plugin as copyleft?

2011-10-05 Thread Mathias Bauer

On 05.10.2011 02:46, Ariel Constenla-Haile wrote:

On Tue, Oct 04, 2011 at 03:12:09PM -0700, Pedro Giffuni wrote:

The GStreamer integration doesn't make sense without
GStreamer - so we
shouldn't build it when no GStreamer libs are available.



Agreed.


We decided that configure without switches should be the
default Apache build. OTOH we didn't agree on linking
against lgpl libraries in the system in this default
build, so I disabled GStreamer by default (as
well as some other copyleft components - there is no
difference in that regard between GStreamer and hunspell!).


There is a *huge* difference here: Hunspell is MPL so we can
use it. LGPL is category X: we can't use Gstreamer at at.

We can keep the MPL stuff around but anything GPL we should
consider extensions and move to apache-extras. The glue code
that we can relicense to AL2 we will, of course.


but we are talking here about the GStreamer *plugin*. Just like the gtk
and kde VLC plugins, they have Oracle license, and they are in the
software grant, this means they (will) have Apache License.

If AOOo will build a vanilla Linux version it should be shipped with the
gtk and kde plugins (no desktop integration is a no-go!). Following this
argument, it should also ship the GStreamer integration (here going back
to the Java Media Framework is a no-go!).


The difference is that you can safely assume that one of the vcl plugins 
will work on the user's computer without urging him to install 
additional packages. That's not true for GStreamer.


The GStreamer plugin can't even be compiled without GStreamer libraries 
installed. Of course you can use them installed in your system, but - as 
I wrote - so far we didn't agree that this should be allowed in a 
Vanilla build. So the correct way to build with GStreamer from the 
system is using --with-system-GStreamer or so (don't know if this 
switch exists already). Without any switch we must assume that the 
GStreamer lib is not available and so we can't compile the plugin.


As soon as we have an agreement that building against system libs is OK 
for a Vanilla AOOo build , we can compile much more stuff in OOo on 
Linux, most probably nearly every copyleft component we had to disable 
otherwise.


Of course that would probably mean that our Linux version won't run out 
of the box on most Linux machines as the old OOo version did. It will 
run only on those computers that have compatible versions of all 
libraries installed in their system that were used in the build. That's 
the price you have to pay.


Regards,
Mathias


Re: GStreamer avmedia plugin as copyleft?

2011-10-05 Thread Mathias Bauer

On 05.10.2011 02:34, Ariel Constenla-Haile wrote:


It seems the question remains if vanilla AOOo can link  against copy-left
system libraries.


Exactly. And moreover, which libraries can be named system libraries. 
IMHO gtk can, but librsvg can't. Don't know about GStreamer.


Regards,
Mathias



Re: GStreamer avmedia plugin as copyleft?

2011-10-04 Thread Mathias Bauer
Am 04.10.2011 17:27, schrieb Ariel Constenla-Haile:

 Hi there,
 
 The GStreamer avmedia plugin is disabled when copyleft is disabled (the
 default in AOOo). 
 http://svn.apache.org/viewvc/incubator/ooo/trunk/main/configure.in?view=markup#l1248
 I cannot understand why this is treated as copy-left code. The plugin
 was developed by Oracle and is on the software grant; building it only
 requires system headers and linking against system libraries. 
 In this sense, the code is just like the VCL GTK and KDE plugins, and the GTK
 and KDE file pickers... these are not disabled as copy-left code.
 
 So, is this code copy-left code? Does building and shipping the pulgin
 break some Apache licensing rule?
 
 Regards

The GStreamer integration doesn't make sense without GStreamer - so we
shouldn't build it when no GStreamer libs are available.

We decided that configure without switches should be the default Apache
build. OTOH we didn't agree on linking against lgpl libraries in the
system in this default build, so I disabled GStreamer by default (as
well as some other copyleft components - there is no difference in that
regard between GStreamer and hunspell!). This is an open question that
IIRC I already asked here on that list (at least I wanted to do that ;-)).

If we decided that linking against LGPL libs installed in the local
system is OK for a vanilla Apache build, we could enable e.g.
GStreamer on Linux again, but then we also could enable Hunspell and the
other stuff that are disabled when copyleft stuff isn't enabled
explicitly. Until then you have to use --with-system... switches to
enable these copyleft components. If that doesn't work - you have found
a bug. :-)

Does that make sense?

Regards,
Mathias


Re: handling of ext_sources - Juergen's suggestion [was: Re: A systematic approach to IP review?]

2011-10-01 Thread Mathias Bauer
Am 01.10.2011 00:17, schrieb Michael Stahl:

 On 30.09.2011 21:24, Mathias Bauer wrote:
 On 28.09.2011 17:32, Pedro F. Giffuni wrote:
 
 Another advantage of unpacking the tarballs: the patches will become
 *real* patches that just contain changes of the original source code.
 Often the patches nowadays contain additional files that we just need to
 build the stuff in OOo (e.g. dmake makefiles) - they could be checked in
 as regular files.
 
 Currently keeping them as regular files is awkward because then they
 need to be copied to the place the tarballs are unpacked to.
 
 but this is just because dmake can only build source files in the same
 directory; imagine a more flexible gbuild external build target where the
 makefiles are in the source tree while the tarball gets unpacked in the
 workdir...

Sure, but until we aren't there...

I didn't talk about the dmake makefiles that are used to unpack and
patch, I was talking about using dmake for building the external modules
that come with their own build system. The makefile.mk in the root
directory of the external modules are not part of the patch, but some
patches contain makefile.mk files that are necessary to build the stuff,
either on all or only on some platforms.

Regards,
Mathias


Re: Not new but under a new hat

2011-09-30 Thread Mathias Bauer
On 29.09.2011 09:56, Ian Lynch wrote:

 On 28 September 2011 16:51, Rob Weirrobw...@apache.org  wrote:

 If everyone agreed that having a single project was best today, then
 we would have a single project tomorrow.

 Point is we have made little real effort to achieve any consensus on this.
 We have done a lot of bitching on both sides and posted stuff like this that
 almost guarantees it will never happen. 
   The question should be what
 can you, or I, or anyone else who wants that outcome, do today, to
 make it more likely to move closer to that outcome.


 I'd say stop posting reactionary and emotive stuff when someone makes a
 positive suggestion to get people working together.

You've hit the nail on the head!

Since the split of the OOo community we had the strange situation that
many people on both sides declared an interest to get together (even if
it is unclear today how that will look like), but when we had
discussions about that matter, most participants acted like they wanted
the opposite.

Looking for parts in each others posts that could be *interpreted*
negatively got more interest than praising the positive statements in
them. I have a very philantropic attitude, so I still believe that this
was caused by negative emotions of the past, not by malevolence. But
IMHO it's time to stop that and look forward, not backwards.

Nobody is perfect - so people make mistakes. Sometimes also people have
to do things for reasons that are not their own (I know this well
enough!). If you want cooperation, you have to remember this and focus
on the positive sides that can help to establish or foster cooperation,
but not enhance the negative sides and blame people for them.

And nobody should expect that anybody will come up with a plan and -
whoosh! - both projects will work together. That won't happen. Future
cooperation will require small steps and the admittance to overcome the
mostly psychological barriers.

Regards,
Mathias


Re: handling of ext_sources - Juergen's suggestion [was: Re: A systematic approach to IP review?]

2011-09-30 Thread Mathias Bauer
On 28.09.2011 17:32, Pedro F. Giffuni wrote:
 FWIW;

 I don't like the patches because I can't really examine well
 the code, besides this is something the VCS handles acceptably:
 commit the original sourcecode and then apply the patches in a
 different commit. If we start with up to date versions there
 would not be much trouble.

I'm not against unpacking the tarballs and applying the patches, but we
should keep the patches somewhere so that updates could be done with the
same effort as today.

Another advantage of unpacking the tarballs: the patches will become
*real* patches that just contain changes of the original source code.
Often the patches nowadays contain additional files that we just need to
build the stuff in OOo (e.g. dmake makefiles) - they could be checked in
as regular files.

Currently keeping them as regular files is awkward because then they
need to be copied to the place the tarballs are unpacked to.

Regards,
Mathias


Re: a question for #i117804# differentiate between ENABLE_CAIRO and ENABLE_CAIRO_CANVA

2011-09-30 Thread Mathias Bauer
Am 22.09.2011 10:35, schrieb Shao Zhi Zhao:

 hi,
 
 In the change of vcl340fixes: #i117804# differentiate between ENABLE_CAIRO
 and ENABLE_CAIRO_CANVA…
 in file of set_soenv.in
 
 there are 4 new lines added
 +ToFile( DISABLE_SAXON,  @DISABLE_SAXON@, e );
 +ToFile( DISABLE_HUNSPELL,   @DISABLE_HUNSPELL@, e );
 +ToFile( DISABLE_HYPHEN, @DISABLE_HYPHEN@, e );
 +ToFile( DISABLE_LIBWPD, @DISABLE_LIBWPD@, e );
 +
 
 This changed is aimed to disable these three 3rd party modules?
 And what is the reserve unit for these 3rd party modules or they will be
 removed in AOOo?

We don't have replacements for them yet; we will need replacements for
hunspell and hyphen for sure and that will be one of the hardest part of
the copyleft replacements. We won't get a replacement for libwpd and
we should look for a replacement for saxon, that shouldn't be so hard.

Regards,
Mathias



Re: Not new but under a new hat

2011-09-30 Thread Mathias Bauer
Am 30.09.2011 21:36, schrieb Alexandro Colorado:

 I dunno why this is such an issue really, we are both open source projects.
 Cooperating and working together doesnt really needs much, just commit to
 both projects and move on. I mean, what are we looking for here, do you want
 an explicit thank you note from both projects? Or you only wanting to get
 commits and contribute to both.

I think that I have clearly stated what I would like to see. Or better,
what I don't like to see. Sorry, but I don't understand how your comment
is related to that.

Regards,
Mathias


Re: [patch] Removal of Windows build requirement on unicows.dll - issue 88652

2011-09-29 Thread Mathias Bauer

On 28.09.2011 00:49, Michael Stahl wrote:

On 27.09.2011 22:22, Rob Weir wrote:

On Tue, Sep 27, 2011 at 4:08 PM, Dennis E. Hamilton
dennis.hamil...@acm.org  wrote:

What is the oldest Windows OS version that Apache OOo 3.4(-dev) will
be supported on?  How does that compare with the oldest Windows OS
version that the last stable release (3.3.0?) of OpenOffice.org is
supported on?  (If there is a JRE dependency, that is another variant
to consider.)


AFAIK OOo 3.x Windows baseline is NT 5.0 (Windows 2000);
AFAIK this OS version is no longer supported by the vendor.


And AFAIR Win 2000 was already dropped as a supported platform in OOo 
3.3. All Win 9x Platforms already are not supported anymore since OOo 3.0.


Besides that I still don't get why Windows versions are always discussed 
in the context of removing unicows.dll.


I already wrote it, but again: this library was needed only in Win9x and 
this platform in no longer supported by OOo since 3.0. So unicows.dll 
can and should be removed.


Whether or not AOOo will support WinXP SP2 or only SP3 is totally 
irrelevant in the discussion about unicows.dll. It might become more 
interesting when uwinapi.dll gets the focus. Getting rid of that library 
also would be highly desirable.


Regards,
Mathias


Re: [patch] Removal of Windows build requirement on unicows.dll - issue 88652

2011-09-29 Thread Mathias Bauer

On 28.09.2011 21:03, Marcus (OOo) wrote:


Yes, and as long as there are no real technical problems I don't see a
need to drop the support.


Indeed, and AFAIR dropping the support for any OS versions always was 
technically motivated at OOo. Of course technical motivations are 
debatable. Maintaining compatibility layers for OS versions that are as 
old as the hills IMHO *is* a technical problem. If the maintenance or 
the presence of the library has a negative impact on build system or 
code quality, the motivation to remove it grows with the age of the OS 
version and the shrinking user base.


Regards,
Mathias


Re: odma.h [was: Re: my next (tiny) steps - clean up regarding stuff which is not conform to the Apache license]

2011-09-29 Thread Mathias Bauer
Am 29.09.2011 11:28, schrieb Oliver-Rainer Wittmann:

 Hi,
 
 thanks for details on ODMA and the made research on the current
 source code.
 
 I have figured out that the complete ODMA stuff in our source code is
  not part of the build and thus, not part of an installation set -
 the source folder /ucb/source/ucp/odma is not built. I did some
 research in OOo's hg repository to find out when the build of this
 folder has been removed. Surprisingly, it does not seem that it has 
 been ever built, because I did not find a revision of
 /ucb/prj/build.lst which contains /ucb/source/ucp/odma - may be I
 overlooked something.

Indeed the ODMA content provider was never built in a regular build. The
project could be built manually, in case someone wanted to test or use
it. In fact it never was more than a nice try.

Regards,
Mathias



Re: [patch] Removal of Windows build requirement on unicows.dll - issue 88652

2011-09-29 Thread Mathias Bauer
Am 29.09.2011 20:17, schrieb Mathias Bauer:

 On 28.09.2011 00:49, Michael Stahl wrote:
 On 27.09.2011 22:22, Rob Weir wrote:
 On Tue, Sep 27, 2011 at 4:08 PM, Dennis E. Hamilton
 dennis.hamil...@acm.org  wrote:
 What is the oldest Windows OS version that Apache OOo 3.4(-dev) will
 be supported on?  How does that compare with the oldest Windows OS
 version that the last stable release (3.3.0?) of OpenOffice.org is
 supported on?  (If there is a JRE dependency, that is another variant
 to consider.)

 AFAIK OOo 3.x Windows baseline is NT 5.0 (Windows 2000);
 AFAIK this OS version is no longer supported by the vendor.
 
 And AFAIR Win 2000 was already dropped as a supported platform in OOo 
 3.3. All Win 9x Platforms already are not supported anymore since OOo 3.0.

Correction: for 3.3 we decided not to drop Win 200o officially, but in
case problems should appear only on that platform we wouldn't fix them.
That's kind of possible, but unsupported - do it at your own risk.

Regards,
Mathias



Re: handling of ext_sources - Juergen's suggestion [was: Re: A systematic approach to IP review?]

2011-09-28 Thread Mathias Bauer

On 20.09.2011 16:36, Pavel Janík wrote:

Have we ever considered using version control to...uh...manage file
versions?

Just an idea.



Maybe Heiner will say more, but in the past, we have had the external
tarballs in the VCS, but then we moved them out and it worked very
well. There never was a reason to track external.tar.gz files in VCS,
because we do not change them.
What might be the best way to handle 3rd party code in AOOo probably 
will depend on the needs of the developers as well as on legal requirements.


We had these tarballs plus patches IIRC because Sun Legal required that 
all used 3rd party stuff should be preserved in our repos in its 
original form.


As a developer I always had preferred to have 3rd party code treated in 
the *build* like the internal source code.


So if there wasn't a requirement to have unpatched sources in the 
repository, the most natural way to keep 3rd party stuff would be to 
have a third sub-repo 3rdparty next to main and extras with the 
3rd party stuff checked in. Not the tarballs, just the unpacked content.


I wouldn't give up the patches, as they allow to handle updates better. 
This would cause a problem, as direct changes to the 3rd party stuff 
without additional authorization (means: changing the source code must 
not happen accidently, only when the 3rd party code gets an update from 
upstream) must be prevented, while still patch files must be allowed to 
added, removed, or changed, not the original source code. If that wasn't 
possible or too cumbersome, checking in the tarballs in 3rdparty would 
be better.


As svn users never download the complete history as DSCM users do, the 
pain of binary files in the repo isn't that hard. In case AOOo moved to 
a DSCM again later, the tarballs could be moved out again easily.


Regards,
Mathias


Re: A systematic approach to IP review?

2011-09-28 Thread Mathias Bauer

On 19.09.2011 02:27, Rob Weir wrote:


1) We need to get all files needed for the build into SVN.  Right now
there are some that are copied down from the OpenOffice.org website
during the build's bootstrap process.   Until we get the files all in
one place it is hard to get a comprehensive view of our dependencies.


If you want svn to be the place for the IP review, we have to do it in 
two steps. There are some cws for post-3.4 that bring in new files. 
Setting up a branch now to bring them to svn will create additional work 
now that IMHO should better be done later.




2) Continue the CWS integrations.  Along with 1) this ensures that all
the code we need for the release is in SVN.


see above


e) (Hypothetically) files that are not under an OSS license at all.
E.g., a Microsoft header file.  These must be removed.


I assume that you are talking about header files with a MS copyright, 
not header files generated from e.g. Visual Studio. In my understanding 
these files should be considered as contributed under the rules of the 
OOo project and so now their copyright owner is Oracle.



5) We should to track the resolution of each file, and do this
publicly.  The audit trail is important.  Some ways we could do this
might be:

a) Track this in SVN properties.
IMHO this is the best solution. svn is the place of truth if it comes 
down to files.


The second best solution would be to have one text file per build unit 
(that would be a gbuild makefile in the new build system) or per module 
(that would be a sub folder of the sub-repos). The file should be 
checked in in svn.


Everything else (spreadsheets or whatsoever) could be generated from 
that, in case anyone had a need for a spreadsheet with 6 rows 
containing license information. ;-)


Regards,
Mathias


Re: [LINUX-BUILD] Details of Fedora 14 and 15 x68_64 build

2011-09-28 Thread Mathias Bauer

On 27.09.2011 04:36, Carl Marcum wrote:

As of Repo version 1175305 I can Build on Fedora 14 and 15 x86_64.

Thank you Ariel for helping me get the first one completed.

I found that there is a problem trying to to build hsqldb using java 1.7
due to the build.xml only having targets for java up to 1.6 so I
switched back to 1.6 for the complete build.

Starting with a Fedora basic desktop install.

I used yum to install the packages listed on the Fedora build
instructions [1].

I needed to add librsvg2-devel and junit4.


The first problem is a bug, libsvg shouldn't be needed in a non-copyleft 
build (as it's LGPL licensed). junit4 indeed is needed, but can be made 
obsolete by using --without-junit in configure.


Thanks for reporting your results,
Mathias


Re: a LGPL v3 report tool

2011-09-25 Thread Mathias Bauer
Am 21.09.2011 05:13, schrieb Shao Zhi Zhao:

 
 
 hi,
 
 JFreeReport introduced several 3rd modules within LGPL.
 
 Here is a LGPL v3 report tool.
 http://jasperforge.org/projects/jasperreports
 
 
 thanks
 
 mail:zhaos...@cn.ibm.com
 tel:54747
 Address:2/F,Ring Bldg. No.28 Building, Zhong Guan Cun Software Park, No.8,
 Dong Bei Wang West Road, ShangDi, Haidian District, Beijing 100193,
 P.R.China

JFreeReport is not necessary for OOo, it is part of the ReportBuilder
Extension.

Regards,
Mathias


Re: consolidation of Windows Build software requirements

2011-09-23 Thread Mathias Bauer
Am 23.09.2011 09:55, schrieb Oliver-Rainer Wittmann:

 Maybe you mixed unicows.dll with the notorious uwinapi.dll that at least
 has some value on WinXP, though it's unclear how much.
 
 Hm...
 Why do you think I am mixing unicows.dll with uwinapi.dll?

I replied to Martin. :-)

Regards,
Mathias



Re: Introduction and start working

2011-09-22 Thread Mathias Bauer
Am 21.09.2011 12:07, schrieb Martin Hollmichel:

 Hi,
 
 Am 20.09.2011 12:26, schrieb Oliver-Rainer Wittmann:
 Hi,

 [...]
 I will start working on a consolidation of the Windows Build software 
 requirements as given on 
 http://ooo-wiki.apache.org/wiki/Documentation/Building_Guide/Building_on_Windows:
 - get rid of dependence on unicows.dll
 This will have some impact wrt system requirements ? Which Windows 
 version will be affected by this change ?
 -- take over issue 88652 
 (https://issues.apache.org/ooo/show_bug.cgi?id=88652) from Mathias and 
 perform the given tasks.
 - get rid of dependence on instmsiw.exe and instmsia.exe
 also this will iirc have some dependencies wrt system requirements, what 
 do  you consider as minimum Windows baseline ? I would be fine with a XP 
 System SP2,

Why not SP3?
Really, SP2 is a totally outdated system.

Besides that, we don't need unicows.dll on any Windows XP installation,
WinXP is UniCode enabled. unicows.dll ist just for Win9x.

Maybe you mixed unicows.dll with the notorious uwinapi.dll that at least
has some value on WinXP, though it's unclear how much.

Regards,
Mathias



Re: AOOo can't save passwort protected file

2011-09-22 Thread Mathias Bauer
Am 22.09.2011 17:49, schrieb Michael Stahl:

 On 17.09.2011 22:32, Pedro F. Giffuni wrote:
 
 
 --- On Sat, 9/17/11, Rob Weir robw...@apache.org wrote:
 ...

 OpenSSL is a a validated module when run in FIPS mode:

 http://csrc.nist.gov/groups/STM/cmvp/documents/140-1/1401val2009.htm#

 But that would still apply to AES, not Blowfish.

 Think of it this way:  FIPS 140 defines what the
 acceptable algorithms are.  Then the actual modules,
 the actual libraries, are validated by 3rd party
 testing labs according to NIST criteria.   If we use
 validated modules implementing approved algorithms, then
 we're golden.

 
 Thanks for this point. NSS is not certified and given the
 
 where the heck did you get that idea?
 
 http://csrc.nist.gov/groups/STM/cmvp/documents/140-1/140val-all.htm#1280
 
 version OOo carries has known security issues I suggest
 we kill the configure option to avoid hazards to our users.
 
 indeed the version shipped by OOo is outdated (3.12.6); newest one on the
 FTP server is:
 
 https://ftp.mozilla.org/pub/mozilla.org/security/nss/releases/NSS_3_12_11_RTM/src/
 
 (of course the OOo internal OpenSSL is similarly out of date...)
 
 Without other options I prefer Blowfish to no security at all.
 Again, patches for OpenSSL or any other certified solution
 are welcome :).
 
 While here .. I also think we should kill mozilla:
 
 1) The version we carry also has serious security issues.
 2) Google Chromium has a better license.
 
 but can Google Chromium read Mozilla address books?
 
 AFAIK that is all that OOo uses Mozilla for...

AFAIR a genius has bound our whole address book support code (not only
the code for the Mozilla address book) to Mozilla code. And we also use
the Mozilla stuff for ldap. All other formerly Mozilla based
functionality in OOo nowadays uses nss.

All just IIRC.

Regards,
Mathias


Re: automake insteed configure

2011-09-19 Thread Mathias Bauer
Am 19.09.2011 20:40, schrieb Raphael Bircher:

 Hi Herbert
 
 Am 19.09.11 13:18, schrieb Herbert Duerr:
 In the past, configure was only rebuild if needed by samone (releng?) in
 Hamburg. Now Mathias Bauer recommends to build configure.sh everytime
 and make automake to the default step by the buildprocess. I think this
 Step should be don ASAP.
 I think that's not a load of changes, but I have a question: How to run
 automake, and where it is located can sameone help me.

 You probably mean autoconf, right?
 Oh yes!

 If you are on windows install the autoconf package from cygwin. If you 
 are on Linux install it from your package repository.
 Thanks for the hint. looks like I can make a new configure.sh by running 
 the command:
 
 automake configure.in  configure.sh

It's easier than that: just call autoconf in the root folder where
configure.in is. This will create a new configure script in that place.

Regards,
Mathias


Re: AOOo can't save passwort protected file

2011-09-19 Thread Mathias Bauer
Am 18.09.2011 06:10, schrieb Pedro F. Giffuni:

  Ugh ... nevermind, we already carry xmlsec !
 
 I guess we have everything to get rid of nss but we are not using it right? 
 Apache Santuario is interesting though.
 
 Cheers, Pedro.

The reason why we went for nss when we needed AES enryption was code
quality. openssl was considered as badly maintained.

Disclaimer: I just repeat what the engineer charged with the evaluation
reported. I didn't carry out this evaluation by myself.

Regards,
Mathias


Re: AOOo can't save passwort protected file

2011-09-19 Thread Mathias Bauer
Am 19.09.2011 23:07, schrieb Pedro Giffuni:

  Hi Matias;
 
  On Mon, 19 Sep 2011 22:06:56 +0200, Mathias Bauer 
  mathias_ba...@gmx.net wrote:
 Am 18.09.2011 06:10, schrieb Pedro F. Giffuni:

  Ugh ... nevermind, we already carry xmlsec !

 I guess we have everything to get rid of nss but we are not using it 
 right? Apache Santuario is interesting though.

 Cheers, Pedro.

 The reason why we went for nss when we needed AES enryption was code
 quality. openssl was considered as badly maintained.

 Disclaimer: I just repeat what the engineer charged with the 
 evaluation
 reported. I didn't carry out this evaluation by myself.

  Thanks for the explanation.
 
  That might have been a valid reason then. The latest version is dated
  from less than 2 weeks ago, so it looks pretty well maintained now :).
 
  Just a thought ... Perhaps we should try to make Apache OO *really*
  Apache. I am now seeing so many nice things that other Apache projects
  offer: Santuario, APR, pdfbox, Xerces/Xalan, Maven, etc. Just something
  to consider (after 3.4).

Whatever external components are added: it should be avoided to use Java
components for code that is loaded on startup or for loading normal
documents. If possible, Java should be used only for optional
components/features.

Regards,
Mathias


Re: VCL TestTool

2011-09-18 Thread Mathias Bauer
Am 18.09.2011 17:34, schrieb Raphael Bircher:

 Hi at all
 
 The VCL TestTool is a GUI Testtool for OpenOffice.org who was writen by 
 SUN and wildly used in by the Hamburg people. The VCL TestTool (short 
 TT) is designed to avoid regressions. I the past, the test was not a 
 load used by the community, and same people say, TT is only wasting time 
 that can be better used by manual testing. There are two disavantage 
 with the TT.

The testtool has some value if it is used with careful consideration and
not as a blind adherence to some stupid rules (Thou shalt not integrate
code without running the testtool on it).

 So I will give the TT a chance, and started to testing with it, and I 
 was surprised. The first Test brings a error. Within five minutes I can 
 confirm it manualy. It was the save with password error. Well, maybe 
 this was also a bit luck, but even it takes 20 Minutes to analyse the 
 error, it's the time wrote.

This backs up my experience with the test tool: it is best used when
larger parts of the code have been replaced, rewritten or removed. Like
now. Though you should be prepared for a lot of red herrings, especially
on Linux (the Windows runs take longer but from my own experience are
much more reliable).

Using the testtool in the long run is problematic: the developers of
that tool didn't maintain its source code for years, so nowadays it's
impossible to create a running testtool from the current sources. OTOH
the sources of the currently available binary instance of the testtool
are several years old an sooner or later won't run anymore. The Linux
version already does not run at least on 64Bit Ubuntu.

The quality of the test scripts also is average at best. And many tests
are superfluous, some of them are carried out up to 9 times, just
wasting time.

Regards,
Mathias


Re: [solved] Build braker on Mac OS X

2011-09-18 Thread Mathias Bauer
Moin,

it seems that configure wasn't updated in the repo and in fact I
remember that we agreed that this is the sane approach. So we should
make clear in the build instructions that our process has changed: the
first thing to do after getting the source or pulling a change of
configure.in is calling autoconf, not calling configure.

IMHO we also should remove configure from the repository so that it is
no longer tracked.

Regards,
Mathias

Am 18.09.2011 09:35, schrieb Raphael Bircher:

 Hi all.
 
 Pavel help me to solve the issue. We have to re run autoconf. I will 
 commit the new configure
 
 Greetings Raphael
 
 Am 18.09.11 09:01, schrieb Raphael Bircher:
 Hi at all

 since the last CWS integration from Michael Stahl, I have a build 
 braker on my mac. Normaly I succsfully build on this mashine. It fails 
 at the ./configure process

 My configure settings:
 ./configure --disable-odk --disable-pasf --disable-gtk 
 --disable-headless --disable-build-mozilla 
 --with-build-version=$OOVERSION-`date +%d-%m-%y` --disable-fontconfig 
 --without-nas 
 --with-jdk-home=/System/Library/Frameworks/JavaVM.framework/Home 
 --with-stlport=no --disable-mediawiki --enable-werror --disable-vba 
 --with-num-cpus=2 --with-max-jobs=2 --disable-copyleft

 The error message:
 Possible unintended interpolation of @ENABLE_CAIRO_CANVAS in string at 
 ./set_soenv line 1638.
 Global symbol @ENABLE_CAIRO_CANVAS requires explicit package name at 
 ./set_soenv line 1638.
 Execution of ./set_soenv aborted due to compilation errors.
 server3:main server3$ echo $ENABLE_CAIRO_CANVAS

 Looks like samething with env Variables is wrong.

 Greetings Raphael
 
 



Re: [code][repo] Integration of CWSs, HOW-TO with hg and git svn and stgit

2011-09-18 Thread Mathias Bauer
Am 18.09.2011 01:04, schrieb Michael Stahl:

 - mba34issues01: somebody said that he would do this one;
   Mathias, how long do we have to wait...  :)

Done.

Regards,
Mathias



Re: VCL TestTool

2011-09-18 Thread Mathias Bauer
Am 18.09.2011 20:41, schrieb Bjoern Michaelsen:

 Hi Rob, Hi Mathias,
 
 On Sun, 18 Sep 2011 13:42:54 -0400
 Rob Weir robw...@apache.org wrote:
 
 Is there a different tool for GUI test automation that we should be
 investing in going forward?
 
 you might be interested in:
 
 http://nabble.documentfoundation.org/subsequenttests-now-run-headless-td2750447.html
 
 for a few additional viewpoints on the test suites. (Not so thrilling
 for Mathias I guess, as he knows the arguments and the positions
 expressed, but still.)

While I agree with your wrt. the subsequenttests, I think that it
doesn't address Rob's question.

About GUI test automation:

If we want to continue GUI automation, IMHO there is no testtool
available that could replace the vcl testtool. As not all GUI elements
in OOo are native controls, there will always be some parts of OOo that
can't be tested with any other available tool. And I doubt that there
are any platform independent GUI tools at all that could be a candidate
for OOo testing.

So we have three options:

- do no automated GUI testing at all
- try to continue using the existing testtool that exists as a binary
and hopefully is still available on some people's hard disk (I have
versions for Windows and Linux 32 Bit)
- fix the testtool code in the automation module and try to create a
newer and better version of it.

Regards,
Mathias


Re: how to do with such ext modules in AOOo?

2011-09-17 Thread Mathias Bauer
Am 15.09.2011 08:01, schrieb Shao Zhi Zhao:

 
 
 hi,
 
 how to do with such ext modules in AOOo?
 |+|
 |Ext Module  |License |
 |+|
 |glibc-2.1.3 |GPL 2   |
 |+|
 |jfreereport |LGPL|
 |+|
 |STLport-4.5 |Copyright (c) 1999, 2000 Boris  |
 ||Fomitchev   |
 |+|
 |epm-3.7 |GPL 2   |
 |+|
 |bsh-2.0b1-src   |SPL and LGPL|
 |+|
 |db-4.7.25.NC-custom |Copyright (c) 1990,2008 Oracle  |
 |+|
 |boost_1_39_0|Boost Software License  |
 |+|
 |cairo-1.8.0 |LGPL|
 |+|

glibc has been discussed elsewhere.

jfreereport is only needed for the ReportBuilder extension.

stlport is still needed until we decide to change the C++ UNO Runtime
incompatibly, I don't think that the license will prevent us from using
it. In that case we could do the planned change to compiler stl earler.

epm is not needed for OOo, only for the build in Linux. We can make it a
build requirement (like compiler or linker) and in fact I already have
committed a patch for that.

bsh seems to be BeanShell. In the worst case we had to remove it. No
big deal, IMHO.

BerkelyDB shouldn't be a problem licensewise.

The same is true for boost, IIRC.

cairo is not needed at all, it's just an optional component.

Regards,
Mathias


Re: AOOo can't save passwort protected file

2011-09-17 Thread Mathias Bauer
Am 17.09.2011 14:44, schrieb Rob Weir:

 When the competition for a new algorithm ended, the winner was the
 Advanced Encryption Standard (AES).  We really need to support that
 algorithm.  There is a reason why ODF 1.3 recommends it.  There are
 regulations in several countries that specify what cryptographic
 methods may be used for government work.  In the US this is called
 FIPS == Federal Information Processing Standards.  There are similar
 rules, for example, in Japan.  FIPS 140-2 recommends AES. It does not
 recommend Blowfish.  So this has great relevance for government users,
 government contractors, as well as other sectors like healthcare.

As you said, OOo *1.3* will *recommend* it. Does that require postponing
an AOOo 3.4 release until there is a code replacement for nss? Or do you
already have something to use? IIRC it took roughly two weeks to
implement and test the new AES code for an engineer familiar with the
code. I assume that for a newbie that would be quite some time more.

IMHO getting 3.4 out fast is important. And of course having AES
encryption is important also - immediately after that.

YMMV.

Regards,
Mathias


Re: AOOo can't save passwort protected file

2011-09-17 Thread Mathias Bauer
Am 17.09.2011 18:47, schrieb Rob Weir:

 On 9/17/11, Mathias Bauer mathias_ba...@gmx.net wrote:
 Am 17.09.2011 14:44, schrieb Rob Weir:

 When the competition for a new algorithm ended, the winner was the
 Advanced Encryption Standard (AES).  We really need to support that
 algorithm.  There is a reason why ODF 1.3 recommends it.  There are
 regulations in several countries that specify what cryptographic
 methods may be used for government work.  In the US this is called
 FIPS == Federal Information Processing Standards.  There are similar
 rules, for example, in Japan.  FIPS 140-2 recommends AES. It does not
 recommend Blowfish.  So this has great relevance for government users,
 government contractors, as well as other sectors like healthcare.

 As you said, OOo *1.3* will *recommend* it. Does that require postponing
 an AOOo 3.4 release until there is a code replacement for nss? Or do you
 already have something to use? IIRC it took roughly two weeks to
 implement and test the new AES code for an engineer familiar with the
 code. I assume that for a newbie that would be quite some time more.

 
 Support for AES exists in the JCE and via the ODF Toolkit.  The later
 is Apache 2.0 licensed.
 
 IMHO getting 3.4 out fast is important. And of course having AES
 encryption is important also - immediately after that.

 
 I'm flexible on the staging of this.  Eventually we'll want to get to
 have full AES support.  I've seen Microsoft push OOo out of
 consideration for government accounts by arguing that the MS Office
 crypto is certified and ours is using an algorithm (Blowfish) that is
 not, that OOo uses a cipher that even the author recommends not using.
   We don't win that debate with a backwards compatibility argument.

Sure, I wasn't aiming at backwards compatibility. In fact I was one of
those who where responsible for adding AES encryption to OOo's ODF code,
for the same reasons as yours.

I just recommended giving the urgency of a 3.4 release a higher priority
than the usage of AES encryption for saving ODF 1.2 documents in that
release.

Regards,
Mathias



Re: AOOo can't save passwort protected file

2011-09-17 Thread Mathias Bauer
Am 17.09.2011 19:26, schrieb Pedro Giffuni:

  Hi;
 
  Despite the valid interest in higher encryption schemes, I
  prefer to set Blowfish as default now. That doesn't mean
  we won't consider patches later on, of course.

Ah, you used the magic word. :-)

So for those who want to have AES encryption in 3.4: send patches!

Regards,
Mathias


Re: AOOo can't save passwort protected file

2011-09-16 Thread Mathias Bauer
Hi,

AOOo can't use the nss libraries as easily as it was possible in the
old OOo, so perhaps a fix would be to revert the default encryption
algorithm in AOOo from AES to Blowfish in 3.4 until we found a
replacement for the AES encryption code from the nss libs.

I know that MPL libs can be used in binary form in Apache projects,
here's the wording:

Software under the following licenses may be included in binary form
within an Apache product if the inclusion is appropriately labeled:
(...) (lists MPL 1.0 and 1.1)

As most 3rd party software is included in binary form in release anyway,
I wonder what that means in practice. Perhaps somebody in the know can
explain that.

Regards,
Mathias

Am 16.09.2011 10:40, schrieb Chao Huang:

 hi, Jürgen
 
 Yes, I built AOOo with argument --disable-mozilla. I will try to
 rebuild AOOo without that arg.
 
 Do we have any alternative way to solve the problem quickly? For
 example, put mozilla library into someplace? Thanks!
 
 
 On Fri, 2011-09-16 at 10:30 +0200, Jürgen Schmidt wrote:
 Hi Raphael,
 
 i assume you have built your office with at least --disable-mozilla,
 correct? As far as i know the password encryption used some stuff from the
 mozilla code. So there will be a problem.
 
 Juergen
 
 
 On Fri, Sep 16, 2011 at 10:01 AM, Raphael Bircher r.birc...@gmx.ch wrote:
 
  Hi Dennis
 
  Thank you for the test too
 
  Am 16.09.11 03:19, schrieb Dennis E. Hamilton:
 
   I can't confirm with an AOOo Build, but I did check the OOO-dev 3.4 on
  win32 to see if the problem existed previously.  I was able to password
  protect (encrypt) a simple Writer document.  It saved and opened fine 
  (after
  I gave the password again.
 
  So this is maybe a regression
 
   What was interesting to me was that OO.o 2.4.1 (Novell Edition) failed to
  open the document and never got to recognizing that it was encrypted.  I 
  got
  a bad XML message, suggesting that an encrypted file was being mistakenly
  opened without decryption first.
 
  I think, that has nothing to do with it.
 
 
  Greetings Raphael
 
 
  --
  My private Homepage: http://www.raphaelbircher.ch/
 
 



Re: Fix me: Abnormalities during bootstrap

2011-09-15 Thread Mathias Bauer
Am 10.09.2011 14:13, schrieb Joost Andrae:

 Hi,
 
 whilst building wntmsci12.pro my first time ever I saw some quirks that 
 need healing hands:
 
 bootstrap aquires external files via hg.services.openoffice.org
 
 I SVN'ed trunk from Apache domain and I think these externals should not 
 use hg.services any more. Would it be possible to move them to an Apache 
 domain based server ?
 
 After starting the build process on my Win7 64bit notebook (dual core 
 Intel 2.8 Ghz with 8 GB memory) it quite often stopped but rebuilding 
 went always fine (until now; I'm still compiling binfilter). I thought 
 it might be useful to implement a watchdog process watching 
 stderr/stdout that restarts dmake as soon as the build breaks but it 
 needs to interpret the error thrown.
 
 Documentation at 
 http://ooo-wiki.apache.org/wiki/Documentation/Building_Guide/Building_on_Windows
  
 was really helpful despite I had to disable ATL

I have found out that we could get rid of the atl problem. When I was
working on another project that forced me to build Chromium, I found in
their build instructions that the MS Windows Driver Development Kit
contains the necessary atl headers and libs. The WDK can be downloaded
freely, so I assume that it's OK to use them and allow us to build with
VS Express *and* atl.

If someone wants to try it out, here are the instructions for Chromium:

http://www.chromium.org/developers/how-tos/build-instructions-windows

See the 4. If you use Visual Studio 2008 Express: part.

Regards,
Mathias


Re: [code][repo] Integration of CWSs, calc67

2011-09-11 Thread Mathias Bauer
Moin Eike,

Am 09.09.2011 16:11, schrieb Eike Rathke:

 Hi,
 
 calc67  DEV300m106
 
 Doing that.
 
   Eike
 

Before you waste your time with it: I will take care for mba34issues01.

Regards,
Mathias


Re: Fix me: Abnormalities during bootstrap

2011-09-10 Thread Mathias Bauer
Am 10.09.2011 14:13, schrieb Joost Andrae:

 Hi,
 
 whilst building wntmsci12.pro my first time ever I saw some quirks that 
 need healing hands:
 
 bootstrap aquires external files via hg.services.openoffice.org
 
 I SVN'ed trunk from Apache domain and I think these externals should not 
 use hg.services any more. Would it be possible to move them to an Apache 
 domain based server ?

This was mentioned here several times (from myself a few days ago), but
without a discussion. We have to find a solution, yes.

 After starting the build process on my Win7 64bit notebook (dual core 
 Intel 2.8 Ghz with 8 GB memory) it quite often stopped but rebuilding 
 went always fine (until now; I'm still compiling binfilter). I thought 
 it might be useful to implement a watchdog process watching 
 stderr/stdout that restarts dmake as soon as the build breaks but it 
 needs to interpret the error thrown.

Probably you have been hit by the well-known bug in GNU Make 3.81 that
lets it core-dump at times. This needs to be solved, the developers of
Libre Office have gained some experience with that.

 Documentation at 
 http://ooo-wiki.apache.org/wiki/Documentation/Building_Guide/Building_on_Windows
  
 was really helpful despite I had to disable ATL

If you used VC Express: IIRC the documentation mentions that this is
necessary in that case.

Regards,
Mathias


Re: Fix me: Abnormalities during bootstrap

2011-09-10 Thread Mathias Bauer
Am 10.09.2011 15:00, schrieb Joost Andrae:

 Hi Mathias,
 
 After starting the build process on my Win7 64bit notebook (dual core
 Intel 2.8 Ghz with 8 GB memory) it quite often stopped but rebuilding
 went always fine (until now; I'm still compiling binfilter). I thought
 it might be useful to implement a watchdog process watching
 stderr/stdout that restarts dmake as soon as the build breaks but it
 needs to interpret the error thrown.

 Probably you have been hit by the well-known bug in GNU Make 3.81 that
 lets it core-dump at times. This needs to be solved, the developers of
 Libre Office have gained some experience with that.
 
 GNU Make 3.8.1 is installed on my machine but I thought that dmake is 
 always used instead ?

You really did not recognize our new build environment project?
Parts of OOo are built with GNU Make, hopefully somewhere in the future
everything.

 Documentation at
 http://ooo-wiki.apache.org/wiki/Documentation/Building_Guide/Building_on_Windows
 was really helpful despite I had to disable ATL

 If you used VC Express: IIRC the documentation mentions that this is
 necessary in that case.
 
 Yes it is MSVC++ Express 2008 using Cygwin.
 
 btw. I wonder about the low CPU load during compile job (max. 50%). Is 
 there a way to allow dmake to parallelize compile jobs ?

You can use build -Pn or build -- -Pn where n is the number of
parallel build jobs you want to use. As a rule of thumb you will get the
best performance with n = 2*number of (virtual) cores that you have.

The first variant build -Pn starts n parallel processes of dmake, the
second build -- -Pn tells dmake to do n parallel tasks. You also use
build -Pn -- -Pn and divide your cores between both options. If you do
a full build, I recommend build -Pn -- -Pn with n=square root of
number of your cores (means 2 or 3 for 4 or 8 cores).

This procedure is awkward and doesn't scale well. This was one of the
reasons why we wanted a new build system. If everything was build with
our new GNU Make based build system, the build would scale much better,
nearly linear with the number of available cores.

Regards,
Mathias


Re: [repo] External sources, ICU

2011-09-10 Thread Mathias Bauer
Am 10.09.2011 16:54, schrieb Eike Rathke:

 Hi Mathias,
 
 On Saturday, 2011-09-10 00:11:55 +0200, Mathias Bauer wrote:
 
  If we can be sure about the IP situation, we should at least add
  copyright headers to the files, shouldn't we? Or put a license file into
  the repository.
  
  I'm confused now. The breakiterator data files _have_ copyright headers.
  Copyright IBM. Wasn't this what it is all about? Or do you mean we
  should add where they originated from, ICU?
 
 Not all files in the module i18npool have copyright headers. For
 example, see the data files in collator or indexentry. When we talked
 about that some weeks ago you mentioned all of them came from ICU. Did I
 misunderstand you?
 
 Apparently yes, only all breakiterator data is based on ICU files.
 
 The collator data files were contributed either by the Sun Globalization
 team (all CJK files) and have a Sun copyright header, or by OOo members,
 in which case a comment only states for which language/script the data
 is. Unfortunately hg log shows only the CWS integration commit comment,
 not the real commits that happened on the CWS, otherwise one could even
 dig out associated issue numbers and probably would find there the
 original contributors. Anyway, all submissions should have happened
 under JCA/SCA and thus are covered by the SGA.
 
 The indexentry data files were created by Karl Hong, working for the Sun
 Globalization team at that time. IIRC he used the CLDR main
 exemplarCharacters to generate them, see
 http://unicode.org/repos/cldr-tmp/trunk/diff/summary/root.html and
 individual language's charts, and added the index information. For CLDR
 data the applicable license is the Unicode Terms of Use, see
 http://cldr.unicode.org/index/downloads and
 http://unicode.org/copyright.html#Exhibit1
 The Unicode copyright notice is included in
 readlicense_oo/html/THIRDPARTYLICENSEREADME.html

Whatever the source of the data files in i18npool is, we should provide
them with proper license information.

Regards,
Mathias


Re: [code][repo] Integration of CWSs, HOW-TO with hg and git svn and stgit

2011-09-10 Thread Mathias Bauer
Am 10.09.2011 21:00, schrieb Pedro F. Giffuni:

 Hi,
 
 Excuse the newcomer ignorance ...
 
 --- On Sat, 9/10/11, Michael Stahl wrote:
 ...
 
 and i completely forgot to mention that i've got a linear
 MQ patch series applying against OOO340 that contains the
 following:
 
 ooo340fixes
 mingwport35
 
 ause131
 ause130
 writerfilter10
 gnumake4
 sd2gbuild
 
 (second group is for 3.5 so can't be committed to SVN
 now...)
 
 
 Does gnumake4 save us from dmake? I suspect no one
 here wants to maintain dmake in apache-extras if we
 can kill it.

gnumake4 as well as some other cws from this list are steps towards the
end of dmake in OOo. But there still will be some more steps to go.

Regards,
Mathias


Re: [repo] External sources, ICU (was: Who wants to build OpenOffice?)

2011-09-09 Thread Mathias Bauer
Am 09.09.2011 12:06, schrieb Eike Rathke:

 Hi Mathias,
 
 On Thursday, 2011-09-08 19:33:59 +0200, Mathias Bauer wrote:
 
   I don't see why the current ICU 4.0.1 needed to be updated,
   what issues with headers are you referring?
  
  I am not sure exactly when the ICU license changed but
  the first ICU versions had a restrictive license.
  
  Well, ICU 1.8.1 and later don't, see
  http://userguide.icu-project.org/icufaq#TOC-How-is-the-ICU-licensed-
  
  We have some code from ICU in the tree, Matthias'
  ApacheMigration list has:
  
  - get new break iterator data from current ICU
  
  That wouldn't change anything, our break iterator data has the same
  IBM copyright text as the current ICU's data, for example see
  http://source.icu-project.org/repos/icu/icu/trunk/source/data/brkitr/char.txt
  
  In fact the entire ICU source is full of
  Copyright IBM ... - All Rights Reserved without mentioning any
  license, but ICU as a whole is licensed under that nonrestrictive
  license mentioned above. I don't see any issue with that.
 
 So YAAL? :-)
 
 I'm glad I'm not ;-)
 
 Honestly, we shouldn't have files in our repository without a copyright
 header. And if these files are under IBM's copyright, we should fix that.
 
 Please read again: the breakiterator data files were copied from ICU's
 source and modified and contain an IBM copyright. All source files of
 ICU contain an IBM copyright. Just pick any file under
 icu/$INPATH/misc/build/icu/source/ and see. ICU's license is
 http://source.icu-project.org/repos/icu/icu/trunk/license.html
 What exactly would not be permissible with the breakiterator files given
 that the license grants you the rights to use, copy, modify, merge,
 publish, distribute?

If we can be sure about the IP situation, we should at least add
copyright headers to the files, shouldn't we? Or put a license file into
the repository.

Regards,
Mathias


Re: [repo] External sources, ICU

2011-09-09 Thread Mathias Bauer
Am 09.09.2011 21:11, schrieb Eike Rathke:

 Hi Mathias,
 
 On Friday, 2011-09-09 20:21:19 +0200, Mathias Bauer wrote:
 
 If we can be sure about the IP situation, we should at least add
 copyright headers to the files, shouldn't we? Or put a license file into
 the repository.
 
 I'm confused now. The breakiterator data files _have_ copyright headers.
 Copyright IBM. Wasn't this what it is all about? Or do you mean we
 should add where they originated from, ICU?

Not all files in the module i18npool have copyright headers. For
example, see the data files in collator or indexentry. When we talked
about that some weeks ago you mentioned all of them came from ICU. Did I
misunderstand you?

Regards,
Mathias



Re: threading

2011-09-09 Thread Mathias Bauer
Am 06.09.2011 21:38, schrieb Eike Rathke:

 Hi Dennis,
 
 On Monday, 2011-09-05 09:34:48 -0700, Dennis E. Hamilton wrote:
 
 Daniel,
 
 I don't generally have any way of knowing what does and does not
 influence threading.  I don't have threading in my mail client
 
 I would be surprised if Outlook didn't have threading (on the other
 hand.. it's Outlock ;-)  you just might not be aware of it. Are the
 mails you see on the list really not grouped by topics and you don't see
 who answers whom respectively which mail?

Outlook is able to preserve threading for other mail clients by
providing the necessary headers, but IIRC it's a configuration option
(at least it was so in older versions of Outlook).

But if Dennis always uses the same instance of Outlook with the same
user configuration, I don't understand why some of his mails break the
threading and others don't.

Regards,
Mathias


Re: [LINUX-BUILD] Details on my Ubuntu 11.04 experience

2011-09-08 Thread Mathias Bauer
Am 08.09.2011 01:27, schrieb Rob Weir:

 On Wed, Sep 7, 2011 at 5:57 PM, Mathias Bauer mathias_ba...@gmx.net wrote:
 Am 07.09.2011 19:27, schrieb Rob Weir:

 I did this a couple of days ago, but ran into some issues that others
 had not come across, so I decided to repeat it. I wiped out the
 machine, and started fresh with new 11.04 Ubuntu machine.  I tried to
 follow the build instructions literally and copied errors messages and
 remediations when I found them.

 == prep ==

 Fresh install of Ubuntu 11.04 on x86, 1 GB RAM, 240 GB HD

 Ran update manager, installed all patches, rebooted

 == getting the code ==

 We're on Subversion now, not Mercurial, so obviously that part of the
 instructions changes.

 svn co https://svn.apache.org/repos/asf/incubator/ooo/trunk ooo

 That didn't work.  Needed to apt-get install subversion first.

 Then code downloaded fine

 == build-dep ==

 Instructions in the guide say to do:

 sudo apt-get build-dep openoffice.org

 Not the instructions that I have pointed to several times:

 http://wiki.services.openoffice.org/wiki/Documentation/Building_Guide/Building_on_Linux

 and

 http://wiki.services.openoffice.org/wiki/Ubuntu_Build_Instructions

 
 Those are exactly the instructions I used, Matthias.  Take a look at
 them. They say:
 
 To make sure that all packages are installed you could just simply run 
 command:
 
 sudo apt-get build-dep openoffice.org

So that's wrong. The right thing is what I have once added to that wiki
page:

 sudo apt-get install g++ gcc bison flex libarchive-zip-perl libcups2-dev 
 libpam0g-dev \
 sun-java6-jdk gperf libfreetype6-dev libxaw7-dev libfontconfig1-dev 
 libxrandr-dev patch \
 libgconf2-dev libgnomevfs2-dev ant libgtk2.0-dev junit junit4
That worked, last time I checked, with the exeption of librsvg because
of the bug in configure. Sorry for the misunderstanding.

Regards,
Mathias


Re: [repo] External sources, ICU (was: Who wants to build OpenOffice?)

2011-09-08 Thread Mathias Bauer
Am 08.09.2011 02:41, schrieb Eike Rathke:

 Hi Pedro,
 
 On Wednesday, 2011-09-07 14:19:50 -0700, Pedro F. Giffuni wrote:
 
  I don't see why the current ICU 4.0.1 needed to be updated,
  what issues with headers are you referring?
 
 I am not sure exactly when the ICU license changed but
 the first ICU versions had a restrictive license.
 
 Well, ICU 1.8.1 and later don't, see
 http://userguide.icu-project.org/icufaq#TOC-How-is-the-ICU-licensed-
 
 We have some code from ICU in the tree, Matthias'
 ApacheMigration list has:
 
 - get new break iterator data from current ICU
 
 That wouldn't change anything, our break iterator data has the same
 IBM copyright text as the current ICU's data, for example see
 http://source.icu-project.org/repos/icu/icu/trunk/source/data/brkitr/char.txt
 
 In fact the entire ICU source is full of
 Copyright IBM ... - All Rights Reserved without mentioning any
 license, but ICU as a whole is licensed under that nonrestrictive
 license mentioned above. I don't see any issue with that.

So YAAL? :-)

Honestly, we shouldn't have files in our repository without a copyright
header. And if these files are under IBM's copyright, we should fix that.

Regards,
Mathias


Re: QUASTE working?

2011-09-08 Thread Mathias Bauer
Am 08.09.2011 00:52, schrieb Raphael Bircher:

 Hi Alexandro
 
 Am 08.09.11 00:32, schrieb Alexandro Colorado:
 I can't access QUASTe, I hope this is not the lost box that was in the
 planet.
 Regards.

 I'm actualy not sure if we realy want to use Quaste. Quaste was a 
 special tool for our automated tests. But they never worked well outside 
 SUN. Automated Tests are a nice Idea, but there Results has to be 
 reliably. And If you know how many menpower SUN and Oracle investegate 
 to maintain this tests... I'm not sure if we will do the same.

It's wrong that the test never worked well outside SUN. I used these
tests on my private machines quite often and had no problems, at least
not on Windows and only rarely on Linux (the testtool doesn't want to
run on 64Bit).

These automatic tests have some value, but not in the way they were
used. But with or without the tests, QUASTE was and is dispensable. It
never did anything that you couldn't do much easier, faster and cheaper
in a working community with accountable members.

Regards,
Mathias



Re: [repo] External sources, ICU (was: Who wants to build OpenOffice?)

2011-09-08 Thread Mathias Bauer
Am 08.09.2011 13:30, schrieb Eike Rathke:

 Hi Pedro,
 
 On Wednesday, 2011-09-07 20:32:38 -0700, Pedro F. Giffuni wrote:
 
   There is also interest in using icu-regex for
   replacing the copyleft regex and the latest
   versions seem to have improved a lot.
  
  Of course, but upgrading ICU at this stage for a 3.4
  release IMHO is not an option.
 
 Hmm.. the release.
 
 Apart from the SGA, that doesn't really depend on us,
 we need:
 - new regex code (where I thought and ICU upgrade would
 actually help)
 
 ICU 4.0.1 we use also has regex. Sure, newest ICU probably improved
 there, but upgrading ICU in the past gave some surprises, especially
 regarding break iterators, that made it necessary to adapt some things.
 That should be avoided.

Which version of ICU is used in LibreOffice? If they use a newer one,
this probably can tell us about possible surprises.

Regards,
Mathias


Re: [LINUX-BUILD] Developer Education -- Building on Linux event starts now

2011-09-08 Thread Mathias Bauer
Am 08.09.2011 11:22, schrieb Pavel Janík:

 
 On Sep 7, 2011, at 9:32 PM, Pavel Janík wrote:
 
 AFAIK source_config is the only way to get build.pl to look outside the
 main repository/directory, so it is necessary.
 
 given that we now have a more fixed directory layout with the SVN repo,
 i guess it should be possible to hardcode the path to the extras
 directory and have configure put something in the environment or just
 make build.pl respect gb_REPOS which already exists, then we can finally
 get rid of this source_config nonsense.
 
 Yes, this is my goal after the clean build is finished.
 
 
 Hmm, I forgot to send this yesterday.
 
 I modified build.pl to automatically add extras repository:
 
 --- /Users/pavel/.ooo/ooo/trunk/main/solenv/bin/build.pl  2011-08-28 
 20:19:41.0 +0200
 +++ solenv/bin/build.pl   2011-09-07 22:22:48.0 +0200
 @@ -1617,6 +1617,13 @@
  
  sub get_module_and_buildlist_paths {
  if ($build_all_parents || $checkparents) {
 +
 + # Perl Hackery: add repo extras explicitly
 + my $extras;
 + $extras = $source_config-get_module_path('vcl');
 + $extras =~ s|vcl|../extras|g;
 + $source_config-add_repository($extras);
 +
  $source_config_file = $source_config-get_config_file_path();
  $active_modules{$_}++ foreach ($source_config-get_active_modules());
  my %active_modules_copy = %active_modules;
 
 
 This change is incorrect, should not be integrated. But shows what is needed 
 to be changed somewhere. I can only read Perl, so do not ask me why I used 
 this approach...
 
 With this change, you do not need source_config with localized builds.

Thanks for the heads-up, I already discovered that the build doesn't
work. Does that mean that we should just revert that change or does
somebody already have a better idea? For me perl also is a read-only
language (better: read only if I can't avoid it ;-)).

What's the status with source_config? Do we still need it, do we
*want* to use it? The new build system doesn't need it, so probably
ditching source_config and hard coding main and extras somewhere in
the perl ball of mud would be an option.

Regards,
Mathias


  1   2   3   >