Re: [Bugzilla] Proposal: Further value for field Version - namely pre 3.4.0

2014-01-22 Thread Oliver-Rainer Wittmann

Hi,

On 18.01.2014 01:20, Andrea Pescetti wrote:

Oliver-Rainer Wittmann wrote:

I would like to tag issues which occurs in pre 3.4.0 version
appropriately


This would be helpful for me too. Probably two new categories:
- 3.3.0 or earlier
- 3.4.0-beta

This would allow to file all bugs appropriately. The latter category
would be useful only for QA purposes, I believe 3.4.0-beta does not have
a significant number of users.

We have the version field for (in theory) the first version containing
the bug (in practice, most people set it to the version where they
observed the bug). Then we have the Latest confirmation on field to
check whether the bug still applies to the latest release.



No objections have been raised in the last days.
Thus, could someone with corresponding karma create the following new 
values for field Version in Bugzilla:

- 3.3.0 or earlier
- 3.4.0-beta

Thx in advance.

Best regards, Oliver.

-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.apache.org
For additional commands, e-mail: dev-h...@openoffice.apache.org



Re: Contribute code for OOXML export

2014-01-22 Thread Steve Yin
Great news!


On Wed, Jan 22, 2014 at 3:25 PM, Clarence GUO clarence.guo...@gmail.comwrote:

 Hi~ All,
 Since Office 2007 Microsoft has defaulted to saving files in OOXML format.
 And soon, in April, Microsoft will stop supporting Office 2003, the last
 version of Office to write the binary format by default. So it becomes more
 and more important for AOO to have capability to support OOXML file format
 in order to help these users who need to work with OOXML files. AOO already
 has some capabilities for OOXML file import, but it needs many
 improvements. We have some pilot code for enabling OOXML export, developed
 by De Bin, Jian Yuan, Sun Ying, Jin Long... Although it still has some ways
 to go before ready for production, we'd like to contribute it first to AOO
 for further development so that more developers can work on the framework
 and continuously contribute their works. Since it still has many feature
 gaps, we propose to put it on a branch firstly, and continue to enhance it,
 and integrate it into a release only when we see it ready.

 Clarence




-- 
Best Regards,

Steve Yin


Building sw/ with ninja

2014-01-22 Thread Andre Fischer
Not quite a week ago I wrote about an idea to use XML files to store the 
declarative part of our makefiles: dependencies of libraries on source 
files, which resources are to be created and so on.  In the meantime I 
have found the time to do make (conduct?) an experiment.  I am now able 
to build module sw from the XML files with the help of the ninja build 
'system' [3].  Most of the work of converting the XML files into one 
single build.ninja file was done on one weekend.  You can see the source 
code at [1] ([2] contains everything zipped together).


The results are promising.  It runs faster and the build.ninja generator 
looks more maintainable than our solenv/gbuild/... makefiles.  But I am 
certainly biased.
Before I give you some numbers, I should say that I have collected the 
numbers totally unscientifically and it may be necessary to add some 
missing steps to the ninja build.  To the best of my knowledge all C++ 
files are compiled, libraries linked, resource files built, XML files 
copied.  Only the single sw.component file somehow escaped.


I ran my experiments on ani7 2.2GHz, 8GB notebook.

Complete build of a clean module:
gbuild about 9m30s (make -sr -j8)
ninja  about 7m15s (ninja)

Cleaning up
gbuild about 40s   (make clean)
ninja  less then 1s(ninja -t clean)

rebuild after touching one single header (sw/inc/section.hxx)
gbuild about 1m10s (make -sr -j8)
ninja about50s (ninja)

Building an already built module (nothing to do): depends very much on 
whether the disk cache is warm or cold.  Best times:

gbuild   more than 3s (make -sr -j8)
ninjaabout 0.4s(ninja)


Why is ninja faster than make/gbuild?
- Make runs each recipe in its own shell (bash), ninja executes its 
command directly.
- Ninja understands the header dependencies created by gxx/clang and 
msvc and stores them in a compact format that can be read in very fast 
on startup.

- I avoided some steps of build that are unnecessary in ninja
  = Ninja creates directories for the targets it makes.  Gbuild creates 
them explicitly.
  = GBuild first creates empty dependency files and later, in a second 
step, fills them with the actual dependency information created by one 
of the C/C++ compilers.



But, for me, these numbers are just a welcome side effect.  More 
important to me is maintainability.
Ninja follows a very different approach from (GNU) make.  Its lack of 
even simplest control structures such as if/then/else or foreach, 
requires the generation of the main makefile (by default that is called 
build.ninja) by program or script.  This leads to my current approach:
- Use XML to represent the static data (C++ files, libraries, resource 
files, XML files).

- Use a Perl script to translate the XML files into the build.ninja file.
The best tool for each job (XML: data representation, Perl: data 
processing).  Instead of Perl we could use any language that is part of 
our current build requirements (Java, C/C++, Python (we would have to 
compile that first, though)).  Look at the Perl files in [1] or [2] 
(build/source/ninja/*pm) and compare them to solenv/gbuild/*mk and see 
which you can understand better.



I think this could be one way to set up a better maintainable build 
system that is even slightly faster then what we currently have.


Best regards,
Andre


[1] http://people.apache.org/~af/build/
[2] http://people.apache.org/build.zip
[3] http://martine.github.io/ninja/manual.html


-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.apache.org
For additional commands, e-mail: dev-h...@openoffice.apache.org



Re: Building sw/ with ninja

2014-01-22 Thread Rob Weir
On Wed, Jan 22, 2014 at 8:32 AM, Andre Fischer awf@gmail.com wrote:
 Not quite a week ago I wrote about an idea to use XML files to store the
 declarative part of our makefiles: dependencies of libraries on source
 files, which resources are to be created and so on.  In the meantime I have
 found the time to do make (conduct?) an experiment.  I am now able to build
 module sw from the XML files with the help of the ninja build 'system' [3].
 Most of the work of converting the XML files into one single build.ninja
 file was done on one weekend.  You can see the source code at [1] ([2]
 contains everything zipped together).

 The results are promising.  It runs faster and the build.ninja generator
 looks more maintainable than our solenv/gbuild/... makefiles.  But I am
 certainly biased.
 Before I give you some numbers, I should say that I have collected the
 numbers totally unscientifically and it may be necessary to add some missing
 steps to the ninja build.  To the best of my knowledge all C++ files are
 compiled, libraries linked, resource files built, XML files copied.  Only
 the single sw.component file somehow escaped.

 I ran my experiments on ani7 2.2GHz, 8GB notebook.

 Complete build of a clean module:
 gbuild about 9m30s (make -sr -j8)
 ninja  about 7m15s (ninja)

 Cleaning up
 gbuild about 40s   (make clean)
 ninja  less then 1s(ninja -t clean)

 rebuild after touching one single header (sw/inc/section.hxx)
 gbuild about 1m10s (make -sr -j8)
 ninja about50s (ninja)

 Building an already built module (nothing to do): depends very much on
 whether the disk cache is warm or cold.  Best times:
 gbuild   more than 3s (make -sr -j8)
 ninjaabout 0.4s(ninja)


 Why is ninja faster than make/gbuild?
 - Make runs each recipe in its own shell (bash), ninja executes its command
 directly.
 - Ninja understands the header dependencies created by gxx/clang and msvc
 and stores them in a compact format that can be read in very fast on
 startup.
 - I avoided some steps of build that are unnecessary in ninja
   = Ninja creates directories for the targets it makes.  Gbuild creates them
 explicitly.
   = GBuild first creates empty dependency files and later, in a second step,
 fills them with the actual dependency information created by one of the
 C/C++ compilers.


 But, for me, these numbers are just a welcome side effect.  More important
 to me is maintainability.
 Ninja follows a very different approach from (GNU) make.  Its lack of even
 simplest control structures such as if/then/else or foreach, requires the
 generation of the main makefile (by default that is called build.ninja) by
 program or script.  This leads to my current approach:
 - Use XML to represent the static data (C++ files, libraries, resource
 files, XML files).
 - Use a Perl script to translate the XML files into the build.ninja file.
 The best tool for each job (XML: data representation, Perl: data
 processing).  Instead of Perl we could use any language that is part of our
 current build requirements (Java, C/C++, Python (we would have to compile
 that first, though)).  Look at the Perl files in [1] or [2]
 (build/source/ninja/*pm) and compare them to solenv/gbuild/*mk and see which
 you can understand better.


 I think this could be one way to set up a better maintainable build system
 that is even slightly faster then what we currently have.


Do you get a sense for how well-maintained Ninja is?  Are there many
contributors?  Many users?  Are we confident it will be around in 5
years?   I worry (but only a little) of another DMake.

-Rob


 Best regards,
 Andre


 [1] http://people.apache.org/~af/build/
 [2] http://people.apache.org/build.zip
 [3] http://martine.github.io/ninja/manual.html


 -
 To unsubscribe, e-mail: dev-unsubscr...@openoffice.apache.org
 For additional commands, e-mail: dev-h...@openoffice.apache.org


-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.apache.org
For additional commands, e-mail: dev-h...@openoffice.apache.org



Re: Building sw/ with ninja

2014-01-22 Thread Andre Fischer

On 22.01.2014 14:45, Rob Weir wrote:

On Wed, Jan 22, 2014 at 8:32 AM, Andre Fischer awf@gmail.com wrote:

Not quite a week ago I wrote about an idea to use XML files to store the
declarative part of our makefiles: dependencies of libraries on source
files, which resources are to be created and so on.  In the meantime I have
found the time to do make (conduct?) an experiment.  I am now able to build
module sw from the XML files with the help of the ninja build 'system' [3].
Most of the work of converting the XML files into one single build.ninja
file was done on one weekend.  You can see the source code at [1] ([2]
contains everything zipped together).

The results are promising.  It runs faster and the build.ninja generator
looks more maintainable than our solenv/gbuild/... makefiles.  But I am
certainly biased.
Before I give you some numbers, I should say that I have collected the
numbers totally unscientifically and it may be necessary to add some missing
steps to the ninja build.  To the best of my knowledge all C++ files are
compiled, libraries linked, resource files built, XML files copied.  Only
the single sw.component file somehow escaped.

I ran my experiments on ani7 2.2GHz, 8GB notebook.

Complete build of a clean module:
 gbuild about 9m30s (make -sr -j8)
 ninja  about 7m15s (ninja)

Cleaning up
 gbuild about 40s   (make clean)
 ninja  less then 1s(ninja -t clean)

rebuild after touching one single header (sw/inc/section.hxx)
 gbuild about 1m10s (make -sr -j8)
 ninja about50s (ninja)

Building an already built module (nothing to do): depends very much on
whether the disk cache is warm or cold.  Best times:
 gbuild   more than 3s (make -sr -j8)
 ninjaabout 0.4s(ninja)


Why is ninja faster than make/gbuild?
- Make runs each recipe in its own shell (bash), ninja executes its command
directly.
- Ninja understands the header dependencies created by gxx/clang and msvc
and stores them in a compact format that can be read in very fast on
startup.
- I avoided some steps of build that are unnecessary in ninja
   = Ninja creates directories for the targets it makes.  Gbuild creates them
explicitly.
   = GBuild first creates empty dependency files and later, in a second step,
fills them with the actual dependency information created by one of the
C/C++ compilers.


But, for me, these numbers are just a welcome side effect.  More important
to me is maintainability.
Ninja follows a very different approach from (GNU) make.  Its lack of even
simplest control structures such as if/then/else or foreach, requires the
generation of the main makefile (by default that is called build.ninja) by
program or script.  This leads to my current approach:
- Use XML to represent the static data (C++ files, libraries, resource
files, XML files).
- Use a Perl script to translate the XML files into the build.ninja file.
The best tool for each job (XML: data representation, Perl: data
processing).  Instead of Perl we could use any language that is part of our
current build requirements (Java, C/C++, Python (we would have to compile
that first, though)).  Look at the Perl files in [1] or [2]
(build/source/ninja/*pm) and compare them to solenv/gbuild/*mk and see which
you can understand better.


I think this could be one way to set up a better maintainable build system
that is even slightly faster then what we currently have.


Do you get a sense for how well-maintained Ninja is?  Are there many
contributors?  Many users?


I only know that it was developed by/for the chrome project [4] and that 
cmake has support for ninja as back end.



Are we confident it will be around in 5
years?   I worry (but only a little) of another DMake.


Then I probably should not tell you that I may make a similar experiment 
with tup as backend.


-Andre


[4] http://www.aosabook.org/en/posa/ninja.html
[5] http://gittup.org/tup/



-Rob



Best regards,
Andre


[1] http://people.apache.org/~af/build/
[2] http://people.apache.org/build.zip
[3] http://martine.github.io/ninja/manual.html


-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.apache.org
For additional commands, e-mail: dev-h...@openoffice.apache.org


-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.apache.org
For additional commands, e-mail: dev-h...@openoffice.apache.org




-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.apache.org
For additional commands, e-mail: dev-h...@openoffice.apache.org



Re: Building sw/ with ninja

2014-01-22 Thread Andre Fischer

On 22.01.2014 14:58, Andre Fischer wrote:

On 22.01.2014 14:45, Rob Weir wrote:
On Wed, Jan 22, 2014 at 8:32 AM, Andre Fischer awf@gmail.com 
wrote:
Not quite a week ago I wrote about an idea to use XML files to store 
the

declarative part of our makefiles: dependencies of libraries on source
files, which resources are to be created and so on.  In the meantime 
I have
found the time to do make (conduct?) an experiment.  I am now able 
to build
module sw from the XML files with the help of the ninja build 
'system' [3].
Most of the work of converting the XML files into one single 
build.ninja

file was done on one weekend.  You can see the source code at [1] ([2]
contains everything zipped together).

The results are promising.  It runs faster and the build.ninja 
generator

looks more maintainable than our solenv/gbuild/... makefiles. But I am
certainly biased.
Before I give you some numbers, I should say that I have collected the
numbers totally unscientifically and it may be necessary to add some 
missing
steps to the ninja build.  To the best of my knowledge all C++ files 
are
compiled, libraries linked, resource files built, XML files copied.  
Only

the single sw.component file somehow escaped.

I ran my experiments on ani7 2.2GHz, 8GB notebook.

Complete build of a clean module:
 gbuild about 9m30s (make -sr -j8)
 ninja  about 7m15s (ninja)

Cleaning up
 gbuild about 40s   (make clean)
 ninja  less then 1s(ninja -t clean)

rebuild after touching one single header (sw/inc/section.hxx)
 gbuild about 1m10s (make -sr -j8)
 ninja about50s (ninja)

Building an already built module (nothing to do): depends very much on
whether the disk cache is warm or cold.  Best times:
 gbuild   more than 3s (make -sr -j8)
 ninjaabout 0.4s(ninja)


Why is ninja faster than make/gbuild?
- Make runs each recipe in its own shell (bash), ninja executes its 
command

directly.
- Ninja understands the header dependencies created by gxx/clang and 
msvc

and stores them in a compact format that can be read in very fast on
startup.
- I avoided some steps of build that are unnecessary in ninja
   = Ninja creates directories for the targets it makes. Gbuild 
creates them

explicitly.
   = GBuild first creates empty dependency files and later, in a 
second step,

fills them with the actual dependency information created by one of the
C/C++ compilers.


But, for me, these numbers are just a welcome side effect. More 
important

to me is maintainability.
Ninja follows a very different approach from (GNU) make.  Its lack 
of even
simplest control structures such as if/then/else or foreach, 
requires the
generation of the main makefile (by default that is called 
build.ninja) by

program or script.  This leads to my current approach:
- Use XML to represent the static data (C++ files, libraries, resource
files, XML files).
- Use a Perl script to translate the XML files into the build.ninja 
file.

The best tool for each job (XML: data representation, Perl: data
processing).  Instead of Perl we could use any language that is part 
of our
current build requirements (Java, C/C++, Python (we would have to 
compile

that first, though)).  Look at the Perl files in [1] or [2]
(build/source/ninja/*pm) and compare them to solenv/gbuild/*mk and 
see which

you can understand better.


I think this could be one way to set up a better maintainable build 
system

that is even slightly faster then what we currently have.


Do you get a sense for how well-maintained Ninja is?  Are there many
contributors?  Many users?


I only know that it was developed by/for the chrome project [4] and 
that cmake has support for ninja as back end.



Are we confident it will be around in 5
years?   I worry (but only a little) of another DMake.


Then I probably should not tell you that I may make a similar 
experiment with tup as backend.


There are two different and independent ideas:

- Use an easy to read data format for expressing module data (C++ files, 
libraries, etc.) that is independent from the actual build tool.


- Use ninja as a back end.

The first part is much easier to accomplish and enables us to make 
experiments regarding different back ends.
Our current gbuild system would be a natural choice for the first back 
end.  Pure make might be the second.  And maybe ninja would be the third.


-Andre



-Andre


[4] http://www.aosabook.org/en/posa/ninja.html
[5] http://gittup.org/tup/



-Rob



Best regards,
Andre


[1] http://people.apache.org/~af/build/
[2] http://people.apache.org/build.zip
[3] http://martine.github.io/ninja/manual.html


-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.apache.org
For additional commands, e-mail: dev-h...@openoffice.apache.org


-
To unsubscribe, e-mail: 

Re: Building sw/ with ninja

2014-01-22 Thread Rob Weir
On Wed, Jan 22, 2014 at 9:06 AM, Andre Fischer awf@gmail.com wrote:
 On 22.01.2014 14:58, Andre Fischer wrote:

 On 22.01.2014 14:45, Rob Weir wrote:

 On Wed, Jan 22, 2014 at 8:32 AM, Andre Fischer awf@gmail.com wrote:

 Not quite a week ago I wrote about an idea to use XML files to store the
 declarative part of our makefiles: dependencies of libraries on source
 files, which resources are to be created and so on.  In the meantime I
 have
 found the time to do make (conduct?) an experiment.  I am now able to
 build
 module sw from the XML files with the help of the ninja build 'system'
 [3].
 Most of the work of converting the XML files into one single build.ninja
 file was done on one weekend.  You can see the source code at [1] ([2]
 contains everything zipped together).

 The results are promising.  It runs faster and the build.ninja generator
 looks more maintainable than our solenv/gbuild/... makefiles. But I am
 certainly biased.
 Before I give you some numbers, I should say that I have collected the
 numbers totally unscientifically and it may be necessary to add some
 missing
 steps to the ninja build.  To the best of my knowledge all C++ files are
 compiled, libraries linked, resource files built, XML files copied.
 Only
 the single sw.component file somehow escaped.

 I ran my experiments on ani7 2.2GHz, 8GB notebook.

 Complete build of a clean module:
  gbuild about 9m30s (make -sr -j8)
  ninja  about 7m15s (ninja)

 Cleaning up
  gbuild about 40s   (make clean)
  ninja  less then 1s(ninja -t clean)

 rebuild after touching one single header (sw/inc/section.hxx)
  gbuild about 1m10s (make -sr -j8)
  ninja about50s (ninja)

 Building an already built module (nothing to do): depends very much on
 whether the disk cache is warm or cold.  Best times:
  gbuild   more than 3s (make -sr -j8)
  ninjaabout 0.4s(ninja)


 Why is ninja faster than make/gbuild?
 - Make runs each recipe in its own shell (bash), ninja executes its
 command
 directly.
 - Ninja understands the header dependencies created by gxx/clang and
 msvc
 and stores them in a compact format that can be read in very fast on
 startup.
 - I avoided some steps of build that are unnecessary in ninja
= Ninja creates directories for the targets it makes. Gbuild creates
 them
 explicitly.
= GBuild first creates empty dependency files and later, in a second
 step,
 fills them with the actual dependency information created by one of the
 C/C++ compilers.


 But, for me, these numbers are just a welcome side effect. More
 important
 to me is maintainability.
 Ninja follows a very different approach from (GNU) make.  Its lack of
 even
 simplest control structures such as if/then/else or foreach, requires
 the
 generation of the main makefile (by default that is called build.ninja)
 by
 program or script.  This leads to my current approach:
 - Use XML to represent the static data (C++ files, libraries, resource
 files, XML files).
 - Use a Perl script to translate the XML files into the build.ninja
 file.
 The best tool for each job (XML: data representation, Perl: data
 processing).  Instead of Perl we could use any language that is part of
 our
 current build requirements (Java, C/C++, Python (we would have to
 compile
 that first, though)).  Look at the Perl files in [1] or [2]
 (build/source/ninja/*pm) and compare them to solenv/gbuild/*mk and see
 which
 you can understand better.


 I think this could be one way to set up a better maintainable build
 system
 that is even slightly faster then what we currently have.

 Do you get a sense for how well-maintained Ninja is?  Are there many
 contributors?  Many users?


 I only know that it was developed by/for the chrome project [4] and that
 cmake has support for ninja as back end.

 Are we confident it will be around in 5
 years?   I worry (but only a little) of another DMake.


 Then I probably should not tell you that I may make a similar experiment
 with tup as backend.


 There are two different and independent ideas:

 - Use an easy to read data format for expressing module data (C++ files,
 libraries, etc.) that is independent from the actual build tool.

 - Use ninja as a back end.

 The first part is much easier to accomplish and enables us to make
 experiments regarding different back ends.

That's a fair point.  Good design is always in style.

-Rob

 Our current gbuild system would be a natural choice for the first back end.
 Pure make might be the second.  And maybe ninja would be the third.

 -Andre



 -Andre


 [4] http://www.aosabook.org/en/posa/ninja.html
 [5] http://gittup.org/tup/


 -Rob


 Best regards,
 Andre


 [1] http://people.apache.org/~af/build/
 [2] http://people.apache.org/build.zip
 [3] http://martine.github.io/ninja/manual.html


 -
 To unsubscribe, e-mail: 

RE: EXTERNAL: Re: loadComponentFromURL - Solaris 11 OO 3.3

2014-01-22 Thread Steele, Raymond
Andrew, 

Thanks again for taking the time to look at this. I hope someone else can also 
provide some input.

Thanks,

Raymond



-Original Message-
From: Andrew Douglas Pitonyak [mailto:and...@pitonyak.org] 
Sent: Tuesday, January 21, 2014 8:16 PM
To: Steele, Raymond; a...@openoffice.apache.org
Subject: Re: EXTERNAL: Re: loadComponentFromURL - Solaris 11 OO 3.3


No worries regarding the confusion, language is generally imprecise :-)

Disclaimer: I have never used Java to manipulate OO.

That said, your explanation sounds plausible, and my mostly uninformed opinion 
is that you are correct. Hopefully another more familiar person can provide a 
workable solution.


On 01/21/2014 02:03 PM, Steele, Raymond wrote:
 Thanks for the reply and I apologize about the confusion.  If I use 
 OpenOffice as a regular user, the applications work just find (i.e. writer, 
 database, etc.), but if I attempt to run any code that I've written in Java 
 that uses the loadComponentFromURL method, the application crashes with the 
 below stack trace.  Also, compiling and running of FirstLoadComponent.java 
 located in sdk/examples/DevelopersGuide/FirstSteps results in the same.If 
 I install OpenOffice 3.3 on a Solaris 10, not Solaris 11 as in this case, the 
 applications run fine. I presume there is a conflict with the libraries. 
 Specifically, the libxml.so.2 libraries, which already exist  in my /usr/lib 
 (installed by other Solaris applications). It appears that OpenOffice 
 requires  the libxml.so.2 located within the /opt/openoffice.org directories 
 instead of the one installed by the Solaris apps. If I change my 
 LD_LIBRARY_PATH to include the one openoffice requires, I then get other 
 library references out of sync.

 Raymond
 -Original Message-
 From: Andrew Douglas Pitonyak [mailto:and...@pitonyak.org]
 Sent: Tuesday, January 21, 2014 6:49 AM
 To: a...@openoffice.apache.org; Steele, Raymond
 Subject: EXTERNAL: Re: loadComponentFromURL - Solaris 11 OO 3.3

 Given all of the lists that you copied, I assumed that you were not 
 subscribed so I copied you directly. Please respond to the list rather than 
 me directly so that others will see your response...

 On 01/15/2014 07:00 PM, Steele, Raymond wrote:
 Hello,

 I have OpenOffice 3.3 installed and running on my Solaris x86 system.
 Things run under normal user experience, but if I run a custom 
 application it fails on

 loadComponentFromURL(private:factory/scalc, _blank, 0, loadProps).

 This code works fine on a Solaris 10 system running OO 3.3.  Here is some of 
 the stack trace that java produces.
 I am unclear on what you are saying since you say things like this code 
 works fine and here is a stack trace. Are you saying:

 (1) You have code that works on one computer but not on another

 (2) this code always fails

 If (1), then there is a problem with your installation or a bug in OOo.
 If (2), your code is probably wrong.

 What language did you use? Basic, Java, C++? My guess is Basic, but your 
 claimed error seems wrong for that.

 If (1) then there is probably a problem in Java version, linked libraries, or 
 similar.

 If (2), have you tried a simple version written in Basic? Can you show a few 
 lines before and after the one line that causes the problem. This is not even 
 an entire line of code (since loadComponentFromURL is a method on the desktop 
 object). Would be nice to see how loadProps is declared and what it contains.



 Register to memory mapping:

 EAX=0x090bbe40 is an unknown value
 EBX=0xfe762000: _GLOBAL_OFFSET_TABLE_+0 in /libc.so.1 at 0xfe60

 Stack [oxddc51, 0xddd4f000], sp=0xddd4c520, free space=1005k 
 Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native 
 code)
 C [libc.so.1]  t_delete+0x41
 C [libc.so.1] realfree+0x5e
 C [libc.so.1]   _malloc_unlocked+0x1d2
 C [libc.so.1]   malloc+0x38
 C [libxml2.so.2]   xmlXPathNewParserContext+0x28
 C [libxml2.so.2]   xmlXPathEval+0x92
 C [libunoxml.so+0x81131]


 Any help would be greatly appreciated. Also, I had to type this stack trace 
 so I only provided what I thought was pertinent. Please let me know if you 
 need more.

 Thanks,
 Raymond Steele


 --
 Andrew Pitonyak
 My Macro Document: http://www.pitonyak.org/AndrewMacro.odt
 Info:  http://www.pitonyak.org/oo.php



--
Andrew Pitonyak
My Macro Document: http://www.pitonyak.org/AndrewMacro.odt
Info:  http://www.pitonyak.org/oo.php



-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.apache.org
For additional commands, e-mail: dev-h...@openoffice.apache.org



Re: config error on Windows 7

2014-01-22 Thread jonasalfreds...@gmail.com
The searched registry key does not exist, but registry key 
HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\VisualStudio\9.0\Setup\VC 
exists which is somehow found by the oowintool - see sub 
reg_get_value($) in oowintool. I observed this situation on three 
Windows 7 machine (virtual ones and non-virtual ones) on which I have 
setup an AOO build environment. 

Did you found the same registry key on your system? 

I found the key on my system aswell but in my case oowintool does not find
it.




--
View this message in context: 
http://openoffice.2283327.n4.nabble.com/config-error-on-Windows-7-tp4652383p4658198.html
Sent from the Development mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.apache.org
For additional commands, e-mail: dev-h...@openoffice.apache.org



Re: Call for Comments: Apache OpenOffice Distributor Best Practices

2014-01-22 Thread Donald Whytock
On Mon, Jan 20, 2014 at 2:14 PM, Rob Weir robw...@apache.org wrote:

 On Thu, Dec 5, 2013 at 9:47 AM, Rob Weir robw...@apache.org wrote:
  Details are here:
 
  https://blogs.apache.org/OOo/entry/call_for_comments_apache_openoffice
 

 It has been over a month since we put out this call for comments.  You
 can see some of them in this thread, as well as with the blog post:


 https://blogs.apache.org/OOo/entry/call_for_comments_apache_openoffice#comments

 The response was generally positive.  However, the volume of responses
 was rather low.  So I do wonder whether there is a large unmet need
 for this.  For example, we have not (to my knowledge) received
 requests for a CD on the mailing list in months now.

 Another data point:  the webpage that is #1 in Google search results
 for the query openoffice cd is:

 http://www.openoffice.org/distribution/cdrom/

 It receives around 7 visits per day.  Any proposal we came up with
 would be findable to user mainly through that same mechanism --
 searching Google.   Is it worth setting something up for 7 users per
 day?

 Note:  if we removed our web pages that discuss OpenOffice CD's, the
 top link would be a vendor on Amazon.com selling an OpenOffice CD.  So
 in a sense, if we just get out of the way, it would tend to work.
 The risk would be if we see vendors starting to scam users.

 Next steps?  If anyone really wants to have a CD distributor listing,
 I can help.  But it is not sufficiently high on my priority list to
 carry this by myself.  Someone else would need to take the lead.

 Regards,

 -Rob


If you want to take a get out of the way approach, would you nevertheless
want to put up signature files for official releases, such that anything
one does buy can at least be verified before it's installed?

Might be too late at that point to get one's money back, but it could save
the buyer some grief with his machine.  And give the buyer grounds to out a
fraudulent purveyor.

Don


Re: Call for Comments: Apache OpenOffice Distributor Best Practices

2014-01-22 Thread Rob Weir
On Wed, Jan 22, 2014 at 11:05 AM, Donald Whytock dwhyt...@apache.org wrote:
 On Mon, Jan 20, 2014 at 2:14 PM, Rob Weir robw...@apache.org wrote:

 On Thu, Dec 5, 2013 at 9:47 AM, Rob Weir robw...@apache.org wrote:
  Details are here:
 
  https://blogs.apache.org/OOo/entry/call_for_comments_apache_openoffice
 

 It has been over a month since we put out this call for comments.  You
 can see some of them in this thread, as well as with the blog post:


 https://blogs.apache.org/OOo/entry/call_for_comments_apache_openoffice#comments

 The response was generally positive.  However, the volume of responses
 was rather low.  So I do wonder whether there is a large unmet need
 for this.  For example, we have not (to my knowledge) received
 requests for a CD on the mailing list in months now.

 Another data point:  the webpage that is #1 in Google search results
 for the query openoffice cd is:

 http://www.openoffice.org/distribution/cdrom/

 It receives around 7 visits per day.  Any proposal we came up with
 would be findable to user mainly through that same mechanism --
 searching Google.   Is it worth setting something up for 7 users per
 day?

 Note:  if we removed our web pages that discuss OpenOffice CD's, the
 top link would be a vendor on Amazon.com selling an OpenOffice CD.  So
 in a sense, if we just get out of the way, it would tend to work.
 The risk would be if we see vendors starting to scam users.

 Next steps?  If anyone really wants to have a CD distributor listing,
 I can help.  But it is not sufficiently high on my priority list to
 carry this by myself.  Someone else would need to take the lead.

 Regards,

 -Rob


 If you want to take a get out of the way approach, would you nevertheless
 want to put up signature files for official releases, such that anything
 one does buy can at least be verified before it's installed?


I think getting our installers digitally signed is important for many
reasons.  At the very least it reduces user confusion during the
download and install process.

However it won't prevent the most common kinds of abuses.  We're not
really seeing people modify the AOO installer and putting malware into
the AOO installer.  What we see is someone creating a new installer
or downloader and advertising that for the users to download.  This
program installs the malware and then as the last step it downloads
and installs the original, unmodified AOO installer.

So even if we are digitally signed, it doesn't help in this case.  The
damage is already done before the real AOO installer is even launched.

One idea, and maybe this would cause users to panic more than we want
to, would be this:

As the first screen of the install program have a screen that says:

Important:  If you did not download this program from a known safe
website then you may be at risk from viruses, etc.  Apache OpenOffice
is free for all users.  You should not need to pay for it.  If
immediately before this screen you were asked to install other
software applications, or asked to authorize payment for OpenOffice,
then you may have been scammed.  Read here for more information...

Of course, we have nothing officially against selling OpenOffice, etc.
 So we would want to make it easier for a real programmer to disable
this screen in the installer.  But it might have some value.   But it
is coming one step too late to really prevent the problem.

Regards,

-Rob


 Might be too late at that point to get one's money back, but it could save
 the buyer some grief with his machine.  And give the buyer grounds to out a
 fraudulent purveyor.

 Don

-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.apache.org
For additional commands, e-mail: dev-h...@openoffice.apache.org



Fw: Email Capable?

2014-01-22 Thread george.zurek

- Original Message - 
From: george.zu...@att.net 
To: contac...@openoffice.us.com 
Sent: Tuesday, January 21, 2014 11:33 AM
Subject: Email Capable?


My Laptop 's OS is Windows 8.
I migrated my email address from the UK, it is a pop3 account that has been 
configured to a one months trial  of Microsoft office 365,I am told that I need 
to purchase this product to be able to continue to use my 
email.(george.zu...@btconnect.com) would Open Office enable me to use my email?
Kind Regards
George Zurek.

Re: Building sw/ with ninja

2014-01-22 Thread Armin Le Grand

Hi Andre,

On 22.01.2014 07:06, Andre Fischer wrote:

On 22.01.2014 14:58, Andre Fischer wrote:

8-8-
I only know that it was developed by/for the chrome project [4] and 
that cmake has support for ninja as back end.



Are we confident it will be around in 5
years?   I worry (but only a little) of another DMake.


Then I probably should not tell you that I may make a similar 
experiment with tup as backend.




This is wonderful news, the build system is one of the 'blocking' 
factors for further development. Thank you very much for doing this 
experiments and driving this forward.



There are two different and independent ideas:

- Use an easy to read data format for expressing module data (C++ 
files, libraries, etc.) that is independent from the actual build tool.


- Use ninja as a back end.

The first part is much easier to accomplish and enables us to make 
experiments regarding different back ends.
Our current gbuild system would be a natural choice for the first back 
end.  Pure make might be the second.  And maybe ninja would be the third.


I think that seperation is a very good approach. Backends could be the 
numbered ones, but also script-created stuff for eclipse and msdev 
(probably?).


We will have to make sure then that - when different ways exist to build 
the office - that we all still build the same. We already have issues 
when comparing stuff e.g. when looking into bugfixes on different 
systems, we do not really want this on the same system for different 
ways of building the office.


This also means - if I get you right - that the creation of the needed 
build info from the xml description is an integral part of the build 
system, probably the first step. Thus, changing it (e.g. adding a new 
file to the AOO build tree) will (has to) always be done in the xml 
description. Then to rebuild the needed stuff - dependent of the build 
method, e.g. ninja - the build system needs to have a dependency that 
leads to recreation of the needed build data itself, then to the build 
of AOO.


When (just dreaming) applying this to something like eclipe or msdev 
this would mean that the created data for these 'helpers' would need to 
be somehow automatically recreated/reloaded on the fly, would somehing 
like that be possibe...?


I also see making the build faster as a nice side effect, better 
readability/maintainability is the biggest plus.

And reliable global dependencies would be a big plus, too, of course...

Let's drive this forward, Im ready to help converting modules when it's 
principally working and the transition step is defined!


Sincerely,
Armin



-Andre



-Andre


[4] http://www.aosabook.org/en/posa/ninja.html
[5] http://gittup.org/tup/



8-8-



-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.apache.org
For additional commands, e-mail: dev-h...@openoffice.apache.org




-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.apache.org
For additional commands, e-mail: dev-h...@openoffice.apache.org



Re: Install error on nightly Windows build (r1560073)

2014-01-22 Thread Herbert Duerr

Hi Rob,

On 22.01.2014 17:29, Rob Weir wrote:

Installing on a clean Windows 7 64-bit image.

I get an error when configuring Microsoft Visual C++ 2008 Redistributable

Error 1935 An error occured during the installation of assembly
'policy.9.0.Microsoft.VC90.CRT,version=9.0.30729.6161
HRESULT:0x800736B3

Anyone else seeing this?


Apparently, as there is a knowledgebase article for that [1]. They 
recommend to run a tool [2] to fix it.


[1] http://support.microsoft.com/kb/970652
[2] http://support.microsoft.com/default.aspx?scid=kb;EN-US;946414

Herbert

-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.apache.org
For additional commands, e-mail: dev-h...@openoffice.apache.org



Re: Install error on nightly Windows build (r1560073)

2014-01-22 Thread Rob Weir
On Wed, Jan 22, 2014 at 11:39 AM, Herbert Duerr h...@apache.org wrote:
 Hi Rob,


 On 22.01.2014 17:29, Rob Weir wrote:

 Installing on a clean Windows 7 64-bit image.

 I get an error when configuring Microsoft Visual C++ 2008 Redistributable

 Error 1935 An error occured during the installation of assembly
 'policy.9.0.Microsoft.VC90.CRT,version=9.0.30729.6161
 HRESULT:0x800736B3

 Anyone else seeing this?


 Apparently, as there is a knowledgebase article for that [1]. They recommend
 to run a tool [2] to fix it.


This is odd, since this is a fresh VM image of Windows 7, with no
other applications ever installed.  Only Windows and Windows updates.
So it would be odd for there to be registry corruption.  Unless it
came from Windows updates...

Do you know if we used the same version of the  MSVCRT package with 4.0.1?

-Rob


 [1] http://support.microsoft.com/kb/970652
 [2] http://support.microsoft.com/default.aspx?scid=kb;EN-US;946414

 Herbert

 -
 To unsubscribe, e-mail: dev-unsubscr...@openoffice.apache.org
 For additional commands, e-mail: dev-h...@openoffice.apache.org


-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.apache.org
For additional commands, e-mail: dev-h...@openoffice.apache.org



Re: Building sw/ with ninja

2014-01-22 Thread Andre Fischer

On 22.01.2014 17:28, Armin Le Grand wrote:

Hi Andre,

On 22.01.2014 07:06, Andre Fischer wrote:

On 22.01.2014 14:58, Andre Fischer wrote:

8-8-
I only know that it was developed by/for the chrome project [4] and 
that cmake has support for ninja as back end.



Are we confident it will be around in 5
years?   I worry (but only a little) of another DMake.


Then I probably should not tell you that I may make a similar 
experiment with tup as backend.




This is wonderful news, the build system is one of the 'blocking' 
factors for further development. Thank you very much for doing this 
experiments and driving this forward.



There are two different and independent ideas:

- Use an easy to read data format for expressing module data (C++ 
files, libraries, etc.) that is independent from the actual build tool.


- Use ninja as a back end.

The first part is much easier to accomplish and enables us to make 
experiments regarding different back ends.
Our current gbuild system would be a natural choice for the first 
back end.  Pure make might be the second.  And maybe ninja would be 
the third.


I think that seperation is a very good approach. Backends could be the 
numbered ones, but also script-created stuff for eclipse and msdev 
(probably?).


Jan is working on a similar approach for msdev.  Eclipse has its own 
idea how building a project works.  I can not say if that can, 
eventually, mapped to our build system.
But I am currently working on a small Eclipse addon that provides a few 
buttons (or menu entries, etc.) that start a build of the current 
module, directory or file.  That might be enough for the time being.




We will have to make sure then that - when different ways exist to 
build the office - that we all still build the same. We already have 
issues when comparing stuff e.g. when looking into bugfixes on 
different systems, we do not really want this on the same system for 
different ways of building the office.


Good point.  As final outcome we should have only one build system, the 
ability to have different back ends is primarily interesting for 
evaluating different replacements of the current system.  At best there 
could be a secondary build system for the integration into IDEs.  But 
for building releases and reporting bugs we should use only one build 
system.




This also means - if I get you right - that the creation of the needed 
build info from the xml description is an integral part of the build 
system, probably the first step. Thus, changing it (e.g. adding a new 
file to the AOO build tree) will (has to) always be done in the xml 
description. Then to rebuild the needed stuff - dependent of the build 
method, e.g. ninja - the build system needs to have a dependency that 
leads to recreation of the needed build data itself, then to the build 
of AOO.


Think of the XML files as a replacement with a different syntax for the 
makefiles like Library_sw.mk.  The only difference is that make can not 
includes these files directly but has to translate them first into 
makefiles.  This could be done with a simple make rule like


Library_sw.mk : Library_sw.xml
xml2mk $ $@

and then include Library_sw.mk

For ninja this is already working in my experiment.  Change one of the 
xml files and build.ninja is rebuilt, included, and the updated build 
rules executed.




When (just dreaming) applying this to something like eclipe or msdev 
this would mean that the created data for these 'helpers' would need 
to be somehow automatically recreated/reloaded on the fly, would 
somehing like that be possibe...?


It depends on how much work one wants to invest.  It would be possible 
to write an addon for editing the xml files of the build system.  It is 
also possible, but more difficult, to add some hook that adds a new C++ 
file to the corresponding XML file when that C++ file is created via an 
Eclipse wizard.




I also see making the build faster as a nice side effect, better 
readability/maintainability is the biggest plus.

And reliable global dependencies would be a big plus, too, of course...

Let's drive this forward, Im ready to help converting modules when 
it's principally working and the transition step is defined!


Thank you, that is great to hear.

-Andre



Sincerely,
Armin



-Andre



-Andre


[4] http://www.aosabook.org/en/posa/ninja.html
[5] http://gittup.org/tup/



8-8-



-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.apache.org
For additional commands, e-mail: dev-h...@openoffice.apache.org




-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.apache.org
For additional commands, e-mail: dev-h...@openoffice.apache.org




-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.apache.org
For additional commands, 

unoinfo-bug in latest MacOSX snapshot still present (Re: [RELEASE]: snapshot build for Mac and Windows based on revision

2014-01-22 Thread Rony G. Flatscher

On 13.01.2014 14:01, Jürgen Schmidt wrote:
 On 1/13/14 2:00 PM, Jürgen Schmidt wrote:
 Hi,

 I have upload a new snapshot for Mac and Windows based on revision
 1521921. I also update the related wiki page under [1] (quite painful
 after the confluence update).

 An overview of changes/fixes in this snapshot since AOO 4.0.1 can be
 found under [2].

 Mac is still 32 bit but my plan is that the next snapshot and future
 versions will be 64 bit.

 Linux is not yet available via the snapshot page because of some
 problems with the build machines. I recommend the builds from the Apache
 builds bots. The difference is only that the Apache bots have a newer
 baseline but will work on newer Linux systems.

 I plan to provide patch sets for Windows in the next days together with
 additional information how to test and use them.

 Further languages and updates of existing languages will be integrated
 in the next snapshot.

 Juergen

 [1]
 https://cwiki.apache.org/confluence/display/OOOUSERS/Development+Snapshot+Builds

 [2]
 http://people.apache.org/~jsc/developer-snapshots/snapshot/AOO4.1.0_Snapshot_fixes_1524958_1556251.html

 -

Just downloaded the latest snapshot build for MacOSX (en-us, rev. 1556251) and 
found that the
unoinfo-bug reported in https://issues.apache.org/ooo/show_bug.cgi?id=123475 
is still present,
preventing Java programs using uninfo java for setting the classpath to be 
able to interact with AOO.

As the issue might not be too visible, yet the bug inhibits effectively Java 
from using AOO when
using unoinfo it seems that it should be fixed, before a final release for 
4.1.0.

---rony


-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.apache.org
For additional commands, e-mail: dev-h...@openoffice.apache.org



unoinfo-bug in latest MacOSX snapshot still present (Re: [RELEASE]: snapshot build for Mac and Windows based on revision

2014-01-22 Thread Rony G. Flatscher (Apache)

On 13.01.2014 14:01, Jürgen Schmidt wrote:
 On 1/13/14 2:00 PM, Jürgen Schmidt wrote:
 Hi,

 I have upload a new snapshot for Mac and Windows based on revision
 1521921. I also update the related wiki page under [1] (quite painful
 after the confluence update).

 An overview of changes/fixes in this snapshot since AOO 4.0.1 can be
 found under [2].

 Mac is still 32 bit but my plan is that the next snapshot and future
 versions will be 64 bit.

 Linux is not yet available via the snapshot page because of some
 problems with the build machines. I recommend the builds from the Apache
 builds bots. The difference is only that the Apache bots have a newer
 baseline but will work on newer Linux systems.

 I plan to provide patch sets for Windows in the next days together with
 additional information how to test and use them.

 Further languages and updates of existing languages will be integrated
 in the next snapshot.

 Juergen

 [1]
 https://cwiki.apache.org/confluence/display/OOOUSERS/Development+Snapshot+Builds

 [2]
 http://people.apache.org/~jsc/developer-snapshots/snapshot/AOO4.1.0_Snapshot_fixes_1524958_1556251.html

 -

Just downloaded the latest snapshot build for MacOSX (en-us, rev. 1556251) and 
found that the
unoinfo-bug reported in https://issues.apache.org/ooo/show_bug.cgi?id=123475 
is still present,
preventing Java programs using uninfo java for setting the classpath to be 
able to interact with AOO.

As the issue might not be too visible, yet the bug inhibits effectively Java 
from using AOO when
using unoinfo it seems that it should be fixed, before a final release for 
4.1.0.

---rony


-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.apache.org
For additional commands, e-mail: dev-h...@openoffice.apache.org



Re: First look at www.openoffice.org accesibility

2014-01-22 Thread Marcus (OOo)

Am 01/22/2014 03:50 AM, schrieb Nancy K:

I don't know if any of these links will help -

Invisible content just for screen reader users:
http://webaim.org/techniques/css/invisiblecontent/#techniques


Thanks for this. The H1 text is now styled with display: none;.


Cynthia Says for Section 508/WCAG2.0 (A thru AAA) accessibility (enter the url 
online):
http://www.cynthiasays.com/?

Colorblind tests (enter url online):
http://colorfilter.wickline.org/

html5 validator:
html5.validator.nu


I've not yet looked into these webpages but I fear we will have much to 
do. ;-)


Marcus





  From: Rob Weirapa...@robweir.com
To: dev@openoffice.apache.orgdev@openoffice.apache.org
Sent: Tuesday, January 21, 2014 1:39 PM
Subject: Re: First look at www.openoffice.org accesibility


On Tue, Jan 21, 2014 at 4:09 PM, Marcus (OOo)marcus.m...@wtnet.de  wrote:




In the meantime it's already online on www.oo.o. I've added a h1 tag and
fixed the double-link problem.

@Rob:
Please can you test if this is now OK in the screen reader?

Thanks




Here's the tool I used to check:

http://wave.webaim.org

The duplicate links problem is gone.  That's good news.

The error about the missingh1  is gone.  But now it gives an error
for theh1  with no content.

I wonder whether the real solution here is to make those main
options intoh1's and update the CSS accordingly?  If we use a
specific class for those headers we won't conflict with theh1's on
other pages, which are styled differently.

The only other error we have on the home page (and the other templated
pages) is the lack of the language identifier, and it sounds like Dave
had a good solution there.

Regards,

-RobboR


-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.apache.org
For additional commands, e-mail: dev-h...@openoffice.apache.org



Re: Building sw/ with ninja

2014-01-22 Thread Pedro Giffuni
Excuse the intermission as I have been very busy lately on my other pet 
project ...


+1 Ninja

We have seen it speed up things in some of FreeBSD's ports plus it is 
under an Apache License (which is probably not as relevant ... but is 
good).


Cheers,

Pedro.

-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.apache.org
For additional commands, e-mail: dev-h...@openoffice.apache.org



Re: First look at www.openoffice.org accesibility

2014-01-22 Thread Marcus (OOo)

[Top post to give you the latest update]

I've done fixed on the homepage [1] and download webpage [2].

The remaining things to fix are a missing title in the search box (top 
right) and the missing lang ID. Both applies to the entire website and 
not only to the both.


To be continued ...

[1] http://www.openoffice.org
[2] http://www.openoffice.org/download/

Marcus



Am 01/21/2014 10:39 PM, schrieb Rob Weir:

On Tue, Jan 21, 2014 at 4:09 PM, Marcus (OOo)marcus.m...@wtnet.de  wrote:




In the meantime it's already online on www.oo.o. I've added a h1 tag and
fixed the double-link problem.

@Rob:
Please can you test if this is now OK in the screen reader?

Thanks




Here's the tool I used to check:

http://wave.webaim.org

The duplicate links problem is gone.  That's good news.

The error about the missingh1  is gone.  But now it gives an error
for theh1  with no content.

I wonder whether the real solution here is to make those main
options intoh1's and update the CSS accordingly?  If we use a
specific class for those headers we won't conflict with theh1's on
other pages, which are styled differently.

The only other error we have on the home page (and the other templated
pages) is the lack of the language identifier, and it sounds like Dave
had a good solution there.

Regards,

-RobboR


-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.apache.org
For additional commands, e-mail: dev-h...@openoffice.apache.org



Re: Building sw/ with ninja

2014-01-22 Thread Andre Fischer

On 22.01.2014 23:13, Pedro Giffuni wrote:
Excuse the intermission as I have been very busy lately on my other 
pet project ...


You are always welcome.



+1 Ninja

We have seen it speed up things in some of FreeBSD's ports plus it is 
under an Apache License (which is probably not as relevant ... but is 
good).


Do you know if anyone has written down their experience with porting to 
ninja?


-Andre



Cheers,

Pedro.

-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.apache.org
For additional commands, e-mail: dev-h...@openoffice.apache.org




-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.apache.org
For additional commands, e-mail: dev-h...@openoffice.apache.org



Re: Install error on nightly Windows build (r1560073)

2014-01-22 Thread Herbert Duerr

Hi Rob,

On 22.01.2014 17:56, Rob Weir wrote:

On Wed, Jan 22, 2014 at 11:39 AM, Herbert Duerr h...@apache.org wrote:

[...]
Apparently, as there is a knowledgebase article for that [1]. They recommend
to run a tool [2] to fix it.



This is odd, since this is a fresh VM image of Windows 7, with no
other applications ever installed.  Only Windowsand Windows updates.



So it would be odd for there to be registry corruption.  Unless it
came from Windows updates...


Bingo! The knowledge base article mentions that An unsuccessful install 
of prior Windows Updates OR Custom MSI packages
could have left over faulty registry keys under 
HKEY_LOCAL_MACHINE\COMPONENTS.


So the Windows updates probably caused the problem. It's hard to 
believe, but even Windows has bugs ;-)



Do you know if we used the same version of the  MSVCRT package with 4.0.1?


The deliver log [1] of the nightly build showed that everything seemed 
normal.


[1] 
http://ci.apache.org/projects/openoffice/buildlogs/win/main/external/wntmsci12.pro/misc/logs/external_deliver.txt


Herbert

-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.apache.org
For additional commands, e-mail: dev-h...@openoffice.apache.org