[dev] Window title not displayed in OOo 3.1.1

2009-12-14 Thread true801
Hi

The following macro creates a simple window with a title. OOo 2.4.1 shows the 
title normally, while OOo 3.1.1 seems to ignore it. Does anyone know what's 
wrong ?


Thanks


Option Explicit

Sub Main
Dim oNewFrame As Object
Dim oNewWindow As Object
  oNewWindow = CreateWindow( GetToolkit, MakeRect(50,50,250,150) )
  oNewFrame = CreateNewFrame( oNewWindow, New Frame )
  oNewFrame.setPropertyValue(Title,title)
  StarDesktop.getFrames().append(oNewFrame)
  Exit Sub
End Sub


Function GetToolkit()
  GetToolkit = GetProcessServiceManager. _
createInstanceWithContext( _
com.sun.star.awt.Toolkit,GetDefaultContext)
End Function


Function CreateNewFrame( oLocWindow As Object, _
sLocFrameName As String ) As Object
Dim oLocNewFrame As Object
  oLocNewFrame = 
GetProcessServiceManager.createInstance(com.sun.star.frame.Frame)
  With oLocNewFrame
.initialize(oLocWindow)
.setCreator(StarDesktop)
.setName(sLocFrameName)
  End With
  CreateNewFrame = oLocNewFrame
End Function


Function CreateWindow( oLocToolkit As Object, _
  aLocRect As com.sun.star.awt.Rectangle ) As 
com.sun.star.awt.WindowDescriptor
Dim oLocWinDesc
  oLocWinDesc = createUnoStruct(com.sun.star.awt.WindowDescriptor)
  With oLocWinDesc
.Type = com.sun.star.awt.WindowClass.TOP
.Bounds = aLocRect
  End With
  With com.sun.star.awt.WindowAttribute
oLocWinDesc.WindowAttributes = _
  .MOVEABLE + .CLOSEABLE + .BORDER + .SHOW + .SIZEABLE
  End With
  CreateWindow = oLocToolkit.createWindow(oLocWinDesc)
End Function


Function MakeRect( nX As Long, nY As Long, _
  nWidth As Long, nHeight As Long ) As com.sun.star.awt.Rectangle
Dim oLocRect
  oLocRect = createUnoStruct(com.sun.star.awt.Rectangle)
  With oLocRect
.X = nX
.Y = nY
.Width = nWidth
.Height = nHeight
  End With
  MakeRect() = oLocRect
End Function 


-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.org
For additional commands, e-mail: dev-h...@openoffice.org



[dev] Libraries moved from svx module

2009-12-14 Thread Mathias Bauer
Hi,

starting with the DEV300m68 build the libraries cui and msfilter are
not built in svx anymore.

The code for the cui library has been moved to an own module called
(surprise!) cui. It also got an own resource file, including its own
resource IDs and Help IDs. In result the svx resource file was reduced
in size by a little bit more than 50%.
The build time of the svx module is also reduced considerably.

The code for the msfilter library has been moved into a sub folder in
the filter module (filter/source/msfilter and filter/inc/filter/msfilter).

Please don't put new files into svx/source/cui and svx/source/msfilter
anymore. Instead of this please resync to m68, once it is ready, and
then add your files to the appropriate folder in cui or to
filter/source/msfilter.

If you want to add resources to CUI and for whatever reason want to make
them publicly known, please add them to the appropriate section in
svx/inc/svx/dialogs.hrc, where starting with DEV300m68 comments explain
where these shared IDs are used.

Regards,
Mathias

-- 
Mathias Bauer (mba) - Project Lead OpenOffice.org Writer
OpenOffice.org Engineering at Sun: http://blogs.sun.com/GullFOSS
Please don't reply to nospamfor...@gmx.de.
I use it for the OOo lists and only rarely read other mails sent to it.

-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.org
For additional commands, e-mail: dev-h...@openoffice.org



[dev] New module svl

2009-12-14 Thread Mathias Bauer
Hi,

starting with the DEV300m68 a new module svl will be part of our
build. The code in this module isn't just a 1:1 copy of the former svl
library built in the svtools module, it is a library that contains as
much code as possible from svtools that does not have any vcl or toolkit
dependency. This allowed to link the following modules without sfx2,
svtools, vcl and toolkit (after some more or less massive code massage):

- connectivity
- xmloff
- linguistic
- xmlhelp
- lingucomponent
- embedserv

embeddedobj will follow (it has remaining svtools and vcl dependencies
that will be fixed in cws svxsplit (issue 107449).

Some code from svtools has been moved into unotools (character
conversion, locale settings) as that allowed to use it in vcl also.

Please refrain from reintroducing dependencies on svtools, toolkit,
goodies, or vcl into these libraries or any other libraries that not
already depend on one of these. Such code is best put into svtools
itself (or libraries located higher than it).

Regards,
Mathias

-- 
Mathias Bauer (mba) - Project Lead OpenOffice.org Writer
OpenOffice.org Engineering at Sun: http://blogs.sun.com/GullFOSS
Please don't reply to nospamfor...@gmx.de.
I use it for the OOo lists and only rarely read other mails sent to it.


-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.org
For additional commands, e-mail: dev-h...@openoffice.org



[dev] Test Cleanup

2009-12-14 Thread Stephan Bergmann

Hi all,

I just embarked on a new project, namely to clean up and consolidate the 
various test frameworks and corresponding tests available in the OOo 
build environment.  These include at least:


- C++ unit tests, based on CppUnit and/or testshl2, and Java unit tests, 
based on JUnit and/or qadevOOo.  These are scattered across the code 
base (*/qa, */test, */workben, testtools/, ...), some are executed 
during a regular build (o3tl/qa, basegfx/test, basebmp/test) but most 
are not even compiled during a regular build (and thus rot over time). 
Some of the tests need no special environment, while others require a 
working UNO runtime environment.


- The so-called UNO-API and Complex tests.  These are located in 
*/qa/unoapi and */qa/complex, use OOoRunner from qadevOOo, and can be 
executed via cwscheckapi or checkapi.  They require an installed OOo, 
which cwscheckapi takes care of.  They are not compiled or executed 
during a regular build (they clearly cannot be executed, as they require 
an installed OOo), but it is expected that cwscheckapi is manually 
executed for each CWS.


- The smoke test in smoketestoo_native.  It requires an installed OOo, 
which smoketest.pl takes care of.  It is executed at the end of a 
regular build.


- The ConvWatch and Performance tests, that can be started from the EIS 
page of a CWS.  They require an installed OOo (and also the installation 
of a corresponding master-workspace OOo, for result comparison), which 
they take care of.  They are not executed during a regular build, but it 
is rather expected that they are manually triggered from EIS for each 
CWS (where they are executed asynchronously on dedicated machines, and 
their results made available in EIS).


What is *not* covered (for now?) are the automatic QA tests based on 
testtool, as well as the portability tests (so to speak) of building 
OOo on a wide range of platforms via buildbots and tinderboxes.


The two main problems with the tests listed above appear to be that (a) 
many of them require an OOo installation, and they all invented their 
own ways of providing one, and all those ways are brittle and start to 
fail sooner or later, and (b) the tests that are not compiled or 
executed during each build (CWS as well as master) start to rot sooner 
or later.  A third problem probably is that the tests and test 
frameworks are often poorly documented and do things in non-standard 
ways (e.g., testshl2 vs. plain CppUnit), so that it is not easy to 
maintain existing tests and write additional ones.


I would like to address these problems.  My guiding vision in doing so 
is the following perfect world:  There is one OOo installation in the 
solver.  (Ideally, it would automatically emerge from delivering the 
various files directly to the appropriate locations in the solver.)  All 
the tests that require an OOo installation use that one installation. 
(They do not modify it.  Each test probably has its own, throw-away 
UserInstallation directory, and soffice is started with appropriate 
switches to not show unwanted first start wizards etc.)  All the tests 
are written using standard tools (the xUnit framework: CppUnit resp. 
JUnit).  For tests that have specific requirements on their environment 
(i.e., require a working UNO runtime environment, or an OOo 
installation), there are library routines available to set up/tear down 
such environments, to be called from the xUnit setup/tearDown methods. 
Generally, tests are compiled and executed during every regular build. 
For tests which absolutely cannot be executed during every regular build 
(maybe because they are too expensive, or require a dedicated machine, 
like could be the case for the performance test), the main way to 
execute them is still to have some (manual) makefile target for them. 
(There may be additional convenience mechanisms, like buttons in EIS, 
but they are strictly secondary.)


I know that this picture is not perfectly realistic, and that there will 
be obstacles along the way that require pragmatic workarounds.  Still, I 
think it is important to know what the ideal should look like, even if 
you have to deviate from it.


As a first step, I set up CWS sb118 to experiment, as a showcase and to 
gain further insight, with turning smoketestoo_native into such an ideal 
test.  As it turned out, the first thing I had to do on that CWS was to 
replace the heavily modified CppUnit 1.8 currently used by OOo with a 
plain unmodified latest version of CppUnit 1.12.1.


Comments on all of this are, of course, very welcome.

-Stephan

-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.org
For additional commands, e-mail: dev-h...@openoffice.org



Re: [dev] Test Cleanup

2009-12-14 Thread Steffen Grund

Hello,

see my 2 cents below.

Stephan Bergmann wrote:

Hi all,

I just embarked on a new project, namely to clean up and consolidate the 
various test frameworks and corresponding tests available in the OOo 
build environment.  These include at least:


- C++ unit tests, based on CppUnit and/or testshl2, and Java unit tests, 
based on JUnit and/or qadevOOo.  These are scattered across the code 
base (*/qa, */test, */workben, testtools/, ...), some are executed 
during a regular build (o3tl/qa, basegfx/test, basebmp/test) but most 
are not even compiled during a regular build (and thus rot over time). 
Some of the tests need no special environment, while others require a 
working UNO runtime environment.


- The so-called UNO-API and Complex tests.  These are located in 
*/qa/unoapi and */qa/complex, use OOoRunner from qadevOOo, and can be 
executed via cwscheckapi or checkapi.  They require an installed OOo, 
which cwscheckapi takes care of.  They are not compiled or executed 
during a regular build (they clearly cannot be executed, as they require 
an installed OOo), but it is expected that cwscheckapi is manually 
executed for each CWS.


Not connected, I would like to change something: as far as it is 
possible (from build order perspective), I plan to include all Java 
tests into the build, where this has not been done already. Tests can 
then be executed with dmake or cwscheckapi.
Execution while building is not planned - nearly all tests need an 
installed runnable office to execute.

When no Java environment is set, tests will not be compiled, of course.

-Steffen



- The smoke test in smoketestoo_native.  It requires an installed OOo, 
which smoketest.pl takes care of.  It is executed at the end of a 
regular build.


- The ConvWatch and Performance tests, that can be started from the EIS 
page of a CWS.  They require an installed OOo (and also the installation 
of a corresponding master-workspace OOo, for result comparison), which 
they take care of.  They are not executed during a regular build, but it 
is rather expected that they are manually triggered from EIS for each 
CWS (where they are executed asynchronously on dedicated machines, and 
their results made available in EIS).


What is *not* covered (for now?) are the automatic QA tests based on 
testtool, as well as the portability tests (so to speak) of building 
OOo on a wide range of platforms via buildbots and tinderboxes.


The two main problems with the tests listed above appear to be that (a) 
many of them require an OOo installation, and they all invented their 
own ways of providing one, and all those ways are brittle and start to 
fail sooner or later, and (b) the tests that are not compiled or 
executed during each build (CWS as well as master) start to rot sooner 
or later.  A third problem probably is that the tests and test 
frameworks are often poorly documented and do things in non-standard 
ways (e.g., testshl2 vs. plain CppUnit), so that it is not easy to 
maintain existing tests and write additional ones.


I would like to address these problems.  My guiding vision in doing so 
is the following perfect world:  There is one OOo installation in the 
solver.  (Ideally, it would automatically emerge from delivering the 
various files directly to the appropriate locations in the solver.)  All 
the tests that require an OOo installation use that one installation. 
(They do not modify it.  Each test probably has its own, throw-away 
UserInstallation directory, and soffice is started with appropriate 
switches to not show unwanted first start wizards etc.)  All the tests 
are written using standard tools (the xUnit framework: CppUnit resp. 
JUnit).  For tests that have specific requirements on their environment 
(i.e., require a working UNO runtime environment, or an OOo 
installation), there are library routines available to set up/tear down 
such environments, to be called from the xUnit setup/tearDown methods. 
Generally, tests are compiled and executed during every regular build. 
For tests which absolutely cannot be executed during every regular build 
(maybe because they are too expensive, or require a dedicated machine, 
like could be the case for the performance test), the main way to 
execute them is still to have some (manual) makefile target for them. 
(There may be additional convenience mechanisms, like buttons in EIS, 
but they are strictly secondary.)


I know that this picture is not perfectly realistic, and that there will 
be obstacles along the way that require pragmatic workarounds.  Still, I 
think it is important to know what the ideal should look like, even if 
you have to deviate from it.


As a first step, I set up CWS sb118 to experiment, as a showcase and to 
gain further insight, with turning smoketestoo_native into such an ideal 
test.  As it turned out, the first thing I had to do on that CWS was to 
replace the heavily modified CppUnit 1.8 currently used by OOo with a 
plain unmodified latest 

Re: [dev] Test Cleanup

2009-12-14 Thread bjoern michaelsen - Sun Microsystems - Hamburg Germany
On Mon, 14 Dec 2009 15:06:45 +0100
Stephan Bergmann stephan.bergm...@sun.com wrote:

 Hi all,
 
 I just embarked on a new project, namely to clean up and consolidate
 the various test frameworks and corresponding tests available in the
 OOo build environment. 
 ...
 Comments on all of this are, of course, very welcome.

Yay! Sounds like another great step forward for the development
environment.

Best Regards,

Bjoern Michaelsen
-- 
===
 Sitz der Gesellschaft:
 Sun Microsystems GmbH, Sonnenallee 1, D-85551 Kirchheim-Heimstetten
 Amtsgericht Muenchen: HRB 161028
 Geschaeftsfuehrer: Thomas Schroeder, Wolfgang Engels, Wolf Frenkel
 Vorsitzender des Aufsichtsrates: Martin Haering
===


-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.org
For additional commands, e-mail: dev-h...@openoffice.org



Re: [dev] Test Cleanup

2009-12-14 Thread Juergen Schmidt

Stephan Bergmann wrote:

Hi all,

I just embarked on a new project, namely to clean up and consolidate the 
various test frameworks and corresponding tests available in the OOo 
build environment.  These include at least:


- C++ unit tests, based on CppUnit and/or testshl2, and Java unit tests, 
based on JUnit and/or qadevOOo.  These are scattered across the code 
base (*/qa, */test, */workben, testtools/, ...), some are executed 
during a regular build (o3tl/qa, basegfx/test, basebmp/test) but most 
are not even compiled during a regular build (and thus rot over time). 
Some of the tests need no special environment, while others require a 
working UNO runtime environment.


- The so-called UNO-API and Complex tests.  These are located in 
*/qa/unoapi and */qa/complex, use OOoRunner from qadevOOo, and can be 
executed via cwscheckapi or checkapi.  They require an installed OOo, 
which cwscheckapi takes care of.  They are not compiled or executed 
during a regular build (they clearly cannot be executed, as they require 
an installed OOo), but it is expected that cwscheckapi is manually 
executed for each CWS.


- The smoke test in smoketestoo_native.  It requires an installed OOo, 
which smoketest.pl takes care of.  It is executed at the end of a 
regular build.


- The ConvWatch and Performance tests, that can be started from the EIS 
page of a CWS.  They require an installed OOo (and also the installation 
of a corresponding master-workspace OOo, for result comparison), which 
they take care of.  They are not executed during a regular build, but it 
is rather expected that they are manually triggered from EIS for each 
CWS (where they are executed asynchronously on dedicated machines, and 
their results made available in EIS).


What is *not* covered (for now?) are the automatic QA tests based on 
testtool, as well as the portability tests (so to speak) of building 
OOo on a wide range of platforms via buildbots and tinderboxes.


The two main problems with the tests listed above appear to be that (a) 
many of them require an OOo installation, and they all invented their 
own ways of providing one, and all those ways are brittle and start to 
fail sooner or later, and (b) the tests that are not compiled or 
executed during each build (CWS as well as master) start to rot sooner 
or later.  A third problem probably is that the tests and test 
frameworks are often poorly documented and do things in non-standard 
ways (e.g., testshl2 vs. plain CppUnit), so that it is not easy to 
maintain existing tests and write additional ones.


I would like to address these problems.  My guiding vision in doing so 
is the following perfect world:  There is one OOo installation in the 
solver.  (Ideally, it would automatically emerge from delivering the 
various files directly to the appropriate locations in the solver.)  All 
the tests that require an OOo installation use that one installation. 
(They do not modify it.  Each test probably has its own, throw-away 
UserInstallation directory, and soffice is started with appropriate 
switches to not show unwanted first start wizards etc.)  All the tests 
are written using standard tools (the xUnit framework: CppUnit resp. 
JUnit).  For tests that have specific requirements on their environment 
(i.e., require a working UNO runtime environment, or an OOo 
installation), there are library routines available to set up/tear down 
such environments, to be called from the xUnit setup/tearDown methods. 
Generally, tests are compiled and executed during every regular build. 
For tests which absolutely cannot be executed during every regular build 
(maybe because they are too expensive, or require a dedicated machine, 
like could be the case for the performance test), the main way to 
execute them is still to have some (manual) makefile target for them. 
(There may be additional convenience mechanisms, like buttons in EIS, 
but they are strictly secondary.)


I know that this picture is not perfectly realistic, and that there will 
be obstacles along the way that require pragmatic workarounds.  Still, I 
think it is important to know what the ideal should look like, even if 
you have to deviate from it.
thanks for sharing your vision of the ideal world, i agree that it is 
important and of course very useful to know where to go ...




As a first step, I set up CWS sb118 to experiment, as a showcase and to 
gain further insight, with turning smoketestoo_native into such an ideal 
test.  As it turned out, the first thing I had to do on that CWS was to 
replace the heavily modified CppUnit 1.8 currently used by OOo with a 
plain unmodified latest version of CppUnit 1.12.1.


Comments on all of this are, of course, very welcome.
a cleanup and a consolidation of the different available and used test 
frameworks etc. sounds very useful. And once we have reached a state 
where we have working tests, a working framework and some documentation 
in place that describes how to write new 

Re: [dev] Test Cleanup

2009-12-14 Thread Frank Schoenheit, Sun Microsystems Germany
Hi Stephan,

 I just embarked on a new project, namely to clean up and consolidate the 
 various test frameworks and corresponding tests available in the OOo 
 build environment.

That#s highly appreciated!

 - The so-called UNO-API and Complex tests.  These are located in 
 */qa/unoapi and */qa/complex, use OOoRunner from qadevOOo, and can be 
 executed via cwscheckapi or checkapi.

Not sure whether you mix things here, or whether I am simply not
up-to-date: To my knowledge, the complex test cases in */qa/complex are
not (read: cannot be) executed by (cws)checkapi. At least in all modules
I now, they're accompanied by some makefile which allows to invoke them
via dmake run or some such.

 ...
 I know that this picture is not perfectly realistic, and that there will 
 be obstacles along the way that require pragmatic workarounds.  Still, I 
 think it is important to know what the ideal should look like, even if 
 you have to deviate from it.

Agreed. I'd already be very happy if only some parts of this could be
achieved.

For the records, since you didn't mention it explicitly, though I think
it's on your list: (Un)Reliability of the tests is another major blocker
for their acceptance currently. Of course, in places where this is due
to the concrete test, not due to the test framework, this is to be
solved one by one only. But we shouldn't forget this important goal: If
tests do not run reliably, then the best test framework of the world
won't get us anywhere.

Ciao
Frank

-- 
- Frank Schönheit, Software Engineer frank.schoenh...@sun.com -
- Sun Microsystems  http://www.sun.com/staroffice -
- OpenOffice.org Base   http://dba.openoffice.org -
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.org
For additional commands, e-mail: dev-h...@openoffice.org



Re: [dev] StarOffice3.1 executing macros from the command line

2009-12-14 Thread Markus Daniel
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Hiho,

Mathias Bauer schrieb:
 Markus Daniel wrote:
 
 I tried to load the documents with OpenOffice, this works fine, but the
 layout was not like the original sdw-Files and the pdf-export in early
 OpenOffice-Versions is not so good as needed.

 Perhaps you can find an SO5 copy somewhere (as this was available as a
 free download at some point in time) and - if you are lucky - this
 version handles your SO3 documents properly.
 I will try this approach, thank you for the tipp.

 Can you tell me if the command promt call of a macro works or not?
 soffice3.exe macro:///lib.module.macro1
 
 Sorry, I don't remember. My recommendation would be to use SO5 (if you
 can find one) for conversion to the latest sdw format (that OOo can
 hopefully import better) and then use OOo1 and your macro to export to PDF.

I found SO5, installed it and tried to open the SO3-Documents in SO5 but
the layout of the Documents in SO5 is broken, looks like the documents
look in OpenOffice103.

 In case that doesn't work, please ask again.

It does not work. Better: It works but SO5.2 interprets the layout not
like SO3.1. E.g. page breaks are different and while SO31 shows ? SO52
and OpenOffice103 shows DM.

Any ideas?

Thanks in advance
Markus

- --
/**
 * Markus Daniel
 * Bachelor of Science
 *
 * Synyx GmbH  Co. KG
 * OpenSource Solutions
 * Karlstr. 68
 * 76137 Karlsruhe
 *
 * phone +49(0)721 66 48 79 31
 * fax   +49(0)721 66 48 877
 * eMail markus.dan...@synyx.de
 * www   http://www.synyx.de
 * skype synyx_daniel
 * irc   irc.synyx.de
 *
 * Sitz der Gesellschaft: Karlsruhe
 * Registergericht: Mannheim
 * Handelsregisternummer: HRA 4793
 * USt-IdNr.: DE249264296
 *
 * Komplementärin: Elatech Verwaltungs GmbH
 * Sitz der Gesellschaft: Karlsruhe
 * Geschäftsführer: Markus Daniel
 * Registergericht: Mannheim
 * Handelsregisternummer: HRB 7250
 */
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.6 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iD8DBQFLJmC/A6KmbVkuXQkRApchAKDQKrm1vdNXTaDABfMNOzwk7HB6WgCgmIEh
jjgwcalmD5eewQQnGVj5fxc=
=odHv
-END PGP SIGNATURE-

-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.org
For additional commands, e-mail: dev-h...@openoffice.org



Re: [dev] Test Cleanup

2009-12-14 Thread Stephan Bergmann

On 12/14/09 16:21, Frank Schoenheit, Sun Microsystems Germany wrote:
I just embarked on a new project, namely to clean up and consolidate the 
various test frameworks and corresponding tests available in the OOo 
build environment.


That#s highly appreciated!

- The so-called UNO-API and Complex tests.  These are located in 
*/qa/unoapi and */qa/complex, use OOoRunner from qadevOOo, and can be 
executed via cwscheckapi or checkapi.


Not sure whether you mix things here, or whether I am simply not
up-to-date: To my knowledge, the complex test cases in */qa/complex are
not (read: cannot be) executed by (cws)checkapi. At least in all modules
I now, they're accompanied by some makefile which allows to invoke them
via dmake run or some such.


You are right.  As Steffen already wrote, he is currently (and somewhat 
independently) looking into treating the complex tests more like the 
unoapi tests, so I took the liberty of discussing those two kinds of 
tests here as if they were more or less the same sort of thing.


I know that this picture is not perfectly realistic, and that there will 
be obstacles along the way that require pragmatic workarounds.  Still, I 
think it is important to know what the ideal should look like, even if 
you have to deviate from it.


Agreed. I'd already be very happy if only some parts of this could be
achieved.

For the records, since you didn't mention it explicitly, though I think
it's on your list: (Un)Reliability of the tests is another major blocker
for their acceptance currently. Of course, in places where this is due
to the concrete test, not due to the test framework, this is to be
solved one by one only. But we shouldn't forget this important goal: If
tests do not run reliably, then the best test framework of the world
won't get us anywhere.


Yes, thanks for mentioning it here.  Unreliable tests are a waste of 
time (as are unreliable test frameworks).  We have to get rid of them 
(by fixing them or by dumping them).  Its on the list.


-Stephan

-
To unsubscribe, e-mail: dev-unsubscr...@openoffice.org
For additional commands, e-mail: dev-h...@openoffice.org