On Apr 7, 9:56 am, smr <[email protected]> wrote:
> So, to summarise that, as well as no longer using Ubuntu you don't
> actually use OpenOffice or 8GB of RAM to edit text files either?  Why
> did you say that's what you needed then?  I'm going to point out that
> you don't need to be a software developer to understand when someone
> keeps changing their story, you need to be a psychic just to keep up.
> I said I couldn't see how you'd ever use 8GB to edit text files and,
> well, you've just said you don't, so that's fine.
>
Have I physically used 8Gig of RAM?  No.  I've never seen a "free"
report using much over 6Gig, but I had a lot of interruptions that
day, so more documents and terminal sessions open than normal.


> I would assume a software developer would understand quite how much
> data 8BG actually is and understand why I was dubious of your claim.
> A text document which needed 8GB to hold it (working from a decimal
> megabyte, so even rounding down) would be something like 2,080,000 A4
> pages long, and that's only if you insisted on having every last page
> in memory at the same time.  Are your fingers alright after all that
> typing?  Basically are you writing a programming book or copying out
> the Library of Congress?  8GB is a ridiculous amount of memory to say
> you require to write a book and I called you on it, so I don't see a
> need to get defensive about my occupation.
>
> Soo, I'm not /saying/ you're making stuff up here, but...

LOL.

OK, 4 and 10 escaped you so let me shed some more light.  I'm working
on 4 books right now:  A second edition of "The Minimum You Need to
Know to Be an OpenVMS Application Developer" (source of many OO
problems), "Twenty of Two/The Infamous They", "The Minimum You Need to
Know about Qt and Databases", and I have just released "Infinite
Exposure" which seems to always have some eBook conversion work or
marketing response to write.

The application book source ODT was created by some tool I purchased
which was the only tool that could extract the content from the
original source into something OO could load.  That WordPerfect import
filter/functionality in OO, fugedaboutit.  It's just there so its
developers can point and laugh at you every time you click on it.  The
resulting ODT wasn't in good shape, but at least OO could open it.
Since it needed a lot of work, new fonts, and a lot of items tagged
for TOC and Index, it made sense to selectively cut and paste from the
converted doc to a new document.  This meant keeping two documents
open at the same time.

Everything was fine until I got to around 240-260 pages in the
destination document under Ubuntu.  Occasionally OO would stumble and
fail, but I had timed saves going on, so never lost _that_ much work.
Then Cannonical released a well and truly busted kernel which affected
thousands of machines and chose not to back it out.  My destination
document was still under 400 pages, but after the kernel update I was
having to physically reboot the machine 3-5 times per day.  There was
no help coming from Cannonical.  They have a "March or Die" mentality
when it comes to things they release.  I searched all over for an
alternative and/or a work around.  The Symphony product was
unavailable for 64-bit Ubuntu.  KWord and Abiword both have even less
support for the ODT standard than OO.  I tried everything to avoid
jumping distros due to some databases I have which must be recreated
each time.  Converting my desktop is a 4 day task.

Post OpenSuse 11.1 install, I could keep both documents open in OO and
had things working pretty well until I got to around 420 pages in the
destination document.  OO would crash, but it wouldn't take out the
entire system.  I could easily restart OO and begin figuring out what
was missing.  I started watching the free report and noticed that I
was using some swap space just before the OO crash would happen.  I
ordered 4Gig of RAM from Smile & Tango on Ebay.  When it hadn't showed
up in over 4 days I did what I should have done in the first place,
ordered from Tiger Direct, had the RAM the next day, but had to play
the cat&mouse game of sending in for a "rebate".  The day after I
installed this RAM, the order from Smile and Tango showed up.  No
sense letting it go to waste, I installed it.

OO worked OK for a little while, then started crashing and burning
again.  Memory was no longer an issue since there had only been less
than 2Gig free on one occasion.  I searched around for a solution.  I
found it in Symphony.  Yes, Symphony is capable of working on both
documents without going down in flames.  I would like to say I have my
choice of which word processor to use, but that would be a fantasy.
Symphony implements a much more complete ODT spec than OO.  Quite a
few things didn't transfer over...most notably page breaks since it
appears they both have a different rendering engine when it comes to
calculating how much of a page is used.

I can keep the massive converted ODT open in Symphony and paste into
the new document in OO.  If I wasn't already this far into the book,
you would be correct, I would simply use Symphony for all of it.  I
did add roughly a 100 pages via symphony, but it was far less work to
fix those 100 pages in OO than to go back in Symphony and fix the
previous 400+.  Not worth the pain.  ODT != ODT != ODT.  Sad, but it's
true.  I have to get the port of this book done so I can completely
abandon my last Windows partition.  The second edition won't come out
until the end of 2011 or mid 2012, but I need it in a form I can use.
I'm updating it while making the port because if current first edition
stock sells out early, I will send it off to editing, then printing
ahead of schedule.

Would I consider abandoning OO for my new books?  Definitely, if
Symphony gets the additional plug-ins like LT (Language Tool), the PML
plug-in for converting to Palm Markup Language (needed to make one
type of eBook), a Weblog publisher plug-in, and several others I use
for eBook generation but don't remember the names of now.

For now, I have a solution which lets me move forward.  Things are
behind schedule, but that is normal for books.  I no longer have a
kernel bug which lets OO take down the entire machine when you exceed
its design limitations.  That's a very good thing.  I now have a
method of keeping both documents open so I can keep moving forward.
This method was not available to me under Ubuntu, it is under OpenSuSE
11.1

Cannonical should never have released a kernel which allowed OO to
shoot it out of the saddle.  When it did get released, they should
have backed it out.  They failed on both counts.  I moved on.

The Java developers working on OO appear to have a significant lack of
real world experience.  I find this pretty common in the Java world.
They chose to launch a new "instance" of OO for each document rather
than operate in MDI fashion.  They chose to load the entire document
rather than just a few pages.  They chose to make memory devouring
objects out of everything rather than following traditional word
processor development paths.

The last mentioned of the above is a very slippery slope, but "memory-
is-somone-else's-problem" lets them commit many sins.

Of all their design flaws, the most critical was launching a new
"instance" of OO without launching a new JVM to own it.  I belive part
of that design flaw comes from the "quick launch" or whatever it is
feature which runs in background.  If each instance of OO was actually
running in its own JVM then each instance would be able to use the
full resources configured for the JVM.  Any single instance which
exceeded its resource allocation would crash, but it wouldn't take out
the other instances.

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Ubuntu Linux" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/ubuntulinux?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to