> OK, 4 and 10 escaped you so let me shed some more light.

Ah, that's unfortunate, I think I see how your mind's working.  You've
noticed my comment that I hadn't put any thought about how the version
numbers in Ubuntu works and, because of that, you believe that my
criticism of your hyperbole is invalid.  That's not actually how the
real world works.  You don't have to wait until someone omniscient
comes along to call you on it when you exaggerate, regular people can
do that too.

While the policy behind version numbers in any particular open source
project is potentially quite interesting, and hey, we're all
interested in computers here so that's cool, it's not exactly going to
shake the world and destroy my credibility if I happened to not be
that bothered, since I'm not involved with any part of Ubuntu and the
numbers are not my concern.  That said, I also like to learn new facts
and I thank people who tell me things I don't know and tell them if I
think it's interesting, it's how my mother raised me, it's just
politeness.

You said earlier:
"I need to be able to address 8Gig of RAM because OpenOffice and other
applications don't work when parts of them are in the swap file...at
least on Ubuntu. "

But, when the gargantuan figure of 8GB for text editing was questioned
you admitted, which is big of you and kudos for doing it:
"Have I physically used 8Gig of RAM?  No."

I kinda feel I can rest my case with that really.  If you don't mind
me saying so I think that picking up on my "that completely escaped
me, that's very interesting" comment to another person on the group
comes across as sort of petty and I don't know if you want that
associated with your "professional" persona, if you understand what I
mean by that, and your books.

Your criticism of OO.o has some merit, but I do think you're selling
the marathon effort that's gone into OO.o short, it may not be good
for you but it is extremely useful for millions of people who cannot
easily afford modern commercial word processing/spreadsheet/
presentation software.  As you say yourself, it ably handles a 20 page
term paper and it's crucial, particularly for authors selling their
work I suppose, to remember that not everybody is writing a technical,
illustrated book.  If you need to type a 20 page term paper you've got
a bigger concern right there than the structural design decisions
which shaped your word processor.  "Good enough is", to quote a joke
from Reader's Digest.  It performs a really crucial job of attracting
people to free and open source software, concerns about its
(potentially) monolithic structure aside.  It is a huge program now,
though, and receives a lot of criticism for being sluggish on some
systems and that's a worry because it's one of the poster child
projects for open source.

On 7 Apr, 18:02, yyyc186 <[email protected]> wrote:
> On Apr 7, 9:56 am, smr <[email protected]> wrote:> So, to summarise 
> that, as well as no longer using Ubuntu you don't
> > actually use OpenOffice or 8GB of RAM to edit text files either?  Why
> > did you say that's what you needed then?  I'm going to point out that
> > you don't need to be a software developer to understand when someone
> > keeps changing their story, you need to be a psychic just to keep up.
> > I said I couldn't see how you'd ever use 8GB to edit text files and,
> > well, you've just said you don't, so that's fine.
>
> Have I physically used 8Gig of RAM?  No.  I've never seen a "free"
> report using much over 6Gig, but I had a lot of interruptions that
> day, so more documents and terminal sessions open than normal.
>
> > I would assume a software developer would understand quite how much
> > data 8BG actually is and understand why I was dubious of your claim.
> > A text document which needed 8GB to hold it (working from a decimal
> > megabyte, so even rounding down) would be something like 2,080,000 A4
> > pages long, and that's only if you insisted on having every last page
> > in memory at the same time.  Are your fingers alright after all that
> > typing?  Basically are you writing a programming book or copying out
> > the Library of Congress?  8GB is a ridiculous amount of memory to say
> > you require to write a book and I called you on it, so I don't see a
> > need to get defensive about my occupation.
>
> > Soo, I'm not /saying/ you're making stuff up here, but...
>
> LOL.
>
> OK, 4 and 10 escaped you so let me shed some more light.  I'm working
> on 4 books right now:  A second edition of "The Minimum You Need to
> Know to Be an OpenVMS Application Developer" (source of many OO
> problems), "Twenty of Two/The Infamous They", "The Minimum You Need to
> Know about Qt and Databases", and I have just released "Infinite
> Exposure" which seems to always have some eBook conversion work or
> marketing response to write.
>
> The application book source ODT was created by some tool I purchased
> which was the only tool that could extract the content from the
> original source into something OO could load.  That WordPerfect import
> filter/functionality in OO, fugedaboutit.  It's just there so its
> developers can point and laugh at you every time you click on it.  The
> resulting ODT wasn't in good shape, but at least OO could open it.
> Since it needed a lot of work, new fonts, and a lot of items tagged
> for TOC and Index, it made sense to selectively cut and paste from the
> converted doc to a new document.  This meant keeping two documents
> open at the same time.
>
> Everything was fine until I got to around 240-260 pages in the
> destination document under Ubuntu.  Occasionally OO would stumble and
> fail, but I had timed saves going on, so never lost _that_ much work.
> Then Cannonical released a well and truly busted kernel which affected
> thousands of machines and chose not to back it out.  My destination
> document was still under 400 pages, but after the kernel update I was
> having to physically reboot the machine 3-5 times per day.  There was
> no help coming from Cannonical.  They have a "March or Die" mentality
> when it comes to things they release.  I searched all over for an
> alternative and/or a work around.  The Symphony product was
> unavailable for 64-bit Ubuntu.  KWord and Abiword both have even less
> support for the ODT standard than OO.  I tried everything to avoid
> jumping distros due to some databases I have which must be recreated
> each time.  Converting my desktop is a 4 day task.
>
> Post OpenSuse 11.1 install, I could keep both documents open in OO and
> had things working pretty well until I got to around 420 pages in the
> destination document.  OO would crash, but it wouldn't take out the
> entire system.  I could easily restart OO and begin figuring out what
> was missing.  I started watching the free report and noticed that I
> was using some swap space just before the OO crash would happen.  I
> ordered 4Gig of RAM from Smile & Tango on Ebay.  When it hadn't showed
> up in over 4 days I did what I should have done in the first place,
> ordered from Tiger Direct, had the RAM the next day, but had to play
> the cat&mouse game of sending in for a "rebate".  The day after I
> installed this RAM, the order from Smile and Tango showed up.  No
> sense letting it go to waste, I installed it.
>
> OO worked OK for a little while, then started crashing and burning
> again.  Memory was no longer an issue since there had only been less
> than 2Gig free on one occasion.  I searched around for a solution.  I
> found it in Symphony.  Yes, Symphony is capable of working on both
> documents without going down in flames.  I would like to say I have my
> choice of which word processor to use, but that would be a fantasy.
> Symphony implements a much more complete ODT spec than OO.  Quite a
> few things didn't transfer over...most notably page breaks since it
> appears they both have a different rendering engine when it comes to
> calculating how much of a page is used.
>
> I can keep the massive converted ODT open in Symphony and paste into
> the new document in OO.  If I wasn't already this far into the book,
> you would be correct, I would simply use Symphony for all of it.  I
> did add roughly a 100 pages via symphony, but it was far less work to
> fix those 100 pages in OO than to go back in Symphony and fix the
> previous 400+.  Not worth the pain.  ODT != ODT != ODT.  Sad, but it's
> true.  I have to get the port of this book done so I can completely
> abandon my last Windows partition.  The second edition won't come out
> until the end of 2011 or mid 2012, but I need it in a form I can use.
> I'm updating it while making the port because if current first edition
> stock sells out early, I will send it off to editing, then printing
> ahead of schedule.
>
> Would I consider abandoning OO for my new books?  Definitely, if
> Symphony gets the additional plug-ins like LT (Language Tool), the PML
> plug-in for converting to Palm Markup Language (needed to make one
> type of eBook), a Weblog publisher plug-in, and several others I use
> for eBook generation but don't remember the names of now.
>
> For now, I have a solution which lets me move forward.  Things are
> behind schedule, but that is normal for books.  I no longer have a
> kernel bug which lets OO take down the entire machine when you exceed
> its design limitations.  That's a very good thing.  I now have a
> method of keeping both documents open so I can keep moving forward.
> This method was not available to me under Ubuntu, it is under OpenSuSE
> 11.1
>
> Cannonical should never have released a kernel which allowed OO to
> shoot it out of the saddle.  When it did get released, they should
> have backed it out.  They failed on both counts.  I moved on.
>
> The Java developers working on OO appear to have a significant lack of
> real world experience.  I find this pretty common in the Java world.
> They chose to launch a new "instance" of OO for each document rather
> than operate in MDI fashion.  They chose to load the entire document
> rather than just a few pages.  They chose to make memory devouring
> objects out of everything rather than following traditional word
> processor development paths.
>
> The last mentioned of the above is a very slippery slope, but "memory-
> is-somone-else's-problem" lets them commit many sins.
>
> Of all their design flaws, the most critical was launching a new
> "instance" of OO without launching a new JVM to own it.  I belive part
> of that design flaw comes from the "quick launch" or whatever it is
> feature which runs in background.  If each instance of OO was actually
> running in its own JVM then each instance would be able to use the
> full resources configured for the JVM.  Any single instance which
> exceeded its resource allocation would crash, but it wouldn't take out
> the other instances.
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Ubuntu Linux" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/ubuntulinux?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to