Although on the projects we've done to date I have generally used other methods
(non metadata methods) of associating keywords with content items, I do know
that optimizing the collections is definitely important to keep performance
good.

One strategy is to batch the collection updates until off-hours. At that point
they can be updated and optimized and during working hours users are only going
against optimized collections.

When adding content to collections we have found that you don't have to add too
many individual articles to a collection before you need to optimize it. Perhaps
you're running into a similar thing with the metadata collections.

David

-----Original Message-----
From: Petersen, Jeremy [mailto:[EMAIL PROTECTED]]
Sent: Wednesday, August 02, 2000 8:53 AM
To: '[EMAIL PROTECTED]'
Subject: RE: Metadata Keywords


Everyone seems to agree ram is king in regards to verity performance.  I'm sure
that will help.

Be sure you are keeping the collections optimized.   This helps in both the
updates and the searches.

I have never seen this 'officially' verified, but even with strict optimization
to your collections the wear and tear of consistently adding things to it will
eventually force you to delete the collections and rebuild them from the ground
up.  In fact, It would be interesting to see what that would do for your
performance times and collection sizes.

-----Original Message-----
From: Steve Jaeger [mailto:[EMAIL PROTECTED]]
Sent: Tuesday, August 01, 2000 7:40 PM
To: Spectra-Talk
Subject: Metadata Keywords


Hi list,
I am having some issues with my execution times when working with metadata
keywords.

I am using Metadata Keywords to relate a variety of different content
together.  When all of the content is entered, we will have well over 1000
content objects.  We will also have 200 to 300 keywords total for all of the
content.  As the database is filled with content, we are  experiencing
increasingly high execution times, between 15 to 35 seconds for a single
"listings" style page to load.

Our development environment is not ideal.  We are using a P2 400 with 128MB
of RAM.

What I need to know is: what can I do to help this situation?  We have a new
server on order, and it has 1GB of RAM, which I am sure will help.  Even so,
is there some kind of optimization that I can do to help this situation?
How can I check if my metadata collections are out of whack?  We user a
number of cfa_metadataKeywordObjectFind tags to retrieve the content, is
there a better way to do this?

On the back end, I have created an edit handler for my content that will
automatically assign some of the keywords during content creation.  The last
process of the edit handler is to update the Metadata Index using
cfa_metadataindexupdate.  This tag takes 20 to 30 seconds to execute.  Has
anyone determined a workaround for this?  Should I not update the Index
during creation, and instead make it a scheduled process to run at night?

Any answers to these issues would be greatly appreciated.

Thanks,
Steve Jaeger
-----------------------------
email: [EMAIL PROTECTED]
  web: www.spectraconsulting.net

------------------------------------------------------------------------------
To Unsubscribe visit
http://www.houseoffusion.com/index.cfm?sidebar=lists&body=lists/spectra_talk or
send a message to [EMAIL PROTECTED] with 'unsubscribe' in
the body.
------------------------------------------------------------------------------
To Unsubscribe visit
http://www.houseoffusion.com/index.cfm?sidebar=lists&body=lists/spectra_talk or
send a message to [EMAIL PROTECTED] with 'unsubscribe' in
the body.

------------------------------------------------------------------------------
To Unsubscribe visit 
http://www.houseoffusion.com/index.cfm?sidebar=lists&body=lists/spectra_talk or send a 
message to [EMAIL PROTECTED] with 'unsubscribe' in the body.

Reply via email to