Xizhen Wang writes:

> so, you mean both client and server need to store session data?

     Each request from the browser to the server comes in with no
identification or continuity.  To the server, each request is a brand
new browser showing up and asking for something.  This is the
"stateless" aspect of HTTP, which made it very flexible and easy to
implement, and probably was partially responsible for it being so
popular and successful.

     However, now when we're developing complex applications to run on
web servers and browsers, the stateless nature of the web is a pain in
the ass.  Cookies were added to browsers to remedy this; an HTTP
cookie is a small piece of information the server sends to the
browser.  The browser includes the cookie information in any requests
it sends to the server after that.

     There are some optional cookie settings that can determine how
long the browser keeps the cookie (default is to keep it only until
the user shuts down the browser, but you can ask the browser to keep
it for days, weeks,or longer).  You can also determine whether the
browser sends the cookie along with all requests to that domain, or
with requests for pages from a specific directory, or subdomain, etc.

> so if I want to store a large amount of data in session, will it
> make the response very slow? (because I think the server needs to
> send the session information to the client). but i feel it is
> quicker than querying the database.

     The cookie can only hold a relatively small amount of
information.  Besides which, the information has to travel across the
network and possibly across a slow modem connection every time. So if
you want to store a lot of information, the standard approach is to
only store some sort of identification (even simply a unique serial
number) and then store the information on the server, and look it up
according to the identification.

     Some browsers don't store cookies.  An alternative approach for
them is to generate a URL with a serial number embedded in it.  If you
built the cookie-based approach using a server database, then you can
just write a method that checks both for a cookie and for a URL, and
gets the identification and pulls the data out of the server database.
I believe some servlet engines provide facilities to do all of this
for you.

     Since your project is a search engine, you really don't need to
use cookies to store the search results.

     You could just return the search results, and make the "next" and
"previous" links include all of the search arguments in a "GET" style
URL:

     HTTP://www.mysite.com/search?term1=money&term2=cash&term3=yen&results=30-40

     Add an argument (in this example it's the "results" argument)
indicating which of the search results sets should be returned.  To
your search engine, each request would look like "Search for money,
cash, and yen, and show me results 30 to 40".  This might be simple
enough.  If the burden on your search engine is too great, you could
implement some sort of caching based on keeping around the results of
every search that is run, comparing the search arguments, and
returning the appropriate range of results; I think this is how most
of the big web search engines work.  I suspect many databases come
with something built in to do this.

Steven J. Owens
[EMAIL PROTECTED]
[EMAIL PROTECTED]

___________________________________________________________________________
To unsubscribe, send email to [EMAIL PROTECTED] and include in the body
of the message "signoff SERVLET-INTEREST".

Archives: http://archives.java.sun.com/archives/servlet-interest.html
Resources: http://java.sun.com/products/servlet/external-resources.html
LISTSERV Help: http://www.lsoft.com/manuals/user/user.html

Reply via email to