One thing I do specifically for search engines is to put my main page into frames, even if the whole page is in a single frame.

Now you can make an entirely separate page for Google to search and spider.  Google will spider and score the nonframe section of the page.  Now you have a page you can set up specifically to work the way you want your site searched w/o having to worry about the user seeing the page.  You can set up special links to go to your other pages using just the arguments you want and not have to worry about expired userRefereceArguements.

You can do things like create virtual directories that point to tafs in the main directory using arguments and the link through the virtual directory won’t display the arguments.

 

I would also recommend purchasing Web Position Pro to help set up your noframe section and put the right keywords and text in there so that it will score high for your desired key words.

 

 

Troy

 

 


From: Stefan Gonick [mailto:[EMAIL PROTECTED]
Sent: Monday, March 13, 2006 9:13 AM
To: [email protected]
Subject: RE: Witango-Talk: Tml pages and search engines

 

Hi John,

I agree with Google not liking the userref arg. That's particularly bad.
Sometimes G will tolerate 2 fixed args and sometimes not. It will
tend to have problems with more than 2. I have a client who is a search
engine expert. He told me that one arg is always safe, 2 may work, and
more than 2 is much more risky.

Also, G does not execute _javascript_. If you have sites with _javascript_
menus where the links are being created by _javascript_, then those won't
work. If you have _javascript_ menus where the links are still accessible as
text within the _javascript_, that will still work. The basic message is that
G will only use what it can see in the text of the html returned by the server.
That last part is important. TML files are processed by the Witango server
and ultimately generate html that the web server processes. That's what
G will see. That applies to includes as well. Whatever html the web server
sees is what G will see and process.

Stefan

P.S. Congratulations on your top 5 positions in Google!

At 10:04 AM 3/13/2006, you wrote:

Mmmmm...

Sorry Stefan, but I have no problem with crawls on pages with links that
have more then 1 argument. In fact, I have a forum with over 5000 pages
all listed in G and all have multiple arguments. What Google does not
like is the ID tag or any tag that identifies a user as being unique. So
a userreference passed in the url would be bad. 

Another thing. If you are using includes of any type in tml, taf, and
even shtml or shtm, the spiders don't even know there is a call to an
external file. The include file is just that... included. If you look at
your source where there is an include file, you don't see the include
tag, you see the source of the include where the tag should be. This is
what the spiders see too.

I have several sites that use multiple arguments along with includes for
headers, footers, menus, etc., etc. All these sites are listed within
the top 5 in Google. One of these sites is #1 for several very
competative phrases. Has been for several years. All pages are taf and
all links to product pages pass multiple args in the url and as post
args.

As for _javascript_ menus, I have 2 sites that get crawled regularily that
use these. All pages are listed in G. In fact, G will crawl just about
anything that has an a href or http in front of it. Even if it is not a
link (just text).

The hardest part of optimizing for G is to make sure that the URLs on
your site remain static even though the pages are dynamic. I know this
because it was hard for me to wrap my brain around this. For example...

One of these sites had a picture section that had pictures of the latest
submissions by users. Of course these pages changed almost daily. I had
a taf that searched the db and counted all that were marked as new to
diplay with the latest being first. The result was a list of pictures
that would pop up in a window and let you scroll through them with the
latest being first. Well every time G came and crawled the site, the
links to these dynamically added pages, along with the titles,
description, etc., were different! Now when you click on those links,
the pages are different then what what the title shows in G. Truly
dynamic huh? Maybe a little bit too dynamic.

In other words, dynamic urls (URLS that change over time) are bad. A url
that references one page one day, and another page on another day, will
totally hose what you want to be displyed in Google or any other of the
engines. Okay, what page it it supposed to be? The green widget that was
here yesterday? Or the blue wudget that's here today?


 

-----Original Message-----
From: Stefan Gonick [ mailto:[EMAIL PROTECTED]]
Sent: Sunday, March 12, 2006 6:25 PM
To: [email protected]
Subject: Re: Witango-Talk: Tml pages and search engines

Hi Dan,

Spiders will crawl tml files with no problem. It will crawl any file
accessible by an html link. However, Google doesn't like links with more
than one argument.

The following is good for Google:

http://www.mydomain.com/myprogram.taf?_function=list

The following is NOT good:

http://www.mydomain.com/myprogram.taf?_function=list&id=3

Spiders also can't run _javascript_, so links that are generated by
_javascript_ don't work. There are many dynamic menu systems that use
_javascript_, and those won't be spidered.

Stefan


At 07:14 PM 3/12/2006, you wrote:
>If we make a page a TML page instead of HTML so we can put in some
>include files and have it handled by the Witango server will the robots

>and spiders still crawl through those pages?
>
>What is the latest thinking on optimization of witango sites for the
>search engines.
>
>--
>Dan Stein
>FileMaker 7 Certified Developer
>Digital Software Solutions
>799 Evergreen Circle
>Telford PA 18969
>Land: 215-799-0192
>Cell: 610-256-2843
>Fax 413-410-9682
>FMP, WiTango, EDI,SQL 2000, MySQL, CWP
>[EMAIL PROTECTED]
>www.dss-db.com
>
>
>     "It's very hard to grow, because it's difficult to let go of
>models of ourselves in which we've invested so heavily."
>
>
>_______________________________________________________________________
>_ TO UNSUBSCRIBE: Go to http://www.witango.com/developer/maillist.taf

=====================================================
Database WebWorks: Dynamic web sites through database integration
http://www.DatabaseWebWorks.com


________________________________________________________________________
TO UNSUBSCRIBE: Go to http://www.witango.com/developer/maillist.taf
________________________________________________________________________
TO UNSUBSCRIBE: Go to http://www.witango.com/developer/maillist.taf

=====================================================
Database WebWorks: Dynamic web sites through database integration
http://www.DatabaseWebWorks.com

________________________________________________________________________
TO UNSUBSCRIBE: Go to http://www.witango.com/developer/maillist.taf
________________________________________________________________________
TO UNSUBSCRIBE: Go to http://www.witango.com/developer/maillist.taf

Reply via email to