Hi Ruben,

I like that idea a lot and have been thinking about it as well. At the same 
time, preventing spiders from indexing content that we don't want them to index 
is a concern. From what I've read, robots.txt is a way to solve both things at 
the same time as you can set up rules for robots and provide a sitemap, but 
feeds are certainly another option to do this. There is also Opensearch, a 
standard to provide search engines with hints on the site data and structure.

Tobias

On 22.05.2012, at 16:25, Rubén Pérez <[email protected]> wrote:

> Dear list,
> 
> We have noticed that Google does not index our public videos correctly and we 
> have thought we maybe could generate a "sitemap"[1] file to describe the 
> contents of the site better and have hits of the individual videos in google 
> searches. What do you think about this idea? It seems that the XML feeds can 
> be configured to work as this "sitemap" file. Do you think this is feasible, 
> or can you think of other alternatives?
> 
> Any comments or ideas are appreciated.
> 
> Best regards
> Rubén
> 
> 
> [1] http://support.google.com/webmasters/bin/answer.py?answer=156184
> _______________________________________________
> Matterhorn mailing list
> [email protected]
> http://lists.opencastproject.org/mailman/listinfo/matterhorn
> 
> 
> To unsubscribe please email
> [email protected]
> _______________________________________________

_______________________________________________
Matterhorn mailing list
[email protected]
http://lists.opencastproject.org/mailman/listinfo/matterhorn


To unsubscribe please email
[email protected]
_______________________________________________

Reply via email to