很多朋友刚步入SEO行业,可能是带着很多疑问的,比如:我的站排名为什么不太好?我的站的收录为什么这么少等。
当你咨询一些SEOER时,他们也会教你一些措施,其中关于Google的收录就有一条Google Sitemap中文名为google网站地图。
Google Sitemaps是google在2005年6 月份推出的一项服务,关于Google Sitemap(s),google是这样描述的:
The Sitemap Protocol allows you to inform search engine crawlers about URLs on your Web sites that are available for crawling. A Sitemap consists of a list of URLs and may also contain additional information about those URLs, such as when they were last modified, how frequently they change, etc.
在一则国外的博客读到关于“为什么要使用Google Sitemap”的文章,标题为“Why use a Google Sitemap”,里面是这样阐述的:
Why use a Google Sitemap
Does Google misjudge the importance of pages on your website?
Perhaps you have a product page in your catalogue that is linked to from lots of sites on the web and has a high pagerank - this page may often appear above your home page in the search engine results page even for searches that are for your company generally, not that specific page. The sitemap protocol enables you to indicate the relative importance of the pages on your site.
Do you have dynamic content that is not indexed by search engines?
There are many ways of tackling this problem, the sitemap protocol is a new tool to help ensure that your site is indexed in depth. As the sitemap protocol is new (and still in beta) it should not be relied upon to ensure deep indexing.
Do search engines crawl your site too much?
The sitemap protocol enables webmasters to suggest to search engine robots how often particular pages should be indexed. This could potentially reduce the bandwidth used by search engine robots on dynamic sites.
对于Google Sitemap,枫林是这样熟悉的,现在有很多的在线生成网址,当然还有一些Google sitemap在线生成软件之类的东西,但是这类工具都有个特点,都是完全的机器或者智能化操作,这样虽然方便也很节约时间,但是带来更多的不良因素。
在线生成工具往往过于笼统的加入某些网址,甚至有些加入的网址是该站早一不存在的,这样导致的后果是什么呢?Robot无法抓取,这也就是很多朋友查看Google Sitemap帐户中的Robot爬行日志看到有很多网址无法抓取的原因。
枫林认为假如一个站出现大量的Robot无法抓取的页面,这个不但不能提高收录量反而会影响正常的收录,Google Sitemap原本是为Robot提供方便的通道,使得该站的收录数量更为良好,然而在线生成致使大量页面无法抓取,这也无疑是对Robot的一种欺骗,很有可能影响到网站以后的收录,这也就是很多朋友制作了xml的sitemap,收录量反而降低的原因之一。
关于Google Sitemap,枫林是这样看的:
1.千万不要笼统的把每个页面都加入到sitemap之中(robots.txt里的除外)
2.认真查看和修改在线生成的sitemap,相信我们的眼睛
3.如有sitemap的制作基础或者站点的内容比较少,手动制作是首选
新闻热点
疑难解答