Getting indexed
The leading search engines, such as Google and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. Some search engines, notably Yahoo!, operate a paid submission service that guarantee crawling for either a set fee or cost per click.[29] Such programs usually guarantee inclusion in the database, but do not guarantee specific ranking within the search results.[dead link][30] Two major directories, the Yahoo Directory and the Open Directory Project both require manual submission and human editorial review.[31] Google offers Google Webmaster Tools, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that aren't discoverable by automatically following links.[32]
Search engine crawlers may look at a number of different factors when crawling a site. Not every page is indexed by the search engines. Distance of pages from the root directory of a site may also be a factor in whether or not pages get crawled.[33]
Preventing crawling
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed, and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[34]
Increasing prominence
A variety of methods can increase the prominence of a webpage within the search results. Cross linking between pages of the same website to provide more links to most important pages may improve its visibility.[35] Writing content that includes frequently searched keyword phrase, so as to be relevant to a wide variety of search queries will tend to increase traffic.[35] Adding relevant keywords to a web page's meta data, including the title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic. URL normalization of web pages accessible via multiple urls, using the "canonical" meta tag[36] or via 301 redirects can help make sure links to different versions of the url all count towards the page's link popularity score.
File Names
Search Engines algorithms prefer that web pages files are really descriptive and relevant to the information displayed on the page. For SE to interpret a page properly its best that keywords are used instead of random characters and numbers. Each page should be optimized for a certain keyword or keyword phrases which should also appear in H1 tag but also in the page file name. Since no spaces are accepted in the file name, hyphens and underscores are preferred.
Google SEO
Google holds over 60% of the total search market.[37] Its algorithm is naturally also unique, so ranking on Google carries its own unique considerations. Although there are over 200 criteria Google uses to rank sites, they can be categorized into two main sections: on-site and off-site factors:
Google values sites that deliver quality content, relevance, easy navigation and an overal user-friendliness to the site’s visitors (on-site). However, a site’s popularity is heavily weighted when Google ranks sites (off-site).[38] Thus Google was originally designed to rank sites mostly based on the number of inbound links they were receiving from other sites. In other words, the more site A was used as a “reference” the higher it would rank. Anchored text links used to link to site A are also very important as well as the popularity and the relevance of the site that is referencing site A
No comments:
Post a Comment