Friday, August 6, 2010

Search Engine Optimization Techniques

Believe it or not, SEO is all about basic common sense and simplicity. The purpose of search engine optimization is to make a web site as a search engine friendly as possible. It's really not that difficult. SEO 101 does not require special knowledge, algorithms, programs, or taxonomy, but this requires understanding the basic principles of search engines.

In the interest of brevity this piece starts with a few assumptions. The first assumption is a small business website is now working. The second assumption is the site is written using a fairly standard markup language, such as or HTML PHP. The last assumption is that some form of keyword research and determination has already taken place and the webmaster confident in the choice of key indicators.

There are two aspects of search engines to consider before jumping in. First, how spiders work. The second way the search to find out which pages relate to which keywords and phrases.

In simplified form, search engines collect data on a unique site by sending an electronic spider to visit the site and copy its contents, which is stored in a database search engine. Widely known as "bots" These spiders are designed to follow links from one page to another. As they copy and absorb content from one page, they record the communication and other bots to make copies of the content of these pages. This process continues indefinitely. By sending out spiders and collecting information 24 / 7, the major search engines created databases that measure their size in the tens of billions.

Knowing the spiders and read the information at the technical end of basic SEO. Spiders are designed to read site content like you and I read in the newspaper. Beginning in the upper left corner, a spider will read the site through the contents of the line left to right. If they are used (as in most sites), spiders will follow in the left column, before its completion before moving to the center and right-hand column. If the spider can follow a link is found, it will record that link and send another bot to copy and record data on a page link. The spider will proceed through the site, while he writes everything that can possibly be found there.

As spiders follow links and record everything in its path, it is safe to assume that if there is a link to the site, the spider will find this site. There is no need to manually or electronically submit your site to major search engines. Search spiders are quite capable of finding it yourself, provided links to your site somewhere on the Internet. Search engines have the uncanny ability to judge the topic or subject of the page, they are learning, and use this opportunity to assess the current relations of pages that are linked. The most valuable incoming links to sites that share topical themes.

When a search spider finds your site, helping him get around this is the first priority. One of the most important basic SEO tips is to provide clear paths for spiders to follow from point A to point Z on your site. This is easily accomplished by providing easy to follow text links to the most important pages in the navigation menu, or just at the bottom of each page. One of these text links should lead to a text map, which lists and provides a text link on every page. The card may be the main page of the site, its purpose is more direct than spiders help lost site visitors, though designers should keep site visitors in mind that when you create the site. Google also takes a more modern, based XML Sitemaps, which can be read on their Help for webmasters.

There will be cases where the spiders allows free access to all pages on the site is not always desirable. To do this you need to know how to say that some spiders the site is closed and should not be added to their database using the file robots.txt. (To learn more about creating a robots.txt file to begin the article, Jennifer Laycock of Robots. Foundations txt)

access to the areas of accommodation spiders a site wants to access half of it. The other half is in the content of the site. Search engines must provide their users with lists of pages that are related to finding people to join in the search box. Search engines need to determine which of the billions of pages, is related to the small number of specific words. To do this, a search engine must know your site relates to those words.

To begin with, there are several elements, the search engine looks at when considering this page. After the URL of the site, the search spider records the name of the site. It also addresses the description meta tag. Both elements are in the "head" of the source code.

Titles should be written using the strongest keyword targets as the basis. Some names are written using two or three basic two-keyword phrases. The key to writing a good name is worth remembering that the man readers will see the name as a link or banner on the page, search engine results. Do not overload your title with key phrases. Focus on your strongest keywords that best describe the topic of the page content.

Description meta tag is also quite important. Search engines usually use it to gather information on the subject or topic on this page. Also, the written description is formulated in two or three complete sentences with the strongest keyword phrases woven into every sentence. As in the title tag, some search engines description appears in search results pages, as a rule, using it in whole or in part, to provide the text that appears click on the link.

Due to abuse by webmasters, such as the use of irrelevant queries, search engines place minor (if any) weight in the meta-tag keywords. Thus, there is no need to spend much time worrying about the tag keywords.

After reading the information found in the "head" of the source code, spiders continue to analyze the content of the site. It makes sense to remember that spiders read the same way that we do, left to right and following columns.

Good content is the most important aspect of search engine optimization. The simplest and most basic SEO search engine spiders generally can expect to read basic body text 100% of the time. By providing a search engine spider to the main content of the text, to offer engines information in a simple format for reading them. Some search engines can strip text and link content from Flash files, nothing compares with the main text body, when it comes to providing information to the spiders. You can almost always find a way to work the main text body, without prejudice to the site is designed to look for a designer, feel and functionality.

The content should be thematically focused. In other words, keep it simple. Some pages are devoted to the many threads on each page, which is confusing for spiders. SEO The basic rule here, if you want to express more than one topic per page, you need more pages. Fortunately, the creation of new pages with unique content subjects one of the main methods of SEO, to make the site simpler for both live users and electronic spiders.

When writing the contents page, try using a strong target keyword at the beginning of the copy. For example, a website selling "Blue Widgets" may use the following as the lead-sentence;

"Blue Widgets Smith and K ° the strongest construction widgets available and the confidence of the leading builders and contractors."

The main goal, obviously, works for the blue widget. Placing keyword phrases "blue widgets" and "construction widgets", along with other key words, such as the specific word "strongest", "reliable" and "builders" and "contractors" penalty is designed to help search engines see the relationship between these words. Subsequent sentences would also have keywords and phrases weaved into them. One thing to keep in mind when writing a page a copy of unnecessary repetition of keywords (keyword stuffing) is often considered spam by search engines. Another thing to keep in mind that ultimately, a written copy is meant for reading the human eye, as well as search engine spiders. Read copy aloud. Does it make sense and rational use of natural? If not, then you over-use of key phrases and words necessary to make adjustments.

Another important element of a spider examines when reading the site (and later relating the content to user requests) is the anchor text used in internal links. Using relevant keyword phrases in anchor text of the basic method of SEO, aimed at solidifying the perception of relations between the search engine pages and the words used in the link. For example ... We also have a popular series of articles on the basics of written Degeyter Stony SEO. Context, the term "Fundamentals of SEO" is an example of the use of key phrases in the anchor text. Terms such as "SEO 101" or "SEO for Beginners" could also be used.

Remember that the basis for successful optimization of the site is simplicity. The goal is to make the site easy to find, easy to follow and easy to read for search spiders and live visitors, well-written, relevant content and relevant incoming links. Although the basic SEO can be time consuming in the early stages, the result is worth the effort and create a foundation for more advanced future work.

No comments:

Post a Comment