Tuesday, October 25, 2005

SEO - The Biggest Mistakes

SEO - The Biggest Mistakes
By Christoph Puetz

Search Engine Optimization is critical for the success of a website. Ignoring search engines and how they apply their rules can result in significant loss of traffic for your website. Plenty of words have been written about what to do for search engine optimization, but not that many words have been written about what mistakes to avoid when it comes to Search Engine Optimization. This article will show a few of the biggest mistakes to avoid when it comes to SEO.

1) Duplicate Content

Do not build websites that only offer duplicate content. Make sure your website has unique character and content. If you use content that has already been published on other websites make sure to add additional pieces to make it look more unique. In general your website should have at least 60% unique content to pass the duplicate content filter. The more unique content the better the site will be off.

2) Outgoing links

A single page should not carry too many outgoing links to other websites. The danger is pretty high that a website is considered a link farm if too many outgoing links are present. If all a website has to offer are outgoing links search engines might not see any value in that website and therefore give it a low ranking or even remove it from the index completely. A safe number is to have 10 or less outgoing links per page to avoid any problems (this is a "rather safe than sorry" approach).

3) Search Engine Submission Services

Do not use one of those search engine submission services that promises to register your site with thousands of search engines. This is considered search engine spamming and will result in penalty. If you do want to submit your website to search engines - do it manually. Even better - just link to your site from another one that is already frequently spidered and indexed by search engines. The search bots will then find your website anyway.

Mr. Christoph Puetz is a successful entrepreneur and also an international book author. Websites of Christoph Puetz can be found at The Breast Pump Store and at Highlands Ranch Colorado.

Article Source: http://EzineArticles.com/

Monday, October 24, 2005

Basic Terminology of SEO

Basic Terminology of SEO
By Kellin Keller

The basic terminology of SEO. Here is a compiled list of
search engine and promotion related terms ...........

Adwords: This is the Pay Per Click advertising program
offered by Google.

Adsense: Contextual advertising by Google. Website
publishers earn a portion of the advertising revenue for
placing Google sponsored links on their site.

Algorithm: The search engine program that ranks sites
based on certain criteria. Google has over 100 different
ranking criteria that sites need to meet.

Automated Submissions: Services that use a web based tool
or software to submit sites to search engines, free for
all pages, and directories. CM SEO does not recommend using
these services, these submissions can be viewed by the
receiving party as spam.

Bid for click: A search engine under which you bid for your
site to be ranked under a keyword. See Pay Per Click (PPC)
and Paid Placement (PP).

Code: The background code that runs a web site. As well as
HTML, this can include, JavaScript, ASP, PHP, JSP,
Coldfusion and more.

Click popularity: a measure of how often a listing presented
by a search engine is clicked. Some search engines and
directories will rank a site higher on their results pages
if the site proves to be popular among searchers.

Cloaking: also known as "stealth," involves serving a specific
page to each search engine spider and a different one to human
visitors. In most cases, frowned upon by search engines.

Comment tag: html code that describes or documents content.
Most search engines ignore the content of comment tags.

Dead link: An Internet link which does not lead to a page or
site. This usually occurs when a server is down, the page has
moved, or it no longer exists.

Domain: A sub-set of internet addresses. Domains are hierarchical, and lower-level domains often refer to particular web sites within a top-level domain. The most significant part of the address comes at the end - typical top-level domains are .com, .net, .edu, .gov, .org.

Directory: directories are built from submissions made by website
owners, and generally arrange site listings hierarchically. Yahoo! is the best known example.

Doorway page: a web page created solely to achieve high ranking
in search engines for particular keywords, and perhaps for a specific engine. Today's doorway pages should contain valuable and useful content related to your site, and be fully linked to the site, and so are often referred to as "information pages."

Dynamic html: web pages generated on demand by data in databases or using similar technology. Can create ranking problems because a search engine's spider may not retrieve relevant content.

FFA Site: A so-called "free for all links" page, which is created for the sole purpose of compiling links. Submission software or companies that claim to submit your site to hundreds or thousands of "search engines" actually use these for most of that number. FFA sites are essentially worthless in terms of generating traffic, and links from them will count nothing towards your site's link popularity. Basically, they're a waste of time.

Frames: Some sites have pages that are made up of multiple HTML pages. Typically the navigation will be on one page and the content on another. You can tell if you scroll down the page and the navigation remains static. Frames are bad for a sites search engine promotion.

Hand Submissions: A service to develop appropriate titles and
descriptions for submission to directories, search engines, and
advertising sites. The submission is often unique for each directory, being sure to be compliant with all guidelines. CM SEO always offers hand submissions and has great success in getting relevant listings in appropriate directories.

Hidden text: Text that is visible to the search engine spiders but not to site visitors. Used to add extra keywords in the page without actually adding content to a site. Most search engines will penalize Web sites which use hidden text.

Hit: In the context of visitors to web pages, a hit (or site hit) is a single access request made to the server for either a text file or a graphic. If, for example, a web page contains ten buttons constructed from separate images, a single visit from someone using a web browser with graphics switched on (a "page view") will involve eleven hits on the server.In the context of a search engine query, a hit is a measure of the number of web pages matching a query returned by a search engine or directory.

Home page: The main page of a Web site.

HTML (HyperText Markup Language): The coding language that all Web sites use to exist on the Internet.

Hyperlinks: Hyperlinks are used to link one or more documents together.

Impression: A single display of an online advertisement.

Inbound link: Links that direct users to another Web site. When a user arrives at a site from another site, that link is known as an inbound link.

Informational page: a content-filled web page created to focus on
particular keywords. Differs from a "doorway page" in that is wholly integrated into the site and is useful to human visitors, while a traditional "doorway page" is aimed only at search engines.

Invisible text: using a font color the same or close to the color of the background of a page, in an attempt to allow the content to be indexed by search engines while not being visible to humans. To search engines, this is spam.

IP delivery or IP-based delivery: the technique of serving a particular page in response to a page request from a specific IP address. Used in cloaking; a search engine is identified by the IP address it is using, and a page customized for that search engine is served.

Keyword: A word used to find pages when conducting a search.

Keywords: Due to abuse by many Web sites in the past, search engines have reduced the importance of the keywords meta tag when ranking a Web page for keyword relevance. Many have actually decided to not consider the keywords tag altogether. While it has reduced in significance, it is still an important meta tag to include in your Web pages.

Keyword density: Keyword density is the ratio of a keyword or key
phrases to the total number of words on that page. Keyword density is one of the most critical aspects of successful search engine optimization.

Keyword phrase: A phrase used to find pages when conducting a search.

Keyword frequency: Keyword frequency is the number of times keywords occur in the text on a given page. Search engines want to see more than one repetition of a keyword in your text to make sure it's not an isolated case.

Keyword prominence: The general location of a keyword or phrase in relation to the overall text on that page. You'll want to make sure your important keywords appear early in your Web site copy and that they draw attention to themselves.

Keyword research: Researching the most relative and popular keywords for a given site.

Keyword Spamming: Deliberate repetition of keywords in a page by using invisible or tiny text to increase keyword density. This is banned by search engines.

Link popularity: Search engines often use link popularity as part of their ranking criterion. In simple terms, link popularity is the measurement of the number of other Web sites that include a link to your Web site on theirs. Each search engine, depending on their specific algorithms, determines it differently.

Link analysis: a measure of the quality and relevance of the set of links pointing to a given site; contrast with link popularity.

Link Farms: sites created and maintained solely for the purpose of constructing links between member sites. Should be avoided as a violation of most search engines' policies; their use won't build your site's link popularity, and may result in a ranking penalty.

META refresh tag: automatically replaces the current page with a different one within the website, or possibly offsite. In general, use of refresh tags is discouraged or penalized by search engines.

META tag: html tag in the header section of a web page, intended to offer content to search engines. Among them are the keyword and description tags, but these days most true search engines de-emphasize or completely ignore META tags.

Mirror sites: Sites designed as duplicates of an original site, but are hosted on a different server. Link cloaking and doorway pages, the creation of mirror sites is a recognized spam tactic and violators will be penalized by many of the major search engines.

ODP - Open Directory Project: The largest human edited directory on the Internet. The Open Directory provides listings for free but only for qualified sites and because editors are volunteers, wait times can be lengthy.

Outbound link: A link to a site outside of your own.

Page Rank: See also Link Popularity. A numerical rating of a site developed by Google as part of it's algorithms for determining search engine listings. To view page rank requires installing the Google tool bar in your browser. Yahoo also utilizes Page Rank calculations.

PPC: Pay Per Click. This is an advertising option in which the advertiser has typically a small textual ad on a search engine site and pays only if a user clicks on the link in the ad.

PFI: Pay for Inclusion. This is a fee charged by a search engine to be spidered on a periodic basis to be included in the search engine results. Yahoo has a service that is a combination of PFI and PPC.

Reciprocal link: An exchange of links between two sites.

Relevancy: how closely related a particular page is to the search term requested.

Re-index: How often a search engine updates its index. Google updates its index once a month.

Reputation: related to link popularity, a page will score highest for reputation when it is linked to by pages from other sites which themselves are highly ranked. Well-known sites recognized as "authoritive" are given high reputation scores on their own; it's for this reason that a link to your site from something like cnn.com would be very valuable.

Search engine: A search engine is a database system designed to index and categorize internet addresses, otherwise known as URLs
(for example, http://www.yourdomain.com).

Search engine marketing: encompasses several forms of marketing products and services on the internet through management of information presented by search engines and directories. Included are such elements as site optimization, and the purchase and placement of advertisements.

Search engine optimization (SEO): These are the techniques used to improve a Web page's results in a search.

Search engine positioning: the process of managing a page or site's positioning in the search engines.

Selective delivery: the technique answering browser's page request with a specific page selected via an automated process based on some piece of information gained from the browser. For example, reading the browser's language setting may allow a page in that language to be served. Similar to IP-based delivery.

SERP: A "search engine results page," the page of site listings that a search engine returns in response to a user's entry of a search query. Often used in discussion of the way such a page is laid out, for example: "Overture listings are the first sites presented on Yahoo's SERPs."

Spam: as it applies to search engines, any attempt to submit or place deceptive information, or to "trick" the search engine into placing a page in an inaccurate position.

Spider: A software program used by search engines to crawl the Web, storing URLs and indexing the keywords and text of pages. Spiders are also referred to as crawlers or robots.

Stop word: common words, or words considered by search engines to be irrelevant, are left out by search algorithms. Examples are "and," "the," etc. Generally, a stop word in a query is treated as a "wild card;" that is, the returned results usually won't be exactly the same as if the word had been left out of
the query entirely.

Theme: a relatively recent change in search engine ranking algorithms, theme-based engines essentially try to determine what a page is "about" — and to compare it to other pages that seem to be related to the same topic — and rank it highly for certain keywords that are determined to be related to that page theme.

Traffic: The actual visitors to a Web page or Web site.

Unique Visitor: A real visitor to a web site. Web servers record the IP addresses of each visitor, and this is used to determine the number of real people who have visited a web site. If for example, someone visits twenty pages within a web site, the server will count only one unique visitor.

URL: The Uniform Resource Locator is used to specify the address of Web sites and Web pages.

Author: Kellin Keller

This data are collected from deffent website glossary.

Website: WebTraffic Provide more targeted traffic.

Have a look of the OneWay linking package.



Article Source: http://EzineArticles.com/

Saturday, October 22, 2005

The Ten Commandments of Search Engine Optimization

The Ten Commandments of Search Engine Optimization
By Bhaskar Thakur

Most of the time when we pitch to a new client we are asked for SEO guarantees. “Your competition has guaranteed top results and submission to 100,000 Search Engines and Directories”. We go all out educating clients that Search Engine Optimization is all about smart work and not just adding random keywords and submitting to every directory possible. I’m writing this article to reach out to the SEO buyers and help them distinguish the crooks from the genuine SEO. I’ve compiled my Search marketing experience over the years in this article. I hope this helps you in selecting your Search Marketing initiative.

Commandment 1: There are no Rank Guarantees. (Period)

Search Engines alone control their indexing and ranking algorithm. Do not try to trick Search Engines. The only way to improve your Search Engine Ranks is by playing by the rules. And the rule is very simple: make it logical. Web content is primarily for the site visitor and not crawlers.

If your Search Engine Optimizer sold you magic “Top rank on google in 10 days flat”. Forget it. There are no short cuts. Top ranking in Search Engines Natural Results will take time. Hard work is imperative especially for the content on your website and the links to your site.

Commandment 2: Ranking is not the end, it’s the means.

Ask yourself what will Top Search Engine Ranks get you? Most businesses are interested in increasing sales on the website or in the least driving qualified traffic. Ranking for the right keywords (keywords used by your target audience) is important. There are SEOs who would try to show case results for keywords that occur only on your website. Beware such gimmicks.

Commandment 3: Know your competition.

“Rank” is relative position and more so in the Search Engine’s natural results. How well you do in the Search Engine Results is a function of how much hard work you have put over competition. Analyze competition’s keywords, links, keyword density and spread. But be sure to never copy your competition.

Commandment 4: Use Search Engine Friendly design.

A search and visitor friendly design is must for any successful website. Your website should be compelling enough for repeat visits by search engines and potential customers. Make sure you have Search Engine friendly urls and avoid those long URLs with query strings (http://mywebsite.com/index.php?PHPSESSID=5&a=z&f=g). You should also make sure that your web designer follows global coding standards like w3c (http://www.w3.org).

Commandment 5: Select Keywords that are worth.

You must research the keywords before targeting. There are tools that give you good idea of keyword’s search potential for example (http://www.wordtracker.com/, http://inventory.overture.com/d/searchinventory/suggestion/, https://adwords.google.com/select/KeywordSandbox ). It is important to know the number of searches for a keyword in the last month, last 6 months and last year. You should also find out the number of web pages that are targeting the keyword. It is advisable to start a campaign with keywords with moderate competition and high number of search.

Commandment 6: Write Great content.

Even if your website site is technically perfect for search engine robots, it won’t do you any good unless you also fill it with Great content. Great is contextual and has editorial value. Great content brings repeat visits and increases the chance of conversion. Great content is factual and appeals to the target audience. The web page should have desired action embedded in the content.
You must ensure that the content is fresh. Keep adding and editing the content regularly.

Commandment 7: Use good hyper linking strategy.

Hyperlinks make the content accessible and contextual. You must hyperlink in the right context within the website and to the other websites. Good links are appreciated by the Search Engines and by the visitors. No one likes to be taken to a mall selling “Macintosh” when shopping for “apples”.

Commandment 8: Write relevant and original Meta content.

Meta content is like business cards. Just as your business card tells who you are and what you do Meta content tells the Search Engines the relevance and context of a web page. Resist the temptation to include everything in the Meta content, but make it detailed. Confused? The idea is to include only what is relevant to the page in the Meta Content but include everything that is relevant.

Commandment 9: Acquire Relevant Links.

The Links you acquire are the roads to your web page for Search Engine Bots and visitors. Good links increase your webpage’s equity on the World Wide Web and bad link make a dent to the equity and credibility. Be selective in reciprocal linking. Both reciprocal and one way links work, if you are prudent in selecting the links. Submit your website to the relevant sections in relevant directories.

Commandment 10: Consult experts, if you need to.

If you have the competence there are two ways to learn. Learning from one’s own mistakes and learning form other’s experience. You could choose either. If you have the time and can wait for the online dollars do it yourself. If you want to get started now it may be useful to consult the experts.

The author is an expert in Search Marketing with over 10 years Online Marketing experience. He heads http://www.rankuno.com, the specialist in online marketing and Search Engine Optimization. RankUno empowers its clients around the world with high ROI online marketing programs. He may be reached at bhaskar@rankuno.com .

Article Source: http://EzineArticles.com/

Thursday, October 20, 2005

Online Marketing Made Easy With SEO

Online Marketing Made Easy With SEO
By Paul Jesse

When it comes to marketing your business online, you want to reach as many people as possible. For this reason, search engine optimization is very important to your business and making sure you can reach as many potential customers as possible.

There are two ways to go about this. One, you can hire a firm to submit your URL and web site information to hundreds of search engines every month for a year. This takes the effort out of your hands and you still get the desired results. However, this method costs money and if you are marketing on a tight budget, it might not be an option for you. Fortunately, this type of work does not need a professional and you can do it yourself. All you need to do is research all the search engines on the web and then submit your web page URL and information to them. By doing this you are making your web site
part of the search process of all the different search engines. This is important because most people use one search engine for all their searching needs and that is it. Because of this, you want to target as many search engines as possible in order to target as many potential customers as possible.

In addition to listing your URL with as many search engines as possible, you need to stay up to date on new search engines that other potential customers might use. By constantly getting your URL on these search engines you are increasing your chances of marketing your product or service.

Another important aspect to consider when it comes to search engine optimization is where your Web page ranks in the results. If your page is resulted as number 100, it is unlikely people will continue searching that far down the list of results. You want to make sure your page is returned at the top of the list, and there are several ways to ensure this. First, you will want to make sure your Web site contains a variety of keywords that people might search for when looking for your particular service or business. you will want to have relevant information about these keywords so that when people visit your Web page you have the information they were looking for. This is very important and should be given considerable attention. finally, once you have your Web page listed and full of useful information, you will want
to check regularly to make sure your web page is in the top search results. If not, you will need to modify your information to make sure it is. The best recommendation for search engine optimization is to maintain your Web pages and standing in the search engines as frequently as possibly. By doing so, you have everything to gain.

Paul Jesse is owner of Shea Marketing, published author, retired government employee, private pilot and lifetime student of Internet Marketing. He created SheaMarketing.com to help those interested in starting an online internet business. http://www.sheamarketing.com

Article Source: http://EzineArticles.com/

Wednesday, October 19, 2005

Link Building - Still Good for Search Engine Optimization?

Link Building - Still Good for Search Engine Optimization?
By Christoph Puetz

Different strategies for search engine optimization have been developed over the last few years. One strategy that always comes back as a response when asking what works, is to build additional back links to a website that needs to be promoted. This advice is old (old in Internet days) but still very valid. Until Google.com finds a different way of calculating the search engine results the back links will still play a big role and no webmaster should ignore them. The assumption still is that if a site has valuable content or services to offer that other webmasters will link to that particular site. The higher the number of links becomes the more value is being allocated to that website by Google. Google pretty much allows the Internet community to decide or better - help deciding what is good and that it should show up high in search results.

Since this fact became public knowledge people are eager to follow these 'unwritten rules' of Google's search index. Link farms, online link directories and other link collection type of sites have been populating the Internet in increased numbers since then. What the value is of these sites needs to be determined. Plain link farms do not carry any real value for the user. All they are really there is to help to increase the number of backlinks to a site and nothing else. There is no value content-wise whatsoever. Link Directories carry a little more value, but overall the usage of link directories is steadily going down. Search engines are stepping up and provide better value and offer more convenience for the user at the same time. Other websites use their value to sell text links. Users buy these links and basically pay for the link and the hope to gain a higher Page Rank from it. This can work up to a certain limit. It becomes problematic if the seller of the text links stretches the limits on how many outgoing links a single page will carry. 5 - 10 links are probably most common. However - sometimes you run into websites and web pages that carry 20-30 of these outgoing links. The value of these kind of links is questionable. There is no technical reason for a page to carry that many outgoing links in 95% of cases. It's not exactly clear at this point if search engines degrade the value of inbound links coming from those page already. If they don't - the day that they will put less value on those links is probably very near. So, users buying text links should verify that the pages the link will be placed on do not carry too many outgoing links to keep the value of this link alive.

About the Author

Christoph Puetz is a successful entrepreneur and international book author. Websites of Christoph Puetz can be found at Pregnancy Announcements and at Highlands Ranch Colorado.

This article can freely be distributed and re-printed as long as the links in the author's section remain active and clickable. This comment about the re-printing permissions does not have to be published.

Article Source: http://EzineArticles.com/

Monday, October 17, 2005

Search Engine Optimization (SEO) – Boost Your Website Traffic

Search Engine Optimization (SEO) – Boost Your Website Traffic
By Rory Canyons

Search engines bring more than 80 percent of the traffic for small to medium websites. This tells exactly how important it is for small and medium websites to optimize their web structure and pages for search engines. Optimization of your website for search engine includes many aspects: website content, keywords, URL, meta-tag, back links, etc. Let’s explain it one by one.

Select the right keywords. You can pay a visit to your competitors’ websites (only those with top-ranked search engine placements). Through analyzing their web contents and meta tags, you can easily find out the keywords they are using. Overture keywords selection tool can also provide you valuable information. Open your favorite browser, enter http://inventory.overture.com/d/searchinventory/suggestion/, type in the search box your keywords or phrases and see how many times they were searched last month. Select those with high search frequency. Google also has a similar keyword selection tool at https://adwords.google.com/select/KeywordSandbox. You can have a try of both of them and balance the search results.

Target your web site content at selected keywords. After finalizing the keywords, you can now build up your web pages with them. But, be careful, don’t overuse any keywords on your web pages. Overuse of keywords may make search engine spiders think you are spamming and get your website banned. How can I judge overuse or not? Go to http://www.gorank.com/seotools/ to have a check of your keywords density, make sure it is within a reasonable range.

Make search engine friendly URLs. Although some of the search engines can follow all dynamic URLs, like http://www.scriptmenu.com/detail.php?id=25257, some of them still prefer static URLs ended with html, htm, etc. To make search engine friendly URLs, you can create real static pages, but you don’t have to. The web server URL rewrite engine can make this job much easier by reinterpreting the URLs before getting actual pages. If you need more help or tips on how to implement URL rewrite model, follow the link http://www.scriptmenu.com/detail_24379.html and get a tutorial.

Get quality backward links to your page. Although keywords optimization of your web pages can improve significantly your search engine placement, it is still far from sufficient to get your pages top ranked. You have to get some quality backward links to your websites. You need at least 35 quality back links to make google going to your web site and take a look at you. You can get these quality links by submitting your site to high-ranked web site directories or by writing some quality articles and then submitting them to the high-ranked online article archives. Many other ways exist, but remember, only backward links from quality web sites count. Websites poorly indexed or with very low search engine ranking have no value to you.

Keep on improving your website. By keyword optimization, URL optimization and quality backward URLs, your website should have gotten remarkable search engine placement. However, the placement is not static, you competitors are optimizing their websites and trying to kick you out of your current position. To maintain a good search engine ranking, you have to keep on improving your website. Keep on optimizing your website navigation, content and structure. Keep on getting more quality links from top-rated sites…. The battle for top search engine positions will never end. Good luck, :-)

Rory Canyons is the founder of ScriptMenu.com. For more information, visit http://www.scriptmenu.com

Article Source: http://EzineArticles.com/

Thursday, October 13, 2005

The Power Of Article Marketing With SEO...

The Power Of Article Marketing With SEO...
By Kenneth Doyle

There are many ways to market your business on the Internet, and using search engine optimized articles has to be one of the "keeper" strategies for getting pre-qualified 'natural' search engine traffic to your web site.

Now, I'm sure you've heard all this before, "people search the Internet looking to information (aka articles) on how to solve a particular problem". People just don't search the web to buy your 'stuff'!

If you truly understand this search "path" then you understand the meat and potatoes of how the web works, it's an information resource for people.

To completely understand the above concept 'follow the money'. Search engines need content to be able to rank web sites, so that they continually get searchers BACK to their engines. The more searchers they have searching for relevant content on a search engine means more revenue for that search engine (e.g. the success of Google adsense).

This is why blogs and RSS have become the "buzz". Blogs and RSS make the content (copy, words) "fluid"... meaning that the content changes often. This means that the search engine bots love these technologies, visit more often, and then rank the content well (if the blog articles are keyword optimized).

To see how this works go to any news site. News sites have a volume of ever-changing content. For example CNN gets spidered (visited) by the Googlebot something like 28,0000 times a day.

Why? Because of CNN's ever changing content, and I guess the 'bots' are 'lazy'? They go to where the good, and changing content is more often.

The SEO aspect of article writing.

Think of SEO as "filing". What I mean by this is that good search engine optimization strategies help the search engine to "file" your content appropriately. Good keyword analysis gives you the information to enable the search engine bot with the 'right' filing.

There are no smoke and mirror tactics here. Your Mother was right when she told you that 'telling the truth is always better in the long run'. This principle especially applies to good SEO practice.

Now, how do we take this concept of "fluid", search engine optimized content and turn it into a Links IN bonanza for your own web site (thus sending your rankings through the roof over time, so to speak)?

These are the core principles...

1. Ensure that the link for your article is on a web page hosted on a unique IP address.

2. Ensure that the link for your article is on a web page that has a unique, wholly independent set of backlinks.

3. Ensure that the link is on a web page that is at least loosely relevant to the topic of your own web site.

You see, it's not just a matter of having your articles on just any article directory. Your articles need to be in a category (and preferably a specialist directory) that is within "theme'. In other words you would never post your article on 'looking after your cattle dog" on a directory that was focused on Financial Management.

In addition to placing your articles on article directories one can also run your articles on press release service sites, and of course there are RSS strategies (but that's a whole other story, and another article).

Kenneth Doyle Is A Writer And Internet Marketing Consultant,
*Find Out About His [Keyword Optimized] Article Writing And Submission Service Gets Thousands Of Prospects To Read YOUR Web Site Offers, Here... http://www.feedyourhungrymind.com/articlesam4

Article Source: http://EzineArticles.com/

Wednesday, October 12, 2005

The Good and the Bad of SEO – From Googles Mouth!

The Good and the Bad of SEO – From Googles Mouth!
By Rob Sullivan

In this article I highlight some of the points made during the call so you know what Google thinks.

You know its bad when you take time from your holidays to come into work to attend a conference call. But that’s what I did a few weeks ago. You see I had to because I was going to have the opportunity to ask some Google employees specific questions on things that I’d been pretty sure about, but wanted to hear it right from the horses mouth.

The call lasted less than an hour, but in that time I found that there were many things I figured were indeed true. So lets start with the most obvious:

Is PageRank still important?

The short answer is yes – PageRank has always been important to Google. Naturally they couldn’t go into details but it is as I suspected. Google still uses the algorithm to help determine rankings. Where it falls in the algo mix, though, is up for speculation. My feeling however is that they’ve simply moved where the PageRank value is applied in the grand scheme of things. If you want to know what I think, be sure to read this article.

Are dynamic URLs bad?

Google says that a dynamic URL with 2 parameters “should” get indexed. When we pressed a bit on the issue we also found that URLs themselves don’t contribute too much to the overall ranking algorithms. In other words, a page named Page1.asp will likely perform as well as Keyword.asp.

The whole variable thing shouldn’t come as a surprise. It is true that Google will indeed index dynamic URLs and I’ve seen sites with as many as 4 variables get indexed. The difference however is that in almost all cases I’ve seen the static URLs outrank the dynamic URLs especially in highly competitive or even moderately competitive keyword spaces.

Is URL rewriting OK in Google’s eyes?

Again, the answer is yes, provided the URLs aren’t too long. While the length of the URL isn’t necessarily an issue, if they get extremely long they can cause problems.

In my experience, long rewritten URLs perform just fine. The important thing is the content on the page.

That was a common theme throughout the call – content is king. Sure optimized meta tags, effective interlinking and externalizing JavaScript all help, but in the end if the content isn’t there the site won’t do well.

Do you need to use the Google Sitemap tool?

If your site is already getting crawled effectively by Google you do not need to use the Google sitemap submission tool.

The sitemap submission tool was created by Google to provide a way for sites which normally do not get crawled effectively to now become indexed by Google.

My feeling here is that if you MUST use the Google sitemap to get your site indexed then you have some serious architectural issues to solve.

In other words, just because your pages get indexed via the sitemap doesn’t mean they will rank. In fact I’d bet you that they won’t rank because of those technical issues I mentioned above.

Here I’d recommend getting a free tool like Xenu and spider your site yourself. If Xenu has problems then you can almost be assured of Googlebot crawling problems. The nice thing with Xenu is that it can help you find those problems, such as broken links, so that you can fix them.

Once your site becomes fully crawlable by Xenu I can almost guarantee you that it will be crawlable and indexable by the major search engine spiders.

Does clean code make that much of a difference?

Again, the answer is yes. By externalizing any code you can and cleaning up things like tables you can greatly improve your site.

First, externalizing JavaScript and CSS helps reduce code bloat which makes the visible text more important. Your keyword density goes up which makes the page more authoritative.

Similarly, minimizing the use of tables also helps reduce the HTML to text ratio, making the text that much more important.

Also, as a tip, your visible text should appear as close to the top of your HTML code as possible. Sometimes this is difficult, however, as elements like top and left navigation appear first in the HTML. If this is the case, consider using CSS to reposition the text and those elements appropriately.

Do Keywords in the domain name harm or help you?

The short answer is neither. However too many keywords in a domain can set off flags for review. In other words blue-widgets.com won’t hurt you but discount-and-cheap-blue-and-red-widgets.com will likely raise flags and trigger a review.

Page naming follows similar rules – while you can use keywords as page names, it doesn’t necessarily help (as I mentioned above) further, long names can cause reviews which will delay indexing.

How many links should you have on your sitemap?

Google recommends 100 links per page.

While I’ve seen pages with more links get indexed, it appears that it takes much longer. In other words, the first 100 links will get indexed right away, however it can take a few more months for Google to identify and follow any links greater than 100.

If your site is larger than 100 pages (as many are today) consider splitting up your sitemap into multiple pages which interlink with each other, or create a directory structure within your sitemap. This way you can have multiple sitemaps that are logically organized and will allow for complete indexing of your site.

Can Googlebot follow links in Flash or JavaScript

While Googlebot can identify links in JavaScript, it cannot follow those links. Nor can it follow links in Flash.

Therefore I recommend having your links elsewhere on the page. It is OK to have links in flash or JavaScript but you need to account for the crawlers not finding them. Therefore the use of a sitemap can help get those links found and crawled.

As alternatives I know there are menus which use JavaScript and CSS to output a very similar looking navigation system to what you commonly see with JavaScript navigation yet uses static hyperlinks which crawlers can follow. Therefore do a little research and you should be able to find a spiderable alternative to whatever type of navigation your site currently has.

Overall, while I didn’t learn anything earth shattering, it was good to get validation “from the horses mouth” so to speak.

I guess it just goes to show you that there is enough information out there on the forums and blogs. The question becomes determine which of that information is valid and which isn’t. But that, I’m afraid, usually comes with time and experience.

About the author:
Rob Sullivan - SEO Specialist and Internet Marketing Consultant. Any reproduction of this article needs to have an html link pointing to http://www.textlinkbrokers.com

Article Source: http://EzineArticles.com/