SEO Guide for Beginners from Google

Is Google getting into SEO business? Well, such a move from Google would lead to a potential conflict of interest.  But even Google is telling us that there’s nothing wrong with SEO, and Google’s official Webmaster Central Blog is now formally offering a free “SEO Starter Guide” with practical search engine optimization tips for new webmasters about how to improve their website’s performance in Google.
Google SEO Guide
This search engine optimization guide comes in a  22-page, pdf format. Google uses the same principles inside Google for optimizing its own sites such as YouTube. This starter SEO guide from Google covers dozens of topics such as
• Title tag
• Description Meta Tag
• URL Structure
• Site Navigation
• Content CreationÂ
• Anchor Text
• Heading Tags (H1, H2, etc.)
• Optimizing ImagesÂ
• Effective of Robots.txt
• Rel=”nofollow”
• Website Promotion
• Webmaster Tools
• Web Analytics
• More Helpful Resources
The guide is well written and is aimed at new webmasters and website owners who are looking for basic training in SEO techniques. I feel it is a good guide for newbies, to cover the basics of SEO before they go for more advanced stuff.

Caffeine – Google’s answer to Bing

Microsoft launched its new search engine Bing in June, positioning it as a “decision engine” which delivers more than just page results. They also parnered with Yahoo recently gaining license to Yahoo’s search technology. This will allow Microsoft to extend its new search engine Bing to all of Yahoo’s Web properties. Microsoft has already started to steal some search engine market share from Google. Facebook is also giving competition to Google by releasing a new realtime search engine.
Google is, however. not taking these developments lightly. For last several months, Google has been secretly working on a new project – codenamedCaffeine – the next generation of Google Search engine. This is not going to be a minor upgrade, but an entire new architecture for Google search engine. In short, it will be a completely new version of Google.
mumbai seo sem
The next generation search architecture dubbed Caffeine – is still under development, but Google has released a beta version of Caffeine for public. Though very little is known about Caffeine’s algorithms and inner workings, its launch has come at the right time when by Microsoft is making an all out effort to grab a share in the search engine marketplace, where it is currently in the third position.
As per statement released by Google, the objective for the new version of Google Search is to improve its size, indexing speed, accuracy, and comprehensiveness. There is no confirmation from Google about when it would formally launch the new Caffeine architecture.
We tested Caffeine for few keyword phrases, and found out that it is much more faster, and has more indexed pages. Search results are obviously little different than the ones you get in the current version of Google. Why don’t you take a test drive yourself to see how the new Google search engine works?

Hiring the right SEO expert

As an online business owner, you need to understand how to drive huge targeted traffic to your website. Without traffic, you can’t get business, and certainly can’t make any sales. There are many different traffic generation methods, and search engine optimization (SEO) is one of the proven methods to get targeted traffic to your website.. If you are new to the Web and just starting out, you would either need to learn SEO or hire an SEO expert in order to promote your website.
SEO technology changes at a rapid pace and you’d need to spend lot of time to keep yourself updated to the latest techniques in SEO. SEO can be a time consuming process. As a businessman, you need to spend your time in developing and growing your business. That’s why it makes sense to hire an SEO consultant to do the work for you. Outsourcing SEO work to a SEO company in countries like India is also a cost effective option to get your SEO job done. However, there are so many self-proclaimed SEO experts that offer tall claims but end up doing your site more harm than good, so you need to take care in selecting the right SEO firm.

Here are few tips for hiring an SEO expert

  1. First you’d need to go through some SEO books and get familiar with search engine optimization
  2. Sign up to Google’s YouTube channels and watch the some informative videos about search engine optimization
  3. Ask for referrals for SEO experts in your professional network. Testimonials and recommendations from your network will filter the fly-by-night SEO experts.
A good SEO expert would have a proven track record. Before you hire any SEO professional, ask him about his previous projects done on other client websites. Ask him what keywords the sites are ranking for, and verify everything yourself. You should also ask him for a proof that the work was done by him and not some other company. Another good test is to check the Google pagerank of the SEO company’s website. You should evaluate each candidate on these criteria or proven rankings and previous results. Find the right SEO expert to work for you, and you will be closer to success.
If you decide to do SEO yourself for your own site, then you must understand that it is very time taking process especially when you are new to this field. And, if SEO is done improperly, your website can actually get penalized by the search engines.
The main goal of any SEO strategy should be to get your website ranked for the keywords your customers will be using to find your products and services, driving traffic to your website and increasing the amount of sales. An expert SEO ensures you that your website will rank high in spite of the constant updates of search engine algorithms.
Read this interesting post, if you wish to know more about how fake SEO experts cheat people!

Google Algorithm Changes

There is a nice article about latest changes in Google algorith by Per Strandberg.  Â 
After Google latest algorithm update nicknamed “Florida”, many webmasters discovered that their traffic plummeted. So What exactly happened? And More importantly what can you do about it as a Google SEO? And what will Google do in the next updates? What happened was that Google made an algorithm change on how they rate the web pages. Â Â Â Â 
Every time you make a search, Google tries to show the most relevant web pages that match your search term. In order to keep out competitors they have to constantly adjust and improve how they judge web pages. By being able to give the most relevant results for queries, they have become the most used search engine in the world.
Because this judgment is done automatically using software, many webmaster have been modifying their websites in order to improve their ranking in the search results. To do this they have exploited different shortcuts and loopholes made possible by shortcomings in the software algorithm. Google make periodic changes in order to stop some webmasters to get unfair advantages by plugging one or two of the loopholes.
This is what happened during the Florida update. With this update Google introduced new algorithms which intended to stop overuse of some search engine optimization techniques.
More specifically they seem to have targeted search terms found in text links also called anchor text. Web pages with good positions in the search result, which had had a disproportional number of in-bound links to them from other web pages with the exact same search term in the anchor text that the page was optimized for suddenly, disappeared from the listings. The pages did not disappear altogether. Just for the search term that the page were optimized for.
For Google, the high proportions of anchor texts with the same text indicate that the texts were put there for one purpose only, to boost ranking.
One suggestion for you is to spread out the anchor text with a mix of different texts to keep your page in the search results. We don’t know if your pages will come back after some time if you do this, but it is likely.
Apparently the search result generated after the latest update have been of a lower quality than before.
What seems to have happen is that a large percentage of web sites have traded links with one another. This link trade has been done with the same search term in the anchor text that they have optimized their pages for.
The victims more often than not have been commercial web sites that relied to heavily on search engine optimization technique.
The search results have been taken over by web sites composed of low quality directory and link farms.
Now, what will Google do next? I don’t know, but TRY TO THINK like Google! This is what I would do if I was responsible at Google for this.
First I think that they will modify and adjust the new algorithm they have introduced during the latest update. Changing the threshold or don’t let the “over optimized pages” drop out of the search result so easy, but rather penalize them and put them under the threshold point.
I think, Google have a problem! You see, many “over optimized” sites are of higher quality that those that are not. To simply drop them out and say that there are enough pages for the same search term is not always true.
There is a thin line between optimization and spamming and where this boundary should be.
After this, what will Google do next? It is clear to me that the many low quality directory sites found in Google search results is a nuance to Google and to the average web user.
It is in this area that, I think, they will make the next modifications.
Google rate web pages according to relevance. The level of relevance is judge based on the web page content and/or how popular the web page is in the view of Google.
To get a page popular you need to have links from other pages. This can come from pages on your own site or from other sites.
Ideally these links should be many, come from pages dealing with similar or identical subject or come from pages that themselves are popular. The best is to have many links from pages dealing with the same subject that themselves are popular.
This had led to an intense link exchange active among webmasters. And the primary reason has been to achieve better ratings. The primary purpose has not been to increase the visitors experience value.
This goes against Google’s principles.
To quote Google webmaster guidelines:
  • Make pages for users, not for search engines.
  • Avoid tricks intended to improve search engine rankings.
  • Don’t participate in link schemes designed to increase your site’s ranking or Page Rank.
To counter this I think Google will target several popularity increasing schemes like: Â Â Â Â 
- Low value directory sites which have been created automatically by robots. These sites contain extracts taken from search engines and directories.
Google can easily spot these sites.
- The building of link directories attached to web sites. They are built with link partner extracting software and services. With them you can upload directory structures directly into your site. This way you can build up a massive number of link partners and also identify link partners with high Page Rank values.
Of course, one can say that by doing this you can add to your visitors experience as the directories make it easy for them to find similar web sites.
However this is an argument that Google most likely would disagree with.
Web sites using tactics like this are easy identifiable by Google. The directory pages are composed of outgoing links which either have the Title, Meta descriptor or other content directly taken from the web pages they are linked to.
Google just have to look at the texts from the directories and the text on the web pages for matching.
Using product or services for this purpose is risking you get banned or at least being penalized by Google.
Will this happen? I think so!
When?
I don’t know! Anytime soon, next month,..next year! Nobody knows, only Google can tell!
I think Google also will look into reciprocal linking as a whole.
Maybe they will start to identify pages with outgoing links on them that link to other web sites and identify which links are coming back from those domains.
What they like to see is spontaneous linking to your site from web owners that regard you as a valuable resource to link to, without you linking back. I believe that they will limit the impact of reciprocal linking, somewhat!
What can you do to improve your web traffic from Google without violating its guidelines?
Build web sites that give value to your visitors. Make it into a popular site, so that others want to link to your site. Build niche information rich sites. Either as mini sites or as larger information sites. Larger sites within a niche are given higher popularity rating than smaller sites by Google.
If you do this your web site will not be affected next time Google make a change. Unless of course your competitor drops out of Google, then your traffic will get a boost.
 
 

Google adds Tagging Feature

Google has recently introduced  a new ‘Tagging’ feature which enables users to tag and comment any webpage in their ‘My search history’ page. This new feature which has been around in other engines for a while now, can allow them later to apply tagging patterns into real search results.  If Google opens this up in the future, letting users share their bookmarks and see bookmarking data in searches, we could see something very useful and popular.
To bookmark a page, just visit it through Google Search, get to it in your Search History, and click the star icon. Then click “edit” and type in any tags under the “Labels” heading. You can even add some notes in the box underneath that. Once you’ve saved a bunch of sites, you can view them by clicking the Bookmarks heading in the left.
While some may think that tagging your own pages can be a good SEO technique, you will need many many accounts with natural search patterns in order to allow such search spam to take place. This prompts the question, should you take the time to open so many accounts and develop the search patterns on them or should you use the same time to create a better site that will get the tagging regardless?

Tips for Higher Google Ranking

Apart from many standard techniques for good HTML accessibility practices, which are essential to scoring high on Google, there is an underlying concept that you can follow to greatly improve your Google Ranking. But first, use the HTML title tags, meta tags, title attributes on the Href tags, and avoid putting important information in images (use Alt attributes on the image tags if you really have to do.). And most importantly: Avoid Flash, Shockwave, Real, or MP3 (unless these formats are “in addition” to the HTML version of the information.) Refrain from using frames or pop-up new browser windows, or if you do, read the Google FAQ about how that effects your scores.
The most important rule for scoring higher on Google search engine is to make your site useful. Simply put, the best way to get your site to score higher on Google is to put some useful content there — have your site bring some benefit to someone. It is difficult to stress this enough.
Remember that the value of the web is the interconnection of it all. Each connection has a pointer (link) and a page that is being pointed to. A link that point to another site is only half the equation. If you point to other relevant sites, it doesn’t really help you score higher in Google. However, if your site has other sites pointing to you, this is called a “back link.” Back links will help your site score higher.
Therefore, seek to bring true benefit; not the propaganda or advertisement, which your marketing people have told you that consumers should believe are the benefits of your product; and don’t go for that slick looking design that your CEO will benefit from when he shows the site to his golfing buddies and brags about the design. When will other people link to your site on their own? Only whe your site brings true benefit to the people who are surfing the web. In nutshell: Be of help to someone. It is that simple. It is the single most important thing you can do to improve your ranking in Google.
Google determines the benefit or usefulness of a site based on the number of “back links”. Google ranks the benefit of a your site is by the pages that are pointing to or linking with your site. If you put something useful on your site, other sites will definitely link to your site (especially if you ask them.) And if you find this article useful, please link to it :)
Furthermore, Google scores your web pages, not just by the number of links pointing to your web pages, but Google takes into consideration the scores or ranks of the sites which are pointing to your website. So your goal, as a site designer who wants to increase your Google ranking, is to persuade other high ranking (i.e. helpful) web sites to link to your pages. And what’s the best way to do that? Put something helpful or beneficial there. Google PageRank (PR) is a good indication of the usefulness of the site. A site with high PR on the scale of 1 to 10m has higher benefit as per Google.
So tring to scam Googlebots with simple link exchange tricks doesn’t always work. Googlebots have methods of detecting links that are intended to trick Google. Remember, Google was designed by Stanford University Graduate students. So your energies are better spent simply publishing useful information than attempting to cheat their band of super Googlebots.
Other Things That Go Against the Conventional (Lack of) Wisdom of Web DesignNavigation on a web site should be made by text links and not by fancy looking images. This is why they call it “hypertext”. It’s not called hyper-gif, hyper-Flash or hyper-image. It’s hyper-text. The information in the hypertext links is very crucial. Google ranks a page based on the text and information in the back links. An image which links to a page – like image navigation bars and banner advertisements score poorer in Google. Why? Because information within an image is not available to the Googlebots.
For instance, it’s far better to have a link pointing to my site that says something like “Google Ranking – Free Tips and Tutorials” than it is to have a link that says “For Google Ranking – Free Tips and Tutorials, click here.” And both of those are far better then having a .jpg image which points to that page. So when you make links, make them in text and describe in words what is on the page you are pointing to. This will help you score higher as per Google algorithm.
That is also related to another practice that goes against the prevailing (lack of) wisdom in website design. Most companies would never put up links to their competitors’ websites on their website. It is not done often. Well, try to hae a page on your website which links to all of their competitors, and furthermore, the page should offer some benefit to the end user. (Read more about this strategy for higher ranking in Google here.) Ultimately, people searching for your competitors will also find your page. Every corporate site therefore should seek to be the leader of relevant information in their field. Why? Because people will link to their website and it will score higher in Google. 
Imagine if you showed your boss that if you search one of your products on the web, your competitor’s site returned as the only result in the search! It would freak him or her right out.
Buying Results On Google
Google has made space available for clearly marked advertisements that appear when people search certain words. If you have money, you might want to purchase sponsor links for certain search words – using the Google Adwords. And you only pay when someone actually clicks to your site. But you should be careful in doing proper keyword research. We’ll discuss that later someday in an article on adwords.
 

How to make your site Google Spider Friendly ?

Google and other search engines extract a great deal of information about a web page by examining the terms and phrases that it contains. Using that data they can and, from a group of pages, about a site as a whole. Of course, they learn something from the frequency of certain terms and words, but that’s not the only thing they weigh.
Writing well for search engines is an art cum science. SEOs can only make very educated guesses as search engines cleverly guard their algorithms. In general, however, writing well for the search engines is very similar to writing well for your site visitors. So there is a definite relation between making a site friendly for search engine spider and making it friendly for your site visitors.
If you want to optimize your on-page text for higher search engine rankings, here are some basic rules to remember:
Make sure the main keyword orphrase for which you wish to score higher in search engine ranking is featured prominently on your web page. Don’t bother too much about measuring your keyword density; its importance is arguable at best. However the general frequency of the term can help boost your rankings.
Keep all of the text on your page of high quality and related to the topic. Yes, search engines like Google indeed look for high quality writing; they are capable of performing some sophisticated analysis of the words on your web pages. You don’t  just have to please the artificial intelligences, but also the human ones; as search engines have teams of researchers who work on identifying and describing the common elements in high quality writing. You don’t have to be a literary genius, but remember that good writing is one of those things that both your visitors and the search engines will appreciate.
Structure your document so that it flows from broad to narrow topics. This is very important for making your site search engine as well as visitor friendly. You may start with a description of the content, so that both the spiders and your human visitors know what to expect. It improves the readability of the entire web page. There may be situations in which this would not be an appropriate way to structure the page; in such cases, you can disregard this advice.
It is also important to keep the text of your document together. Many SEO experts say that it is better to use cascading style sheets (CSS) rather than table layouts for this reason. CSS facilitate keeping the text flow of the document together and prevent the text from being broken up by coding. You can achieve this with tables too; just ensure that text sections (i.e. content, ads, navigation, and so forth) flow together inside one table or row. You should also refrain from having too many “nested” tables that make for broken sentences and paragraphs.
There was a time when text layout and keyword usage and its frequency in a document were very important, but that is no longer true. Do they still make a difference? Yes, to some extent; but there is no reason to obsess over keyword placement or text layout any longer.
 

Google’s Penalty for German BMW site

Google has dropped BMW Germany from its search engine after realizing the top car manufacturer’s German website (bmw.de) was breaching its guidelines by artificially boosting its popularity ranking.
Investigations by Google found that BMW’s German website manipulated search engine results to ensure top ranking when users searched for the keyword “used car.”   Redirects using Javascript was the reason that BMW’s website was dropped from Google’s search engine. Google highlighted that this was in violation of Google’s Webmaster Quality Guidelines, which clearly specify the issue of deceiving users or search engines by showing different content to each also called cloaking.
Google has now reduced BMW’s German page rank to zero, thus the company website no longer appears at the top. It  means BMW will need to start again and build up its ranking from scratch. A very costly exercise not only in re-optimization efforts, but also in lost revenue and exposure. BMW admitted using the so-called “doorway pages” to boost search engine rankings, but denied any attempt to mislead users.  BMW’s activities  and the penalties were revealed in a blog by Google software engineer Matt Cutts.
This should be a clear warning that, whether you are undertaking your own optimization efforts or employing a company to conduct your search engine marketing for you, you better confirm that the techniques being used are ethical and inline with the guidelines of Google and other search engines. So the message from Google is very clear -  ‘Do not deceive’ – and do not resort to any black hat techniques such as cloaking or doorway pages.

Great Site Ranking in Google – The Secret is out !

Google recently filed a US patent which reveals a great deal of information about how they rank your web site. Some of it you could come as a surprise for you
How many years did you register your domain name for?
If it was only one year then Google could hold that against you.
Why?
Because majority of the spam websites only register a domain name for one year. A domain name registered for a period longer than one year implies that the owner is more likely to be serious about their web site and legitimate one.
This is just one of the unusual factors possibly considered by Google when indexing and ranking a website. Factors you could never even have guessed at in some cases.
How do I know this?Google recently made public,  the contents of their filing of United States Patent Application 20050071741.In which many of the search giant’s secret ranking criteria is revealed and it makes very interesting reading. You must read this article if you are serious about ranking well in Google. The days of Spamming Google are almost drawing to a close. With this patent they reveal just how hard they’re coming down on Spam sites. Take a note of these facts if You Do Not want to get caught out.In which many of the search giant’s secret ranking criteria is revealed and it makes very interesting reading. You must read this if you are serious about ranking well in Google. The days of Spamming Google are almost drawing to a close. With this patent they reveal just how hard they’re coming down on Spam sites. Take a note of these facts if You Do Not want to get caught out.Listed below you will find the hard facts. You will need to refer to these each time you optimize a new site.
In which many of the search giant’s secret ranking criteria is revealed and it makes very interesting reading. You must read this article if you are serious about ranking well in Google. The days of Spamming Google are almost drawing to a close. With this patent they reveal just how hard they’re coming down on Spam sites. Take a note of these facts if You Do Not want to get caught out.In which many of the search giant’s secret ranking criteria is revealed and it makes very interesting reading. You must read this if you are serious about ranking well in Google. The days of Spamming Google are almost drawing to a close. With this patent they reveal just how hard they’re coming down on Spam sites. Take a note of these facts if You Do Not want to get caught out.Listed below you will find the hard facts. You will need to refer to these each time you optimize a new site.・Links.
It’s common knowledge that Google relies heavily on inbound relevant links to rank a site. Now they explain exactly how it works.
As well as the number, quality and anchor text factors of a link. Google seems to also consider historical factors. Apparently the Google ’sandbox’ or aging delay begins count down the minute links to a new site are discovered.
Google records the discovery of a link, and link changes over time, the speed at which a site gains links and the link life span. So better stay away from all SEO tricks which guarantee you hundreds of backlinks overnight.
With this in mind, fast link acquisition may be a strong indicator of potential search engine Spam.
So gone are the days of pages and pages full of links. You must grow your links slowly to stay below the radar and be careful who you exchange links with. That means no more buying hundreds of links at once or other underhand tactics.
PR is now very valuable.
Your link anchor text should vary but remain consistent with your site content. No more using your main keywords on every link exchange you gain. That’s ‘anchor Spam’. Instead it is good idea to vary them around your top five to ten keywords.
Link exchanges are still very important but you must work and utilize them ethically. If you don’t and you get caught, the recovery from a google ban can be months and your host and IP may also be recorded.
The fact is fewer but better quality links will benefit you more and they will be much more likely to be over the long-term which is good too.
・Site click through rates (CTR)
Site CTR may now be monitored through cache, bookmarks, temporary files and favorites via the Google toolbar or desktop tools. Many have suspected for some time that sites are rewarded for good CTR with a raise in ranking. Similar to how Adwords works.
CTR is monitored to see if fresh or stale content is preferred for a search result. CTR is also analyzed for increases or decreases relating to trends or seasons.
・The traffic to a web page is recorded and monitored over time.
・Web page rankings are recorded and monitored for changes.
・Sites can be ranked seasonally. A ski site may rank higher in the winter than in the summer. Google can monitor and rank pages by recording CTR changes by season.
・User behavior in general could be monitored.
・Bookmarks and favorites could be monitored for changes, additions or deletions.
As Google is capable of tracking traffic to your site you should closely monitor the small amount of copy returned in search results. Ideally you will want to integrate a call to action in there to increase your listings CTR.
Clicks away from your site back to the search results are also monitored. Make your site as sticky as possible to keep visitors there longer. As mentioned above it may also help if you could get your visitors to bookmark your site.
・The frequency and amount of page updates is monitored and recorded as is the number of pages.
Mass updates of hundreds of files will see you pop up on the Google radar.
On the other hand, few or small updates to your site could see your rankings slide –unless your CTR is good. A stale page that receives good traffic may hold it’s own and not require an update. So don’t update it just for the sake of it.
Depending on your market, fresh content may not be a requirement. If the information your pages contain do not go out of date then updating may not be necessary. For examople – if your market is more news based, then changes regularly are a must. In general changes do not necessarily have to mean fresh content. They could involve simple edits to current content.
A further indicator that Google is really cracking down on Spam is made clear in the following extract from the Patent. Reference is made to changing the focus of multiple pages at once.
Here’s the quote -
“A significant change over time in the set of topics associated with a document may indicate that the document has changed owners and previous document indicators, such as score, anchor text, etc., are no longer reliable.
Similarly, a spike in the number of topics could indicate Spam. For example, if a particular document is associated with a set of one or more topics over what may be considered a stable period of time and then a (sudden) spike occurs in the number of topics associated with the document, this may be an indication that the document has been taken over as a doorway document.
Another indication may include the sudden disappearance of the original topics associated with the document. If one or more of these situations are detected, then Google may reduce the relative score of such documents and/or the links, anchor text, or other data associated the document.”
There’s still more to look out for:-
・The domain name owner’s address is considered, most likely to help in a local search result.
・Changes in keyword density is monitored and recorded as are changes to anchor text.
・The technical and admin contact details are checked for consistency. These are often falsified for Spam domains.
・Your hosts IP address. If you are on a shared server it’s possible somebody else on that server is using dirty tactics or Spamming. If so, your site will suffer since you share the same IP.
The impression I get here is that Google has learned from the Spam ‘attack’ they suffered in early 2004 and they are determined to eradicate it from their listing results.
So what do you do?
There’s a lot to take onboard here and consider. But you can’t go far wrong with your SEO if you try to grow your site as organically as possible.
If you know what you are doing you can take short cuts. Carry on with link exchanges but consider each site carefully and slow down in your gathering of them. Vary your anchor text. Add small amounts of good quality content to your site regularly. Check your search engine listings and edit your site to include a call to action in them if possible. Make your site more ’sticky’ to encourage visitors to stay a while. Encourage visitors to Bookmark your site. Oh, and register new domain names for at least two years.
Before you do anything remember to reference the above info first. It may just save you months of misery as your site gets banned and ‘Sand boxed’.
Overall keep it ethical and you can’t go far wrong. Do not be tempted to Spam. Stick to the guidelines above and you are much more likely to outlast and out rank your competition.

Google is updating Page Rank

In the search engine world, the Google Page Rank (PR) measure is closely watched by all those who want to get into top positions in Google’s search results. PR is one of the most important factors that Google uses to determine its ranking results.
The theory behind Google PR is quite simple. The more relevant back links a website has, the higher its PR, and accordingly the more importance Google places on its content leading to higher rankings.
However, remember, PR and back links alone don’t determine the rankings. There are many other factors that the search engine algorithms use to decide where your website will appear in the SERPs, but they have been recognized as having a significant impact.
It appears that Google, the search titan, is updating its dreaded Page Rank and has also exported the latest Back Link counts. Google tends to do these updates and/or exports periodically . Whether this is a quality control measure or not, it often means that Google’s measures remain unchanged for months, even though your site back links might be increasing (or decreasing) within their datacenters.  
On discovering the latest Google PR update, we did a quick check on our Google Back Link count and discovered that both had been updated simultaneously. While our Page Rank got improved and even saw some nice improvements on individual pages, we noticed a drop in our exported back link count.
Does a drop in back link count mean that alarm bells should be ringing? Well, not necessarily. It does indicate that you should continue to work on your linking strategy as a continued drop in quality backlinks could eventually lead to a decline in Page Rank. But remember that you should look at not only the quantity, but also the quality of your backlinks.
The drop in back link count we experienced may be caused by many factors such as websites that link to us having closed, or webmasters changing their sites and links, which is beyond our control. It has definitely motivated to start supplementing the losses with new relevant links to ensure we maintain and improve our Page Rank.

Importance of Sitemaps for Google Success

There are several SEO tips and tricks that help in optimizing a site but one of those, the importance of which is sometimes underestimated is sitemaps. Sitemaps, as the name implies, are just a map of your website – i.e. on one single page you describe the structure of your site, its sections, the links between them, etc. Sitemaps make navigating your site easier and having an updated sitemap on your site is good both for your users and for search engines. Sitemaps are thus an important way of communication with search engines. While in robots.txt you tell search engines which parts of your site to exclude from indexing, in your site map you tell search engines where you’d like them to go. Sitemaps are not a new thing. They have always been part of best Web design practices but with the adoption of sitemaps by search engines, now they become even more important. However, it is necessary to make a clarification that if you are interested in sitemaps mainly from a SEO point of view, you can’t go on only with the conventional sitemap  (though currently Yahoo! and MSN still keep to the standard html format). For instance, Google Sitemaps uses a special (XML) format that is different from the ordinary html sitemap for human visitors.
One might ask why two different sitemaps are necessary. The answer is obvious – one is for humans, the other is for search engine spiders (for now mainly Googlebot but it is reasonable to expect that other crawlers will join soon). In this context, it is necessary to clarify that having two sitemaps is not regarded as duplicate content. In ‘Introduction to Sitemaps‘, Google explicitly states that using a sitemap will never lead to penalty for your site.

Why Use a Sitemap ?

Using sitemaps has many advantages, not only easier navigation and better visibility by search engines. Sitemaps offer the opportunity to inform search engines immediately about any changes on your site. Of course, you cannot expect that search engines will rush right away to index the changed pages in your site but certainly the changes will be indexed faster, compared to when you don’t have a sitemap.
Also, when you have a sitemap and submit it to the search engines, you rely less on external links that will bring search engines to your site. Sitemaps can even help with messy internal links – for instance if you by accident have orphaned pages or broken internal links   that cannot be reached in other way (however it is always better that you fix your errors than rely on a sitemap).
If your site is new, or if you have a significant number of new (or recently updated pages), then using a sitemap can be vital to your success. Although you can still go without a sitemap, it is very likely that soon sitemaps will become the standard way of submitting a web site to search engines. Though it is certain that spiders will continue to index the Web and sitemaps will not make the standard crawling procedures obsolete, it is logical to say that the importance of sitemaps will continue to increase.
Sitemaps also help in classifying your web site content, though search engines are by no means obliged to classify a page as belonging to a particular category or as matching a particular keyword only because you have told them so.
Having in mind that the sitemap programs of major search engines like Google are still in beta, using a sitemap might not generate huge advantages in the short run, but as search engines improve their sitemap indexing algorithms, it is expected that more and more sites will be indexed faster via sitemaps.

Generating and Submitting the Sitemap

The steps you need to perform in order to have a sitemap for your web site are simple. First, you need to generate it, then you upload it to your web site, and finally you notify Google about it.
Depending on your technical skills, there are broadly two ways to generate a sitemap – to download and install a sitemap generator or to use an online sitemap generation tool. The first is more difficult but you have more control over the output. You can download the Google sitemap generator from here. After you download the package, just follow the installation and configuration instructions in it. This generator is based on a Python script, so your Web server must have Python 2.2 or later installed, in order to run it.
The second way to generate a sitemap is easier. There are many free online tools that can do the job for you. For instance, have a look at this collection ofThird-party Sitemap tools. Although Google says explicitly that it has neither tested, nor verified them, this list will be useful because it includes links to downloadable sitemap generators, online sitemap enerators,  sitemap plugins for popular content-management systems, etc., so you will be able to find exactly what you need.
After you have created the sitemap, you need to upload it to your site (if it is not already there) and notify Google about its existence. Notifying Google includesadding the site to your Google Sitemaps account, so if you do not have an account with Google, it is high time to open one. Also it is useful to know in advance is that in order to add the sitemap to your account, you need to verify that you are the legitimate owner of the site.
Currently Yahoo and MSN do not support sitemaps, or at least not in the XML format, used by Google. Yahoo allows webmasters to submit “a text file with a list of URLs” (which can actually be a stripped-down version of a site map), while MSN does not offer even that but there are rumors that it is indexing sitemaps when they are available onsite. Most likely this situation will change in the near future and both Yahoo and MSN will catch with Google because user-submitted site maps are just a too powerful SEO tool and cannot be ignored.
So sitemaps are becoming increasingly important for your success in Google as well as other search engines.

What is Google Sandbox effect?

The Google Sandbox effect is an alleged restriction placed on new websites. The result is that a new site does not receive good rankings for its most important keywords and keyword phrases for few months. Even with good content, many inbound links, a new website may still adversely affected by the Sandbox effect. The Google Sandbox acts as a probation for new sites, probably to discourage spam sites from rising quickly, getting banned, and repeating the process.Thus the Google Sandbox is very similar to a new website being placed on probation whose rank is kept lower than expected in searches, prior to being given full value for its inbound links and content.
Why did Google create the Sandbox?
It is thought Google created the Sandbox filter for new sites to stop spam sites that purchase numerous inbound links, and rank highly for their keywords from the date of launch. As Google apparently considers a high number of links pointing to a site from the beginning to be rather suspicious, the inbound links are not considered to be natural. Another possibility is that spam sites would use various tactics to rise to the top of the search results, and gain heavy sales prior to being banned for violating Google’s Terms of Service; and then repeating the process continually. As a result, new sites are put on a probation period, and this effect is usually referred to as the Google Sandbox.
Is there really a Google Sandbox?
Not all SEO experts agree that Google Sandbox exists as a separate filter from other alleged Google filters. Many of them do not even agree that Google uses a system of filters at all. Skeptics believe that the phenomenon merely echoes already existing Google algorithm calculations, and the Sandbox effect is an illusion. Note that Google has all but admitted recently that the Sandbox filter is real.
Which sites are placed in the Sandbox?
While all types of new sites can be placed in the Google Sandbox, the problem appears much more frequently for new websites seeking rankings for highly competitive keywords and keyword phrases. All sites are likely to be given a term in the Sandbox, but those websites seeking rankings for keywords that are in high demand are probably in for a much longer duration in the sandbox.
Why some sites never seem to been in the Google Sandbox? You can avoid having your site in the Google Sandbox for several reasons. Sites targeting non-competitive keywords and phrases are often left out of the Google Sandbox as there is little point in applying the filter. Keep in mind, however, that even less competitive search terms can be Sandboxed, but their much shorter stay in the sandbox often goes entirely unnoticed.
How long is a site stays in the Google Sandbox?
Stays in the Google Sandbox can vary from one to six months, with three to four months being the averahe time frame. Less competitive searches will be given the much shorter stay in the sandbox, while hyper-competitive keywords will often spend six months in the sandbox. The filter will be gradually decreased over time, and will lose most of its dampening effect in about three months. However, for the most competitive search keyword phrases, the Sandbox filter might remain in full force for six months.
How do I know if a site is in the Sandbox?
If your site has a good Google PageRank and incoming links, and it shows up in search results for some secondary search phrases, but the site is nowhere to be found for the most important searches; then it is very likely the site has been placed in the Google Sandbox.
How to know if it’s the Sandbox and not a Google penalty?
If a site is punished from a Google penalty, the site would not appear in the Google search engine results pages (SERPs) for even the less important keyword searches. The site would also show no PageRank or even a grey bar on the Google Toolbar in case of Google penalty. The alleged Google Sandbox filter is apparently designed to concern itself with the more competitive keywords as they are more likely to have spam sites, purchased inbound links Google deems unnatural, and probably more manipulation attempts being made.
Will joining of Google AdWords or Google Adsense prevent being placed in the Sandbox?
Joining paid programs like Google Adwords and Google Adsense will have no effect on your site’s duration in the Google Sandbox. Those programs could provide much needed traffic while your site remains in the Sandbox. Participation in the various Google paid advertising programs will not keep your site out of Sandbox, or shorten your stay, despite what some myths would have you believe.
Are there any other Google filters like the Sandbox?
The alleged dampening filter on new incoming links is often mistaken for the Sandbox. Many search engine optimization experts think that new incoming links are not given immediate full credit. The purpose of that gradual passing along of Google PageRank and link popularity, is to discourage purchasing of incoming links, and various linking schemes designed only to increase a site’s standing in the Google search rankings.
How to get out of Google Sandbox ?
Time is the only real escape from the Sandbox. Depending on the competitiveness of your most important keywords, that time can vary from one to six months, with three to four months being the average duration. In the meantime, continue to improve your site contents, and be prepared to make a rapid rise once the Sandbox probation ends.
What should I do while my site is still stuck in the Sandbox?
While your site is stuck in the Sandbox, it’s best to continue adding fresh keyword rich content and new incoming links to your site. Adding inbound links will ensure that they lessen any possible new link dampening filter that might be in effect. The links would be well aged, and ready to pass along their full value of PageRank and link popularity, as the site comes out of the Google Sandbox.
So it is better to concentrate on adding more keyword rich pages, and don’t forget both on page and off page factors. On the page, make sure your title tags match the most important keywords for that page. It is a good idea to add a site map and be sure that all of your pages link properly to one another with appropriate link anchor text containing the keywords for that page. Off page link anchor text should be set up to include keywords for the receiving page as well. This will make sure that when the filter is lifted, your improved site will rise rapidly to its proper place at the top of the search rankings.
Should I keep getting new links to my website while in sandbox?
The Sandbox is an ideal time to start adding incoming links to your site. Because of the alleged new links dampening filter, adding links while in the Sandbox solves two filters at once. If the newly added links are indeed dampened by a filter, then their full value should take effect just as your site comes out from the Sandbox. Be sure to add strong keyword rich anchor text to your incoming links, and vary it to include several keyword combinations.
How long it takes to appear in the SERPS after leaving the Sandbox?
The length of time required to achieve your site’s proper ranking is difficult to quantify as so many variables are taken into consideration. If you have been adding well anchor text covered inbound links from relevant websites, your rise will be much faster than someone who has not continued to add incoming links. It will also assist your site’s ascent to search prominence by constantly adding keyword rich content. Of course, the more competitive the keywords you are targetting, the longer and harder is the climb.
How can I avoid being placed in the Sandbox ?
The Sandbox can be avoided to a degree by making the site prior to its being fully ready for prime time. While the site will be placed in low rankings, it will start the clock ticking on its Sandbox duration time. Be sure to add as many incoming links as possible to get past the alleged new links dampening filter. Keep adding keyword rich content to your site. Anything that can be done to speed up your site’s appearance on the internet, including the purchase of an already existing domain, should be considered. With proper time management, a site can avoid the Google Sandbox entirely.

Importance of Sitemaps for Google Success

There are several SEO tips and tricks that help in optimizing a site but one of those, the importance of which is sometimes underestimated is sitemaps. Sitemaps, as the name implies, are just a map of your website – i.e. on one single page you describe the structure of your site, its sections, the links between them, etc. Sitemaps make navigating your site easier and having an updated sitemap on your site is good both for your users and for search engines. Sitemaps are thus an important way of communication with search engines. While in robots.txt you tell search engines which parts of your site to exclude from indexing, in your site map you tell search engines where you’d like them to go. Sitemaps are not a new thing. They have always been part of best Web design practices but with the adoption of sitemaps by search engines, now they become even more important. However, it is necessary to make a clarification that if you are interested in sitemaps mainly from a SEO point of view, you can’t go on only with the conventional sitemap  (though currently Yahoo! and MSN still keep to the standard html format). For instance, Google Sitemaps uses a special (XML) format that is different from the ordinary html sitemap for human visitors.
One might ask why two different sitemaps are necessary. The answer is obvious – one is for humans, the other is for search engine spiders (for now mainly Googlebot but it is reasonable to expect that other crawlers will join soon). In this context, it is necessary to clarify that having two sitemaps is not regarded as duplicate content. In ‘Introduction to Sitemaps‘, Google explicitly states that using a sitemap will never lead to penalty for your site.

Why Use a Sitemap ?

Using sitemaps has many advantages, not only easier navigation and better visibility by search engines. Sitemaps offer the opportunity to inform search engines immediately about any changes on your site. Of course, you cannot expect that search engines will rush right away to index the changed pages in your site but certainly the changes will be indexed faster, compared to when you don’t have a sitemap.
Also, when you have a sitemap and submit it to the search engines, you rely less on external links that will bring search engines to your site. Sitemaps can even help with messy internal links – for instance if you by accident have orphaned pages or broken internal links   that cannot be reached in other way (however it is always better that you fix your errors than rely on a sitemap).
If your site is new, or if you have a significant number of new (or recently updated pages), then using a sitemap can be vital to your success. Although you can still go without a sitemap, it is very likely that soon sitemaps will become the standard way of submitting a web site to search engines. Though it is certain that spiders will continue to index the Web and sitemaps will not make the standard crawling procedures obsolete, it is logical to say that the importance of sitemaps will continue to increase.
Sitemaps also help in classifying your web site content, though search engines are by no means obliged to classify a page as belonging to a particular category or as matching a particular keyword only because you have told them so.
Having in mind that the sitemap programs of major search engines like Google are still in beta, using a sitemap might not generate huge advantages in the short run, but as search engines improve their sitemap indexing algorithms, it is expected that more and more sites will be indexed faster via sitemaps.

Generating and Submitting the Sitemap

The steps you need to perform in order to have a sitemap for your web site are simple. First, you need to generate it, then you upload it to your web site, and finally you notify Google about it.
Depending on your technical skills, there are broadly two ways to generate a sitemap – to download and install a sitemap generator or to use an online sitemap generation tool. The first is more difficult but you have more control over the output. You can download the Google sitemap generator from here. After you download the package, just follow the installation and configuration instructions in it. This generator is based on a Python script, so your Web server must have Python 2.2 or later installed, in order to run it.
The second way to generate a sitemap is easier. There are many free online tools that can do the job for you. For instance, have a look at this collection ofThird-party Sitemap tools. Although Google says explicitly that it has neither tested, nor verified them, this list will be useful because it includes links to downloadable sitemap generators, online sitemap enerators,  sitemap plugins for popular content-management systems, etc., so you will be able to find exactly what you need.
After you have created the sitemap, you need to upload it to your site (if it is not already there) and notify Google about its existence. Notifying Google includesadding the site to your Google Sitemaps account, so if you do not have an account with Google, it is high time to open one. Also it is useful to know in advance is that in order to add the sitemap to your account, you need to verify that you are the legitimate owner of the site.
Currently Yahoo and MSN do not support sitemaps, or at least not in the XML format, used by Google. Yahoo allows webmasters to submit “a text file with a list of URLs” (which can actually be a stripped-down version of a site map), while MSN does not offer even that but there are rumors that it is indexing sitemaps when they are available onsite. Most likely this situation will change in the near future and both Yahoo and MSN will catch with Google because user-submitted site maps are just a too powerful SEO tool and cannot be ignored.
So sitemaps are becoming increasingly important for your success in Google as well as other search engines.

Matt Cutts answers Google questions

Google’s Matt Cutts discussed some Google & SEO myths and facts via Google Video.  You can check Matt Cutt’s blog about Questions about Google
You can also check them directly by coing to Google Video.  You can navigate to all the videos from the menu at the right.
Transcripts of these videos will be posted on Google Success site soon for the benefit of viewers from countries with slower internet connection.
Technorati Tags: , , 

Google’s New Patent

There is always great interest within the search engine marketing community whenever a search engine files for a patent. Search engine giant Google was granted a new patent last Tuesday, August 22, 2006 . The title is “System and method for supporting editorial opinion in the ranking of search results“. And here is a brief summary of the patent.
US patent 7096214 : For each web page/site identified as favored and non-favored, the editors may determine an editorial opinion parameter for that site… For each web page in the result set that is associated with one of the web sites in the set of affected web sites, the server may determine an updated score using an editorial opinion parameter for that web site.
Some of Google’s patents in the past have given us insight into the influence of anchor text, fresh content, themes, data history, link popularity, user behavior, and domain-related information. However, Google’s most recent patent application shows a shift from focusing on algorithm-based process to the integration of a human editorial process. Ultimately, Google is striving to create the best possible search results for their visitors. This patent proposes one possible method for doing that.
As of now, search engine algorithms have reached their peak. We’ve known for quite some time that an algorithm-based search engine can nevër deliver excellent results. Why, you might ask. Simply because there will always be people out there trying to outdo the system by reverse-engineering it.
Due to this problem, a number of solutions have evolved. One of these is social search engines, which rank their results based on the wisdom of people. Another solution to arise from this problem is a human editorial process.
And now, Google in its new patent application has proposed a hybrid mechanism which combines algorithmic search with a human based editorial process. By integrating editorial opinion, they are looking to enhance the quality of their search results.
The new Google patent describes the process of identifying favored and non-favored sources in order to improve search results.
Favored Sources: Websites that are identified as being useful or containing authoritative content on the desired topic.
Non-Favored Sources: Websites that are identified as sources of misinformation or over-promotion on that particular topic.
To summarize, Google is trying to patent a system for identifying good sites and bad sites in order to rank them accordingly in the SERPs. They have proposed a semi-automatic system for determining favored and non-favored sources.
Thus an online marketer should also put more thought into the quality of the pages of his site delivers as a whole. This is one of the very few patents that refers to a site as a whole rather than individual pages.
For the smart Google SEO, this should not change his methods. As always, quality content will be the key. If you are providing your visitors with relevant, quality content, then the search engines will reward you.
It will be interesting to see how Google evolves over the next few years. Algorithm based search results will continue to be problematic because there will always be those who try to beat the system. Implementing some sort of human editorial opinion into the ranking process seems inevitable.
Also this is true for all of the major search engines. Yahoo, Google, MSN, and AskJeeves must all provide quality search results to compete within this industry. To be truly successful, they will have to go beyond algorithm-based results to deliver the most value for their visitors.
Source : Site Pro News