SEO is Not About Tricking Google

Let me repeat that again. SEO (search engine optimization) is not about tricking Google. In fact, trying to trick Google is the easiest and fastest way to de-optimize a site.

If you or your developer came up with a clever trick that you think will trick Google to give you a higher ranking than you would normally deserve, think very carefully before doing it.

If the answer to that last question is, “because I want to trick Google by intentionally feeding it information that I believe will give me higher ranking in the search results page”, then do not do it.

No doubt, others have thought of that idea already and have tried it and failed. We rarely see a website that got listed higher up than it deserved. We rarely see a website with low relevancy listed high up in the search results. We rarely see a website that appears too high in the search result due to its manipulations and tricks.

And this is the way Google and users want it. Google do not want websites to be intentionally deceiving its ranking algorithm. It wants to rank a site based on the merits of their content.

Techniques that intentionally try to deceive Google are known as black hat techniques. The term “black hat” originates from the references to old cowboy movies when the villain cowboy always wears a black hat.

Conversely, “white hat” techniques like those used by Market Pro Media in West Palm Beach are legitimate ways to optimize (but not trick) Google’s search engine ranking of your site. These methods tend to enhance the user experience as well.

Google attracts some of the top software engineers; because, well it’s Google and people want to work for them. In any case, surely they will be able to detect the use of black hat techniques. Some of those black hat unethical methods include “keyword stuffing”, “hidden text”, “cloaking”, and other.

Keyword stuffing is when you put excessive keywords into the content and meta tag that would be un-natural. Hidden text when you have text of the same color as the background color such that it is invisible to users but visible to search engine. This is an attempt intentionally to feed specific terms to the search engine. Cloaking is similar except that it attempts to feed entire cloaked pages to search engine. Depending on whether the page is being requested by a human or a bot, a different page is rendered.

In February 2006, Germany’s BMW.de website was detected by Google to have used cloaking. The penalty was severe. The site was completely removed from the Google index. Its page rank became zero. The company did apologize and corrected the problem, and Google re-instated them.

This is not the only case of Google banning a site; surely there have been many others. This is just one of the most famous cases that had been talked about in the SEO circles.

In short, do not try to deceive Google. It is not worth it. And that is not what SEO is about.

Bing SEO Tips, Bing SEO Strategies

When you run the same search terms in Bing and then in Google, you will get different results. The number one site on Google is not the number one site on Bing and vice versa. That indicates that the two search engines run slightly different search algorithms.

In this article, we are going to talk about search engine optimization (SEO) tactics that are unique to Bing. We are comparing Bing to Google because Google is the engine to compare. Google is the number one leading search engine that is picking up the 65% of the world’s queries. We want to learn how is optimizing for Bing different from optimizing for Google.

Although it is true that both engines will look at the same major SEO factors when determining rank in the search results page, each search engine may decide to weigh each of those factors differently. And for certain minor SEO factors, one engine may look at them while another may not.

The problem is that since both search engines will not reveal their algorithms, it is difficult to determine how the SEO factors are treated differently by the different search engines. Most hypotheses come about due to experimental evidence by running the same search terms in both engines and analyzing the top search results of each.

Here are what some people have surmised is going on…

We ran the term “SEO Services” on both engines and found that the top result in Bing is an older site; whereas the top result on Google is younger but has more link popularity and diversity of inbound links.

Now consider the type of pages that are linking to the top result. In Bing, the pages that are linking to its top result are pages where the keywords are in the title. Whereas in Google, the pages that are linking to its top result are pages where the keywords are more in the body of the page.

Pandia.com suggests five SEO factors that will be favorable in the eyes of Bing:

Domain age

Original content

Keyword rich titles

Outbound links

And it appears the Bing looks at backlinks more than the page content.

It is interesting to note that in both SEOWizz and Pandia, they both found that Bing values old domains.

These experiments may be suggestive of differences between Bing and Google. But more tests are needed to say definitively that these are real differences. Many people may question how significant of a difference these factors are. Some belief not much. They suggest that it is fine to stick with the strategy of optimizing for Google and let the rest take care of itself.

Currently, Bing is not a game changer. Bing takes in only 13.6% of the share of searches — behind Google and Yahoo. Although, its share continues to rise slowly.

The SEO techniques that we have learned already (such as obtaining a diversity of incoming links in quality and quantities) still applies to both engines. Putting keywords in your title and body still applies to both engines. And all the major SEO factors that we are familiar with still apply to both engines.

Backlinks – Get Quality Backlinks

A web page will rank better if there is a greater number of backlinks to your web page. When a search engine sees that a lot of sites are linking to your web page, it will conclude that your page is accessible and therefore is what web searchers are looking for. It can be loosely interpreted that each backlink is a vote up for your web page. Some people make the analogy that a backlink is like “web currency” – the more, the better.

A link from a site whose contents are similar to your web page will be more valuable. For Example: If your web page is about health benefits of olive oil, then a link from a health website is more valuable than a link from a pet store website (with all other factors being the same).

The location of the link is also important. If the backlink in embedded amid main text content, then that is more valuable. The backlink is less valuable when it is just in a sidebar or footer amid a bunch of random links that are not related.

The text of the link can also be a factor. Let’s keep with the example to suppose that the target destination of the backlink is on your web page about olive oils. If the link text is “olive oils”, then this would be better than if the link text is “click here” or “yummy food” for example.

It is important to note that a link anchor tag in HTML has the possibility of having a “rel” attribute. If the value of that attribute is “rel=nofollow”, the search engine will ignore the link in its ranking algorithm, making the link worthless regarding SEO. This value of the attribute instructs the search engine not to follow the link. Therefore, when obtaining links from other webmasters, make sure they are not putting the “rel=nofollow” attribute in their links.

In short, if you want a web page to rank well in a search engine result page, then you want a large number of high-quality backlinks from other sites to your page. The best backlinks are from high-ranking authoritative sites that are relevant to your page and with the link being within a sentence in main text content area. And the link and link text must be relevant to your web page.Some ways to Getting backlinks

You can get quality backlinks through directory submissions. It may take a long time as it may have to be approved based on the quality of the site and its relevance.

You can promote your Blogs via RSS Feed directories (Technorati, Feedage, MyBlogLog), etc.

Answer related questions on Yahoo Answers and provided your link along with it.

Create a blog on Blogger or WordPress (or both!) about your site’s topic and link back to relevant pages of your main site.

Things You Should Know Before Making Your SEO Plan

Google is frequently updating its algorithm over the last one year. Few weeks ago, it announced a Panda 4.1 update which has affected already 30% query and 17 October 2014; they have rolled out new Penguin 3.0 update which they are calling a refreshed update of Penguin.

These updates have already become a hard nut to crack but as a matter of fact, these update will not be a problem for you, if you know as to SEO properly. Moreover, I have found some indispensable points which you should know before making your SEO strategy.Quality Content

Content is the top vital factor which has been controlling the whole search engine ranking from the beginning. 5-6 years ago, duplicate content was widely used to promote a website which troubled Google a lot because spam content was rising hugely. As a result, users did not get the desired result from search engine. To provide users better search results and kick out the spamming, Google approached to quality content and the made difference between quality content and normal content.

In recent Google Panda 4.1 update, Google has defined that good content is a unique content which consists of an infographic, video, image, useful information, etc. Moreover, they have also told that thin and poor content will be considered as spam.User-Friendly Design

This is another matter about which Google has been emphasizing for one year and now they are going to make it as a factor for ranking. At Search Marketing Expo East, Google Engineer, Garry Illyes emphasized on making a website for desktop, tablet and mobile friendly because Google is trying to add UX factor in its ranking factor.

If your website is compatible with all devices, Google will give you better ranking than others who have not this compatibility feature.XML sitemap and RSS Atom Feed

The XML sitemap is very important for a website as Google clarify this matter very clearly in their guideline because it assists Google and other search engine crawlers to crawl your website.

On the other hand, RSS Atom Feed also helps search engine crawler to index all pages of a website. Well, Google has recently announced that a webmaster can use both XML sitemap and RSS Atom feed for optimal indexing.Title Tag and Meta Description Still Important

Title tag and Meta description is still important for SEO though some experts thought that the significance is decreased a little bit. These two things are the basic of SEO. Meta Keyword is not important to Google but for another search engine, it is still considered as a ranking factor.Title tag and Meta description are important for both click-through rates (CTR) and SEO because these two tags tell visitors and search engines about your web page which attracts both to your website.Rich SnippetRich snippet is markup process which is used to provide search engine additional information about web page content. When you search in Google about any topic, you will find some result with image, video or ratings.

Moreover, it will increase content value because this markup process provides search engine a clear idea about your content. Research has been showed that using a rich snippet in content increases the click-through rates of a website.

 

 

Many webmasters have started spamming through this rich snippet which Google noticed, and they have already decided that excessive use of rich snippet or manipulation of rich snippet will cause penalty.PageRank is Dead?

Many SEO professional believes that PageRank is dead though Google is the inventor of this system. The last couple of years, the trading of high PageRank backlink was increased surprisingly which surged the spammy contents, links, and websites largely.

To prevent, this situation, Google has stopped updating their PageRank for last one year which insists all SEO experts to believe that Google toolbar PageRank is dead. I had a website online marketing ideas.info which PR was two in June 2014 having so many online marketing contents though I did not update for last eight months. I sold it in July 2014, and new website owner removed all the content, but its PageRank is still unchanged.

I believe Google is holding off PageRank update and pretending that they have no interest in this factor to stop high PageRank backlink trading, but the reality is they will announce a refreshed update of PageRank to control SEO ranking. At this moment, you don’t need to go after PageRank, but you should always prepare for it.Webmaster Tool is not Big Deal

Many website owners believe that creating an account on Google Webmaster tool is always important which is an entirely wrong because Google bots will crawl your website whether you create the account on Google Webmaster tool or not.

It has a no effect on your ranking because it will show you your website internal issues which are related to design and showing broken link where bots fail to crawl. You can do this research using other tools without giving them access to your website. So, why will you share your privacy with Google if you can do it using other tools without disturbing your privacy?