Old SEO Methods You Need to Stop


Old SEO Methods You Need to Stop

If the word "EzineArticles" falls out of your mouth when you start talking about your SEO, you should just stop right there. Even before the Panda update, this really wasn't the best use of your time and essentially was just a way to get some quick links back to your site.

Article Submissions
Press Releases without News
Reciprical Linking & Link Exchanges
Ignoring Design :

I have one last bone to pick with something that a lot of people don't think of when it comes to SEO, but it fits and I'm tired of running into it. You know that website that you have? Oh, you helped design it? Yes, I see how that menu looks all cool. Flash, you don't say?

Your website can say a lot about you and your brand. I know a lot of people pay a lot of good money for a site, only to have it be ugly as sin, non-functional, or not search engine friendly. This has been going on since I started working online and it's not bound to stop anytime soon but I'm pleading with you to save yourself the time and money and headache:

The next time you go to build a site, hire a designer. Shop around. But when you find one that's good, trust in them and their design work. They do this for a living and you don't.
Guide your designer, but don't do the designing. You didn't hire them to just use tools you don't understand. Allow them the freedom to make something great for you.
At the same time, you should also hire a great SEO. This person will work with the designer and the developer to ensure that what you end up with not only works to your needs but will do everything it needs to do in regards to the search engines as well.

Google New Guidance Line on Building high quality Sites

In recent months we’ve been especially focused on helping people find high-quality sites in Google’s search results. The “Panda” algorithm change has improved rankings for a large number of high-quality websites, so most of you reading have nothing to be concerned about. However, for the sites that may have been affected by Panda we wanted to provide additional guidance on how Google searches for high-quality sites.

Our advice for publishers continues to be to focus on delivering the best possible user experience on your websites and not to focus too much on what they think are Google’s current ranking algorithms or signals. Some publishers have fixated on our prior Panda algorithm change, but Panda was just one of roughly 500 search improvements we expect to roll out to search this year. In fact, since we launched Panda, we've rolled out over a dozen additional tweaks to our ranking algorithms, and some sites have incorrectly assumed that changes in their rankings were related to Panda. Search is a complicated and evolving art and science, so rather than focusing on specific algorithmic tweaks, we encourage you to focus on delivering the best possible experience for users.

What counts as a high-quality site?

Our site quality algorithms are aimed at helping people find "high-quality" sites by reducing the rankings of low-quality content. The recent "Panda" change tackles the difficult task of algorithmically assessing website quality. Taking a step back, we wanted to explain some of the ideas and research that drive the development of our algorithms.

Below are some questions that one could use to assess the "quality" of a page or an article. These are the kinds of questions we ask ourselves as we write algorithms that attempt to assess site quality. Think of it as our take at encoding what we think our users want.

Of course, we aren't disclosing the actual ranking signals used in our algorithms because we don't want folks to game our search results; but if you want to step into Google's mindset, the questions below provide some guidance on how we've been looking at the issue:

•    Would you trust the information presented in this article?
•    Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
•    Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
•    Would you be comfortable giving your credit card information to this site?
•    Does this article have spelling, stylistic, or factual errors?
•    Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
•    Does the article provide original content or information, original reporting, original research, or original analysis?
•    Does the page provide substantial value when compared to other pages in search results?
•    How much quality control is done on content?
•    Does the article describe both sides of a story?
•    Is the site a recognized authority on its topic?
•    Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
•    Was the article edited well, or does it appear sloppy or hastily produced?
•    For a health related query, would you trust information from this site?
•    Would you recognize this site as an authoritative source when mentioned by name?
•    Does this article provide a complete or comprehensive description of the topic?
•    Does this article contain insightful analysis or interesting information that is beyond obvious?
•    Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
•    Does this article have an excessive amount of ads that distract from or interfere with the main content?
•    Would you expect to see this article in a printed magazine, encyclopedia or book?
•    Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
•    Are the pages produced with great care and attention to detail vs. less attention to detail?
•    Would users complain when they see pages from this site?

Writing an algorithm to assess page or site quality is a much harder task, but we hope the questions above give some insight into how we try to write algorithms that distinguish higher-quality sites from lower-quality sites.

What you can do

We've been hearing from many of you that you want more guidance on what you can do to improve your rankings on Google, particularly if you think you've been impacted by the Panda update. We encourage you to keep questions like the ones above in mind as you focus on developing high-quality content rather than trying to optimize for any particular Google algorithm.

One other specific piece of guidance we've offered is that low-quality content on some parts of a website can impact the whole site’s rankings, and thus removing low quality pages, merging or improving the content of individual shallow pages into more useful pages, or moving low quality pages to a different domain could eventually help the rankings of your higher-quality content.

We're continuing to work on additional algorithmic iterations to help webmasters operating high-quality sites get more traffic from search. As you continue to improve your sites, rather than focusing on one particular algorithmic tweak, we encourage you to ask yourself the same sorts of questions we ask when looking at the big picture. This way your site will be more likely to rank well for the long-term. In the meantime, if you have feedback, please tell us through our Webmaster Forum. We continue to monitor threads on the forum and pass site info on to the search quality team as we work on future iterations of our ranking algorithms.

Google Suggested Keyword Mapping for Designer, Developer, SEO

If you’re planning or managing search engine optimization (SEO) then you’re going to have a list of target keywords. Each target keyword needs to be 'mapped' to its own page on your site. Mark Nunney shows you how to use keyword mapping to display a wealth of information to help you plan, monitor and report on your SEO.

Google's search engine results list pages and not websites.

And your site's pages have more chance of getting high listings for your target keywords if they each focus their SEO on one primary keyword.

So you must map each of your target keywords to a page on your site.

You need to decide which page should be used for which target keyword.

The image below illustrates a simplified example of target keywords being mapped to different types of pages on a site:

The pages might already exist or they might be new pages that need to be created.

And of course you need to keep a record of that, a list of your target keywords and their target page's URLs.



On that record, why not add some useful information for each keyword, such as:

•  How many visits they brought in the last month and year
•  Conversion rates for your site's wanted responses
•  Your site’s rank on Google’s search engine results page (SERP)
•  Google’s estimate of the number of searches made on search engines

Google’s New Algorithm Update Impacts 35% Of Searches - 2012

In our ongoing effort to help you find more high-quality websites in search results, today we’re launching an algorithmic change that looks at the layout of a webpage and the amount of content you see on the page once you click on a result.

As 
we’ve mentioned previously, we’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away. So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience. Such sites may not rank as highly going forward.

We understand that placing ads above-the-fold is quite common for many websites; these ads often perform well and help publishers monetize online content. This algorithmic change does not affect sites who place ads above-the-fold to a normal degree, but affects sites that go much further to load the top of the page with ads to an excessive degree or that make it hard to find the actual original content on the page. This new algorithmic improvement tends to impact sites where there is only a small amount of visible content above-the-fold or relevant content is persistently pushed down by large blocks of ads. 

This algorithmic change noticeably affects less than 1% of searches globally. That means that in less than one in 100 searches, a typical user might notice a reordering of results on the search page. If you believe that your website has been affected by the page layout algorithm change, consider how your web pages use the area above-the-fold and whether the content on the page is obscured or otherwise hard for users to discern quickly. You can use our 
Browser Size tool, among many others, to see how your website would look under different screen resolutions.

If you decide to update your page layout, the page layout algorithm will automatically reflect the changes as we re-crawl and process enough pages from your site to assess the changes. How long that takes will depend on several factors, including the number of pages on your site and how efficiently Googlebot can crawl the content. On a typical website, it can take several weeks for Googlebot to crawl and process enough pages to reflect layout changes on the site.

Overall, our advice for publishers 
continues to be to focus on delivering the best possible user experience on your websites and not to focus on specific algorithm tweaks. This change is just one of the over 500 improvements we expect to roll out to search this year.

Matt Cutts 
Related Posts Plugin for WordPress, Blogger...