You are currently browsing the tag archive for the ‘seo’ tag.

Every year there’s a special gathering of SEO experts to discuss the latest and greatest in optimization techniques. The most recent one was in June titled “SMX Advanced 2013” and this year the keynote speaker was Google’s search expert Matt Cutts gave some interesting insight into what Google looks for in a website so that it will rank better in its search engine.

During past public appearances at SEO events, Cutts’s usual advice for webmasters looking to rank better in Google always was saying: provide compelling content and design your site for your visitors, not the search engines. However, when questioned further about what makes a great website, Cutts made three interesting recommendations;

  1. Make a great site.
    To this first and second points above, a great site is one that’s been optimized for the user experience. It’s fast; it’s got a good mobile version, it has useful info, it’s conversion optimized and easy to use. These were the themes repeated over and over again throughout the conference, and it’s the topic covered in detail in this month’s Back to Basics article SMX Advanced Non-Negotiables: Social Signals, Mobile Search and User Experience.
  2. Don’t take shortcuts.
    Google continues to test and roll out new algorithm signals to detect spam. One example, the “pay day loans” query space in the UK got a jolt the day of Cutts’s keynote as the Google web spam team deployed a means of detecting and demoting spam from pay day loan businesses in search results. Penguin (link spam) and Panda (low-quality content) updates continue to roll out, but don’t expect announcements of refreshes. It’s worth pointing out that Google doesn’t feel it necessary to announce every update to its algorithm anymore, or name updates, for that matter. During the keynote, Cutts said he wanted to avoid “update name inflation.” When updates and refreshes work toward the goal of elevating relevant, quality results and devaluing manipulative, low-quality results, does it really matter what algorithm tweaks are called?
  3. Pay attention to what Google tells you?
    During the keynote Cutts announced that Google now gives example URLs to webmasters notified of a manual action in Google Webmaster Tools. For several years now, Cutts has said that Google was making open communication a priority. In this case, those website owners that have had a manual action, those who have been hand-flagged for acts against Google guidelines, will be given a few examples of the bad behavior Google sees. In a Webmaster Help video, Updated messages for manual webspam actions, Cutts’s example suggests a page that has been hacked, perhaps with something like a link injection.
  4. Tell everyone
    Once you have a great site is one that users love, bookmark, tell their friends about, come to over and over again ? all the things that make a site compelling (Cutts’s favorite word for describing great content).

If you promote your business or personal website via the search engines, your most immediate marketing goal should be to work hard for your website users as a way to create compelling Web content. If you do this, then expect Google to be your ally in the pursuit of that goal by working hard alongside you to show your high-quality content to users.

Advertisements
Image
“Good SEO is usually just good usability “
I recently read this sentence on a blog – and yeah, it holds a lot of truth!
I have seen websites stuffed with keywords and with lots of sneaky stuff. Besides the fact that most of the attempts to fool google just don’t work any more, you must not forget that your site is for real human beings, not for search engines. Your visitor have to like your site! If they love it, they’ll link to it! Don’t make usability sacrifices for SEO, usability and SEO go hand in hand. Following basic accessibility standards ensures that your site can be crawled and indexed by search engines, but as we know – that isn’t enough. Taking usability principles, such as good page markup, use of headers, titles and information architecture can give you a boost in search visibility.In fact, when it comes down to a lot of the things search engines look for, even off-site, you can relate them to end user needs. Lets take another example:Let’s say there are two sites both with the same keywords and the same number of inlinks, which will rank higher? Chances are good that the page that loads faster will be preferred. Use tools like Yahoo! Yslow or Google Page Speed to check whether you could optimzie the page loading speed.

And last but not least validate your site’s HTML or XHTML. Browsers can render pages with errors and a normal visitor might not even notice anything, perhaps the page loads a little bit slower than others. But for search engine bots some validation errors can cause the bot to “see” only half the page and missing out in the rest of the content.So use the w3.org Validator and improve your SEO Usability.

First came Boston, then Cassandra, Dominic, Esmeralda, Fritz, Florida, Bourbon, Big Daddy, Buffy, Vince, Caffeine, Panda, and finally Penguin. These are all the names of updates to the Google algorithm which helps to determine its SERP (a.k.a. search engine results page) listings. Few if anybody outside of those of us in the SEO world knew or cared about these updates but perhaps more people should’ve because they’re all attempts by Google to improve how we access information in our ultra-connected 21st century world.

Google’s Vision For SERPs
When Google started back in the fall of 1998, cofounders Larry Page and Sergey Brin had a vision of sorts to “organize the world’s information and make it universally accessible and useful”, or so they say. Hyperbole aside, what was truly astounding was not there vision but rather there practice of making Google the first search engine that would use links to determine how well a given website ranks in its SERPs rather than relying on the older method of on-page criteria such as meta-tags, body text, headlines, etc. In the late 1990s, this was somewhat ‘revolutionary’ but 14 years later, it’s standard operating procedure among all the search engines.

“SERP-BRIN-Time Sergey; SERP-BRIN-Time”
As someone who’s had his nose to the grindstone since the beginning of the Web, I can personally attest to the vast improvement in the results search engines are returning these days and Google is the main reason for this. Why? Because, WWW stands for World Wide Web so Google took the decidedly ‘revolutionary’ stance that since the World Wide Web is an interlinking network of websites, these websites should be organized in the SERPs based upon how many quality websites linked to it. At the time, this represented a ‘revolution’ of sorts in search engine technology.

What’s In a Name
Which brings us back to why Google updates have names; because they’re that important enough as to get a name. And the latest update, called Penguin, is no less important than its predecessors because it seeks to reduce the amount of spammy websites that turn up when someone ‘Googles’ which has the effect of forcing webmasters to spend less time on optimization tricks and more time on creating unique and interesting content on their websites if they want to rank well in Google SERPs.

How utterly revolutionary. George Washington would’ve been proud.