"Ending the Management Illusion"
I was reading a book by a friend, Professor Hersh Shefrin of the Leavey School of Business at Santa Clara University, on how managers in companies can apply principles of behavioral finance to make better decisions, and came across a *great* table. In it, Prof. Shefrin lists all of the common sources of behavioral bias that people commonly find themselves subject to. Much of the book is about forecasting, which is actually a critical part of the SEO process; forecasting the potential for a website gives an indication of how valuable SEO efforts will be, and puts some boundaries around what constitutes a reasonable amount of effort.
With his permission, I’ve reproduced the table below with an additional column that suggests how these biases are relevant to our industry. Overall, it makes for a cautionary warning – before diagnosing a website, perhaps we should diagnose ourselves… Full article»
Cluck Cluck...I Hope the"Entrail" Reference is Just a Fowl Joke!
In Part 1, we came up with a standard decay curve by position that can be used to estimate Click-Through Rate, provided you know how many clicks are available. In this posting we’ll extend that to provide two ways of estimating organic search traffic potential, complete with a downloadable spreadsheet. One approach is for keywords with no history, and the second one can be used for keywords with a history of performing, to project the effects of moving up various amounts.
Before we start, a number of commenters on Part 1 pointed out that the CTR decay curve will vary Full article»
Jockeying for Position. *click to enlarge*
As an SEO practicioner, you will often find yourself in the position of having to estimate traffic potential. If you’re like me, you’ve probably used the AdWords Keyword Research Tool to do this. Most people do this with “Exact Match” turned on and use the “Local Searches” number, then make assumptions about what position they might be able to obtain on average, then they assume an average click-through rate based on various studies that have been done on click-through rate versus position. For individual keywords, the click-through rate in reality will vary widely from the average, but if you’re doing estimates for hundreds or thousands of keywords this should “all come out in the wash”.
I’ve done this exercise numerous times but have always had the sense that I’m probably overestimating, because the traffic amounts predicted didn’t materialize, even when positions were achieved.
These two postings will identify the reasons for this, and we’ll construct Full article»
This guy had some pretty good advanced tricks too
Important note – before reading this article, read the previous posting “SEO for Local Search: The Complete Guide“.
Before going into the list of advanced ranking techniques for Local Search I’ve uncovered reviewing a myriad of other folk’s postings, let’s talk about what proof is available first.
David Mihm does a *great* yearly survey of local search experts on what variables they believe are important for Local search, very similar to what SEOMoz runs for overall organic search.
It’s important to note that it’s well established (read the book “Supercrunchers” if you’re not convinced) that experts are great at determining what variables matter, but figuring out importance or weightings is best left to machine-learning, or statistical reverse-engineering exercises.
Fortunately, the folks at SEOMoz have done some correlation analysis of some of the Local Search ranking factors Full article»
Location of a Business Called "123kjkjad9 -" * click to enlarge*
“Local search” is a term used for searching on websites that let you find different types of businesses in a geographic area. These can be map sites, search engines, yellow pages sites, or local directories. Unlike traditional organic search optimization, there are numerous places you must go to create, correct, and optimize your business’ listing, and this can be a daunting task. Also, this portion of the SEO industry uses intimidating terms like “7-pack” “NAP”, and “Citations” that can be a little off-putting.
This posting will de-mystify all of this and simplify the process by breaking it down into eleven steps Full article»
Rankings Explained. Any Questions? *click to enlarge*
Last year, some people from the academic community who hadn’t been snatched up yet by Google or Bing did a really interesting study. Rather than simply researching factor correlations to rankings, as SEOMoz does a great job of doing every so often, they used machine learning techniques to create their own search engine, and trained it to reproduce results similar to Google. After the training process, they extracted the ranking factors from their trained engine and published them and presented on them at an industry conference. They were able, for the queries they trained on, to correctly predict Full article»
The U.S. Patent and Trademark Office Website *click to enlarge*
Google and Bing are in an arms race. The two weapons they are deploying are datacenters and Ph.D.’s, which are substitutable for each other. If a Ph.D. comes out with a way to store or index data that is 5% more efficient, this can result in hundreds of *millions* of dollars of savings in datacenter costs.
Often these companies will file patents if they have some really good ideas, or they will file them just to have some patents in a particular area they are worried they may eventually face a lawsuit in. For instance Full article»
Bruce Clay's SEOToolset *click to enlarge*
In our recent post, High-End SEO on a Low-End Budget, we briefly mentioned Bruce Clay’s SEOToolSet®. This is a great subscription service that has been around for a long time and has recently gone through a major upgrade – here we’ll take some more time to run through the service in detail.
Bruce is one of the original SEOs from back in the day, and his company provides a wide variety of training classes, SEO services, and even access to its proprietary optimization toolset. In a nutshell, this subscription service contains Full article»
Black Hat SEOs of Yesteryear Working on a Meta-Description
Meta-descriptions are critical for two reasons; they are used by Google in the ranking process, and they are ultimately responsible for the Click-through-rate that your page will experience.
A properly written meta-description will stand out and have users clicking on it more often than it simply deserves based on its position in the SERPs; a poorly written one may garner as few as zero clicks. Here we’ll detail best practices for writing your Full article»
Monk-eying around with a PDF file
PDF files, just like web pages, can be optimized to rank highly on Google. Many SEOs recommend steering away from PDF files as much as possible, but they are ranking all over the place on Google, so I wouldn’t particularly avoid using them. In fact, if you’ve gone through some effort to make a professionally-formatted PDF file, one might argue it’s likely to be higher “quality” content than the average run-of-the-mill web page. I would not rule out Google even slightly favoring PDF files for this reason Full article»