When you think of keyword research Dan Thies’ name often comes to mind. Dan’s expertise, however goes far beyond keyword research as he has become one of the most sought after SEO coaches/educators out there also being one of the most well-known and respected voices in SEO. Dan was gracious to allow me to interview him and get his thoughts on what’s important and what’s not in terms of SEO.
1) Aside from SEO education programs you provide, you’re known as a keyword research expert. What do you feel are the most important aspects of researching and selecting keywords for an optimization campaign?
You’re looking for terms that are going to be used by the target audience. The approach we take is to look at relevance as the key factor in determining what the most important search terms might be.
We use a formula called weighted popularity – if you believe that there are 100 searches per day for a given search term, and that 50% of the folks using that search term would be interested in what you have to offer, then we multiply 100 searches by 50% and say that there are 50 targeted searches per day.
So I like to start with the same basic research, to come up with a list of “core” search terms. Everything else revolves around these core terms.
In addition to these core terms, there are what I call “modifiers” – those words that will be used by searchers along with the core terms. So if “website hosting” is a core term, you have modifiers like PHP, small business, cheap, unlimited bandwidth, etc. that you can find by using tools like Wordtracker’s multisearch tools or Keyword Discovery.
Some of those modifiers will be relevant, others won’t. Rackspace does great dedicated hosting, but they wouldn’t want to use “cheap” as part of their keyword strategy. Many terms will also have alternative spellings and misspellings. Website or web site? You have to target both, so these have to be taken into account.
2) Do you feel there are any differences between research/selection for optimization vs. PPC campaigns?
I don’t think the basic research is much different. The difference is in how you target the terms.
Over the lifetime of a campaign, SEO results and PPC results can create a feedback loop. So a broad match on a general search term in the PPC campaign can give you lots of specific search terms that you can feed back into PPC as well as apply to your SEO strategy. Terms that come in via SEO may end up in your PPC campaign. That’s one more reason why analytics is so important.
In terms of targeting, again, that’s where the main difference comes in. It costs next to nothing to add search terms to a PPC campaign, you just paste them into your list. With SEO, you have to add content so it’s harder to target the marginal or low volume search terms, even if they may be very targeted and relevant.
With SEO, you generally build pages around the core terms and work the modifiers into the copy and your internal links. This does broaden the profile of the page considerably. You have to target the important terms first though. Because content has a cost, you can’t necessarily build a useful web page to directly target every variation, so you have to use links and copywriting to broaden the profile of your pages. This keeps the site’s content compact and enhances usability.
The advantage you have with PPC campaigns, especially with Google where you can dump in a huge list of terms and use all the different matching options, is the ability to cover your bases more speculatively. Tor Crockatt from Miva gives a great presentation at SES on how they spin up keyword lists. Basically it’s the same idea as mixing core terms with modifiers, but with PPC you can add in all the possible synonyms, spellings, adjectives, etc. You build a big list by doing all the permutations and combinations.
I have a free video on keyword strategy that goes into a lot more detail on this, the archive also includes my most recent presentation from SES. The video runs 75 minutes and goes into a lot more detail on many aspects of keyword strategy, how to leverage PPC to inform SEO, and how you can take a more scientific approach to the question of relevance.
[editors note: In his answer to question #2 above Dan used the phrase “core terms” in the same context as I use “keyword themes”. Targeting core terms and their “modifiers” is what I refer to as “keyword theming” in question #3 below, so the intent of question #3 has already been answered above.]
3) How important do you think keyword theming is when implementing them into a site for optimization?
I don’t know about themes and all that. We had some ideas about that a long time ago, but it was largely misinterpreted, and Mike Grehan among others pointed out the absurdity of many interpretations. Basically, I will say that having related content linked together is a good thing, period, even if search engines didn’t exist.
I took a deep look into the idea of Topic Sensitive PageRank a few years ago, and that paper is still on my site. You can see that whatever ideas Google has about themes and topics, it’s pretty coarse. For example, the personalized search and site-flavored search tools are based on topics, but there are only like 70 topics altogether. So any idea of strongly “theming” within a site is likely to be misplaced and overkill. What’s the theme of Amazon.com, and why do their pages show up in so many search results that aren’t about books?
To me, the information strategy of the site is far more important. Who are the members of your target audience, how can they be segmented into meaningful groups or personas, and what are the information needs of those groups? If you build your content around answering their questions and helping them get what they want, then you are doing it the right way.
4) Different SEOs have different keyword implementations into a site. Some will give clients 25 keywords per page, while others will only do one, two or three. What do you feel is the best practice and why?
Well, there are a couple ways to look at this. I’ll start by repeating a point that I beat to death in my classes, which is that your goal as a search engine marketer is to bring the visitor in to a page that is as close as possible to their goal. The searcher’s goal may be obvious from very specific search terms, or it may not be obvious with more general terms.
The type of page that you build for a search term, therefore, depends on the search term. In some cases, you want pages that give the visitor a set of choices, so that you can move them toward their goal. In other cases, you want to ask for the money on the entry/landing page.
Keyword strategy for web pages is all about getting the visitor to the right page. So on any given page, you target the terms that are going to fit the requirement of getting the visitor as close as possible to their goal.
Sometimes, this means that you have one search term on a page. Sometimes you may have many more. In practical terms, you have 10 words to play with in the TITLE tag, because the engines won’t index more than that. This puts an upper bound on the number of “core” search terms you use on a page. I would certainly think that most pages would have closer to 25 keywords in the copy, if you include modifiers and that sort of thing.
As I already mentioned, you can broaden the profile of a page, and have it appear on more different searches, by using internal and external links with modifiers and other terms. If I have a web page with car rental outlets in the Dallas-Ft. Worth area, I can also target the names of other towns with copy and links. We can target a lot of local terms on the same page. The case for building separate pages involves the possibility of a higher click-through rate – if I search for “car rental Plano TX” then it does help a bit if Plano appears in the title.
I have many individual web pages where the search term referrals each month include hundreds of different terms. Which ones did I specifically target, probably fewer than 5 core terms, the rest are part of that “long tail” we’re always hearing about.
5) Over the years I’ve seen quite a few SEOs and SEO firms move away from providing actual optimization services and move toward education or providing software or reporting services. What’s your take on that?
If that’s what they want to do, then that’s what they should do. The larger business world needs more consulting and education for their in-house resources, and less reliance on someone who comes in and just changes the site. So if individual SEOs get tired of rewriting client pages themselves, and they’d rather teach someone how to do it, and make their long-term revenue off stuff like analytics and real consultation, then that’s very positive. It really just reflects that individual’s professional growth.
There are some areas where I think the client should exercise a lot more control, like link building, since it can affect their reputation in the long run. Relying on an SEO firm to do it right, and do it for you, is unrealistic. Not that everyone out there will rip you off or screw up your site, but you ought to know what’s being done and understand the implications.
Personally, I don’t have SEO clients because I provide so many services to SEO firms and it would just be too much of a conflict of interest. What if I got a keyword report order for an existing client? Would I call them to try to win them back, or be loyal to my “other client” who was about to win their business away from me?
6) I believe that SEO is dying as a stand-alone entity and that SEO companies need to provide additional marketing oriented services as part of their strategies in order to provide real success for their clients. What do you feel is the importance of the marketing aspects within SEO?
Well, if you’re going to add value, you have to focus on business results. If you define “SEO” as getting web pages ranked, then yes, it’s already dead and was never really alive in the first place.
If you define SEO as optimizing a web site to achieve business results through organic search engine listings, then it’s very much alive. But it has to include analytics and everything else that makes it possible to measure and improve business results.
The problem with SEO by itself is that you can’t get the best results without touching a lot of other areas – paid advertising including PPC, usability, analytics, conversion, etc. So a one-trick pony is just that – it may be a cute trick to make your client #1 for some search term. If it’s done in a vacuum then success from a business standpoint, if it happens, would be more like a happy accident than the result of some intentional activity.
7) If you were to develop a search engine, what are the five key elements you would consider most important to analyze?
I’d put a heavy emphasis on linking relationships, link quality, topical relevance, and that sort of thing. I’d look at user feedback that you get from the browser toolbars too. How much time people spend on a site, how many pages they view, and how diverse the audience is. I’d definitely make an effort to determine what the real content of a page was, vs. navigation and other clutter.
8) Do you see any paths the major search engines are taking that you feel is the wrong path?
I think they don’t do enough to disclose the paid advertisements. Yahoo in particular, where “organic” listings may be pay-per-click, and where we’ve seen some pretty compelling evidence that some search results are hand coded. Matt Cutts showed a great example at Consumer Reports’ conference last spring in Berkeley – how many of those “perfect search results” were paid listings? We don’t know, there is no disclosure.
I think that there is an inherent conflict between paid contextual advertising and good organic search. Google and Yahoo pay a lot out through Adsense and YPN, and they have a lot of revenue tied up in that. If they really did something about the splogs and keyword driftnets that are built to siphon off traffic from “long tail” searches, they’d lose that revenue source.
So with all the search engines now, the quality of results for general searches is pretty good, but for very specific searches you may get a lot more spam than information when you search. That’s a huge problem. I’d take some percentage of the revenues and use it to do site reviews. Take a look at the page that’s generating the click and dump ‘em if they’re spamming. Requiring a small deposit on new Adsense or YPN accounts would stop people from opening multiple accounts to spam.
Yahoo buying Delicious, well, if they actually try to use tagging in search results, that would be about 180 degrees away from the right path, but they probably have other plans for deli. When you think about what Yahoo or MSN is doing, you think about how it will bring users into their portals. When you think about what Google is doing, ask how they’re going to turn it into an advertising channel.
A lot of this, though, is that the engines are up against a million hackers who want to game the system, and that’s just tough no matter what you do.
9) In regards to the SEO industry, what would you say is 1) the best thing to happen to the industry, 2) the most important advancements within the industry, and 3) the worst thing to happen to the industry recently.
1) The best thing that’s happened is that people are starting to focus more on results and less on rankings. This is evident when you go to SES or other conferences. Along with this, there are more people actually learning how the search engines work, not just how they can be manipulated at the present moment.
2) The most important advancements involve synergy between paid and organic, and getting new tools that help manage it all, but the best is yet to come. We’ll see real professional certification coming soon, and that’s going to be huge.
3) The worst thing is just bad press, and inaccurate reporting. Newsweek did OK with that article on Rand, but not great. When we’re portrayed in the light of (that company from Las Vegas) or Google bombing, it’s not so great.
10) Would you make a prediction as to how or when Google might lose it’s dominance, or at least what might cause such a fall?
First, someone else will have to take organic search seriously. I don’t think MSN and Yahoo really have their hearts in search. They are getting search users from their portals and services, and they don’t really try to win searchers from Google. Ask has a nice new interface, but the search results are so-so because they don’t really crawl the web very well.
If Google does lose their dominance, it will be because they didn’t innovate, or because someone else buys market share. As in, Microsoft buys AOL. Start the rumor. 😀
Bonus Question: I had previously asked Dan to explain the process of checking to see if a site passes link reputation. Below is his answer in full:
The basic drill on checking if a page passes reputation: It’s possible to get an affirmative result, it’s possible to get a negative result, but sometimes you can’t really get either… and it’s really subjective in terms of the degree of confidence you have in the result, because we are relying on some assumptions.
What you need is the text from a link on the linking site, to use in an exact phrase search. So for example, the site you’re checking might have a text link that says “big boats small boats medium boats – the boat guy” linking to www.the-boat-guy-who-buys-links.com – as long as this doesn’t appear as a phrase on the target URL, then you can run a test by doing an exact phrase search on Google for “big boats small boats medium boats – the boat guy”.
If you see exactly one result (the linking URL) then you would say that the link, at this point, has not passed text reputation to the target URL. If we get such a negative result, we look to repeat this result with other links, because it’s possible that the link we checked is too new to have passed reputation, it’s possible that links on a given site must “age” before reputation can be passed, but that they will eventually, and it’s even conceivable that the links to a specific site might undergo “aging” before they can pass reputation – for a while, we suspected this could explain the “sandbox” but it’s definitely more complicated than that.
If you see exactly two results (the linking URL and the target URL), then you would say that the link has passed text reputation to the target URL. You can validate by checking the cached version of the target URL, and look for “the following only appear in links pointing to this page.” Again, we look to repeat the test, because it’s possible that there are other links which are passing reputation to the page, but have dropped from Google’s index for some reason, that the target URL has changed, that the indexed version doesn’t match the cached version, etc.
If you see more than two results, well, I’d normally just go look for another link to play with, but if it’s a handful you can check to see why they’re showing up in the SERP. Usually this happens because the phrase is too common, or because they’ve used the same anchor text in more than one link. So if the boat guy has used the same anchor text with several link buys, then you wouldn’t know which links are passing reputation. However, if the target URL is not among the results, then you have probably found several links that aren’t passing reputation. We occasionally find cases where there are hundreds of results for the phrase search (may be multiple sites, may be run of site on one site, etc.), but the target URL isn’t included among them, and this gives us several URLs at once that aren’t passing reputation.
It can take some time picking over a web site or page to find good candidates for the phrase search – we look for long strings of keywords because they’re unlikely to appear as a phrase anywhere. It usually takes 5-10 minutes on a given site to find something, or to give up. Crappy directories tend to let sites load up their titles with keywords, so it’s usually pretty easy to find some text you can do some tests with – just look for long strings of keywords.
A useful exercise, if you have control of the linking site, is to do these kinds of controlled tests by putting up links and monitoring the SERP for the phrase search. We’ve been able to find that some websites in fact DO pass reputation, but that it can take several weeks for it to show up. Many sites (Aaron Wall’s search-marketing.info site, for example) are able to pass reputation within a couple days, and affect fairly competitive SERPs pretty quickly. If you just happened to have a link farm or network of your own, you might be able to use this information profitably for your clients.
When we checked the O’Reilly site (that Matt Cutts had commented on), we were even able to find links within blog posts (content) that clearly weren’t passing reputation, even months after the content was created and presumably indexed. That’s one heck of a harsh sanction to hand out, in my opinion.
Yes, I probably need to spend less time thinking about this stuff and get some sleep or something, but in a world where the other 999 experts are so busy “doing” SEO that their understanding begins to diverge, I think there’s room for one geek who just plays with it, assuming I can continue to find enough professionals who are willing to sponsor my efforts. 😀
Dan has another link building clinic starting up in about 3 weeks. I’ve gone through this class and found it not only extremely helpful but fun as well.