Posted by – August 12, 2008
Research firms Neilsen Online and Hitwise have released traffic information that shows strong Google growth in the US market. Google’s audience through its variety of web properties grew by a million from June to July to 129 million (Google has the largest audience in the US through its properties like Google.com and YouTube.com). But even more damning for its rivals (namely Microsoft and Yahoo), and arguably for the webmaster community, is the information Hitwise released today about the most coveted traffic of all: search traffic. Hitwise announced that Google has crossed the 70% mark (they cite 70.77%) which is up 10% from July of last year and 2% from June of this year. Here’s the rest of the search landscape that Google dominates:
- Google – 70.77%
- Yahoo – 18.65%
- MSN/Live – 5.36%
- Ask – 3.53%
And here’s a graph from Hitwise:
Posted by – August 5, 2008
Today Google Insights launched, a tool developed for AdWords advertisers to better understand trends in search terms. You can use this tool to compare the traffic for a keyword or phrase and filter by vertical (Category) and region. This is useful for search marketing professionals both for their efforts in PPC and natural results marketing. In both cases knowing the search volume is one of the most important strategic variables, after all why spend time and money on terms with less traffic than others that you can work or spend on?
In the past, Overture was the most reliable way to get free query volume information from one of the major search engines. But they have discontinued their tool and Google has been releasing more search volume data around their AdWords PPC product and now has several of the most important keyword research tools for your webmaster arsenal.
Read more about it on the AdWords blog.
Posted by – August 5, 2008
One of the very first things I do when installing a WordPress blog is to hack the titles. By default the page titles that WordPress generates are not SEO friendly, and the individual post page titles put the title of the post after the blog and archive wording.
The all in one seo pack plugin allows you to modify the blog’s meta tags through the WordPress admin panel as well as on the individual posts through the post editor. Its defaults are sensible and it represents a cleaner solution than hacking at the code to do it yourself because of the abstraction gained through the WordPress plugin architecture.
It’s now part of my standard WordPress install and should really be a part of the core software.
Posted by – August 4, 2008
Quick heads up for the SEO crowd: Yahoo search is being updated with new indexing and ranking algorithms today. You should see significant changes in their results as they are being rolled out.
Posted by – August 2, 2008
Aaron Wall noted on his blog that Google is testing related search terms under individual sites in the results pages. Have a look at his screen shot here for an example:
Posted by – July 30, 2008
Google has been collecting searchers browsing and searching habits for years. They have their search logs, clickstream information for every site serving their AdSense ads, and for every site using their free web analytics program they have information on the browsing history for their toolbar users who do not opt out, and of all the users of their Web Accelerator proxy. And unlike many other companies that collect user data so Google actually uses their data in fundamental ways. So it comes as no surprise that they’d want to find ways to employ user information in their search algorithms. Clickstream data and folksonomy are some of the big areas that search algorithms are expected to use. Right now all major search engines use ranking algorithms that are primarily based on the Pagerank concept Google introduced and became famous for. They all use links on the web to establish authority, and no fundamental change has taken place in the evolving search algorithms in many years. They get better at filtering malicious manipulation and at tweaks that eek out relevancy but nothing groundbreaking.
So authority based on clickstream analysis and social indexing seemed like good ways to use data to further diversify the effort to allocate authority to web pages. What Google learned early is that they needed scale, and their initial data efforts (things like site ratings by their toolbar users) didn’t end up in their search algorithm. Folksonomy and social indexing doesn’t yet have enough scale to rely on and has potential for abuse, but the clickstream has scale and is harder to game given that traffic is essentially the authority and the people gaming the authority want traffic. So if they need traffic to rank well to get traffic then there’s a significant challenge to those manipulating rankings because they need the end for their means.
But Google is cautious with their core search product and has tweaked their algorithm very conservatively. And it has been hard to tell just how much clickstream data was playing a role in their search results and will continue to be as long as it’s such a minor supplement to their algorithm. Today, Google has posted a bit of information about this on their official blog in their efforts to shine more light on how they use private data. You can read about it here in full but the basics are no big surprises:
- Location - Using geolocation or your input settings they customize results to your location slightly. My estimate is that they are mainly targeting searches with more local relevance. And example of such a search would be “pizza”. “Pizza” is more local than “hosting” and can benefit greatly from localization. Hosting, not so much.
- Recent Searches – Because many users learn to refine their searches when they don’t find what they are looking for this session state is very relevant data. A user who’s been searching for “laptops” for the last few minutes is probably not looking for fruit when they type in “apple” next. They reveal that they store this information client side and that it is gone when you close your browser but because they mean cookies anyone who’s been seriously looking under the hood already know this.
- Web History – If you have allowed them to track your web history through a Google account they use this to personalize your results. They don’t say much of anything about what they are really doing but this is where a the most can be done and there are far too many good ideas to list here. Some examples would be knowing what sites you prefer. Do you always click the Wikipedia link when it’s near the top of the search results even if there are higher ranked pages? Then they know you like Wikipedia and may promote its pages in your personalized results. Do you always search for home decor? Then maybe they’ll take that into consideration when you search for “design” and not give you so many results about design on the web. There are a lot of ways they can use this data, and this is probably an area they will explore further.
In summary, right now I’d say their are mainly going with simple personalizations and not really employing aggregate data and aggregate personalization to give aggressive differences. They are careful with their brand and will use user history with caution. After all, if your the use of your results lead to less relevance they fail and because personalization can be unpredictable (there must be some seriously weird browsing histories out there) they are going to be cautious and slight with this.
Posted by – July 29, 2008
It often seems that everyone working online claims to have expertise in Search Engine Optimization. It isn’t that surprising given the returns (traffic and money) but because SEO is a winner-take-most game knowing a little bit of SEO is about as useful as being a “little bit” pregnant. And no matter how much someone might know about SEO, if they don’t execute better than their competitors they lose.
And the thing about loosing in SEO is that it’s a loser-take-none game. The returns from SEO only begin after achieving the top rankings. Moving in rankings from, say, 50th place to 20th represents virtually no traffic gain (seriously, it could be as low as a difference of a dozen visitors a month).
So SEO is winner-take-most, where only the top spots get any significant traffic and in a competitive SEO market the secret is all about execution. You are competing directly against people who understand SEO (remember, everyone’s an “expert” here) and the depth of your insight that matters isn’t what industry names you can drop, or how well you can argue the pedantry of on-page optimization. It comes down to being able to focus your efforts more accurately and execute better than your opponent. Does your team execute more swiftly, efficiently and accurately than others? Does it do so without running increased risk of penalization? Does it cope with imitation and sabotage?
If not, please remember that there is no silver bullet except having the best people in a fast-paced team who can out maneuver the competition and the search engines’ improving algorithms. If you really want to be competitive in SEO, and because of the nature of the game you shouldn’t bother if you don’t plan to be competitive, you need good people. And if things get competitive and the other guys have good people you need better people.
So the secret to SEO is finding these technical rock stars to lead the critical portions of your online efforts. Businesses can be built around the right online marketing gurus, and when you find one, we’ll not only tell you how you can conquer your niche, but unlike the other “experts” we’ll actually do it.
Posted by – July 25, 2008
Cnet broke a story about Microsoft’s BrowseRank (pdf link) about a authority ranking algorithm proposal coming out of Microsoft’s Chinese R&D labs that proposes “Letting Web Users Vote for Page Importance”. There isn’t much new to this, other than the new term “BrowseRank” as Microsoft has long viewed clickstream data as a potential way to outdo Google’s search algorithm, which like all other major search engines revolves around a page ranking system Google introduced that uses links on the web to determine authority.
Using user traffic has potential if you can aggregate enough scale, but a lot of the data is behind walled gardens. You can easily look at public web pages to count links but access to clickstream data is not as simple. Your options are to buy ISP data, to sample traffic and extrapolate, or just collect as much as you can on your own properties (only a few companies with scale to have useful data).
So ultimately this kind of algorithm is unlikely to be a groundbreaking difference and seems destined to be a supplemental part of the general ranking algorithms. We’ll see more of it and it shows promise in smoothing out link anomalies like link farms but isn’t likely going to be the core of a major search engine any time soon.
Posted by – November 16, 2003
I get so many questions about phpBB search engine optimization that I am finally writing up the definitive mod for this.
Read more at the original phpbb SEO Mod thread on able2know or download the mod.