Local.com was a new player in the local search category, and had managed to move up to the #3 ranking in this space. However, paid search was running at about a -65% ROI, and there was no organic traffic to speak of. I was recruited to turn this around.
Upon arrival, I found several challenges. The first challenge was that the home page was a form, similar to Google's home page. This meant there was no way for any content to be spidered or listed in search engines.
The second challenge was in the site's design itself. It didn't lend itself to bookmarking or linking as it was built around form posts, cookies and sessions. This meant that even if I overcame problem one, there was still no way to link to viable results.
On the paid search front, the site made a significant portion of its revenue through arbitrage. That meant buying traffic from search engines and selling that same traffic back to the search engines at a marginally higher rate. As you can imagine, this is not an easy feat. This was compounded by the fact that the current keyword set was largely comprised of combinations of dictionary words and locations that were not necessarily monetizable.
Finally, in order to combat arbitrage, the search engines would not allow us to uniquely track revenue per page, and we were limited to 1000 tracking codes to cover the millions of keywords we were buying.
I met with the technology team and had them make changes to the site to allow URL parameters in place of cookies, sessions and form parameters. I also had them institure URL rewrites that allowed for friendlier URL structures. I then did some keyword research and found the most popular/profitable business types and built a site map joining location with each of these terms.
Working directly with Google, I blew-up our keyword list and started from scratch. I expanded upon my list of profitable business categories then overlaid all local geographies to build out the complete list. I took this list and built a custom taxonomy based on a subset of SIC codes and locales, and mapped this taxonomy into my list of 1000 tracking codes. By tracking inbound links and matching those to my taxonomy, then evaluating the revenue from search engine reports, I was able to fairly efficiently allocate revenue to specific keywords despite having no direct correlation.
"We made considerable progress in local search during 2006, ending the year as a Top 100 website. Our organic traffic nearly doubled, and visits per user grew."
Revenue per 1000 pages more than tripled from $8 to $29, repeat visitors increased 50%, PPC traffic increased by a factor of 10 and organic traffic doubled. ROI from paid search went positive for the first time and reached 35%.