Last week I attended DrupalCon San Francisco. I recapped Days 1, 2, and 3 already, but I thought I’d spend some time focusing in on SEO for Drupal. We’ve talked about Drupal SEO here before, but I’d like to add to that with some of the new things we’ve learned from DrupalCon.
Jen Lampton and Rob Bertholf from Chapter Three gave a basic rundown of SEO in Drupal sites at DrupalCon. You can watch their presentation on the session page if you’d like. Along with the session, they write a quick blog post to bring their holy grail for Drupal SEO (pdf) document. It reviews some of the best practices for common SEO actions and included what modules are needed for those things. I’d like to dive in a bit deeper on some of my favorites from that list.
One of the best features of the document is that it separates out what things benefit Humans (like you), and what benefits Robots (like Google). For instance, alt text for images benefits the Robots by giving some context to images, but it was really designed to benefit Humans that use screen readers or other accessibility aides.
Rob discussed this early on in the presentation and pushed the audience to only make changes that benefit Humans. If it only benefits the Robots, it’s probably a black-hat tactic. I could not agree more, and am thankful he highlighted this distinction. Aside from nofollow links, robot meta tags, and the robots.txt, everything else on their list benefits humans. So Human-only isn’t a hard a fast rule, but it should be the majority of your focus.
The majority of things that you would optimize on the site should make things easier for the user. This includes:
- A Consistent site structure (code and visually) and both XML and HTML Sitemaps
- Use descriptive Headings and Titles for your pages, articles, and other types of content
- Good internal Site Search and helpful error pages
- Alt and title text for images and link title text to add more information
There are a few others on the list, but those above are the most important for the Humans. Those are the basics, and should be used before any other tweaks are even considered.
After the basics are covered, you can start to get more focused on special things like keyword selection, link text (e.g. not “read more” or “click here”), and setting up patterns and defaults for your URLs, Title tags, and Meta descriptions. Much of this can be setup to be automated whenever new content is added, and overridden when you need to make a specific change to something. Finally, you can use some special redirect modules (path redirect and global redirect) to make sure your content has one specific URL.
If you’ve accomplished everything above, you are far ahead of the pack. Remember again that you should be optimizing for the Humans, and not just the Robots. Another recommendation by the Chapter Three team is to use social media sharing embed tools to allow for easy community sharing. This, like many things, is good in moderation. Use you Analytics reports (you installed analytics tracking, right?) to find some of your top referring sites and use those share buttons on your content. We use a ShareThis tool which has several options, but does not display all of them everywhere. It’s not the greatest tool, but it brings some balance.
Finally, I want to discuss the notion of active vs. passive content. Active content is stuff that may change often, and will hopefully be shared and earn linkbacks. A great example of this is a blog. Blog posts give you little bits of content that other people can link to and send traffic to your site. Passive content is generally pages on your site like your About or Contact page. They won’t change often, and while the may get some traffic, they probably won’t get many linkbacks. You will hopefully have a mix of passive and active content on your site. Doing this will give you different types of traffic as well as linkbacks for your site.
Thanks again Jen and Rob for sharing your knowledge and experiences.