MuraCon Notes - SEO Best Practices, Tom Rusling, Reflexive Media

February 13, 2017

SEO mostly seen as a marketing exercise currently
In early 2000’s, that was heavily debated
used to be considered more of a technical exercise
tho it still -requires- both
but these days it’s all about marketing

starts w/ audience
understand how people relative to your products/services are searching for information

once you have your audience strategy, it’s 3 pieces

1. technical - how is the site structured, content rendered, is it effective / efficient for the bots, etc. nothing really difficult here. just that there are 100’s of things you have to do right. can be 1 landmine that can sabotage everythingthing.

2. content - do we have keywords and content that support the strategy we defined. some tech aspects - “how do i render the content and mark it up so that the search engines know which things are important”, things that help search engines, which pages/sites are more trustworthy and authoritative. (That’s what Google was founded on — which sites are more trustworthy than others).

3. Links

SEO is straight forward but difficult for larger orgs
Relies on people form different divisions/groups
have to unite tech team along w/ the content, the product decisions, UI/UX decisions can have SEO consequences, PR, MarCom, etc can all affect SEO ramifications

do best when you have a top-down mandate to appreciate the importance of doing these things. when there isn’t top down buy-in support, that’s when larger orgs start to have trouble

people who are SEO practitioners come in to this from 1 of 2 angles: technical/programming and got into this, or they’re marketing, coming up thinking about keywords, content, digital PR, link building, etc.
want both sides coming in to help your SEO program

WHEN is the right time to bring in SEO consultation?
You don’t “do” SEO, you INTEGRATE it into every decision and process
Don’t do it “after” picking a domain, CMS, taxonomy, site architecture, etc.
Do it ALONG WITH those things.
You don’t call a structural engineer AFTER you’ve built a building - same idea here.

Want an ‘SEO lense’ helping during all the decisions

You don’t SEO your WEBSITE, you SEO your BUSINESS

building a list of keywords
want 1 page on your site DEDICATED supporting 1 keyword or group of keywords (keyword categories).
Don’t want a keyword with no page for content about it
often a site will have multiple pages optimized around theS SAME keyword — oops
once you have keywords, next important thing to do — go thru a drafting process
match each keyword to each URL - it’s either a page you have or a page you need to create
keyword that needs somewhere to map, now you have a content strategy

content to keyword mapping

Targeting
Do have 1 clear page crafted to support a keyword. Don’t blur keyword targets between pages (it confuses google)

Intent
Do understand user intent for visiting
don’t forget to write the persona and consumer stage

Page Type
Do use taxonomy to your advantage

Google: for generic terms your domain will get ONE listing.
need to tell search enginges “what is the primary page for this term?”
don’t want to create confusion for other pages also optimized for this term
“keyword blurring”

What pages you allow google to index/rank, treat that as an opt-in experience. If a page doesn’t have a marketing purpose (or it’s too similar to other pages), don’t be afraid to block those pages from search engines.  Your site might have more pages about a topic, but you don’t have to show all of those to Google. Various ways to block SE from indexing/caching a page.

Firmofthefuture.com case study

Latent Semantic Indexing
what are the words driven 90% by this niche audience that we want to talk to?

How can we bring in a relevant audience and answer the questions they might be looking for, and put other valuable ideas/info in front of them.
“come in, we’ll tell you what you THINK you need to know, then we’ll tell you what you REALLY need to know”

Okay experience: “got what i needed, then left”
Better experience: “got someone in, provided what they’re looking for then create a great content experience as i discover more info about the person. customize content experience based on what we learn about the visitor”

Push / Pull
Search is all “pull”. can’t convince someone to do it, just have the opportunity to show up

How do we make sure we have content that answers their info and build the “post-click experience” to educate them on the things we think they REALLY need to know.

keyword discovery -
its own conversation
which keywords have the highest level of the core audience we’re looking for

get your list of keywords, could be 1000’s
then map ALL those keywords to a page (painstaking process)

categories

themed tag pages
“we have 5 articles about Quickbooks tips and tricks”
if we have lots of articles at a post level about the same topic, which one will google choose? creating  a TAG PAGE is an important middle ground — curated search results for a term, so no ONE article get all the traffic, and multiple articles don’t confuse Google
On each page, add a “tag page” in the footer, that link goes the curated list of search results all about that subject matter. now we have this “themed page” - very healthy way to help the search engines understand things that aren’t SO broad for a category, but not SO niche for a post

posts

keep every post on a site within “3 crawl levels” of a home page
pagination - older content gets not found as oftnen. what if it’s still relevant?
tag theme pages - this helps “flatten” the architecture of a site better than just a sitemap or pagination does.

canonical tags - the better version of this page is over at this other URL now (going from 2015 product to the 2016 model and wanting to direct traffic there instead)

when you retire pages/products, you need a takedown strategy
301 redirects
assign the authority/trust of the old URL over to this new one

if other people have linked to a page, that’s VERY valuable. don’t want to lose “link credits”
301 redirects maintain that credit

canonical tag
still a newer phenomenon
came out 5-ish years ago
basically says “the definitive version of this page is [here]”
“you’re’ selling fast running shows, all this content for it but a param version of “green, red, orange”, etc on different pages. only things different are those colors. but allowing 8 different versions being shown will have a “blurring problem”. still want the pages to exist so when people click they can see all the versions
for search engines, you want “all these colors all point back to this same version of the page” - so enginges see “multiple SLIGHT variations of the page exist, but the main one you should assign credit to is the one that the canonical tag points to”

canonical tag - first method people usually use
it’s a recommended but not well respected by SE’s yet. not idea. but that’s usually how people do it
more aggressive thing - take old page, put up new, 301 redirect old urls over to the new, and republish the old docs over to a new URL that se/s don’t trust
helps to force popularity onto your new pages
can block the new “old product” pages from being crawled via robots.txt, or robots meta tag in the page (noindex follow) — means “don’t index this page but DO follow all the links on it so the “better” pages get followed”

duplicate content
1 of the main issues that SEO experts battle
site has same content on multiple URLs (intentionally or otherwise). run into the same issue: which page is the right page for Google
canonical tag that works across domains - “we’re publishing this here but the ORIGINAL version os over on this other URL”

in Google
site:[domain name] “some special phrase for this site”
can see which pages are ranked on a particular site for this phrase, can find what Google considers duplicate content, etc.

page scrapers
google has gotten better at understanding these but at the end of the day it can really hurt
can go legal route - C and D letter, but good luck
elsewhere in the world, hard to control those

think of it like an oil change
have to just refresh your content every now and then as you monitor it to see if it’s being scraped and republished
cheaper to write new content on the main pages than it is to go the legal route