Thursday, June 30, 2016

SearchCap: Dynamic search ads, Google Keyword Planner & e-commerce SEO

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Dynamic search ads, Google Keyword Planner & e-commerce SEO appeared first on Search Engine Land.



from
http://searchengineland.com/searchcap-dynamic-search-ads-google-keyword-planner-e-commerce-seo-253121

Google says Dynamic Search Ad targeting will soon get better

Query matching set to improve over several months.

The post Google says Dynamic Search Ad targeting will soon get better appeared first on Search Engine Land.



from
http://searchengineland.com/google-says-dynamic-search-ad-targeting-will-get-better-253117

Learn how to drive more conversions with “The Digital Marketer’s Guide to Call Attribution”

Thanks to smartphones and click-to-call, consumers are responding to digital marketing by calling businesses by the billions. This guide from DialogTech will explain why call conversions have become so important to the success of digital marketing and introduce you to call attribution software – what it is, how it works, and its benefits for digital […]

The post Learn how to drive more conversions with “The Digital Marketer’s Guide to Call Attribution” appeared first on Search Engine Land.



from
http://searchengineland.com/learn-drive-conversions-digital-marketers-guide-call-attribution-253105

What the heck is going on with Google Keyword Planner?

Technical glitches and close variants (sometimes) come to Keyword Planner

The post What the heck is going on with Google Keyword Planner? appeared first on Search Engine Land.



from
http://searchengineland.com/heck-going-google-keyword-planner-253062

SMX Advanced recap: Using Paid Search & Social Together to Deliver the Ultimate Knockout Punch

Columnist Amy Bishop shares details from a session at SMX Advanced in which panelists discussed how to best collaborate with cross-functional teams to build integrated paid search and paid social strategies.

The post SMX Advanced recap: Using Paid Search & Social Together to Deliver the Ultimate Knockout Punch appeared first on Search Engine Land.



from
http://searchengineland.com/smx-advanced-recap-using-paid-search-social-together-deliver-ultimate-knockout-punch-252910

How e-commerce SEO matters in strategic redesign of web shops

Looking to (re)launch your e-commerce website? Columnist Trond Lyngbø urges you to involve an SEO expert before you make any decisions.

The post How e-commerce SEO matters in strategic redesign of web shops appeared first on Search Engine Land.



from
http://searchengineland.com/e-commerce-seo-matters-strategic-redesign-web-shops-252431

Google Maps Android users get multi-stop directions & new Your Timeline features

Currently available only on Android, Google says the new features will be coming soon to iOS.

The post Google Maps Android users get multi-stop directions & new Your Timeline features appeared first on Search Engine Land.



from
http://searchengineland.com/google-maps-lets-android-users-get-directions-multiple-stops-one-route-253085

SMX Advanced recap: Dr. Pete’s Guide To The Changing Google SERPs

How have the SERPs changed in recent years, and what's on the horizon? Contributor Dan Leibson summarizes Dr. Pete Meyers' presentation from SMX Advanced 2016.

The post SMX Advanced recap: Dr. Pete’s Guide To The Changing Google SERPs appeared first on Search Engine Land.



from
http://searchengineland.com/smx-advanced-recap-dr-petes-guide-changing-serps-252999

Leverage the power of IBM Watson in your AdWords campaigns

Columnist Russell Savage shows how you can integrate information from IBM Watson's API into your AdWords scripts to get keyword information from webpages.

The post Leverage the power of IBM Watson in your AdWords campaigns appeared first on Search Engine Land.



from
http://searchengineland.com/leverage-power-ibm-watson-adwords-campaigns-252591

Google now offers real time earthquake information in search results

Google announced they are now showing richer earthquake related information right at the top of the search results for earthquake related queries. They want to give people who feel a tremor quick and authoritative information about what they just felt. Google said the “information will include a summary of the size of the quake, a […]

The post Google now offers real time earthquake information in search results appeared first on Search Engine Land.



from
http://searchengineland.com/google-now-offers-real-time-earthquake-information-search-results-253072

Spotted in Google PLAs: “Special Offers” filter & an ad that links directly to a new Google Shopping page

A new landing page on Google Shopping flows a lot like a product page on Amazon with product details, seller details, reviews, links to related items and more on a single page.

The post Spotted in Google PLAs: “Special Offers” filter & an ad that links directly to a new Google Shopping page appeared first on Search Engine Land.



from
http://searchengineland.com/google-shopping-special-offers-link-direct-google-shopping-253022

Search for Okay Google commands available in Google Search App

Kristijan Ristovski put together a list of 150 commands you can use with Google voice search at ok-google.io.

The post Search for Okay Google commands available in Google Search App appeared first on Search Engine Land.



from
http://searchengineland.com/search-okay-google-commands-available-google-search-app-253068

The Functional Content Masterplan – Own the Knowledge Graph Goldrush with this On-Page Plan

Posted by SimonPenson

[Estimated read time: 17 minutes]

On-page content is certainly not one of the sexier topics in digital marketing.

Lost in the flashing lights of "cool digital marketing trends" and things to be seen talking about, it's become the poor relative of many a hyped "game-changer."

I’m here to argue that, in being distracted by the topics that may be more "cutting-edge," we're leaving our most valuable assets unloved and at the mercy of underperformance.

This post is designed not only to make it clear what good on-page content looks like, but also how you should go about prioritizing which pages to tackle first based on commercial opportunity, creating truly customer-focused on-page experiences.

What is "static" or "functional" content?

So how am I defining static/functional content, and why is it so important to nurture in 2016? The answer lies in the recent refocus on audience-centric marketing and Google’s development of the Knowledge Graph.

Whether you call your on-page content "functional," "static," or simply "on-page" content, they're all flavors of the same thing: content that sits on key landing pages. These may be category pages or other key conversion pages. The text is designed to help Google understand the relevance of the page and/or help customers with their buying decisions.

Functional content has other uses as well, but today we're focusing on its use as a customer-focused conversion enhancement and discovery tactic.

And while several years ago it would have been produced simply to aid a relatively immature Google to "find" and "understand," the focus is now squarely back on creating valuable user experiences for your targeted audience.

Google’s ability to better understand and measure what "quality content" really looks like — alongside an overall increase in web usage and ease-of-use expectation among audiences — has made key page investment as critical to success on many levels.

We should now be looking to craft on-page content to improve conversion, search visibility, user experience, and relevance — and yes, even as a technique to steal Knowledge Graph real estate.

The question, however, is "how do I even begin to tackle that mountain?"

Auditing what you have

For those with large sites, the task of even beginning to understand where to start with your static content improvement program can be daunting. Even if you have a small site of a couple of hundred pages, the thought of writing content for all of them can be enough to put you off even starting.

As with any project, the key is gathering the data to inform your decision-making before simply "starting." That’s where my latest process can help.

Introducing COAT: The Content Optimization and Auditing Tool

To help the process along, we’ve been using a tool internally for months — for the first time today, there's now a version that anyone can use.

This link will take you to the new Content Optimisation and Auditing Tool (COAT), and below I’ll walk through exactly how we use it to understand the current site and prioritize areas for content improvement. I'll also walk you through the manual step-by-step process, should you wish to take the scenic route.

The manual process

If you enjoy taking the long road — maybe you feel an extra sense of achievement in doing so — then let's take a look at how to pull the data together to make data-informed decisions around your functional content.

As with any solid piece of analysis, we begin with an empty Excel doc and, in this case, a list of keywords you feel are relevant to and important for your business and site.

In this example, we'll take a couple of keywords and our own site:

Keywords:

Content Marketing Agency
Digital PR

Site:

www.zazzlemedia.co.uk

Running this process manually is labor-intensive (hence the need to automate it!) and to add dozens more keywords creates a lot of work for little extra knowledge gain, but by focusing on a couple you can see how to build the fuller picture.

Stage one

We start by adding our keywords to our spreadsheet alongside a capture of the search volume for those terms and the actual URL ranking, as shown below (NOTE: all data is for google.co.uk).

Next we add in ranking position...

We then look to the page itself and give each of the key on-page elements a score based on our understanding of best practice. If you want to be really smart, you can score the most important factors out of 20 and those lesser points out of 10.

In building our COAT tool to enable this to be carried out at scale across sites with thousands of pages, we made a list of many of the key on-page factors we know to affect rank and indeed conversion. They include:

  • URL optimization
  • Title tag optimization and clickability
  • Meta description optimization and clickability
  • H1, H2, and H3 optimization and clickability (as individual scores)
  • Occurences of keyword phrases within body copy
  • Word count
  • Keyword density
  • Readability (as measured by the Flesch-Kincaid readability score)

This is far from an exhaustive list, but it's a great place to start your analysis. The example below shows an element of this scored:

Once you have calculated score for every key factor, your job is to then to turn this into an average, weighted score out of 100. In this case, you can see I've done this across the listed factors and have a final score for each keyword and URL:

Stage two

Once you have score for a larger number of pages and keywords, it's then possible to begin organizing your data in a way that helps prioritise action.

You can do this simply enough by using filters and organising the table by any number of combinations.

You may want to sort by highest search volume and then by those pages ranking between, say, 5th and 10th position.

Doing this enables you to focus on the pages that may yield the most potential traffic increase from Google, if that is indeed your aim.

Working this way makes it much easier to work in a way that delivers the largest positive net impact fastest.

Doing it at scale

Of course, if you have a large site with tens (or even hundreds) of thousands of pages, the manual option is almost impossible — which is why we scratched our heads and looked for a more effective option. The result was the creation of our Content Auditing and Optimisation Tool. Here's how you can make use of it to paint a fuller picture of your entire site.

Here's how it works

When it comes to using COAT, you follow a basic process:

  • Head over to the tool.
  • Enter your domain, or a sub-directory of the site if you'd like to focus on a particular section
  • Add the keywords you want to analyze in a comma-separated list
  • Click "Get Report," making sure you've chosen the right country

Next comes the smart bit: by adding target keywords to the system before it crawls, it enables the algorithm to cross-reference all pages against those phrases and then score each combination against a list of critical attributes you'd expect the "perfect page" to have.

Let’s take an example:

You run a site that sells laptops. You enter a URL for a specific model, such as /apple-15in-macbook/, and a bunch of related keywords, such as "Apple 15-inch MacBook" and "Apple MacBook Pro."

The system works out the best page for those terms and measures the existing content against a large number of known ranking signals and measures, covering everything from title tags and H1s to readability tests such as the Flesch-Kincaid system.

This outputs a spreadsheet that scores each URL or even categories of URLs (to allow you to see how well-optimized the site is generally for a specific area of business, such as Apple laptops), enabling you to sort the data, discover the pages most in need of improvement, and identify where content gaps may exist.

In a nutshell, it'll provide:

  • What the most relevant target page for each keyword is
  • How well-optimized individual pages are for their target keywords
  • Where content gaps exist within the site’s functional content

It also presents the top-level data in an actionable way. An example of the report landing page can be seen below (raw CSV downloads are also available — more on that in a moment).

You can see the overall page score and simple ways to improve it. This is for our "Digital PR" keyword:

The output

As we've already covered in the manual process example, in addition to pulling the "content quality scores" for each URL, you can also take the data to the next level by adding in other data sources to the mix.

The standard CSV download includes data such as keyword, URL, and scores for the key elements (such as H1, meta, canonical use and static content quality).

This level of detail makes it possible to create a priority order for fixes based on lowest-scoring pages easily enough, but there are ways you can supercharge this process even more.

The first thing to do is run a simple rankings check using your favorite rank tracker for those keywords and add them into a new column in your CSV. It'll look a little like this (I've added some basic styling for clarity):

I also try to group keywords by adding a third column using a handful of grouped terms. In this example, you can see I'm grouping car model keywords with brand terms manually.

Below, you'll see how we can then group these terms together in an averaged cluster table to give us a better understanding of where the keyword volume might be from a car brand perspective. I've blurred the keyword grouping column here to protect existing client strategy data.

As you can see from the snapshot above, we now have a spreadsheet with keyword, keyword group, search volume, URL, rank, and the overall content score pulled in from the base Excel sheet we have worked through. From this, we can do some clever chart visualization to help us understand the data.

Visualizing the numbers

To really understand where the opportunity lies and to take this process past a simple I’ll-work-on-the-worst-pages-first approach, we need to bring it to life.

This means turning our table into a chart. We'll utilize the chart functionality within Excel itself.

Here's an example of the corresponding chart for the table shown above, showing performance by category and ranking correlation. We're using dummy data here, but you can look at the overall optimization score for each car brand section alongside how well they rank (the purple line is average rank for that category):

If we focus on the chart above, we can begin to see a pattern between those categories that are better optimized and generally have better rankings. Correlation does not always equal causation, as we know, but it's useful information.

Take the very first column, or the Subaru category. We can see that it's one of the better-optimized categories (at 49%) and average rank is at 34.1. Now, these are hardly record-breaking positions, but it does point towards the value of well-worked static pages.

Making the categories as granular as possible can be very valuable here, as you can quickly build up a focused picture of where to put your effort to move the needle quickly. The process for doing so is an entirely subjective one, often based on your knowledge of your industry or your site information architecture.

Add keyword volume data into the mix and you know exactly where to build your static content creation to-do list.

Adding in context

Like any data set, however, it requires a level of benchmarking and context to give you the fullest picture possible before you commit time and effort to the content improvement process.

It’s for this reason that I always look to run the same process on key competitors, too. An example of the resulting comparison charts can be seen below.

The process is relatively straightforward: take an average of all the individual URL content scores, which will give you a "whole domain" score. Add competitors by repeating the process for their domain.

You can take a more granular view manually by following the same process for the grouped keywords and tabulating the result. Below, we can see how our domain sizes up against those same two competitors for all nine of our example keyword groups, such as the car brands example we looked at earlier.

With that benchmark data in place, you can move on to the proactive improvement part of the process.

The perfect page structure

Having identified your priority pages, the next step is to ensure you edit (or create them) in the right way to maximize impact.

Whereas a few years ago it was all about creating a few paragraphs almost solely for the sake of helping Google understand the page, now we MUST be focused on usability and improving the experience for the right visitor.

This means adding value to the page. To do that, you need to stand back and really focus in on the visitor: how they get to the page and what they expect from it.

This will almost always involve what I call "making the visitor smarter": creating content that ensures they make better and more informed buying decisions.

To do that requires a structured approach to delivering key information succinctly and in a way that enhances — rather than hinders — the user journey.

The best way of working through what that should look like is to share a few examples of those doing it well:

1. Tredz Top 5 Reviews

Tredz is a UK cycling ecommerce business. They do a great job of understanding what their audience is looking for and ensuring they're set up to make them smarter. The "Top 5" pages are certainly not classic landing pages, but they're brilliant examples of how you can sell and add value at the same time.

Below is the page for the "Top 5 hybrids for under £500." You can clearly see how the URL (http://www.tredz.co.uk/top-5-hybrids-under-500), meta, H tags, and body copy all support this focus and are consistently aligned:

2. Read it for me

This is a really cool business concept and they also do great landing pages. You get three clear reasons to try them out — presented clearly and utilizing several different content types — all in one package.

3. On Stride Financial

Finance may not be where you'd expect to see amazing landing pages, but this is a great example. Not only is it an easy-to-use experience, it answers all the user's key questions succinctly, starting with "What is an installment loan?" It's also structured in a way to capture Knowledge Graph opportunity — something we'll come to shortly.

Outside of examples like these and supporting content, you should be aiming to

create impactful headlines, testimonials (where appropriate), directional cues (so it's clear where to "go next"), and high-quality images to reflect the quality of your product or services.

Claiming Knowledge Graph

There is, of course, one final reason to work hard on your static pages. That reason? To claim a massively important piece of digital real estate: Google Featured Snippets.

Snippets form part of the wider Knowledge Graph, the tangible visualization of Google’s semantic search knowledge base that's designed to better understand the associations and entities behind words, phrases, and descriptions of things.

The Knowledge Graph comes in a multitude of formats, but one of the most valuable (and attainable from a commercial perspective) is the Featured Snippet, which sits at the top of the organic SERP. An example can be seen below from a search for "How do I register to vote" in google.co.uk:

In recent months, Zazzle Media has done a lot of work on landing page design to capture featured snippets with some interesting findings, most notably the level of extra traffic such a position can achieve.

Having now measured dozens of these snippets, we see an average of 15–20% extra traffic from them versus a traditional position 1. That’s a definite bonus, and makes the task of claiming them extremely worthwhile.

You don’t have to be first

The best news? You don’t even have to be in first position to be considered for a snippet. Our own research shows us that almost 75% of the examples we track have been claimed by pages ranked between 2nd and 10th position. It's far from being robust enough yet for us to formalize a full report on it, but early indication across more than 900 claimed snippets (heavily weighted to the finance sector at present) support these early findings.

Similar research by search data specialists STAT has also supported this theory, revealing that objective words are more likely to appear. General question and definition words (like "does," "cause," and "definition") as well as financial words (like "salary," "average," and "cost") are likely to trigger a featured snippet. Conversely, the word "best" triggered zero featured snippets in over 20,000 instances.

This suggests that writing in a factual way is more likely to help you claim featured results.

Measuring what you already have

Before you run into this two-footed, you must first audit what you may (or may not) already have. If you run a larger site, you may already have claimed a few snippets by chance, and with any major project it's important to benchmark before you begin.

Luckily, there are a handful of tools out there to help you discover what you already rank for. My favorite is SEMrush.

The paid-for tool makes it easy to find out if you rank for any featured snippets already. I'd suggest using it to benchmark and then measure the effect of any optimization and content reworking you do as a result of the auditing process.

Claiming Featured Snippets

Claiming your own Featured Snippet then requires a focus on content structure and on answering key questions in a logical order. This also means paying close attention to on-page HTML structure to ensure that Google can easily and cleanly pick out specific answers.

Let’s look at a few examples showing that Google can pick up different types of content for different types of questions.

1. The list

One of the most prevalent examples of Featured Snippets is the list.

As you can see, Media Temple has claimed this incredibly visual piece of real estate simply by creating an article with a well-structured, step-by-step guide to answer the question:

"How do I set up an email account on my iPhone?"

If we look at how the page is formatted, we can see that the URL matches the search almost exactly, while the H1 tag serves to reinforce the relevance still further.

As we scroll down we find a user-friendly approach to the content, with short sentences and paragraphs broken up succinctly into sections.

This allows Google to quickly understand relevance and extract the most useful information to present in search; in this case, the step-by-step how-to process to complete the task.

Here are the first few paragraphs of the article, highlighting key structural elements. Below this is the list itself that's captured in the above Featured Snippet:

2. The table

Google LOVES to present tables; clearly there's something about the logical nature of how the data is presented that resonates with its team of left-brained engineers!

In the example below, we see a site listing countries by size. Historically, this page may well not have ranked so highly (it isn’t usually the page in position one that claims the snippet result). Because of the ways it has structured the information so well, however, Geohive will be enjoying a sizable spike in traffic to the page.

The page itself looks like this — clear, concise and well-structured:

3. The definition

The final example is the description, or definition snippet; it's possibly the hardest to claim consistently.

It's difficult for two key reasons:

  • There will be lots of competition for the space and answering the search query in prose format.
  • It requires a focus on HTML structure and brilliantly crafted content to win.

In the example below, we can see a very good example of how you should be structuring content pages.

We start with a perfect URL (/what-is-a-mortgage-broker/) and this follows through to the H1 (What is a Mortgage Broker). The author then cleverly uses subheadings to extend the rest of the post into a thorough piece on the subject area. Subheadings include the key How, What, Where, and When areas of focus that any good journalism tutor will lecture you on using in any good article or story. Examples might include
  • So how does this whole mortgage broker thing work?
  • Mortgage brokers can shop the rate for you
  • Mortgage brokers are your loan guide
  • Mortgage broker FAQ

The result is a piece that leaves no stone unturned. Because of this, it's been shared plenty of times — a sure fire signal that the article is positively viewed by readers.

Featured Snippet Cheatsheet

Not being one to leave you alone to figure this out though, I have created this simple Featured Snippet Cheatsheet, designed to take the guesswork out of creating pages worthy of being selected for the Knowledge Graph.

Do it today!

Thanks for making it this far. My one hope is for you to go off and put this plan into action for your own site. Doing so will quickly transform your approach to both landing pages and to your ongoing content creation plan (but that’s a post for another day!).

And if you do have a go, remember to use the free COAT tool and guides associated with this article to make the process as simple as possible.

Content Optimization and Auditing Tool: Click to access


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from
http://tracking.feedpress.it/link/9375/3754336

Wednesday, June 29, 2016

SearchCap: Shopping campaigns, common SEO mistakes & a product search survey

Below is what happened in search today, as reported on Search Engine Land and from other places across the web. From Search Engine Land: SEO? SEM? SMX East has you covered. Jun 29, 2016 by Search Engine Land One thing is certain: profound changes are coming to your profession, whether you’re an SEO or SEM. […]

The post SearchCap: Shopping campaigns, common SEO mistakes & a product search survey appeared first on Search Engine Land.



from
http://searchengineland.com/searchcap-shopping-campaigns-common-seo-mistakes-product-search-survey-253045

SEO? SEM? SMX East has you covered.

One thing is certain: profound changes are coming to your profession, whether you’re an SEO or SEM. This year’s SMX East agenda was created to ensure that you’d succeed in spite of the dramatic developments in organic search marketing and paid search advertising. Get the details on this year’s agenda.             […]

The post SEO? SEM? SMX East has you covered. appeared first on Search Engine Land.



from
http://searchengineland.com/seo-sem-smx-east-covered-253042

Shopping campaigns: Play like every day is a holiday

What's ahead for shopping ads this holiday season and beyond? Columnist Alexander Paluch recaps a session from SMX Advanced focusing on what search marketers need to know.

The post Shopping campaigns: Play like every day is a holiday appeared first on Search Engine Land.



from
http://searchengineland.com/shopping-ads-buy-buttons-social-commerce-remarketing-smx-advanced-recap-252598

The definitive SEO audit part 2 of 3: Content and on-site

In part two of his three-part series on conducting a thorough SEO audit, columnist Dave Davies explains how to ensure that you have the right content optimized the right way.

The post The definitive SEO audit part 2 of 3: Content and on-site appeared first on Search Engine Land.



from
http://searchengineland.com/definitive-seo-audit-part-2-3-content-site-252492

5 more super-common SEO mistakes content marketers make

In a follow-up to last month's column, Stephan Spencer addresses some more common but avoidable SEO problems for content marketers to be aware of.

The post 5 more super-common SEO mistakes content marketers make appeared first on Search Engine Land.



from
http://searchengineland.com/5-super-common-seo-mistakes-content-marketers-make-2-252901

AMP: Above and beyond

Want to know more about Accelerated Mobile Pages (AMP)? Columnist Max Prin recaps a session from SMX Advanced 2016 featuring John Shehata of Condé Nast and Google's Rudy Galfi, product manager for the AMP Project.

The post AMP: Above and beyond appeared first on Search Engine Land.



from
http://searchengineland.com/smx-advanced-session-recap-amp-beyond-252780

The Balanced Digital Scorecard: A Simpler Way to Evaluate Prospects

Posted by EmilySmith

[Estimated read time: 10 minutes]

As anyone who's contributed to business development at an agency knows, it can be challenging to establish exactly what a given prospect needs. What projects, services, or campaigns would actually move the needle for this organization? While some clients come to an agency with specific requests, others are looking for guidance — help establishing where to focus resources. This can be especially difficult, as answering these questions often requires large amounts of information to be analyzed in a small period of time.

To address the challenge of evaluating prospective clients and prioritizing proposed work, we’ve developed the Balanced Digital Scorecard framework. This post is the first in a two-part series. Today, we'll look at:

  • Why we developed this framework,
  • Where the concept came from, and
  • Specific areas to review when evaluating prospects

Part two will cover how to use the inputs from the evaluation process to prioritize proposed work — stay tuned!

Evaluating potential clients

Working with new clients, establishing what strategies will be most impactful to their goals... this is what makes working at an agency awesome. But it can also be some of the most challenging work. Contributing to business development and pitching prospects tends to amplify this with time constraints and limited access to internal data. While some clients have a clear idea of the work they want help with, this doesn’t always equal the most impactful work from a consultant's standpoint. Balancing these needs and wants takes experience and skill, but can be made easier with the right framework.

The use of a framework in this setting helps narrow down the questions you need to answer and the areas to investigate. This is crucial to working smarter, not harder — words which we at Distilled take very seriously. Often when putting together proposals and pitches, consultants must quickly establish the past and present status of a site from many different perspectives.

  • What type of business is this and what are their overall goals?
  • What purpose does the site serve and how does it align with these goals?
  • What campaigns have they run and were they successful?
  • What does the internal team look like and how efficiently can they get things done?
  • What is the experience of the user when they arrive on the site?

The list goes on and on, often becoming a vast amount of information that, if not digested and organized, can make putting the right pitch together burdensome.

To help our consultants understand both what questions to ask and how they fit together, we've adapted the Balanced Scorecard framework to meet our needs. But before I talk more about our version, I want to briefly touch on the original framework to make sure we’re all on the same page.

airplane-quote-kaplan-norton.png

The Balanced Scorecard

For anyone not familiar with this concept, the Balanced Scorecard was created by Robert Kaplan and David Norton in 1992. First published in the Harvard Business Review, Kaplan and Norton set out to create a management system, as opposed to a measurement system (which was more common at that time).

Kaplan and Norton argued that "the traditional financial performance measures worked well for the industrial era, but they are out of step with the skills and competencies companies are trying to master today." They felt the information age would require a different approach, one that guided and evaluated the journey companies undertook. This would allow them to better create "future value through investment in customers, suppliers, employees, processes, technology, and innovation."

The concept suggests that businesses be viewed through four distinct perspectives:

  • Innovation and learning – Can we continue to improve and create value?
  • Internal business – What must we excel at?
  • Customer – How do customers see us?
  • Financial – How do we look to shareholders?

Narrowing the focus to these four perspectives reduces information overload. “Companies rarely suffer from having too few measures,” wrote Kaplan and Norton. “More commonly, they keep adding new measures whenever an employee or a consultant makes a worthwhile suggestion.” By limiting the perspectives and associated measurements, management is forced to focus on only the most critical areas of the business.

This image below shows the relations of each perspective:

balanced scorecard graphic .gif

And now, with it filled out as an example:

92105_B.gif

As you can see, this gives the company clear goals and corresponding measurements.

Kaplan and Norton found that companies solely driven by financial goals and departments were unable to implement the scorecard, because it required all teams and departments to work toward central visions — which often weren’t financial goals.

“The balanced scorecard, on the other hand, is well suited to the kind of organization many companies are trying to become... put[ting] strategy and vision, not control, at the center,” wrote Kaplan and Norton. This would inevitably bring teams together, helping management understand the connectivity within the organization. Ultimately, they felt that “this understanding can help managers transcend traditional notions about functional barriers and ultimately lead to improved decision-making and problem-solving.”

At this point, you’re probably wondering why this framework matters to a digital marketing consultant. While it's more directly suited for evaluating companies from the inside, so much of this approach is really about breaking down the evaluation process into meaningful metrics with forward-looking goals. And this happens to be very similar to evaluating prospects.

Our digital version

As I mentioned before, evaluating prospective clients can be a very challenging task. It’s crucial to limit the areas of investigation during this process to avoid getting lost in the weeds, instead focusing only on the most critical data points.

Since our framework is built for evaluating clients in the digital world, we have appropriately named it the Balanced Digital Scorecard. Our scorecard also has main perspectives through which to view the client:

  1. Platform – Does their platform support publishing, discovery, and discoverability from a technical standpoint?
  2. Content – Are they publishing content which combines appropriate blends of effective, informative, entertaining, and compelling?
  3. Audience – Are they building visibility through owned, earned, and paid media?
  4. Conversions – Do they have a deep understanding of the needs of the market, and are they creating assets, resources, and journeys that drive profitable customer action?
  5. Measurement – Are they measuring all relevant aspects of their approach and their prospects’ activities to enable testing, improvement, and appropriate investment?

These perspectives make up the five areas of analysis to work through when evaluating most prospective clients.

1. Platform

Most consultants or SEO experts have a good understanding of the technical elements to review in a standard site audit. A great list of these can be found on our Technical Audit Checklist, created by my fellow Distiller, Ben Estes. The goal of reviewing these factors is of course to “ensure site implementation won’t hurt rankings” says Ben. While you should definitely evaluate these elements (at a high level), there is more to look into when using this framework.

Evaluating a prospect’s platform does include standard technical SEO factors but also more internal questions, like:

  • How effective and/or differentiated is their CMS?
  • How easy is it for them to publish content?
  • How differentiated are their template levels?
  • What elements are under the control of each team?

Additionally, you should look into areas like social sharing, overall mobile-friendliness, and site speed.

If you’re thinking this seems like quite the undertaking because technical audits take time and some prospects won’t be open with platform constraints, you’re right (to an extent). Take a high-level approach and look for massive weaknesses instead of every single limitation. This will give you enough information to understand where to prioritize this perspective in the pitch.

2. Content

Similar to the technical section, evaluating content looks similar to a lightweight version of a full content audit. What content do they have, which pieces are awesome and what is missing? Also look to competitors to understand who is creating content in the space and what level the bar is set at.

Beyond looking at these elements through a search lens, aim to understand what content is being shared and why. Is this taking place largely on social channels, or are publications picking these pieces up? Evaluating content on multiple levels helps to understand what they've created in the past and their audience’s response to it.

3. Audience

Looking into a prospect’s audience can be challenging depending on how much access they grant you during the pitch process. If you’re able to get access to analytics this task is much easier but without it, there are many tools you can leverage to get some of the same insights.

In this section, you’re looking at the traffic the site is receiving and from where. Are they building visibility through owned, earned, and paid media outlets? How effective are those efforts? Look at metrics like Search Visibility from SearchMetrics, social reach, and email stats.

A large amount of this research will depend on what information is available or accessible to you. As with previous perspectives, you're just aiming to spot large weaknesses.

4. Conversion

Increased conversions are often a main goal stated by prospects, but without transparency from them, this can be very difficult to evaluate during a pitch. This means that often you’re left to speculate or use basic approaches. How difficult or simple is it to buy something, contact them, or complete a conversion in general? Are there good calls to action to micro-conversions such as joining an email list? How much different is the mobile experience of this process?

Look at the path to these conversions. Was there a clear funnel and did it make sense from a user’s perspective? Understanding the journey a user takes (which you can generally experience first-hand) can tell you a lot about expected conversion metrics.

Lastly, many companies’ financials are available to the public and offer a general idea of how the company is doing. If you can establish how much of their business takes place online, you can start to speculate about the success of their web presence.

5. Measurement

Evaluating a prospect’s measurement capabilities is (not surprisingly) vastly more accurate with analytics access. If you’re granted access, evaluate each platform not just for validity but also accessibility. Are there useful dashboards, management data, or other data sources that teams can use to monitor and make decisions?

Without access, you’re left to simply check and see the presence of analytics and if there is a data layer. While this doesn’t tell you much, you can often deduce from conversations how much data is a part of the internal team’s thought process. If people are monitoring, engaging, and interested in analytics data, changes and prioritization might be an easier undertaking.

what-you-measure-quote.png

Final thoughts

Working with prospective clients is something all agency consultants will have to do at some point in their career. This process is incredibly interesting — it challenges you to leverage a variety of skills and a range of knowledge to evaluate new clients and industries. It's also a daunting task. Often your position outside the organization or unfamiliarity with a given industry can make it difficult to know where to start.

Frameworks like the original Balanced Scorecard created by Kaplan and Norton were designed to help a business evaluate itself from a more modern and holistic perspective. This approach turns the focus to future goals and action, not just evaluation of the past.

This notion is crucial at an agency needing to establish the best path forward for prospective clients. We developed our own framework, the Balanced Digital Scorecard, to help our consultants do just that. By limiting the questions you’re looking to answer, you can work smarter and focus your attention on five perspectives to evaluate a given client. Once you've reviewed these, you’re able to identify which ones are lagging behind and prioritize proposed work accordingly.

Next time, we’ll cover the second part: how to use the Balanced Digital Scorecard to prioritize your work.

If you use a framework to evaluate prospects or have thoughts on the Balanced Digital Scorecard, I’d love to hear from you. I welcome any feedback and/or questions!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from
http://tracking.feedpress.it/link/9375/3744615

Tuesday, June 28, 2016

Survey: Amazon beats Google as starting point for product search

Thirty eight percent of shoppers start with Amazon, 35 percent start with Google.

The post Survey: Amazon beats Google as starting point for product search appeared first on Search Engine Land.



from
http://searchengineland.com/survey-amazon-beats-google-starting-point-product-search-252980

SearchCap: Google’s internet speed testing tool, Landy Awards call for entries & more

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Google’s internet speed testing tool, Landy Awards call for entries & more appeared first on Search Engine Land.



from
http://searchengineland.com/searchcap-google-internet-speed-tool-landy-awards-252967

SMX Advanced 2016 recap: Harnessing SEM analytics for smarter automation

Columnist Christi Olson provides tips and advice from PPC experts on how search marketers can make SEM automation work for them.

The post SMX Advanced 2016 recap: Harnessing SEM analytics for smarter automation appeared first on Search Engine Land.



from
http://searchengineland.com/smx-advanced-2016-recap-harnessing-sem-analytics-smarter-automation-252574

How “Do It With Me” marketing services can help SMBs access the latest marketing practices

Small and medium-sized businesses (SMBs) are the new frontier for digital services. The enterprise is armed with excellent digital marketing tools that need to be right-sized to SMB needs and budgets before a small business will enjoy the same digital advantages. This BIA/Kelsey report, sponsored by Vendasta, introduces an emerging online services category which will […]

The post How “Do It With Me” marketing services can help SMBs access the latest marketing practices appeared first on Search Engine Land.



from
http://searchengineland.com/marketing-services-can-help-smbs-access-latest-marketing-practices-252948

7 types of keywords to boost your SEO strategy

Wondering what kinds of keywords you should be targeting? Columnist Ryan Shelley shows how to categorize keywords using a personalized and industry-focused approach.

The post 7 types of keywords to boost your SEO strategy appeared first on Search Engine Land.



from
http://searchengineland.com/7-types-keywords-boost-seo-strategy-252249

Ask The SEOs at SMX Advanced

Unable to attend SMX Advanced this year? Columnist Andrew Shotland recaps the "Ask The SEOs" session for those who missed it.

The post Ask The SEOs at SMX Advanced appeared first on Search Engine Land.



from
http://searchengineland.com/ask-seos-smx-advanced-2-252563

Google testing new tool to check internet speeds directly in search results

Powered by Measurement Labs, Google's latest widget was surfaced via a search for "check internet speed."

The post Google testing new tool to check internet speeds directly in search results appeared first on Search Engine Land.



from
http://searchengineland.com/google-testing-internet-speed-tool-directly-search-results-252902

Why Google is mining local business attributes

What are business attributes, and why should local businesses care? Columnist Adam Dorfman explores.

The post Why Google is mining local business attributes appeared first on Search Engine Land.



from
http://searchengineland.com/google-mining-local-business-attributes-252283

Final call for entries: 2016 Landy Awards

Time is running out to enter the second annual Search Engine Land Awards! Entries close Thursday, June 30th at 11:59 PM PST. Complete your submissions today.

The post Final call for entries: 2016 Landy Awards appeared first on Search Engine Land.



from
http://searchengineland.com/final-call-entries-2016-landyawards-252801

Chinese government cracks down on search ads and “banned content”

New rules to protect public but tighten government control over search results.

The post Chinese government cracks down on search ads and “banned content” appeared first on Search Engine Land.



from
http://searchengineland.com/chinese-government-cracks-search-ads-banned-content-252876

How to Measure the True Performance of Your Content

  We’ve all heard the stats: This year, marketers plan to produce 70 percent more content than they did in 2015. Content marketing has become the most significant part of the marketing mix and is showing no signs of slowing down. However, brands and agencies often still struggle to really define their ROI. One primary […]

The post How to Measure the True Performance of Your Content appeared first on Search Engine Land.



from
http://searchengineland.com/measure-true-performance-content-252795

Google opens Customer Match to Shopping Campaigns

Advertisers will be able to retarget customers with product listing ads.

The post Google opens Customer Match to Shopping Campaigns appeared first on Search Engine Land.



from
http://searchengineland.com/google-adwords-customer-match-shopping-campaigns-252873

10 Illustrations of How Fresh Content May Influence Google Rankings (Updated)

Monday, June 27, 2016

SearchCap: Google EU charges, Bing Ads changes & keyword planner bugs

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Google EU charges, Bing Ads changes & keyword planner bugs appeared first on Search Engine Land.



from
http://searchengineland.com/searchcap-google-eu-charges-bing-ads-changes-keyword-planner-bugs-252804

Bing Ads’ Lynne Kjolso talks growth after Yahoo, LinkedIn, voice search & quick wins

In a keynote discussion at SMX Advanced, Lynne Kjolso, General Manager, Global Search Sales and Service at Microsoft, talked about how Bing Ads is evolving for mobile and more.

The post Bing Ads’ Lynne Kjolso talks growth after Yahoo, LinkedIn, voice search & quick wins appeared first on Search Engine Land.



from
http://searchengineland.com/bing-ads-smx-advanced-keynote-lynne-kjolso-252691

Google clarifies Keyword Planner doesn’t require an active campaign

Google says a technical glitch is causing some users to receive error messages when trying to access Keyword Planner from AdWords.

The post Google clarifies Keyword Planner doesn’t require an active campaign appeared first on Search Engine Land.



from
http://searchengineland.com/google-clarifies-keyword-planner-doesnt-require-active-campaign-252767

The right way to get dynamic with Google AdWords

Want to create more personalized, more effective search ads? Columnist Todd Saunders discusses four dynamic ad varieties and how to make them work for you.

The post The right way to get dynamic with Google AdWords appeared first on Search Engine Land.



from
http://searchengineland.com/right-way-get-dynamic-google-adwords-252286

What’s the best attribution model For PPC?

Columnist Aaron Levy explores some common attribution models used by digital marketers. Which one is right for your business or client?

The post What’s the best attribution model For PPC? appeared first on Search Engine Land.



from
http://searchengineland.com/whats-best-attribution-model-ppc-252374

Google facing likelihood of third antitrust complaint in Europe

AdWords and search-box contracts the focus of potentially impending third Statement of Objections from European Commission.

The post Google facing likelihood of third antitrust complaint in Europe appeared first on Search Engine Land.



from
http://searchengineland.com/google-facing-likelihood-third-antitrust-complaint-europe-252746

“Enterprise Local Marketing Automation Platforms: A Marketer’s Guide” – Updated for 2016

Marketing Land and Digital Marketing Depot have published the 3rd edition of “Enterprise Local Marketing Automation Platforms: A Marketer’s Guide.” This MarTech Intelligence Report examines the market for local marketing automation software platforms and the considerations involved in implementation. This free, 50-page report reviews the growing market for local marketing automation platforms, plus the latest […]

The post “Enterprise Local Marketing Automation Platforms: A Marketer’s Guide” – Updated for 2016 appeared first on Search Engine Land.



from
http://searchengineland.com/enterprise-local-marketing-automation-platforms-marketers-guide-updated-2016-252657

Beware of shady link schemes from black-hat SEOs

Have you received an offer for a link that seems too good to be true? According to columnist Tony Edward, it probably is.

The post Beware of shady link schemes from black-hat SEOs appeared first on Search Engine Land.



from
http://searchengineland.com/beware-shady-link-building-black-hat-seos-252151

Google’s Sundeep Jain on the Expanded Text Ad roll out, device bidding, similar audiences & more

During a keynote discussion at SMX Advanced, Jain shared insights on how Expanded Text Ads will roll out and what advertisers should be working on ahead of the holidays.

The post Google’s Sundeep Jain on the Expanded Text Ad roll out, device bidding, similar audiences & more appeared first on Search Engine Land.



from
http://searchengineland.com/google-sundeep-jain-smx-keynote-252680

Is selling SEO services getting harder or are SEOs just not good at it?

In light of recent survey data from BrightLocal, columnist Myles Anderson shares tips for local search marketers looking to acquire new business.

The post Is selling SEO services getting harder or are SEOs just not good at it? appeared first on Search Engine Land.



from
http://searchengineland.com/selling-seo-services-getting-harder-seos-just-not-good-252279

A brief evolution of Search: out of the search box and into our lives

We live in a mobile-first, cloud-first world powered by technology that changes by the nanosecond. And search is no different. Search is changing in look, form and function to become part of the fabric of our everyday lives, barely recognizable from its inception as a text box.Advertisement By 2020, 50 percent of searches will be […]

The post A brief evolution of Search: out of the search box and into our lives appeared first on Search Engine Land.



from
http://searchengineland.com/brief-evolution-search-search-box-lives-252622

Predicting Intent: What Unnatural Outbound Link Penalties Could Mean for the Future of SEO

Posted by Angular

This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author's views are entirely his or her own and may not reflect the views of Moz, Inc.

[Estimated read time: 8 minutes]

As SEOs, we often find ourselves facing new changes implemented by search engines that impact how our clients' websites perform in the SERPs. With each change, it's important that we look beyond its immediate impact and think about its future implications so that we can try to answer this question: "If I were Google, why would I do that?"

Recently, Google implemented a series of manual penalties that affected sites deemed to have unnatural outbound links. Webmasters of affected sites received messages like this in Google Search Console:

Google Outbound Links Penalty

Webmasters were notified in an email that Google had detected a pattern of "unnatural artificial, deceptive, or manipulative outbound links." The manual action itself described the link as being either "unnatural or irrelevant."

The responses from webmasters varied in their usual extreme fashion, with recommendations ranging from "do nothing" to "nofollow every outbound link on your site."

Google's John Mueller posted in product forums that you don't need to nofollow every link on your site, but you should focus on nofollowing links that point to a product, sales, or social media page as the result of an exchange.

Now, on to the fun part of being an SEO: looking at a problem and trying to reverse-engineer Google's intentions to decipher the implications this could have on our industry, clients, and strategy.

The intent of this post is not to decry those opinions that this was specifically focused on bloggers who placed dofollow links on product/business reviews, but to present a few ideas to incite discussion as to the potential big-picture strategy that could be at play here.

A few concepts that influenced my thought process are as follows:

  • Penguin has repeatedly missed its "launch date," which indicates that Google engineers don't feel it's accurate enough to release into the wild.
Penguin Not Ready
  • The growth of negative SEO makes it even more difficult for Google to identify/penalize sites for tactics that are not implemented on their own websites.
  • Penguin temporarily impacted link-building markets in a way Google would want. The decline reached its plateau in July 2015, as shown in this graph from Google Trends:
    Trend of Link Building

If I were Google, I would expect webmasters impacted by the unnatural outbound links penalty to respond in one of these ways:

  1. Do nothing. The penalty is specifically stated to "discount the trust in links on your site." As a webmaster, do you really care if Google trusts the outbound links on your site or not? What about if the penalty does not impact your traffic, rankings, visibility, etc.? What incentive do you have to take any action? Even if you sell links, if the information is not publicly displayed, this does nothing to harm your link-selling business.
  2. Innocent site cleanup effort. A legitimate site that has not exchanged goods for links (or wants to pretend they haven't) would simply go through their site and remove any links that they feel may have triggered the issue and then maybe file a bland reconsideration request stating as much.
  3. Guilty site cleanup effort. A site that has participated in link schemes would know exactly which links are the offenders and remove them. Now, depending on the business owner, some might then file a reconsideration request saying, "I'm sorry, so-and-so paid me to do it, and I'll never do it again." Others may simply state, "Yes, we have identified the problem and corrected it."

In scenario No. 1, Google wins because this helps further the fear, uncertainty, and doubt (FUD) campaigns around link development. It is suddenly impossible to know if a site's outbound links have value because they may possibly have a penalty preventing them from passing value. So link building not only continues to carry the risk of creating a penalty on your site, but it suddenly becomes more obvious that you could exchange goods/money/services for a link that has no value despite its MozRank or any other external "ranking" metric.

In scenarios No. 2 and No. 3, Google wins because they can monitor the links that have been nofollowed/removed and add potential link scheme violators to training data.

In scenario No. 3, they may be able to procure evidence of sites participating in link schemes through admissions by webmasters who sold the links.

If I were Google, I would really love to have a control group of known sites participating in link schemes to further develop my machine-learned algorithm for detecting link profile manipulation. I would take the unnatural outbound link data from scenario No. 3 above and run those sites as a data set against Penguin to attempt 100% confidence, knowing that all those sites definitely participated in link schemes. Then I would tweak Penguin with this training dataset and issue manual actions against the linked sites.

This wouldn't be the first time SEOs have predicted a Google subtext of leveraging webmasters and their data to help them further develop their algorithms for link penalties. In 2012, the SEO industry was skeptical regarding the use of the disavow tool and whether or not Google was crowdsourcing webmasters for their spam team.

martinibuster

"Clearly there are link schemes that cannot be caught through the standard algorithm. That's one of the reasons why there are manual actions. It's within the realm of possibilities that disavow data can be used to confirm how well they're catching spam, as well as identifying spam they couldn't catch automatically. For example, when web publishers disavow sites that were not caught by the algorithm, this can suggest a new area for quality control to look into." — Roger Montti, Martinibuster.com


What objectives could the unnatural outbound links penalties accomplish?

  1. Legit webmasters could become more afraid to sell/place links because they get "penalized."
  2. Spammy webmasters could continue selling links from their penalized sites, which would add to the confusion and devaluation of link markets.
  3. Webmasters could become afraid to buy/exchange links because they could get scammed by penalized sites and be more likely to be outed by the legitimate sites.
  4. The Penguin algorithm could have increased confidence scoring and become ready for real-time implementation.
Russ Jones

"There was a time when Google would devalue the PR of a site that was caught selling links. With that signal gone, and Google going after outbound links, it is now more difficult than ever to know whether a link acquired is really of value." -— Russ Jones, Principal Search Scientist at MOZ


Again, if I were Google, the next generation of Penguin would likely heavily weight irrelevantly placed links, and not just commercial keyword-specific anchor text. Testing this first on the sites I think are guilty of providing the links and simply devaluing those links seems much smarter. Of course, at this point, there is no specific evidence to indicate Google's intention behind the unnatural outbound links penalties were intended as a final testing phase for Penguin and to further devalue the manipulated link market. But if I were Google, that's exactly what I would be doing.

Tripp Hamilton


"Gone are the days of easily repeatable link building strategies. Acquiring links shouldn’t be easy, and Penguin will continue to change the search marketing landscape whether we like it or not. I, for one, welcome our artificially intelligent overlords. Future iterations of the Penguin algorithm will further solidify the “difficulty level” of link acquisition, making spam less popular and forcing businesses toward legitimate marketing strategies." — Tripp Hamilton, Product Manager at Removeem.com

Google's webmaster guidelines show link schemes are interpreted by intent. I wonder what happens if I start nofollowing links from my site for the intent of devaluing a site's rankings? The intent is manipulation. Am I at risk of being considered a participant in link schemes? If I do link building as part of an SEO campaign, am I inherently conducting a link scheme?

Google Webmaster Guidelines for Link Scheme

So, since I'm an SEO, not Google, I have to ask myself and my colleagues, "What does this do to change or reinforce my SEO efforts?" I immediately think back to a Whiteboard Friday from a few years ago that discusses the Rules of Link Building.

Cyrus Shepard

"At its best, good link building is indistinguishable from good marketing." — Cyrus Shepard, former Content Astronaut at Moz



When asked what type of impact SEOs should expect from this, Garret French from Citation Labs shared:

Garret French


"Clearly this new effort by Google will start to dry up the dofollow sponsored post, sponsored review marketplace. Watch for prices to drop over the next few months and then go back and test reviews with nofollowed links to see which ones actually drive converting traffic! If you can't stomach paying for nofollowed links then it's time to get creative and return to old-fashioned, story-driven blog PR. It doesn't scale well, but it works well for natural links."

In conclusion, as SEOs, we are responsible for predicting the future of our industry. We do not simply act in the present. Google does not wish for its results to be gamed and have departments full of data scientists dedicated to building algorithms to identify and devalue manipulative practices. If you are incapable of legitimately building links, then you must mimic legitimate links in all aspects (or consider a new career).

Takeaways

Most importantly, any links that we try to build must provide value. If a URL links to a landing page that is not contextually relevant to its source page, then this irrelevant link is likely to be flagged and devalued. Remember, Google can do topical analysis, too.

In link cleanup mode or Penguin recovery, we've typically approached unnatural links as being obvious when they have a commercial keyword (e.g. "insurance quotes") because links more naturally occur with the URL, brand, or navigational labels as anchor text. It would also be safe to assume that natural links tend to occur in content about the destination the link offers and that link relevance should be considered.

Finally, we should continue to identify and present clients with methods for naturally building authority by providing value in what they offer and working to build real relationships and brand advocates.

What are your thoughts? Do you agree? Disagree?


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from
http://tracking.feedpress.it/link/9375/3725716

Friday, June 24, 2016

SearchCap: Google My Business API, challenges faced by agencies & more

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Google My Business API, challenges faced by agencies & more appeared first on Search Engine Land.



from
http://searchengineland.com/searchcap-google-business-api-challenges-faced-agencies-252600

Bridging the gaps in the Google My Business API

Each update to the Google My Business API makes life easier for those of us managing locations at scale, but columnist Brian Smith notes that gaps in the automation process have yet to be addressed.

The post Bridging the gaps in the Google My Business API appeared first on Search Engine Land.



from
http://searchengineland.com/bridging-gaps-google-business-api-251977

Search in Pics: Google Father’s Day gift, Gary Illyes’s iPhone & Google Jedi

In this week’s Search In Pictures, here are the latest images culled from the web, showing what people eat at the search engine companies, how they play, who they meet, where they speak, what toys they have and more. Gary Illyes iPhone: Source: SER Google dog: Source: Google+ Google Jedi Selfie: Source: Google+ Google Father’s […]

The post Search in Pics: Google Father’s Day gift, Gary Illyes’s iPhone & Google Jedi appeared first on Search Engine Land.



from
http://searchengineland.com/search-pics-google-fathers-day-gift-gary-illyes-iphone-google-jedi-252578

Long Tail SEO: When & How to Target Low-Volume Keywords - Whiteboard Friday

Posted by randfish

The long tail of search can be a mysterious place to explore, often lacking the volume data that we usually rely on to guide us. But the keyword phrases you can uncover there are worth their weight in gold, often driving highly valuable traffic to your site. In this edition of Whiteboard Friday, Rand delves into core strategies you can use to make long tail keywords work in your favor, from niche-specific SEO to a bigger content strategy that catches many long tail searches in its net.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're going to chat about long tail SEO.

Now, for those of you who might not be familiar, there's basically a demand curve in the search engine world. Lots and lots of searchers are searching for very popular keywords in the NBA world like "NBA finals." Then we have a smaller number of folks who are searching for "basketball hoops," but it's still pretty substantial, right? Probably hundreds to thousands per month. Then maybe there are only a few dozen searches a month for something like "Miami Heat box ticket prices."

Then we get into the very long tail, where there are one, two, maybe three searches a month, or maybe not even. Maybe it's only a few searches per year for something like "retro Super Sonics customizable jersey Seattle."

Now, this is pretty tough to do keyword research anywhere in this long tail region. The long tail region is almost a mystery to us because the search engines themselves don't get enough volume to where they'd show it in a tool like AdWords or in Bing's research. Even Search Suggest or related searches will often not surface these kinds of terms and phrases. They just don't get enough volume. But for many businesses, and yours may be one of them, these keywords are actually quite valuable.

2 ways to think about long tail keyword targeting

#1: I think that there's this small set of hyper-targeted, specific keyword terms and phrases that are very high value to my business. I know they're not searched for very much, maybe only a couple of times a month, maybe not even that. But when they are, if I can drive the search traffic to my website, it's hugely valuable to me, and therefore it's worth pursuing a handful of these. A handful could be half a dozen, or it could be in the small hundreds that you decide these terms are worth going after even though they have a very small number of keyword searches. Remember, if we were to build 50 landing pages targeting terms that only get one or two searches a month, we still might get a hundred or a couple hundred searches every year coming to our site that are super valuable to the business. So these terms in general, when we're doing this hyper-specific, they need to be...

  • Conversion-likely, meaning that we know we're going to convert those searchers into buyers if we can get them or searchers into whatever we need them to do.
  • They should be very low competition, because not a lot of people know about these keywords. There's not a bunch of sites targeting them already. There are no keyword research tools out there that are showing this data.
  • It should be a relatively small number of terms that we're targeting. Like I said, maybe a few dozen, maybe a couple hundred, generally not more than that.
  • We're going to try and build specifically optimized pages to turn those searchers into customers or to serve them in whatever way we need.

#2: The second way is to have a large-scale sort of blast approach, where we're less targeted with our content, but we're covering a very wide range of keyword targets. This is what a lot of user-generated content sites, large blogs, and large content sites are doing with their work. Maybe they're doing some specific keyword targeting, but they're also kind of trying to reach this broad group of long tail keywords that might be in their niche. It tends to be the case that there's...

  • A ton of content being produced.
  • It's less conversion-focused in general, because we don't know the intent of all these searchers, particularly on the long tail terms.
  • We are going to be targeting a large number of terms here.
  • There are no specific keyword targets available. So, in general, we're focused more on the content itself and less on the specificity of that keyword targeting.

Niche + specific long tail SEO

Now, let's start with the niche and specific. The way I'm going to think about this is I might want to build these pages — my retro Super Sonics jerseys that are customizable — with my:

  • Standard on-page SEO best practices.
  • I'm going to do my smart internal linking.
  • I really don't need very many external links. One or two will probably do it. In fact, a lot of times, when it comes to long tail, you can rank with no external links at all, internal links only.
  • Quality content investment is still essential. I need to make sure that this page gets indexed by Google, and it has to do a great job of converting visitors. So it's got to serve the searcher intent. It can't look like automated content, it can't look low quality, and it certainly can't dissuade visitors from coming, because then I've wasted all the investment that I've made getting that searcher to my page. Especially since there are so few of them, I better make sure this page does a great job.

A) PPC is a great way to go. You can do a broad-term PPC buy in AdWords or in Bing, and then discover these hyper-specific opportunities. So if I'm buying keywords like "customizable jerseys," I might see that, sure, most of them are for teams and sports that I've heard of, but there might be some that come to me that are very, very long tail. This is actually a reason why you might want to do those broad PPC buys for discovery purposes, even if the ROI isn't paying off inside your AdWords campaign. You look and you go, "Hey, it doesn't pay to do this broad buy, but every week we're discovering new keywords for our long tail targeting that does make it worthwhile." That can be something to pay attention to.

B) You can use some keyword research tools, just not AdWords itself, because AdWords bias is to show you more commercial terms, and it biases to show you terms and phrases that do actually have search volume. What you want to do is actually find keyword research tools that can show you keywords with zero searches, no search volume at all. So you could use something like Moz's Keyword Explorer. You could use KeywordTool.io. You could use Übersuggest. You could use some of the keyword research tools from the other providers out there, like a Searchmetrics or what have you. But all of these kinds of terms, what you want to find are those 0–10 searches keywords, because those are going to be the ones that have very, very little volume but potentially are super high-value for your specific website or business.

C) Be aware that the keyword difficulty scores may not actually be that useful in these cases. Keyword difficulty scores — this is true for Moz's keyword difficulty score and for all the other tools that do keyword difficulty — what they tend to do is they look at a search result and then they say, "How many links or how high is the domain authority and page authority or all the link metrics that point to these 10 pages?" The problem is in a set where there are very few people doing very specific keyword targeting, you could have powerful pages that are not actually optimized at all for these keywords that aren't really relevant, and therefore it might be much easier than it looks like from a keyword difficulty score to rank for those pages. So my advice is to look at the keyword targeting to spot that opportunity. If you see that none of the 10 pages actually includes all the keywords, or only one of them seems to actually serve the searcher intent for these long tail keywords, you've probably found yourself a great long tail SEO opportunity.

Large-scale, untargeted long tail SEO

This is very, very different in approach. It's going to be for a different kind of website, different application. We are not targeting specific terms and phrases that we've identified. We're instead saying, "You know what? We want to have a big content strategy to own all types of long tail searches in a particular niche." That could be educational content. It could be discussion content. It could be product content, where you're supporting user-generated content, those kinds of things.

  • I want a bias to the uniqueness of the content itself and real searcher value, which means I do need content that is useful to searchers, useful to real people. It can't be completely auto-generated.
  • I'm worrying less about the particular keyword targeting. I know that I don't know which terms and phrases I'm going to be going after. So instead, I'm biasing to other things, like usefulness, amount of uniqueness of content, the quality of it, the value that it provides, the engagement metrics that I can look at in my analytics, all that kind of stuff.
  • You want to be careful here. Anytime you're doing broad-scale content creation or enabling content creation on a platform, you've got to keep low-value, low-unique content pages out of Google's index. That could be done two ways. One, you limit the system to only allow in certain amounts of content before a page can even be published. Or you look at the quantity of content that's being created or the engagement metrics from your analytics, and you essentially block — via robots.txt or via meta robots tag — any of the pages that look like they're low-value, low-unique content.

A) This approach requires a lot of scalability, and so you need something like a:

  • Discussion forum
  • Q&A-style content
  • User-posted product or service or business listings. Think something like an Etsy or a GitHub or a Moz Q&A, discussion forums like Reddit. These all support user-generated content.
  • You can also go with non-UGC if it's editorially created. Something like a frequently updated blog or news content, particularly if you have enough of a staff that can create that content on a regular basis so that you're pumping out good stuff on a regular basis, that can also work. It's generally not as scalable, but you have to worry less about the uniqueness of quality content.

B) You don't want to fully automate this system. The worst thing you can possibly do is to take a site that has been doing well, pump out hundreds, thousands, tens of thousands of pages, throw them up on the site, they're low-quality content, low uniqueness of content, and Google can hit you with something like the Panda penalty, which has happened to a lot of sites that we've seen over the years. They continue to iterate and refine that, so be very cautious. You need some human curation in order to make sure the uniqueness of content and value remain above the level you need.

C) If you're going to be doing this large-scale content creation, I highly advise you to make the content management system or the UGC submission system work in your favor. Make it do some of that hard SEO legwork for you, things like...

  • Nudging users to give more descriptive, more useful content when they're creating it for you.
  • Require some minimum level of content in order to even be able to post it.
  • Use spam software to be able to catch and evaluate stuff before it goes into your system. If it has lots of links, if it contains poison keywords, spam keywords, kick it out.
  • Encourage and reward the high-quality contributions. If you see users or content that is consistently doing well through your engagement metrics, go find out who those users were, go reward them. Go promote that content. Push that to higher visibility. You want to make this a system that rewards the best stuff and keeps the bad stuff out. A great UGC content management system can do this for you if you build it right.

All right, everyone, look forward to your thoughts on long tail SEO, and we'll see you again next week for another edition of Whiteboard Friday. Take care.


Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from
http://tracking.feedpress.it/link/9375/3700107