Partner Bootcamp

SEO Site Audit Training: Close Your Next SEO Client

For your convenience, we transcribed our Partner Bootcamp – SEO Site Audit Training: Close Your Next SEO Client

Read the summary and key takeaways or watch any of our previous webinars.

Talk to our Project Managers

Webinar Transcript

[Downloadable versions of this transcript, the presentation deck, and other materials are also available in the Resource Center].

Bernard:

Good morning and welcome. Thanks for joining us again for the second Partner Bootcamp, and this time, we’re talking about SEO Site Audit Training: Close Your Next SEO Client.

[slide: Meet the Expert]

As usual, it’s me; I’m Bernard. I’m an 18-year web professional. I’ve started four companies, taken three web-based companies to multimillion-dollar status. I’ve launched over 300 websites and we literally do over a thousand marketing campaigns every month. That’s my email address, short and sweet, and that’s you guys can find me on LinkedIn.

I’m doing the Partner Bootcamp series; you know the only sad thing I have about the Partner Bootcamp series is, it’s a one-man show. I miss having a partner on this conversation. But I’m not going to dwell on the negative. Most of these conversations are positive. This is about unearthing opportunities. That’s really what the majority of this conversation is about.

[slide: What Happened During the Previous Webinar]

Now, before I get started, I want to go back to what we talked about previously. So, previously on Partner Bootcamp, we talked about simplifying SEO by breaking down over 200 core metrics and then grouping them into six metric groups. And we talked about, you know, the significance of these six core metric groups in today’s SEO, and the importance of having search engine metrics in groups because you don’t really work with each metric individually in order to achieve significant results.

SEO Reseller’s six core metric groups—which are authority, on-page, off-page, behavior, quality, and trust—all group over a hundred search metrics, and when we work, when keywords work in groups, and when we make recommendations, we always make recommendations based on which groups can be leveraged today to achieve short and midterm goals for the SEO campaign. And then we talked about SEO best practices in order to leverage each metric.

[slide: Upcoming Product]

Now, before I jump into the meat of our conversation, it’s shameless plug time. This the part where I wish I had a song riding in the background: “Shameless plug!” But this plug is about the best way to maximize conversions and what we’re releasing is our integrated search and social retargeting solutions. What these try to do for you is maximize your conversions of high-intent visitors from your website and social media pages. They’re designed to follow your customers so that you’re there when they’re ready to buy, and then—ideally—get them as referral visits or back into your website and convert them.

[slide: Discussion Overview]

Let’s move on to why you guys are really here. This webinar, we’re going to be talking about how to effectively use each part of your SEO site audit to win new clients, and then of course recommend the right solutions and begin a successful SEO campaign. And that really is the gist of this conversation. It’s about starting your campaign on the right foot.

I think everybody that’s listening to this webinar already understands, or at least I hope we’ve mentioned it enough—about the benefits of SEO, right? An ROI factor of 22 per dollar spent. That’s pretty impressive. And, you know, perpetual free traffic if you do it right. So, in this webinar, we want you to be able to go through a site audit with an existing or potential client and explain what the important findings are, what they mean to your clients, and in the simplest terms, in terms that they can understand. Remember that if you don’t translate your activity, the business goals that your clients have, you might as well be pitching to a brick wall.

Now, pro tip: don’t try to maven your way through an SEO site audit meeting. Don’t try to maven your way through it. Most people that try to prematurely become experts wind up painting themselves in a corner and that’s not what you want to do.

Now, in the office, we know when someone is ready to be a senior SEO specialist or analyst, depending on how they are able to do this intuitively: keyword selection, checking accessibility and the indexing status of a website, the offpage and onpage penalties and warnings. A senior person can look at a website and intuitively do this in about five minutes flat. The less-experienced ones, we put a time limit on them, specifically 35 minutes. They’re not supposed to take longer than that to be able to go through all these metrics. And you know, with practice, we’ve got a hundred percent batting average at getting them to become intuitive about doing these.

Now, of course at the end of the conversation, we’re going a run a Q&A session, but don’t wait until the end of the conversation; feel free to use the chat box on the upper right side of the screen to drop your questions as soon as you think about them.

[slide: What is a Site Audit?]

Now, let’s talk about a site audit. What is it really? The site audit is simply a process of identifying elements that prevent websites from ranking. So in short, they’re problems. They’re problem areas that your potential client website may be experiencing, currently have, are attributed to. Why bring this up as a first conversation, right? One tactic in sales is to be the solution to the problem of your client. Therefore, you know, like Plato is credited for having said, “How you see the problem is the problem.” That means, at every time you find a problem, you have a choice in terms of looking at it as a problem or seeing it as an opportunity. And we are talking about, you know, unearthing opportunities.

What you’re seeing on the right is a sample of the automated site audit that you can generate yourself from within the dashboard, for free. So when do we do site audits? We do site audits before a campaign. Essentially, you generate the report that we’re seeing on the right before you even ask us to do an assessment of the site. This is something that’s very self-service. You guys can execute this yourselves. If you’re fairly technical, you’ll understand the value of the page you’re looking at on the right.

Now, if you’re not technical and you’ve pitched to a client, or if it’s the first time you’re working with us, feel free to give us a call so your project manager can walk you through what the automated audit does. It’s actually not rocket science, but it will take a bit of practice.

Now under the assumption you’ve done your pitch, you did a good job, you followed everything that we talked about on the Boost Your Business webinars, we execute site audits, or what we call pre-intake audits, before a campaign starts.

Now why would we do that? Because we want to know that you and we, as partners, are going to a battle that we can win. We want to make sure that the client’s goals are realistic, that there’s something that we can achieve, that the goals of the business are met. Because we don’t like taking money from people when we don’t achieve the results that they want. So before a campaign starts, an audit is done.

Upon the intake of the campaign (because when a campaign starts, that’s usually when we gain access to analytics, webmaster tools, and all the other credentials of the client) we review them because they provide more insight. The Search Console is your oracle into how Google sees your website. It’s usually not possible to see that data until the campaign starts.

Now, another thing that we don’t tell you—but we do this out of habit—is, at the fifth month of an SEO campaign, we assess its performance and we look for other opportunities. Just because you’ve achieved an amount of success—meaning you did successfully manage to rank 60 to 80 percent of the keywords on the first page—that doesn’t mean that a.) those keywords can’t be pushed further up above the fold, positions 1 to 3, or in the snack pack; or b.) that there aren’t any keywords that you can push further.

Now, the other time that this is done is when the client requests for it, and if they ask, “Are there other opportunities?” Or you may have surprising wins: what if the campaign ranks in a hundred days or less, which actually does happen. The client can then say, “I want more bang for my buck; let’s try to target more keywords,” which is sensible. And when they make this request, all you have to do is talk to your project manager.

[slide: Purpose of a Site Audit]

Now, why do a site audit? Because in SEO, all projects are opportunities, right? All problems are opportunities, and a site audit spots opportunities and recommends the right solutions. It also tells you, like I mentioned earlier, is this a battle you can win? Is this a client that you want to build a relationship with? Is this a website you can drive free traffic to? And then of course it gives you the ability to establish expertise, which is a sales tactic. One of the most popular sales tactics is to prove expertise.

One of the other things that usually get ignored because you try to maven your way or you try to prematurely expert your way through a conversation is: the most important thing a site audit does is it allows you to set expectations. When a pre-intake audit is done, once you guys agree on keywords, and then we see the content of the website, its authority, its digital footprint, an experienced SEO can roughly say how quickly they can win. And in our case, we will actually tell you, it looks a 3/6/12-month battle before the keywords we want hit the first page. And these conversations happen pretty frequently.

Let’s start with “Starting off on the right foot.”

[slide: Keyword Selection]

Now, again, before I continue, feel free to send your questions through the chat box on the upper right side and I’ll do my best to answer them during the Q&A portion of the conversation.

Let’s talk about Keyword Selection.

This is the make or break of the campaign. Seventy percent of the success of the campaign, and sometimes even 100 percent of the success of the campaign will depend on the keyword selection.

So what’s keyword selection? This is just about the project manager and you and the client determining which keywords determine the success of the campaign. As an analogy to AdWords, AdWords has conversions, and in the case of AdWords, a conversion is simply whatever you measure success against. It’s whatever you count success against. In SEO, keywords determine whether you’re successful or not. After all, what we try to do is rank 80% of campaigns and at least 60% of keywords on the first page. If you’re not picking the right keywords, you may not drive the keywords you need onto the first page. But more important than that, what if the keywords selected are not the ones users execute a query against, right? So the yardstick of the campaign will be measured against the keyword selection.

Now, there are two portions to the keywords. The first one is keyword research, and SEO’s easy-peasy. It’s very easy to execute keyword research, and there are a lot of tools, by the way, that can help you do this. If you’ve got Search Console access, Google is your oracle. If you’ve got Search Console access, then even more so.

Inside the Search Console are metrics called impressions. And impressions are simply the keywords that were queried where Google gave you as one of the results of the search. That’s what an impression is. It’s the number of times Google presented your site’s meta title and meta description to users that were executing a query. What does this tell you? This tells you that Google has a contextual understanding of the content of your site and that term that was queried.

Keyword research is easy. You use the keyword planner. You could use MOZ’s Keyword Explorer. You could use AdWords Keyword Planner, and so on and so forth.

Now the responsibility for keyword selection, though, will not lie solely on your SEO. The SEO can tell you whether he can win the keyword battle in a three- to six-month period, but this is one of the most collaborative parts of the campaign. It is a responsibility between the project manager, the SEO, and you, and the client. Because everyone has to agree that these are the yardsticks that drive success for this campaign.

Now, bear in mind, most SEOs, they’ll want to choose keywords that they can rank for. Why? Because SEOs get a kick out of ranking keywords; that’s what we do. We like ranking stuff; it’s in our nature. And clients will want keywords that correspond to tactical or strategic goals. Now what are tactical or strategic goals? If I’m going to take this from an agency perspective, strategic goals are goals that drive conversions, goals that drive visits—or, you know, achieving goals.

Tactical goals are what we refer to as vanity goals. And vanity goals are when the client says, “My competitor is number 2 for that keyword. I want to be number 1.” And it doesn’t matter how many that search or query will drive; they want it. That’s an example of a tactical goal or a vanity goal. However, one of the things that everyone in the conversation will agree with is that, strategic keywords always take priority. The keywords that drive traffic always have to take priority. And then it’s a negotiation between how many strategic and how many tactical keywords you will have inside the campaign.

[slide: Keyword Selection: What You Need to Know]

Now, what do you need to know? When executing keyword research, having access to Google Webmaster Tools or the Search Console creates a tremendous advantage. Google hides nothing from you. They’ll tell you that these are the queries where Google gave your website as a result. More than that, Google also tells you what it thinks of your website, what it sees in your website. It tells you if it can find all pages, can’t find all the pages, is giving the wrong page as a search result for a specific query, and so on and so forth.

Another tool that can be used is the AdWords Keyword Planner. This is a staple because most clients like to pick keywords that have high search volumes. But one of the things I always remind people is to take data from the Keyword Planner with a grain of salt. Remember that it’s an AdWords tool and its purpose for being there is to entice people to bid on more and more keywords. That isn’t necessarily your SEO goal. Your SEO goal is to primarily drive free, relevant traffic to the site.

Now, also in several instances, I’ve personally seen keywords in the Keyword Planner where Google says the search is less than a hundred every month, and yet the client website received over a thousand impressions for the same query. Where you have to weigh the accuracy of the number of queries between the Keyword Planner and the Search Console, trust the Search Console.

Now, when executing keyword selection, it’s very important that you understand the client’s business. What service do they offer? What Unique Selling Proposition do they have? What kind of queries would a customer type in order to look for those services? So, you sort of have to wear a couple of hats in this situation. You have to put yourself in the shoes of the entrepreneur, walk a mile in their shoes and understand what their business means to them and what they perceive to be relevant users, relevant customers, potential leads.

But you also have to put yourself in the shoes of a search engine user: what query would they type? If you were a customer looking for the same services, what would you have searched for in Google in order to find relevant producers of that service or relevant producers of that product? And then set expectations with your client in case they want keywords that can’t be won. This is why our packages have built in keyword and search volume limits. I’ll discuss the keyword limits with you right on the next slide.

[slide: Keyword Limits – Organic SEO Packages]

This is a list of our packages, and these are the limitations in terms of exact match searches. Let me try to explain what you’re looking at. The Silver, Gold, Platinum, Platinum Plus—these are the levels of the packages inside the store. If you guys have been to our store and have taken a look at the services in Organic SEO, this is what you would have seen.

The number of keywords were selected because we try to limit the number of keywords we optimize for a page between three and five keywords for each page. Therefore, ten keywords only have two contextual keyword groups. And there’s a limit of about 4,000 exact searches against all ten keywords, or 2,000 exact searches per keyword group, or 2,000 per five keywords. And then you just scale that up: a Gold has 20 keywords, therefore has four keyword groups, and will have 8,000 exact matches or exact searches.

Now, why put these limitations in? Because when we built our methodology, we wanted to offer a service that worked most of the time. I don’t believe that there’s an SEO that works 100 percent of the time. The beauty of working in SEO and why I’ve worked in it so long is no two days are the same. A campaign that you thought you were doing great on will surprise you three months later. And then you have to figure it out again. It’s like an endless cycle of a detective game.

Now, the keyword limitations are there not because we want to hit only low-hanging fruit and multiple little grails. They’re there because adhering to these limitations allows us to rank 80% of the keywords with 60% of them on the first page, 80% of the time. That’s why they’re there. Now, mind you, it could be one or the other: we could hit either the total number of keywords or the total monthly search volumes. But the amount of work placed in every package is meant to rank that much search volume.

I hope that made sense, and if it didn’t, feel free to use the chat box so we can discuss it later inside the conversation.

[slide: Accessibility and Indexability]

Let’s talk about the next metric. This is Accessibility and Indexability.

Now, for those of you that are not technical, that are getting into this fairly new, something you need to know: when people conduct a search in Google, Google does not give you results from the World Wide Web. Google gives you results from Google. Therefore, Google is able to give you a search result in 0.36 seconds because it didn’t check the World Wide Web for your results; it checked its mainframe.

Now, check whether a website that you’re targeting or a website that you’re pitching has crawled and indexed pages. What do I mean? When we say indexed and when we say crawled, crawling is just a function of a search engine bot and it goes through your site. When the bot makes a copy of your website inside the search engine’s mainframe, the site is said to be indexed. When you conduct a search, again, you’re not searching the Web; you’re searching Google’s index, or it could be Yahoo!’s index, Bing!’s index—whichever search engine it is. But essentially, the data that’s presented to you is the data from a copy of the web, and not data directly from the web.

Now, why is this important?

[slide: How to Spot Site Indexing Issues or Ranking Opportunities]

Because you can’t rank a website that does not exist in the index. Google only ranks things that are present in the index. Therefore, if Google has no copy of your page inside the index, the page cannot rank.

Now, how do you spot site indexing issues or ranking opportunities? First, execute a site search. When you execute a site search, what Google will immediately tell you right at the top of the search is, “201 pages found.” When you do a site: search, that is essentially you telling the search engine, “Give me all relevant results only within this domain or only within this site.” And an example is the one below, site:www.domain.com.

Now, another thing that you can do—and this one we do when a client has already selected keywords—is a site: search and then add the keyword. So we do site:www.domain.com “exact match”.

Doing that means we are able to find within the domain if the keyword is actually mentioned. I’ll try to simplify what I just said. The keyword you want to rank for, is it in the content of your site? This is the best way to find out, number 2. And then the last one is of course, checking for crawl errors. Do any of the target pages return a 404? Can we even find the page? Can users find the page? Can the search engine find the page? The easiest way to find all crawl errors is to check with Google Webmaster Tools.

Now, the other reason we need Google Webmaster Tools is the best way to identify whether a page is a great target for SEO, is whether the domain vouches for that page as well. Remember, Google tells you how many internal links you receive from within your own domain. Let’s say the average page receives 40 internal links. You want to target something that’s pretty close to receiving 40 internal links versus one that receives one, two, three links site-wide. If you’re not vouching for that page, why should Google?

Now, let’s continue with Accessibility and Indexability. Spotting site indexing issues also involves you looking at the XML sitemap. And an XML sitemap is simple: it’s a file that lists all the pages on your site that should be accessible. It’s a list of all pages on your site that should be accessible.

The Robots.txt, what is this? The Robots.txt is a Robots Exclusion Protocol, and it is simply a set of command lines that tells search engine crawlers—“bots”—whether they are allowed to have access into specific pages, specific folders, and whatnot. It tells them what to make copies of, what not to make copies of, and pretty much how you keep your secure pages secure. By making sure that the robots are commanded to tell search engine bots, don’t make copies of secure pages, don’t make copies of singular instances of page loads, don’t make copies of login pages, and so on and so forth.

Now, just a little technical tidbit, or just a bit of trivia: If you’re fairly technical and you’re used to looking at robots.txt files, I always get surprised by the fact that people don’t seem to understand when there are conflicting commands inside robots.txt files. The latter command is accepted.

I’ll simplify that. In the event there are two conflicting commands, latter command overrides the first. And that’s just a little SEO trivia.

Now, moving on. Orphan pages; what are these? Orphan pages are typically PPC landing pages or AdWords landing pages. They’re pages that can’t be followed from other pages in your client’s website. They’re typically not indexed because crawler bots find your pages by following a link from one page in order to find another page. So unless they’re linked externally or there’s a backlink pointing to them or whatnot, orphan pages typically cannot be found by navigating through your site. And they’re typically set up for PPC landing pages.

Now, note: don’t try to target orphan pages. They receive no authority; they receive no power. They inherit no authority from the entire domain. And so it’s a bad idea.

[slide: On-Page]

So, unless you guys have questions—and feel free to use the chat box—I’ll move on to the next element in the site audit. So after we pick and look at the keyword selection and agreed that it’s viable, and then we’ve taken a look at the indexing of the site and agreed that there are enough quality pages in the index for us to rank for the desired keywords, the SEO then proceeds to looking at the onpage. And the onpage is simple: it’s the content of the site.

And there are two types of onpage elements: the first one is the content and this is your text, your images, your videos, your rich media—all of that is content. Even the music playing on the background is content. And then the next one is your meta content. Content is still THE single most powerful ranking factor. It is THE single most powerful ranking factor. Now, in order for you to find out the optimum number of keywords, you need to find the number of optimum keywords in order for a page to rank. For non-commercial keywords, you don’t really need a lot of words in terms of creating onpage content.

I remember a couple of months back, our CEO, Itamar, he ran a case study and he published it inside our blog. And what we did was we did a check for the top 3 ranking pages for information keywords. And what we found was, Google tends to rank information pages that contain 1600 words or more. Now for commercial pages, you don’t really need bonkers in terms of 1600 words. Remember, informational pages or hub content are mean to educate users. And they are typically commercially neutral. Commercial pages, on the other hand, have a purpose. Therefore, your primary purpose isn’t to educate. However, it is to build desire. And you need enough content in order to justify your case.

When it comes to targeting commercial keywords, what we recommend is about 300-500 words. Now, we’re not saying 300-500 words because most SEOs say 300-500 words. We say so because that’s enough content to match the purpose of that keyword. And then of course, for transactional pages, you don’t need 300 keywords for that. Fifty words, less than a hundred words are sufficient in order for you to build enough desire to go through a shopping cart, a conversion and whatnot.

Now, let’s talk about proper keyword targeting. The keyword must be present on the page in order for you to rank for that term. Let me give you guys an example. Let’s say there’s a website called guitars.com and it wanted to rank for Gibson guitars. This is the keyword that they want to rank for—Gibson guitars. But they wanted the word “Gibson guitars” to rank in the URL that says guitars.com/electric-guitars. Now, take note, the word Gibson does not appear in the URL. What the SEO will then proceed to do is type in site:guitar.com “Gibson guitars”. And if no pages are found that means the URL that we just cited does not have that term present. In order to rank for Gibson guitars, content about Gibson guitars must be present within the guitars.com/electric-guitars page. And if it does not, you can’t rank for that, right?

Now, put yourself in Google’s shoes here. Your users are looking for Gibson guitars. What you want to do, in order to keep them using your search engine, is offer them a page that adds value to that query. Users want to find information about Gibson guitars. If the words Gibson guitars don’t exist on the page, then there is a very low likelihood that content relevant to Gibson guitars exists on the page. Content is still the single most powerful ranking factor.

[slide: Meta Content]

Now, what else do you need to know? You need to check for the meta content: the title tags, the meta description, meta keywords (not powerful anymore), and the alt tags. And then we also check for the headers. Why do we check for the headers? Because, like we talked about in the previous Partner Bootcamp, the core of the algorithm is still the Latent Dirichlet Allocation, which is an algorithm that says, when words appear to the top and to the left, they are more powerful. And where does the H1 tag exist? Usually, to the top and to the left, therefore making it one of the most important pieces of content in your page.

[slide: Meta Content: What You Need to Know]

Now, what do you need to know in terms of links? When it comes to creating external links, we follow a rule of thumb. Now feel free to agree or disagree, because this is the portion where we get into SEO opinion. Whenever we check a page that we want to rank for SEO, we try to limit the number of external links coming from that specific page. And for us the critical value is about 40. If we can limit it to even less than that, wonderful.

Now, when I say external links, I mean external links that are dofollow. For those of you guys that are technical, dofollow is the default status of a link. And a nofollow is essentially a link that does not pass pagerank or does not pass authority. Why do we look for pages with limited external links? Because whenever you pass a link onto an external domain, you shed some of the authority of that page. The less authority it has, the less likelihood of ranking. So you want to try to consolidate as much authority as you can have inside your page.

Now, what about internal linking—links to other pages inside your own website? Like I mentioned earlier, inside the Search Console, you’re able to limit your view of the number of links received by a page, to how many times you’re mentioned within your own domain. If you don’t vouch for a specific page inside your own domain, why should Google vouch for that page for you, right?

So let’s go back to the Gibson guitars example. If we were on guitars.com/electric-guitars, and the links for Gibson guitars point to another page, then the likelihood of the electric guitars page ranking for the term Gibson guitars becomes less likely. Because you yourself, in your own domain, don’t vouch that that is the most important page for Gibson guitars.

Now, one more thing, and this is a fairly new trick that not a lot of people execute. Let’s talk about site architecture and the siloing. Siloing is the process of you organizing your pages so that your target pages receive contextual support from pages below it, and inherit authority from the pages above it.

What do I mean? If I say Bat Signal, Gotham City, Barbara Gordon, Robin, what is the primary topic? Most of you guys are probably saying Batman, and if you’re not, I’m very worried. But most of you should be saying Batman. Now, why do we know that Batman is the core term? Because we’re able to create contextual relationships between the terms that I just understand. The search engines have begun to become equally as sophisticated. If any of you guys have been keeping tabs with SEO, you guys have probably heard of the term LSI, or Latent Semantic Indexing. And what that is, is the search engine understanding synonyms, contextual relationships, and so on and so forth. You are actually able to imply a keyword and make your page relevant for only an implied keyword.

[slide: Common On-Page Problems to Watch Out For]

Now, the most common SEO problems to watch out for in terms of onpage. And the reality is, duplicate content is still a primary culprit in terms of problems when it comes to ranking.

Why is this a problem? In 2011, Google rolled out the Panda algorithm, and the Panda algorithm’s primary purpose was to find weak websites, websites that were farms, websites that had no value-adding content, websites that copied content or stole content from another domain, or websites that were mass produced, therefore adding no unique value to users. And it penalized them.

Why did Google penalize them? Because if you value your users, you would have invested the time to create great content for them to experience inside the website in order to engage them. Therefore, just stealing content from somewhere else and placing it on your site violates Google Rule No. 3, which is Build for the User, and we discussed Google’s Four Golden Rules on the last webinar.

So here’s an example. Watch out for websites that might be older than five years old that were built by niche providers. Take for example, mass-producing dentist websites, mass-producing chiropractor websites, mass-producing food and beverage websites, because in the old days a lot of people made a lot of money by selling the same website to multiple people and just changed the color schemes, changed the photos, and changed the branding, but kept the content similar across the dozens and dozens and dozens of websites. Make sure that you at least try to grab a paragraph’s worth of the content and then plug it into Google search and see if you only get results from within the same domain. Because if you don’t, you potentially have a duplicate content problem, and duplicate content is actually one of those factors that have the capability to single-handedly have you penalized, or prevent you outright from being able to achieve any rankings.

Now, let’s do a practical application here.

[slide: Algorithmic Compatibility vs Marketing Voice]

You’re talking to the client and then you’re talking about onpage. There is one thing that’s very important that you need to do. In order for a webpage to rank, the content has to be what we call very algorithmically compatible. But we all know that most website owners and most webmasters don’t craft content for algorithmic compatibility. They craft it for marketing voice, they craft it for identity, they craft it for Unique Selling Propositions. But content is rarely crafted for algorithmic compatibility and this is why when we begin work with you guys we set your expectations that when we create content, it will be algorithmically compatible. It will be capable of ranking the keywords.

But—here’s the but—the clients don’t always like it, because the content is designed to leverage elements of LSI, elements of the LDA; it’s crafted so that we’re very conservative when it comes to keyword density; we make sure that the keyword is as close to the top and to the left as we can. This is algorithmic content. And sometimes, the clients prioritize the way their marketing voice represents their brand versus their algorithmic compatibility.

Now, what does this put you into? It puts you in a situation where rankings that could be achieved now cannot be achieved now because the marketing voice takes precedence. And so in situations like this you have to be comfortable sitting down with the client and having an intelligent conversation about where their priorities are. Is it a priority to drive more clients into the website today and earn more keywords into the first page of search engines? Or can it wait? Because we can help keywords get to the first page without following algorithmically compatible content. It’s just that it takes more time. You having that conversation does not diminish your ability, or it does not diminish your image in the eyes of your client. It actually makes you more competent in the eyes of your client.

So, this is one practical application of this portion of the conversation.

Now, moving on to off-page.

[slide: Off-Page]

This one is still important because even according to MOZ, SEOs still believe that 70% of rankings are influenced by backlinks. What do we take a look at when we take a look at backlinks? When we review a site’s backlinks and we don’t have access to the Search Console, we use third party tools, such as Majestic (wouldn’t do the webinar with us, we still love him; shameless plug) and we also use Ahrefs.

Now, one of the things to take note of, though, when you use these tools: remember that Ahrefs is an index, and Majestic is an index. They have their own indices. Majestic does not have an insider look into Google’s index, and Ahrefs does not have an insider look into Google’s index. Each of them have their independent copies of the World Wide Web, and therefore, they are independently tracking the backlinks going to your website.

Now, why take a look at the backlinks? Because—you need to know—SEO is a competitive industry. In order for you to achieve first-page rankings, you have to be better than the guys on the first page. And that’s all of these; you have to index better, your content has to be better, your technical superiority has to be better, but more than that, your digital footprint has to be better, and these are what your backlinks are.

Experienced SEOs know how to spot non-value adding links, or deprecated links, or dangerous links, or old-school SEO links. The best way to really spot them is with Google Webmaster Tools, and under Search Traffic we go to the Google Search Console and we download all of the backlinks. Experienced SEOs, ones that have more than four years of experience—you know, pre-Panda, pre-Penguin babies—ones that would have done blog commenting and social bookmarking, they know the domains that they used to use. They will spot these, just like that.

Now, the first thing that we look at in terms of backlinks is the anchor text distribution. The easiest way to spot an unnatural digital footprint is when a generic money term has more reference to the site than the name of the domain. I’ll repeat that: the easiest way to spot a fake digital footprint is when a generic money keyword has greater density, is mentioned more on the World Wide Web, than the name of the brand. Your brand name must be your primary anchor text. In the same way that you’re most talked about by your name than your job title or what you are in the family or what car you drive—it’s the same thing. Most people are referred to by their name; the algorithm mimics behavior, so most websites are referred to by their name.

[slide: Off-Page Elements: What to Watch Out For]

ow, what else do we watch out for when we take a look at off-page elements? We look at the branding term; is it the dominant link back to the site? We take a look at bad backlinks. What are bad backlinks? Bad backlinks are low-quality websites, sites that negatively affect your client’s website. We’re also talking about blog commenting from four-five-six years ago, or old-school social bookmarks, old-school directories. Those are examples of bad backlinks. The worst type of backlinks, by the way, are links from domains that Google has chosen to de-index, and then of course, you know, PPC. We got porn links, gaming links, and controlled substances links. Google does not appreciate any of those.

The other thing that we look for is an unnatural link velocity. What is link velocity? It’s simply the rate at which your website acquired backlinks. Now, why do we look for this? Because Google looks for this. The easiest way for Google to spot paid links is when the site’s digital footprint suddenly and unnaturally enlarges. And when I say “unnaturally enlarges”, that means you suddenly received hundreds of backlinks into your site, especially when it’s only coming from one C class or only one domain, Google will review you for unnatural link building, and I’ll show you what that looks like in a bit.

[slide: Penalties and Warnings]

Now, again, feel free to send me your questions, use the chat box if you guys have any questions so far.

Let’s talk about the last things we look at, and we look for penalties and warnings.

[slide: Penalties]

Under the assumption that we have access to the Search Console, we will proactively look for webmaster messages. When websites are hit with a penalty—Panda or Penguin—you cannot proceed with optimization until the trust issue has been addressed. I’ll repeat that because it’s important: When websites are hit with a penalty—whether it’s the cute Panda on the left, or the cute Penguin on the right—you can’t rank. You cannot rank because you have established a trust issue with the search engine.

Search engines—when they develop a trust issue with you—will not offer you, as a result, to users. They must be able to trust your content because they only want the best content to go in front of users. So you have to address the trust issues with the search engine first before SEO activities can have any positive impact. I mean, can you optimize for meta, can you optimize for onpage content, can you add multiple content formats? Yes, you can. Will you benefit? Not enough to get to the first page.

Now, two ways to detect penalties. First of course is we use the Google Search Console. But without the Search Console, not all hope is lost. There are a couple of tools that we use. One is DEJAN SEO’s Algoroo (thanks, Dan, for building a great tool). Algoroo tells you if on the date of an algorithmic release, was there a drop in traffic?

If you don’t have access to Algoroo, you’ve never heard of it before, you’ve never played with it, another tool that we use is called FE International’s Penalty Indicator. Just Google it and you’ll be able to find the tool.

Both are great. But remember that using FE International and Algoroo only allows you to guess at the presence of a penalty, and the only absolute way to know if a penalty exists is with the Search Console.

How do you know you’ve got a penalty? Or what do penalties look like?

[slide: Panda – Site-Wide Match – Unnatural Links]

They look like that. This is a sample message of an Unnatural Links Violation from your website, and this is a Panda warning, and that’s why I’ve got an angry panda to the right.

[slide: Panda – Site-Wide Match – Thin Content]

The other one is this. This is the Thin Content Warning, and it is also a Panda warning, therefore the angry panda to the right. And this warning will prevent you from ranking.

[slide: Panda – Duplicate Content]

This is one of the most dangerous ones to have. This is the Duplicate Content Warning. This is what it will look like on your Search Console. And this will tank all your rankings.

[slide: Panda – Site-Wide Match – Pure Spam]

Now, let’s move on to another one, right. This one: the Pure Spam Warning. There’s only really one way to deal with this; forgive me for being a little aggressive, but if you get this warning, oh my Lordie, pack up your domain and build a new website. Pack up your domain and build a new website. This one is incredibly, incredibly difficult to overcome.

[slide: Penguin – Partial Matches – Unnatural Links]

Now, what about an angry penguin? I know that’s an Angry Bird to the right but I thought it would be appropriate. Here are a couple of messages that you can get inside your Search Console. The first one are partial match warnings: Unnatural Links going to your site. Whether you bought links or have not bought links, you may receive this warning if Google has cause to suspect any link-buying happening. So, I’ll repeat that: You may or may not get this warning whether or not you’re doing any active linkbuilding going to your website. Now, is this a penalty that will affect your site or all your rankings? Actually, it won’t.

The beauty of the Partial Match Warning is that it only affects specific URLs in question. This is actually the first warning you’ll get if your anchor text distribution is unnatural. Google will detect, “Wow, you have an unusually high number of anchor texts for payday loans that point to this page! We’re investigating it; you don’t necessarily have to do anything.

Now what does Google mean when they say, we’re investigating it? It means, they’re going to try to figure out, does it look like you contrived to own those links, you contrived to acquire those links or not? And if they find that you haven’t, no skin off your back. But if they find out that you do it will progress to another warning.

[slide: Penguin Penalty]

This is another Unnatural Link Warning. And this is when Google detects or suspects that you have purchased links. What is the easiest way for Google to identify purchased links? An unnatural link velocity. When it detects that you gained an unnaturally high number of backlinks for a money term, and usually coming from a powerful but only one domain, Google will suspect link buying. And if it confirms link buying, it will send you this warning.

[slide: Link Clean-Up]

Now, in the presence of penalties or warnings, because they can stop a website from ranking, you need to be ready to go into a conversation about asking the client for additional investments in cleaning up the backlinks. The challenge for you is to convince the client to invest in these activities. This is not a question of should they do it? Or can I do it later? It’s a question of when. If the client wants to acquire leads, customers, visitors from free, organic sources they will never achieve that unless they invest in cleaning up their backlinks first.

Now, why invest in backlink cleanup? Because SEO ultimately leads to free traffic, and that’s why they need to do this. So when a penalty exists inside your Google Search Console, it’s typically a foregone conclusion. You should be able to win that conversation because it’s not an “if” conversation, it’s a “when” conversation. They will need to do it. Why not do it with you? Because if you work with us, we’ve got a hundred percent batting average in getting rid of those nasty messages.

[slide: Synthesis]

Now, I’m going to try to wrap up to make room for your questions. So here’s the gist of everything we just talked about.

The SEO Site Audit. It’s simply a process of identifying elements or opportunities that allow you to recommend the right solutions for your clients and set expectations.

Keyword Selection. The entire campaign will be built off which keywords you pick to rank. Whether the campaign ranks or not depends on the keywords that you choose. What kind of content needs to be written for the website will depend on the keywords you choose; the kind of Alt tags, the kind of recommendations, the pages to target all depend on the keyword selection. Balance out your strategic and your tactical keywords whenever you can.

The next one, Accessibility and Indexability. What is this? It is how accurate the search engine’s copy of your website is. It’s about how accurate the copy of your website is inside the search engine.

Onpage. Content is king. The keyword must be present on the page in order for that page to rank for that term.

Off-page. Your digital footprint. Does it look natural? Is your domain name used more than any money term? Do you have bad backlinks coming from either SEO legacy tactics or bad domains? Is your link velocity acquisition natural? Did you gain a steady number of backlinks over a given period of time?

And then of course Penalties and Warnings. Do you have messages inside Search Console? Address your trust issues before proceeding to any SEO activity.

[slide: For Our Partners]

Bernard: Now, for our partners. We’ve created a downloadable PDF training guide against this webinar so you guys can download it through this short URL bit.ly link. Or you guys can go into the dashboard and download it from there.

[slide: Q & A Session]

Now, I’m down to ten minutes. I’ve got ten minutes left to have a quick discussion with you guys and I’ll start going through the first few questions.

The first questions is, “If you need keywords that are location-specific, do you create keyword+location or is it advisable to use locations alone as keywords if the client has several locations they want to target that can bump up the number of keywords considerably?”

OK, fair question and that’s true so let me try to hit the question fragment by fragment. If you need keywords that are location-specific, do you create keyword+location? Yes, you do. Yes, you need to. Google still responds to the presence of a keyword plus the presence of a location. Why? Because, let’s say you just moved to San Francisco and you don’t have a physician, a dentist, a whatnot, how do you do a query, right? You look for dentist in San Francisco, physician in San Francisco, San Francisco physician, dentist San Francisco. People actually execute keywords with keyword+location, so it is valid to optimize in this way.  

Now, the other way to optimize for this to use local Schema, micro formats in the code that tell search engines that this is where you’re physically located. That is the other way to approach it. If the client has several locations they want to target that can bump up the keywords considerably—yes, it can—but you don’t have to hit all goal keywords in the same 30 days, right? If the client has one, two, three locations, depending on the package that you take, you may be able to rank one, two locations at any given 60- to 90-day period and then move on. That would be the right to approach it.

The next question says, “My client has two sites. Some of the content is duplicated on the second site which is described as a division of the company named on the first site. In that case, would there be a penalty?”

Great question. So the qualifying question here is, what are the C classes of both domains? Are they hosted in the same physical server? That’s one. Two: That mention of the credit that content originally belongs to the other one, is it visible to users? And three: Google says the appropriate way to create content like this is to attribute authority to the original website that owned the content. In the simplest of terms, Google says use a rel=canonical. Add a content or page attribute to the second site that copied the content. That will prevent the site from penalizing. However, canonical pages are no authority. Therefore, they do not rank well. I hope that made sense. If that didn’t make sense, feel free to send a follow-up question. We still have about seven minutes.

“How do you get backlinks for a brand new site that sells a product? They don’t have a blog.”

Great question. So, I’ll start off this way: first, do some content marketing. And then, second, create your own backlinks, right? YouTube is a C class IP that you don’t own. And inside YouTube, when you create your own YouTube channel, inside the description you can point the backlink back to your website. More than that, the video has the ability to embed a card in it with a link. That’s a backlink. It’s a unique C class IP that points a backlink back to you.

Next, build a Facebook Fan Page. You have the ability to point a backlink from your social profile back to your site.

Next, go to reliable business directories. Post your website on it.

Next, Google+ Profile. Build one, point it to your site.

And all the reliable pages where you can list your business in. All the authoritative URLs or all the authoritative domains that you can post your website in.

You have the ability to create your own backlinks. You just don’t have the ability to create a ton of them.

And then execute some content marketing. Now, I’ll just add a bit to that. If you want to gain backlinks badly, nothing stops you from paying for the visibility of your content. Blog about something, share it on your Facebook, boost it for $5. Cheap, right? Do a quick YouTube add that fits within your budget. Cheap. It will allow new users—people that have never heard of you—to go to your website. Create great content that they will want to share on their websites in turn.

Next question: “Can I get a prospect website audited with you? Do you charge?”

Great question. Yes, a.) there’s a self-service audit tool inside the dashboard; all you have to do is log into the dashboard and use the tool. It’s free to use. Next, “do you charge?” We will do a pre-intake audit against any campaign that we try to work on. So, no, we don’t charge unless it’s your tenth request and you haven’t closed anybody. Then the project manager will begin to charge. But your first ten attempts, no, nobody will charge you for us to do a pre-intake audit for you.

“Can I present your site audits to my client, and how do I explain it? Should I explain everything or just a brief overview?”

OK, I’ll answer that one by one. “Can I present your site audits to my client?” Yes, you can. If you’ve set up your dashboard, it will actually publish itself with your branding on it. If it’s a pre-intake audit, your project manager will be mindful enough to post your brand on it.

How do you explain it? This one’s a little tough. Call your project manager and request for a walk-through of how to drive that conversation. They have done this over a hundred times, and they’ll teach you how to do it easy.

“Should I explain everything or just a brief overview?” Stick to the brief overview. Sales conversations are exactly that—they’re sales conversations. If you don’t maximize them by asking questions so you can understand what your client’s needs are, and you use all 30 minutes to one hour trying to be an expert, you will have wasted opportunities to find ways to help your client. So keep the sales conversations less pitching, more qualifying, more need-unearthing.

Next one is: “Does your audit come with SEO recommendations? And who represents the recommendations—you, us, or the website owner?”

OK, great question. Answers are: does your audit come with recommendations? The automated audit inside the dashboard does not. The pre-intake audit will. Who implements the recommendations, you, us, or the website owner? If the websites are built in WordPress, we will actually implement the recommendations for free—if you give us access. If you guys prefer to do it yourselves, that’s also fine if the site is not built on a technology we can touch, like take for example, Microsoft Server-driven websites are not technologies we can touch because we run on Apache servers. So, PHP-based websites, yay; Microsoft Server-based websites, nay.

“How does Schema affect ranking for a new site?”

Wow, I don’t know who asked this question but whoever you are it’s a great question. How does Schema affect ranking for a new website? And the answer is, tremendously. It depends on which Schema, though. Let’s take local Schema as an example.

For those of you guys who are not technical, what is Schema? Schema is a micro format prescription in HTML that allows search engines to not just read content but understand content. I’ll repeat that and I’ll try to make it simple. What is Schema? It’s a way to write HTML so the search engine can understand what that content was. It knows that this text is an Address Line 1. It knows that the other text is a phone number. It knows that this portion of the website’s HTML refers to a longitude and latitude that can be located in a Google map, and so on and so forth. In short, it’s the way to format your HTML so that search engines easily understand what you are; what do you benefit from? You benefit from map listing. You become very Pigeon compatible. Pigeon is the local mitigating algorithm.

Google suddenly displaced the content on your website? If Schema was implemented via JSON, your potential for appearing on the knowledge graph becomes significantly stronger. It allows Google to associate that your brand owns that Facebook page, owns that Google My Business page, owns that YouTube channel. So, Schema has a tremendous ability to rank a new website; especially when the campaign is local, sometimes adding Schema allows you to appear on the snackpack in about 30-90 days. Now, that’s not a promise. I’m just telling you that based on my experience and my observations, that happens. It’s a very powerful markup.

“What are some good tools for backlink search?”

I’ll give you guys my top 3. My top 3 are: the Search Console, Majestic, and Ahrefs.

I’ve run out of time. I’m going to move us on.

[slide: For Our Partners]

Because I’m out of time, I’m sharing with you guys again the bit.ly link for the site audit guide. And if you guys don’t grab the bit.ly link, feel free to log into your dashboard.

[slide: Schedule a Call]

And again, these are our contact details. You can schedule a call with us by going to bit.ly/getsked, or you can dial these numbers. And most of you guys will already be familiar with it. Feel free to call one of our project managers; feel free to ask for education; feel free to ask for material. Download the stuff in the dashboard. Our amazing Marketing Team put in a lot of time and effort and creativity into putting those there for you. If you guys need assistance, hit us up at info@reseller.com. There are actually six people in the office that look into that mailbox, including me. I many not personally respond to it, but I will make sure that everything that goes to the mailbox is responded to in the same day.

I’m out of time. You guys have been a great audience. I’m signing off and I hope to see you on the next Partner Bootcamp. Thank you very much, and let’s keep selling.