Partner Bootcamp

Simplifying SEO: The 6 Core Metrics You Need To Know

For your convenience, we transcribed our Partner Bootcamp: Simplifying SEO: The 6 Core Metrics You Need To Know webinar below.

Read the summary and key takeaways or watch any of our previous webinars.

Talk to our Project Managers

Webinar Transcript

[Downloadable version of the transcript, the presentation deck, and other materials are available in the Resource Center].

Bernard:

Hello, hello, hello, and welcome everyone to the Partner Bootcamp series. For everyone that’s joined us in the Boost Your Business series, I just wanted to make it clear this is an altogether different webinar series. SEOReseller wants to run you through the more technical elements of SEO.  

[slide: Simplifying SEO: The 6 Core Metrics You Need to Know]

So today, we are talking about the core metrics – the six core metrics that you need to know and only the six that you need to know.

[slide: Meet The Expert]

I’m Bernard San Juan III and if you guys have worked with us before then you’ll know who I am. And I’m running this webinar today all by my lonesome and I’m an 18-year web professional, which makes me really old.

And yeah, it’s exactly what the writing says, I’ve launched over 300 websites – actually, way more than that by now. I’ve managed over a thousand marketing campaigns. I’m also very easy to hit up by the way, I’m bernard@seoreseller.com – short and sweet – and you can find me on LinkedIn by going to linkedin.com/bernardsanjuaniii.

[slide: Mockup Builder & Proposal Creator]

So today, I’m going to be talking to you about the six core metrics on SEO. But true to our tradition, this is the shameless plug time, ‘cause it’s always slide number 3 that’s shameless plug time. So I’m going to talk about the mockup builder and the proposal creator.

If you haven’t joined us on the Boost Your Business webinars before, you may not be aware but we made teaching and selling web design easier.

We built a mockup builder and a proposal creator right at the comfort of your own dashboard. It allows you to create high-quality designs for your prospects in minutes. This then allows you to attach them to professionally made documents like SLAs, contracts, quotes, and allows you to launch your website in no time. All you got to do is make them sign the dotted line. You can visit, you can learn more about our mockup builder and proposal creator by following the bit.ly link, which is really just bit.ly/mockup-builder.

[slide: Discussion Overview]

Now, let’s get on to an introduction of what we’re covering in this conversation. What we’re talking about are the six core metrics of SEO and the best practices for effective optimization. Essentially, I guess all of you guys know if you’re not new to SEO, everybody’s aware that there are over 200 search metrics in Google’s algorithm.

And you need to know the importance of having search engine metric groups because, you don’t really, it’s not practical for anybody to go through each metric through each page of their website. In fact, when most people talk to us, we tell them there is no true 100% optimized website. But by thinking of the metrics in groups, it gives you a more practical application of how to leverage these metrics to your website’s advantage and to make it easier to the bots and the users – specifically, the user – to get a better idea of what the content on your website is about.

You’ll discover what our six core metric groups are and the best SEO practice to leverage each metric. Of course, at the end of this session, I’m going to try to make it short so that we’ve got more time for Q&A because I can really talk SEO, but feel free to send your questions in advance by using the WebEx chat, which should be the little box on the top right of your screens.

[slide: Significance of the 6 Core Metrics in Today’s SEO]

So, let’s get started. What is the significance of having six core metrics in today’s SEO? So, we started the conversation because according to Marketing Sherpa, SEO is a 16-billion-dollar industry. It’s a huge industry and yet, that 16-billion-dollar industry is only made up of the, actually not ten, but nine percent of websites that have mature optimization characteristics.

Ninety percent, if not more, of the SEOs that you’ll manage to talk to, they’re not institutionalized. They don’t have formal research capabilities. They don’t have a laboratory to play with. They don’t perform their own research. Most of them will have learned their own SEO techniques coming from blogs or other professionals or other sources like, for example, Moz’s Whiteboard Fridays or link research tools, which by the way are great resources. But we always encourage everyone to test and verify their own information.

Now, because they haven’t done their own bouts of testing, a lot of people that claim to be great SEOs can’t really predict how their work will affect the website that they’re trying to work on and this is very dangerous, because if you can’t predict it, you can’t prevent yourself from harming the website that you’re trying to help.

And even at the point of speaking to you guys during this webinar, I’m sad to say that we still see work reports coming from potential clients, potential partners showing us outdated practices that endanger their clients’ websites. I mean, we’re talking about eighty percent blog comments, still automated social bookmarks, outdated business listings online, and whatnot.

So that’s why we’re talking. We’re talking because you need to understand what professional SEO services are about and most established organizations, not just us, but you know mostly us, but not just us have a methodology in place in terms of what they’re trying to do with their website and why they’re doing it that way. They’re not just copying somebody else’s methodology; they build their own.

So again, before I talk about why deal with search engine metrics in groups, a reminder to send in your questions for the Q&A portion.

[slide: Why Talk About Search Engine Metrics in Groups?]

So, why group the search metrics? In reality, there are over 200 metrics to rank websites. If you guys have visited Moz, Search Engine Journal, Search Engine Land, all these great resources online, they will list over a hundred metrics and what their estimated effect on search engine rankings are for each metric.

For us, in-house, we have found a hundred twenty-three metrics that have an amount of influence on rankings. Therefore, we tested them and we understand how they can affect your rankings. We understand how they can make you visible to your customers.

One of the things that most people fail to understand is that the algorithm is a filter and if you’re a marketer, I strongly recommend you guys read up on Seth Godin. Seth Godin wrote a milestone book called Permission Marketing and the search engine algorithm is the ultimate in permission marketing. It only makes sure that information that’s relevant to the user gets to the user. So only the best content and the most user-friendly information goes to a potential customer. The algorithm sees all the content but will not allow all the content to reach the users.

Several SEOs will comb your site through, you know, the one to two dozen metrics they know off the top of their head and these will be the older school SEO metrics and what they’ll tell you is your site is not ranking because your meta title’s not optimized. Your site is not ranking because you have no meta descriptions, your site is not ranking because you have no schema or because your site is slow or because your banner’s too big or because there’s no content above the fold or insufficient content. Whatever it is, they’ll tell you that that’s why you’re not ranking and this is very dangerous advice.

What we like to tell our partners is there is no such thing as a single metric capable of ranking or penalizing you. There’s just no single metric out there, there’s no single metric out there that can single-handedly rank you or single-handedly penalize you.

All search metrics work synchronously. Like iron filings to a magnet, none of the metrics work alone. The quality metrics all work in sync and the quality metrics are the newer ones. All content metrics or on-page metrics, they all work together – the keyword density, the word count, the originality of the content, the presence of the keyword above the fold, the keyword in the H1 tag – all of these work synchronously with each other.

Now, remembering the metrics in groups makes it easier to leverage the correct group of metrics to gain results for your site. When you take a look at the search engine metrics, SEO is a very competitive industry and by practice, you have to execute it competitively.

What do I mean? If you want to be on number one, you have to be as good as or better than whoever is on number one. And when you take a look at where you’re better or where they’re better, you have to compare yourselves by search metric group. By quality metrics, are they performing better than you? By off-page metrics, are they performing better than you? By on-page metrics, are they performing better than you? Leverage the ones you perform better at and try to compensate for the ones that you are naturally weak at.

[slide: SEOReseller’s Metric Groups]

So, we’re going to talk about our metric groups and this is really more of a story of Google so you know, we’re not taking credit for it entirely. But I’ll go through this as if it were Google’s story. So let’s talk about Google’s first. For all of you guys that have been netizens for a long time and over the age of 30 because, you know, we’ll forgive the Millennials for not knowing this story. They probably weren’t online back then.

But Google was built in 1997 by Larry Page and Sergey Brin and it was their post-grad paper. It used to be called Backrub and they built it inside the mainframe of the Stanford University. In 1997, before Google was rolled out to the public in 1998, Larry Page was sort of wondering what kind of paper he should write. Should he write about better search engines or should he write about self-driving cars? You know, which is why Google remains obsessed with self-driving cars; it’s just the vision of the founder.

[slide: Authority]

So when Google rolled out in 1998, the third metric that it had – and I’ll get to the first two in a bit, but I wanted to talk about this one briefly. The third metric that Google launched is a metric called the authority group of metrics. Internally to Google, this is known as PageRank and this is the primary authority algorithm.

What’s the best way to define PageRank? PageRank is a numeric representation of how powerful your website is. It is the difference in the likelihood of a truth being true based on who you hear it from. In short, you know, it’s Aunt May versus Ellen versus Oprah Winfrey. The authority levels on each one are all different. Now, some of you guys might tell me, but oh, Google is not updating PageRank anymore.

So I’ll correct that: Google is updating PageRank internally. It’s not a signal that they can remove anytime soon. However, they have prevented SEOs from being able to use toolbars to pick up the data, and as far as late, as late as, or sorry, as early as February of 2014, they stopped public access or they stopped updating the publicly accessible PageRank database.

PageRank was released in 1998 and Sergey Brin was the one that coined that term. He named it after Larry Page. You know it sort of makes sense, but in reality, it wasn’t named because it’s the rank of your page. It’s named because the PageRank algorithm was developed by one of the founders.

Now, PageRank, one of the most difficult ones to influence and in fact, in reality, we tell people don’t try to optimize for PageRank, because you can’t. You can’t manipulate it. But you know for some more advanced early SEOs, they’ll tell you there’s a way and there is a couple of technical ways to do it. However, if you’re God-fearing, ethical, and law-abiding, don’t do it.

This is difficult to influence primarily because some of the metrics are natural; they happen with time. Age is a metric of authority and authority has to do with how powerful the vouch comes from with each website. Now like I was saying, authority was born in 1998, but when Google was born in 1997 as Backrub, it actually already had double the number of four metrics compared to its primary competitor, which back then was Yahoo Search Engine. Not Yahoo, but Yahoo Search Engine.

[slide: Authority (Best Practices)]

So, you don’t try to optimize for authority. So if you can’t optimize for it, why are we even talking about it? Because if you got it, you should leverage it. If you’ve got it, you should leverage it. So I’ll give you guys a quick case study on this.

In 2013, we worked on a bank with over a hundred branches and they were trying to fight two much larger competitors and the bank in the region, one of the largest banks in the region, actually had a PageRank of 5. That is tremendous PageRank.

They were trying to beat websites and domains that were younger than them. They had their domain for about 14, 15 years, which allowed it to earn a lot of authority over time. The people that they were trying to beat were 1-2 PageRanks below them, which means they could leverage their PageRank. So if you’ve got it, use it, use it against the competition, because we did and they wound up outranking their competition for about 60% of similar terms.

[slide: On-Page]

Now let’s talk about on-page. On-page is the granddaddy of all metrics. It is the granddaddy of all metrics. Before Google, before Yahoo, the W3 catalog, all of them were built foundationally against an algorithm called the LDA or the latent dirichlet allocation. You guys don’t need to remember that or write it down. It will not make you, you know, richer in the future, but suffice it to say the LDA is what we should refer to in layman’s terms as the keyword density algorithm. It’s the keyword density algorithm.

Now, the LDA is a very complex algorithm, but it works in a very elegant way. What it does is it tries to understand what the content is about based on a mathematical breakdown of what words appear where in what frequency. So, what words appear where in what frequency. The reason keyword density and the reason keyword presence still matters across all content inside the website is because of the LDA and it still sits there at the core of the algorithm. Currently it’s complemented by more sophisticated algorithm, like the Caffeine mitigates it, the LSI mitigates it, but the LDA is there.

It relies on the presence of terms in order to identify relevance and this is why content will always be king. Now, one important thing to realize when it comes to talking about the LDA or the keyword density algorithm, it isn’t all about the frequency of what words appear where. In fact, the “where” is very important.

According to the LDA, a word is more powerful – a keyword, a phrase is more powerful – the closer it is to the top and to the left. And that sort of makes sense, because most of the time when we talk about topics, the topic is the subject and everything else is a predicate, right? So it follows natural human behavior. The LDA attributes more weight to content that’s present above the fold in the top hundred words to the top and to the left of the page.

[slide: On-Page (Best Practices)]

So, how do we optimize for on-page? There are really a lot of ways by the way to optimize for on-page and I’ll briefly mention some of them but I’ll get into the bullets first.

So first of all, optimize your meta titles, right? Optimize your meta titles – this is very, very, very easy to do. Limit it to 68 characters, not 512 pixels because we’re not robots. We can’t see pixels. We can optimize for word counts. In order for your meta title to not ellipses, limit it to 68 characters. Whenever you can, try to make sure that your keyword and your brand are present in the meta title.

For best practices, what we like to tell people is if you are not a household name, if you’re still trying to prove yourselves as a brand, make sure that you follow a formula that says key phrase, pipe, brand or you could also do and if you’re a household name by now, do brand, pipe, keyword, so it’s relatively simple. And try to make your meta title as clear and as attractive as possible to your customers and whenever possible, don’t duplicate them. Do not duplicate your meta titles.

One SEO best practice is to support your content with supporting content. What do I mean? If I’m creating a page about, about chocolate, I would support that page with dark chocolate, milk chocolate, chocolate for kids, chocolate with nuts, and so on and so forth. Those would be supporting topics; those would be separate pages. Those separate pages cannot carry the same meta title as the original page or else I’m confusing the bot and I’m confusing the users, so make them unique and make them valuable to a user.

Now let’s talk meta description and there’s a lot of debate about whether the meta description holds value as an optimization element. And for us, it does, because the meta description is the single, most powerful click-through driver for your site. It may not be a powerful ranking driver, but it is a significantly powerful click-through driver and click-through, as you will discover later, is a ranking metric.

Optimize your headers and your content. One of the things that we like to preach to everyone: content is the single most powerful metric on your site. Content is the single most powerful metric on your site. Your headers are important because again, according to the LDA, if a word appears to the top and to the left, it carries more power and headers typically appear to the top and to the left. And therefore, your H1 tag, which is the one that appears most to the top and most to the left becomes the most important header tag. Insert the keywords where possible and where readable.

Insert the keywords in the top hundred words and above the fold where you can. Where you can’t put them, let’s say you have to compromise between readability and algorithmic optimization. Please prioritize readability, because one of Google’s rules is building content for the user.

Now, insert multiple content formats – and I’ve got a great case study to show you guys on this.

Why should you insert multiple content formats? A lot of you may have heard the current web being referred to as Web 2.0 and most of the time when you talk to an SEO, you tell them Web 2.0 they’re thinking Tumblr, StumbleUpon, Blogger, Blogspot, Weebly, and all these places that offer free blogs. It’s true that these sites are a great representation of Web 2.0, but they are not Web 2.0.

Web 2.0 is the user’s web. It’s enabled by HTML5 and it allows non-programmers to interface with a website and change the face of the website without writing a single page of code. This is what Web 2.0 is. It allows you to drag and drop images, files onto a web page. It allows you to extract files, drag and drop elements from a web page onto your desktop. All those capabilities are Web 2.0 – the web becoming more friendly to users.

Now, why should you build multiple content formats and take advantage of Web 2.0? Because Google considers multiple content formats as a sign of trust, and trust is easily measured by how frequently Google decides to index your website. Let’s go to the case study.

[slide: On-Page (Case Study)]

So, this is what we call the 70-site test. In 2011, 2012, I would guess probably more 2012, we created about 70 websites. Out of the 70 websites, we kept 10 for control and then we did Group A and Group B. Remember, we segregated the remaining 60 and then we segregated them.

Group A was given optimized content and Group B had optimized content and we inserted images with alt characteristics. Group A, with varying PageRank, by the way, and it was still readable way back in 2012, had a mean index of greater than 10 days. While Group B, with two content formats and optimized content was indexing between 7-10 days, which means Group B is what we were looking for.

So we proceed further with the experiment and we take Group B and we segregate it into two. We took group C, which wound up as the control group and then Group D was optimized for content, images, and then we inserted videos. When three content formats existed on the page, all the sites – and I mean all the sites – began indexing with a mean of four to seven days. Not a single one of them indexed longer than 7 days, even the PageRank N/A sites.

Now, of course that’s what we were running after, so we took Group D and we segregated it some more. Group X became the control group from D and Group Y was optimized for content, images, social badges, a map integration, and the index became greater – the index became less than three days. Now, less than three days indexing for a PR N/A site was tremendously amazing, especially in 2011-2012.

So, as a last ditch effort, we created one more test – the last ditch – and what we tried to do was see how do ads influence content on a site. Will it cause Google to trust the page more or trust the page less? Just to stay on the safe side, we used Google Ad placements on it, but of course, unfortunately at the time, Top Heavy came out and so the websites started indexing in more than four days, so scrap the last ditch idea and the final selection was websites with multiple content formats index faster one to three days. And one to three days is a lot of trust from the algorithm.

Now, I’ll move on onto on-page and again I want to remind everyone: feel free to send in your questions at any point in time.

[slide: Off-Page]

Let’s talk about off-page. So, remember 1997 Backrub. In 1997, almost all search engines were created equal – Lycos, Yahoo, AltaVista, if you guys are old enough to remember that, AskJeeves, because it wasn’t Ask, it was AskJeeves back then.

They all made use of the LDA, but when Google rolled out in 1998, it had one more group of metrics more than all of its competitors. It had the off-page group of metrics. The off-page group of metrics are actually invented by Google. In fact, the term backlink was coined by Larry Page. What he realized was the most important pages, the pages that were most valuable to him as a human being evaluating a website seem to also be vouched for by other websites on the World Wide Web. And so he thought of writing an algorithm that took a look at how many vouches you had across the World Wide Web. How many websites were talking about you and turned it into a metric. So, the off-page group of metrics is really just how many people talk about you online – and the more chatter, the more popular you are. Now, this does not preclude who’s talking about you.

Now, let’s take two examples. While I can’t see you, you guys are gonna have to be honest with yourselves. Who among you guys have read Twilight? Raise your hands. Okay, so I can’t see you but I’m hoping you guys are honest with yourselves. I’m actually raising my hands. I read Twilight under duress from a female friend. So, I’m familiar with Stephenie Meyer’s writing and, in my opinion, in my personal opinion, and unfortunately a lot of people in the academe agree with me, Stephenie Meyer’s writing isn’t very good, but there’s a lot of disagreement there because while I may, while I may disagree that she’s a great writer, she does have not one, not two, but three bestsellers. Therefore, the popularity metric works in her favor. It doesn’t matter what I think or what the on-page thinks about her. She has a lot of vouches, because a lot of people bought her book.

Now, the other example is Michael Bay. And again, if you guys are Transformers fans, please raise your hands and be honest with yourselves. I’m raising my hands; I’m a big, big, big, big, big, big, big, big Transformers fan and I’ve never witnessed such horrible butchering of a brand I love when I was 9 years old. Now, while I disagree with how Michael Bay interprets the Transformers franchise, you guys know what disagrees with me? Not one, not two, not four, but five blockbusters, right? So the guy has off-page on his side. It doesn’t matter what I think. And even if let’s say I was an academic, even if I was an academic, cinematographer, or I took up film, it doesn’t matter what my opinion is. Public opinion stands with him and therefore, Michael Bay is a significant person in the film industry, because he’s got proven results and people, you know, watch his movies.

[slide: Off-Page (Best Practices)]

So, let’s talk off-page best practices. The best practices for off-page really are about you getting mentioned online. Whether we like it or not, that’s how the Google algorithm works and that’s how most search engines work now. They look for vouches of you online.

Now, off-page is not this simple. It isn’t three things to do in order for you to improve your digital footprint, because the unique C-classes matter, the unique physical servers matter, the power of the website that vouches for you matters. Whether you’re site-wide or unique to a page matters, whether you’re a sidebar link or body content maters.

So, off-page has a lot of metrics, but the most actionable items for you to do is first off, improve your content marketing strategy. Create compelling content that resonates with your audience. Create compelling content that resonates with your audience.

A lot of you guys that are on this webinar are on this webinar because you’re part of our newsletter list. Now, some of you guys may have noticed that we’ve been sending out newsletters on case studies that we’ve done. It is not the first time that we’ve done, that we’ve done newsletter marketing or that we’ve created content or that we’ve blogged about it or anything. But when we started writing case studies, it resonated with our audience. It produced more traffic on any single day than we’ve ever seen and produces more unique visits than we’ve ever done from any marketing practice.

So it was amazing; it was a big eye-opener for us, because for us, that was an a-ha moment. A-ha, our clients like case studies. They like to see how the practical application of SEO works on a business. So, create compelling content that resonates with your audience.

Next is get your client’s business on relevant business listings. This one is so easy and I’m always surprised at how many people don’t do it. Whenever we take a look at a website, we see how many vouches it has online. But more than that, we specifically look for business listings, because we look for name, address, phone number consistency. And 90% of the time, on the sites we work on, no relevant business listings. Either that or unclaimed business listings, meaning something that the business listings put there themselves. The client did not act on, sorry – the customer did not act on optimizing their presence online. But this is such an easy win. Yelp, yellow pages, Google My Business, even a Facebook fan page all allow unique C-class links to point to your site. It is link building in its most legit way. Right?

The other is if you’re really good at optimizing a YouTube channel, for example, optimize for that. Make sure that you build on your clients’ online presence. Put them on social media. Make sure they’ve got a fan page and make sure that the fan page is linked to the site. Remember: Google can’t see activities that happen in Facebook. They, uh, don’t talk well to each other because, you know, they’re sort of competing on search. But, any social activity that is present on the website or visible on the website is visible to Google. So, put them on social media. You are able to earn a link for yourself.

The next one is, does your brand, does your customer, does your client have a YouTube channel? It’s such a shame that, you know, in the description, you actually are able to point a link back to yourself, which is a vouch because nobody wants the website inside YouTube servers except YouTube. So it’s another C-class referral, and more than that, a lot of people don’t utilize the cards. When a YouTube video plays, you have the ability to insert these little popups, these little snippets that appear. And those are the YouTube cards. You are able to put a link in those and a lot of people have not leveraged that.

And then of course, improve your client’s brand mention not just as a keyword. Help them market themselves, help them present themselves as leaders in their industry or as thought leaders and help them get their brand recognized. Peddle creative content, sophisticated content, funny content on their behalf and get them mentioned everywhere else.

So I’ll give you guys an example of this. A couple of weeks ago Dan Petrovic was here in Manila and he joined us for the SEO summit and he did this really great, he did this, this, this really great study. He ran a Google survey about what characteristics like about men and in the survey, what won was the eyes, the smile, the chest, and the arms. And just to make it more interesting, right after you fill out the survey, you go to a page where there was a local celebrity with a heat map on it.

Meaning, he was tracking whether people were telling the truth in terms of what they voted for on their surveys. It was controversial. Dan got published locally on local print and he got published online by the local media. Excellent way to get yourselves known, a great way to earn a link. So, improve your content marketing strategy, create compelling content that resonates with your audience, get your clients’ business on relevant business listings, and establish your client’s online presence – excellent ways to work on off-page.

[slide: Behavior/Best Practices]

Now let’s talk about the behavior metric. You can’t directly control this metric, but you can see what happens inside analytics. What you do have is the ability to intervene in this metric. So, this is about how users interact with your clients’ websites when they’re searching.

In 2001-2002, we started working with the Urchin company and they started using web analytics. Since then, Google has since bought Urchin and it is now what you know as Google Analytics. In Google Analytics, your site is measured against several metrics: visits, bounces, exits, sessions, repeats, uniques, CTR – no, not in a, actually yeah, you do see average CTR in analytics. Out of all the behavioral metrics, the most important one is CTR, because CTR happens at the SERP level. It happens at the search level. It doesn’t even happen inside the site.

So, what do you do in order to mitigate? In order for you to earn more visitors, avoid more bounces, avoid exits, create more sessions, get more repeat customers, and get them to refer new unique visits to your site. Plan your site architecture. Make your site as shallow as you can. A lot of people don’t seem to know that you lose 74% of users past the third click. Make your sites as shallow as you can and make all the most pertinent information available in three clicks or less. Optimize your meta descriptions and showcase your client’s brand name and value proposition clearly. Attract them to come to the site. The meta description is a CTR device with no equal. And then of course, listen to your users.

Google rule #3: build for the user. Apply layout changes based on how users behave on your site. If you are realizing that they become cold past half the scroll level of your page, put all valuable information at the top. If you’re realizing, let’s say you were selling flooring and you’re selling hardwood, laminate, engineered wood, and then tiles, and what you’re realizing by installing a heat map on your page is that most people that go to your site click on tiles. Do not insist on putting the tiles to the right. Make it accessible to your user by adjusting to their preferences. If majority of your users need tiles, move it to the left. This is what users are looking for.

So, plan your site architecture, make your clicks shallow, don’t lose 74% of your users. Same is true for your shopping cart. Don’t make your shopping carts difficult to use by making them use more than three clicks to close or else you will lose a lot of sales. Optimize your meta descriptions because this is a significantly powerful CTR tool. Apply layout changes on your site based on your user behavior and remember: build for your user.

Now, one more tip is if you’ve got the search console or Google Webmaster Tools installed on your site – and you should, if you don’t – if you want to know what to optimize for, Google tells you flat out what it is. Google tells you, hey site owner, hey webmaster, I presented you as a search result x thousand times. Out of the x thousand times I presented you, here are the queries that I presented you for and here are the number of times people clicked on you.

If you’re seeing you’re getting several hundred impressions for a specific term that’s relevant to you but no one is clicking, you may want to optimize your meta description in order to encourage better click. A meta description is like the trailer to a movie, you know, it can create commitment on the spot if it’s really, really good. So, this is the behavior group of metrics and you know, it’s a mouthful. I could talk to you guys for eight hours just on this metric alone.

[slide: Quality/Best Practices]

Let’s move forward – the quality group of metrics. The quality group of metrics are the most modern of all the metrics. They’re the most modern of all the metrics. The first algorithm to introduce this in a major way was the Panda, which rolled out in 2011.

Now, unlike all the other metrics that we’ve talked about so far, the quality metric is an on-page regulating algorithm. You know, it doesn’t necessarily give you an additional score. It says whether you deserve all your entire on-page score or should it deduct half a percent, two percent, or three percent, or should it augment that.

Now the beauty of the on-page algorithm is it tells us how smart the web is becoming. This is about content originality, the mobile compatibility of your site, and what kind of mobile compatibility you created – are you client-serving, are you M-dot, or are you a responsive website? They all impact search differently.

What about sociability? Sociability is your website’s ability with users on a social level, the integration of the like, the integration of follow, the ability to share content. One piece of content that is very powerful, that could sit on your website is called user generated content and that’s when users engage your content. They drop comments, they agree, they disagree, they make an opinion, they engage you. They are very powerful pieces of content and all of that go to the sociability metric.

What about the security of your site? Do you have an SSL certificate tied to your website? Do you redirect properly to https? The user-friendliness of your website, are you accessible across multiple platforms? Do users see elements where the eye looks for them?

Also, content quality. Is your content original or did you just rip it off of somewhere? Is your content original? Is it valuable? Does it exist in multiple content formats on your web page? And then trust signals on your website, a BBB badge that is linked to your BBB page.

So, one of the things I like to remind people is when you put your clients on your site, when you put your trust badges on your site, where you are certified, which associations you’re with, they don’t really matter if they’re just images; Google establishes primary relationships via the existence of a link. Don’t be afraid to point a nofollow link to BBB, to your Google My Business page where you get reviewed. If you’ve got a Shopify page, point it there. If you’ve got a Bizrate page, point it there. But wherever you are reviewed, point it there.

So, what do you do to apply quality metrics to your site? A couple of quick wins is, you know, apply an SSL certificate to your site, therefore making it https. Build your site using a mobile-friendly framework.

Don’t try to reinvent the wheel. A lot of WordPress themes that cost forty to forty-nine dollars are already mobile responsive and a lot of them are fast. Use modern code. Make sure that your websites express themselves in HTML5 and that they don’t express themselves in HTML4, you know, font color is really very, very old school HTML. You don’t need to use that code anymore. Allow for social signals and interaction, don’t be selfish with your website. Google appreciates non-self-serving websites more than it does self-serving websites.

Make your content original, worthwhile, and value-adding. Make sure that neurons do not commit suicide when they read your content. Implement rich media on your website. Leverage Web 2.0. Leverage HTML5 where you have an opportunity to include rich media on your page, include rich media.

I mean, just take a look at the case study we showed you, right? Again, 70 websites that had three to four more content formats index in one to three days. That is tremendous. Apply trust signals and badges and remember trust signals and badges must have an accompanying link. And then link to your reviews. Don’t try to cheat the algorithm by putting a review schema on your site and then just making up reviews. It doesn’t work like that.

Search engines are very smart that content doesn’t matter unless it can’t be verified on a third-party site that isn’t yours, where users can freely comment. Okay, now, one more thing to mention about the quality group of metrics. Unlike all the other metrics we’ve talked about so far, the quality group of metrics can actually be leveraged to give you a boost every two to three weeks. If you guys go to Moz.com and we agree, in fact Moz says, the Panda crawls the web every month.

Based on our experience, we see the quality group of metrics rescore you about every two to three weeks. If you guys need a little boost in visibility, a little boost in rankings, optimize for one of these elements and you will see gradual increases every two to three weeks post-implementation against your website.

[slide: Trust/Best Practices]

Now, let’s move on to trust. This one is also a regulating algorithm but it regulates off-page. Trust is generally known as the Penguin. Now, one of the things I need to mention about the Panda and the Penguin: these two algorithms are very unique. Never in the history of Google has an algorithm been developed to penalize websites. All the other previous algorithms – Mayday, yeah, Mayday, Caffeine, the Brandy, Florida – all of these previous algorithms were meant to reward websites that comply to the metrics that they introduce to the algorithm.

But in 2011 and in 2012, Google changed all that by introducing the Panda and the Penguin. The Panda actually will penalize you and leaves a nice love letter inside your search console. The Penguin though, is, again, equally unique. Aside from the fact that it will penalize you, the Penguin will actually penalize you for retroactive work. Meaning, if you at one point in time once upon a time, hired a really bad SEO and didn’t bother to correct his work, if the Penguin sees an attempt to manipulate your digital footprint, your backlinks, you have bot links, you will get penalized.

So, trust is an off-page regulating algorithm and it happens naturally over time. It takes a look at the C-class distributions of your links, the PageRank distribution of your links, the contextual distribution of your links. All of these are trust signals. Don’t try to manipulate this, don’t try to optimize for trust.

What you can do is avoid the practices that get you in trouble against this metric. It’s easy to spot. If Google detects a partial, a partial match metric against your, or a partial match warning against your site, it’s telling you the backlinks that point to your site don’t follow a natural distribution. And so they leave a love letter and sometimes Google will tell you, you don’t need to do anything yet. We’re investigating whether it looks like these links are contrived or they’ll tell you a-ha, we caught you buying links, you’re penalized.

Don’t hire cheap SEO services. Trust only experts, people that can talk to you at the level I’m talking to you now.

Build thought leadership, this is where you leverage your expertise. Get people to naturally ask for your content. Don’t buy links. Most people that sell links still do it in a very bad way that’s very easy for Google and other search engines to spot. Don’t buy your links; earn them and then balance your digital footprint.

I’ll give you guys a quick story on this: we had a client that had trouble trying to optimize for his brand because if you took a look at his backlink distribution, his exact match anchor texts was extremely high. When we took a look why, it’s because he formed a partnership with another company and the company decided to point the link to him using an exact match term, do-follow in the footer.

Now of course we didn’t tell him “sever your relationship with this client,” but tell them if that’s the way you’re going to point the link, can we just get a body link instead or can you set those links on your footer to no-follow, the ones that point to us, especially since the link was earned through a commercial relationship.

Remember: Google doesn’t say it’s forbidden to earn a link with a commercial relationship. They just say that it is unethical to earn the PageRank or the power from that link if it was acquired commercially. So, balance your digital footprint, don’t break the trust signals. That way, you don’t receive penalties inside your search console.

[slide: Google’s 4 Golden Rules]

Now before I wrap up and get on to the question and answer, four of Google’s golden rules.

Google’s mantra is don’t be evil. I’m not sure if you guys have seen enough Google literature out there, but the Google mantra, the Google model is don’t be evil. Right? It’s, uh, how much more can you say, right? That’s pretty ambitious.

Google rule number two though, is you searched and you found what you were looking for. This is Google’s model. This is their mission, this is what the core of their business is. Google’s core business is search. Therefore, they only stay on top of that game by giving relevant results over and over and over again consistently to their users because Google rule number 2 is you search and you find what you’re looking for as a user.

Rule number 3: build for the user. Google has been preaching this a long time and until they rolled out algorithms that penalize websites, no one would build for the user. Build for your users. Build for your users, don’t build for your bots, don’t optimize for search engine bots. The search engines are getting very, very smart. They are beginning to understand the content on your site.

One of the metrics that we didn’t touch on earlier, which is part of the on-page group of metrics is LSI or the latent semantic index. And what this means is the search engine attempts to read and comprehend the content on your site without you stuffing it with keywords. It doesn’t need a lot of help in terms of figuring out whether content is unique and valuable or not. Build for your users, from navigation, design, content, site architecture, everything.

Now, Google rule number 4: don’t game the algorithm. Do not attempt to build a link just for the sake of passing PageRank. And if you game the algorithm, well we all sort of know what happens, but don’t game the algorithm.

All of the rules in the Google Webmaster quality guidelines are covered by these four sacred rules. If you guys comply to these, if you guys follow them, then you’re already on your way to becoming a more relevant brand.

[slide: For Our Partners]

Now, for our partners, the guys that already have access to our dashboard, we built Simplifying SEO: The Six Core Metrics You Need to Know. This is now in a downloadable PDF training guide inside the dashboard, just follow the bit.ly link, which is bit.ly/coremetricsguide and we will now proceed to the Q&A session.

[slide: Q&A Session]

So if you guys have questions, feel free to type it inside the search box. You could also schedule a call with us by following the bit.ly link here. Ask for Tim or William. These guys are terrific. They will sound nearly as awesome as I will. They know almost everything I do. Of course they don’t have 18 years of experience, you know, they’re relatively young and I’m relatively old. Those are our numbers, feel free to call us. And if you guys by the way email info@seoreseller.com, I may not respond to you personally, but I promise you I see each and every email that goes into that mailbox.

So we’ve got the first question and it says: I noticed in the dashboard SEOReseller will build links each month for a client. How were they able to get content and links posted on the third-party site that links back to my client’s website?

Okay, so great question. So when we build links, when we acquire links for your clients’ websites, we produce content for them and then we build partnerships with publishing partners and there are three. Before we decided to partner with these groups, we measured their website for trust flow, citation flow, have they ever been penalized before, do they actually carry any PageRank, and so on and so forth. It’s a whole slew of metrics, but they need to be able to pass very stringent guides in order for us to treat them as a partner.

Now, part of the arrangement with these partners is when we produce content, they put them on their niche properties that are relevant to where that content is needed. And there are also very stringent rules, we can’t point oodles of links. And every now and then, some of the publications will not allow dofollow links and the distribution of dofollow and nofollow links is perfectly natural. So that’s how we formed those partnerships and that’s how we acquire those links and that’s how we publish them. I hope that answered the question. If not, feel free to send a follow-up question.

Next one, how do I know when my client’s website has authority? Is there a tool I can use to analyze authority?

So the answer is yes and yes. In spite of the fact that the last public update of PageRank was done in 2014, you should be able to see if there is any residual PageRank on your client’s website by just doing, you know, PageRank domain and then you’re going to get to see third-party tools that made a scrape of the last PageRank report of your site. That’s one.

Now, you can’t always trust that because Google updates PageRank frequently. So, this is when you start using DADA, Moz trust, um, I guess you could also use Semper Power Semper Trust, and you can also use Majestics trust flow and citation flow. What these metrics try to do is they try to give you a good guesstimate. It is their estimation of the authority of your website. It is them creating their own algorithm, their own metric that tries to measure the authoritativeness of the site.

From what we’ve studied, it seems that the trust metrics across all these different tools align very closely with PageRank. So you can use those tools to figure out, did your PageRank most likely increase or decrease in the eyes of Google? You can use Moz or Semper, which are link research tools or Majestic to figure out if you are becoming more authoritative or less authoritative.

Next question: do you think it’s good to do SEO and display advertising?

Yes, amen, yes I do, yes I do. I’m a big believer in this. SEO is very momentum-driven. If your site is particularly new, there’s no way you will rule the first page above the fold. Unless you live in the Philippines, because here it’s very easy to optimize for stuff and nobody has matured optimization characterisitcs, but that’s beside the point.

SEO is momentum-driven, but that doesn’t mean your website doesn’t need traffic now. It does. And I wouldn’t just say display advertising, I would also say, you know, if you’re going to do Google AdWords, do the search network and the display network at the same time. The purposes are entirely different. Inside the search network, when someone is doing a search, you can become more visible and drive instant traffic to your site. By using the display network, you’re able to target specific demographics to see your ad but what I really love about the display network is the remarketing capability, your ability to leave a sticky message and follow a user around wherever they go until they are ready to buy. So we don’t try to hound our users that go to SEOReseller for more than 30 days and then you know we also like to change the ads, but I think it’s an extremely good idea to exercise optimization practices, organic optimization practices on your site and make use of Google AdWords at the same time.

The next question is: what’s the difference between indexing and crawling?

Wow, okay, great question. This one, relatively simple. Crawling is a function of the bot and indexing is a function of the search engine. Indexing is when a copy of your site is made on the site. Crawling is when the search engine scans your website. When the crawler scans your website, I’m not sure if you guys have noticed, but a great way to practice crawling versus indexing is if you guys type in site, colon, and then your domain and you know the actual number of pages on your site, try to take a look at how many results Google will find. And if it matches the actual number of pages on your site. They do not always match.

Now, where there’s a do-follow link, the bots are compelled to follow all do-follow links, meaning all your pages are interconnected. Google’s bot will follow all of them, therefore you can get completely crawled. But Google may not consider all your pages valuable and it will only index, it will only keep a copy of the most valuable pages on your site. And that’s why it’ll never be one is to one, even if you force-fetch and render your pages.

Google determines what the quality of the pages and if it is deserving of being seen by a user. Until it is, Google keeps it off the index, because, I mean, let’s face it, in spite of Google becoming this powerful, there are still physical limitations in terms of resources, right? So they have to keep the most valuable pieces of information and present those to users.

The next one is a relatively easy question, and it says: how long does it take to see results from SEO?

And this one is like really wow, there’s a lot of ways to answer that, because I could ask, I could equally ask: are you trying to do local SEO or just purely SEO on a national level? Are you, where are you located? Are you in the Philippines, New Zealand, Australia, the US, the UK, it sort of depends on where you are.

Because we’re talking to a Western audience, I’m going to say what we usually say: about six months, roughly. You could see some great results in three months by the way. In fact, for us, the most compelling results are seen within the first 60 days.

Under the assumption the partner and the customer implement all our recommendations, the most drastic results happen two to three weeks after the on-page is implemented. That’s when we see the strongest results for SEO, but maximum results can typically be seen somewhere within the sixth month.

Now, if you’re in the Philippines, you’ll see it in about 30 days, again, because you’re competing with you know, not mature websites. Same thing with New Zealand, because we got a great partner in New Zealand and we work on hundreds of campaigns for them also. They are also immature in terms of optimization characteristics. Therefore, if you optimize a site, chances are, there’s a huge chance you’re the first person in that vertical, in that niche, optimizing your website for that audience and therefore it’s pretty easy to earn first page rankings in those regions. Middle East is also still easy in the English language. WE have not tried to optimize Arabic. When we do, I’ll let you guys know. It’ll probably be an entire webinar.

Moving on is: what’s the biggest change you’ve seen in SEO during 2016? Oh wow. Uh, this one is a little tough. So, it might be a mouthful, so it’s a story.  

At the start of the year, we observed that older school metrics seem to be particularly powerful and when I say older school metric, I’m talking about keyword density and authority. How the age of the domain seems to affect searches a lot.

In 2014, LSI was particularly powerful, almost all throughout the year. But in 2016, it seems that the older school metrics are very pervasive. Backlinks don’t seem to be as powerful through 2016 and in the last few, in the last two months actually, it’s been crazy. Sites that we didn’t expect to appear in blended results started appearing in blended results and sites that were deeply entrenched inside the local snack pack started disappearing. These are big changes, because for some reason, what Google used as signals, so what Google used as signals in order to put you in the snack pack before obviously had changed because we had sites that weren’t appearing in the snack pack appear in the snack pack and sites that appeared, that dominated the snack pack disappeared from it. In fact, this is still a big topic of research internally in the office, but there have been a few changes in SEO in 2016.

What is the next question? Okay, so next one is we have two similar questions: what do you think is the future of SEO and do you think there will be a job in that business? Wow, so whoever asked this question, you should fly over, we should sit down for coffee, because you and I could sit in Starbucks for like four days, eight hours a day and I still wouldn’t be done.

So let me try to answer this in sequence. What do I think the future of SEO is, the future of SEO are the search engines anticipating what you need. When Siri came out, I personally thought Apple remained ahead of the search race, because Siri is a learning AI. She’s not an algorithm, she’s an AI, or he, because now he can be British, is extremely intelligent, learn the nuances of your speech, learn your patterns, and it just got better and better and better. But then Siri got relegated from being strong AI to weak AI and now Google has a significantly stronger AI.

I’m not sure if you guys played with the Google app, the one that you can dictate to. It’s amazing. I tried to play with it in different accents and it picks me up all the time. It knows what I’m saying, it understands the context of what I’m saying. But more than that, I think the more users use search engines, the more the search engine learns to understand your behavior and then learns to anticipate what you need.

I think the future of search is very anticipatory in nature. Part of it would be the presence of more virtual-assistive devices out there. And I think SEO, I got to ask this in the SEO Summit this year, uh, when do I think SEO will become obsolete? And my answer was when the web becomes perfect, because this is really what SEOs do and now I’ll pull a quote from Sam Nam. If you guys don’t know him, Sam Nam was the vice president of marketing for the Digital Room. Awesome, awesome guy. I’ll have him for a conversation tomorrow, actually. And Sam said, a good SEO’s job is to make the web a better place. And in my opinion, I think that’s what SEOs do, they make the web a better place. So do I think there will be a job in that business, yes, because as long as the web continues to evolve, we will always continue to move away from it being perfect and there will always be a job for people who need to be responsible for making the web a better place, you know for website owners and for website users and for Internet users in general.

The next one is: does SEO have a future? It seems like everything above the fold is ad-driven. What do I tell clients who ask me this question? So, yes, SEO does have a future, but if your clients tell you, it seems like everything above the fold is ad-driven, what do I tell my clients? You have to tell them that SEO is an investment in free traffic, because that is essentially what it does.

According to Marketing Sherpa, the ROI of SEO and we will personally vouch for this because this is what we’ve seen with our end clients, SEO has a 20-factor return on investment. It has that capability because you invest in your on-page once, you optimize your on-page once, you optimize your website once, and that quality score that on-page score is yours forever. Right, it takes a while for you to lose above the fold rankings on page one. And while you’re there, it is free traffic.

Now, what I strongly discourage people from when they sell SEO is don’t try to encourage, don’t try to sell SEO and encourage SEO as just, you know, link building forever. It’s not link building forever. In fact, we don’t link build forever. Some of our clients for the first time have to do a consultation with us for 30 to 60 days and then we will always go back to on-page on a periodic basis. Every three to six month, we will do a revisit to see do we have opportunities? What does the search console tell us? Therefore, SEO is highly consultative and highly on-page in nature and highly quality in nature.

So, will your clients get seen on the first page given that there are so many Google ads? Yes, they will. Google is extremely smart, by the way. They have no interest in making the first page entirely ad-driven, because the day they do that is the day they give all their market share to Bing.

Google is all about providing great value to users and they’re intelligent about this. Google specifically segregates what’s a user and what’s a customer and they always try to balance out the agenda between those two. Google is relevant not because of the customers. They’re profitable because of the customers. But they’re relevant because of the sheer volume of users that use them as a resource. One hundred billion searches every month.

So the next one is, oh this is interesting: what’s your take on infographics? How can it impact an SEO campaign strategy? So, infographics are great, especially if you post them on the right places with the link coming back to you because you can get either direct visits coming from them or referral visits depending on where you publish them. It is another type of link, because it’s not just another anchor text link pointing back to your website. And from what I’ve seen, when infographics are controversial or awesome at simplifying complex information into something simple, they’re great at driving traffic to a site. So, that’s how it can impact your SEO campaign strategy. You don’t only look at it as a link resource, you also look at it as a referral traffic driver.

Okay, next one. I’ll take the last two questions because we’re actually out of time, so I thank everybody for keeping up with me for the minute that I overextended.

How do you keep up with the ever-changing algorithm which Google serves? We do research. We own over 700 websites, we run tests on them. We encourage our employees to run their own websites, run their own tests on them. We spend thousands of dollars trying to figure out what the algorithm likes and dislikes and we change our methodology accordingly and that’s how we keep up with it. That’s why we’re saying stay away from cheap SEOs. I know that we say that we’re not the cheapest provider all the time, but again, there’s a reason for that, right? That research costs money. Figuring out, doing trial and error, testing – all of these take resources. And applying a safe methodology, one that doesn’t get your clients in trouble but does drive results is not something that happens for free. So we don’t have, you know, six webmasters that crawl the web and look for blogs that this is valuable. We actually only vouch for methods that we’ve tested ourselves.

The last one, or the last question I’ll take: my client’s rankings have decreased. How do I see what’s the problem? My provider is being very evasive. Oh wow, awesome.

A: talk to us. No commitments. We won’t bill you for it, we always perform a pre-intake audit to find out if a website is in trouble. Talk to Tim. Give him the URL, have him perform a pre-intake audit for you. If it’s a simple problem, we’ll find it very quickly. The pre-intake audit takes 35 minutes to do, but this is an entirely different webinar and maybe I should introduce it.

So how do you find out why your rankings are dropping? When our employees do an audit, they do an audit from five perspectives. In the same way that we have six metric groups, we have five audit groups. The audit groups are others your site look like inside the index, what does your on-page look like? What does your digital footprint look like? Do you have any trust violations? And the last one is, we take a look: do you have mature on-page characteristics? We filter through those one by one by one by one by one and by doing it a lot, our SEOs are actually able to figure out what’s wrong with a website.

They’re intuitively, because they see it hundreds of times every month, you know, they take a look at the site and most of the time, about 80% of the time, figure out what’s wrong in about 15 minutes and then it will take them about an hour to create a document. But, they’re taking about 15 minutes to figure out what’s wrong in a website, but if you’re interested in getting help but there’s no professional in house, we’re not a charity, but we do offer value first. And if you’re, if you want to see what we’re made of, we can do a pre-intake audit against the site. Give us a call and ring up and ask Tim or ask for Tim. Give us the URL and we can do a pre-intake audit.

In my opinion, no provider should be evasive. Transparency is not one of our core values, it’s like integrity. It has to be a given. So your provider cannot be evasive. Nobody should do a song and dance number on SEO.

And with that, I thank everybody. These were awesome, whoo! Challenging questions, but I really, really, really enjoyed it. If any of you guys happen to be in the Philippines, pop into the office, give us a ring, ask for our address. But you know, pop into the office and I’ll take you out for coffee and I can talk about SEO to the wazoo.

But thank you very much, you guys have our numbers and again I look at that mailbox every day. This is Bernard, I’m signing off. I’m signing off the webinar and I appreciate you being here.