Organizing your website architecture the right way is important in ensuring the success of your SEO campaign. This can be tricky, given the algorithmic updates Google has released over the past year. Having strong website architecture also makes it easier for visitors to find the information they need on your site, improving their browsing experience. Today, I want to discuss how you can come up with the best SEO-friendly site architecture.
Start with the Keywords
Your keyword research serves as your foundation for the site. The website you’re creating will become more SEO-friendly if its structure revolves around the language commonly used to search for your industry. Consider what products and services you will feature on the site, and create a list of terms commonly used for each. Run it by Google’s keyword tool and see what terms generally receive the most queries, which ones are competitive locally, and which ones you can realistically rank for.
Take these terms and arrange them in a spreadsheet according to their category. This will serve as your initial structure. Remember to target just one keyword on each page. Targeting multiple keywords in one page can confuse algorithms; as a result, you may not rank for any keyword at all. The homepage may be more difficult, because you can’t avoid mentioning multiple keywords – the trick is in distributing it all over the body text and the metadata naturally when you create the content later.
Establish your Linking Strategy
Once you have the keywords, you need to figure out how to arrange these to optimize crawling and browsing activities. Keeping your website shallow makes it easier for crawlers to find, index, and rank your pages. It also makes browsing through pages easier for your visitors. While having only two levels sounds ideal, it will make finding information difficult for your visitors, which in turn could affect your rankings and conversions. Keep your structure three to four levels deep.
Silo linking may be a good idea if you are building a static website, and you will not make major changes to the content often. It gives your site structure by grouping pages according to categories, and it results in a stronger link profile for your homepage. This is not an advisable tactic if you are creating dynamic websites like blogs and ecommerce, because it will require you to change the links in each page every time you add or remove a product.
Structure your website in a way that maximizes your anticipated incoming traffic. You can do this by applying the “nofollow” attribute to links with less traffic and getting the crawlers to focus on pages you expect to generate high traffic. This process is called PageRank sculpting. It’s easier to sculpt your inner links (compared to your inbound links later on) because you have full control over their settings.
Build Your Structure Around Content
Content is at the core of your structure; naturally, you should build your structure around your content. Based on your keyword research, you can decide what categories you need to create on your site, and how to approach your content strategy. Use your primary keyword in each page in your URL, and mention it enough times on your body text. There’s no real formula for the number of times you need to mention your keyword, but it’s best to keep its total occurrences under 5% at most.
Simplify your URLs with your target audience in mind. Shorter URLs are easier to remember, are more direct to the point, and therefore are more likely to attract traffic. When deciding your URL structure, keep in mind your categories and create subfolders for these to make crawling easier. The best structure for your URL is: [Homepage]/[Category]/[Product or Service Keyword].
Quality is what search engines credit the most today. Keep your content relevant to the keywords you are targeting, and you’ll be creating a strong foundation for your website. Depending on what linking strategy you choose, you may or may not need to use multiple keywords. Make sure to make your primary keyword the dominant keyword. If you’re going to use silo linking, find a way to relate your main topic to the page you will be linking to, and make your content flow naturally so Google won’t flag you.
Strengthen Your Support Structure
Finally, you need to reinforce your foundations by making your support structure – your server, your code, and your CMS – just as strong. Make your codes as clean and organized as possible to help bots crawl quickly. Follow W3C standards in order to make sure your code can be crawled by spiders easily. Check your server and make sure it is in good health, and that it provides enough memory to accommodate your expected content. Google and other search engines factor in site speed when ranking websites, and this is where your server plays an important part. Finally, choose an SEO-friendly CMS. Make sure it can be crawled by bots, it allows customizations and flexibility, and it is easy to use.
We’re happy to help you answer your web development questions. Contact your account manager today for assistance or check out our services to get full support. Sign up for a free account and become our partner today to get started!
Latest posts by Clayton Wood (see all)
- Amazon Takes Aim at Google AdWords with New Sponsored Links - September 16, 2014
- After Google’s HTTPS Implementation: Is it for More or for Worse? - September 12, 2014