As we start to turn the page into 2021, a change I’m sure most people are eager to get to in light of World events this year, it’s time to take a look at what B2B SEO practitioners should be planning for to stay current and take advantage of emerging trends. As we evaluate the changes that took place in 2020 and the expectations for 2021 in order to craft a winning SEO strategy, segmenting activities into the following categories is helpful to put a framework around our planning:
In this first of six articles corresponding to each of these categories, we will take a look at some of the SEO priorities 2021 that should be top of mind for B2B companies who want to ensure they are competitive and staying ahead of their competition.
If you haven’t heard yet, Google intends to make Core Vitals a ranking factor in May of 2021 and have given Webmasters and site owners notice to make necessary changes now or suffer the consequences later. This is true of Core Vitals and some other Usability factors that are all being put under the umbrella of Page Experience. Here is the recent Google blog post that references the upcoming changes and all of the issues that need to be addressed.
Some of the issues are straightforward like ensuring your site is secure. For some sites, ensuring that the site is mobile friendly and has the same content for mobile and desktop users is also straightforward while some folks will have some challenges to work out to comply with that directive. But in many more cases, the big obstacle to Page Experience success will be to address Core Vitals and page speed.
The following definition of Core Vitals is taken directly from this page on Google’s site:
Core Web Vitals
Core Web Vitals are the subset of Web Vitals that apply to all web pages, should be measured by all site owners, and will be surfaced across all Google tools. Each of the Core Web Vitals represents a distinct facet of the user experience, is measurable in the field, and reflects the real-world experience of a critical user-centric outcome.
The metrics that make up Core Web Vitals will evolve over time. The current set for 2020 focuses on three aspects of the user experience—loading, interactivity, and visual stability—and includes the following metrics (and their respective thresholds):
Largest Contentful Paint threshold recommendationsFirst Input Delay threshold recommendationsCumulative Layout Shift threshold recommendations
Largest Contentful Paint (LCP): measures loading performance. To provide a good user experience, LCP should occur within 2.5 seconds of when the page first starts loading.
First Input Delay (FID): measures interactivity. To provide a good user experience, pages should have a FID of less than 100 milliseconds.
Cumulative Layout Shift (CLS): measures visual stability. To provide a good user experience, pages should maintain a CLS of less than 0.1.
For each of the above metrics, to ensure you’re hitting the recommended target for most of your users, a good threshold to measure is the 75th percentile of page loads, segmented across mobile and desktop devices.
Improving the scores for each of these new metrics can be a very complex endeavor that requires a deep technical understanding of your code, especially for enterprise level sites with multiple CMS platforms and page templates. It will undoubtedly require IT resources to improve and will require ongoing attention and education as these metrics are subject to change or evolve.
This is going to be a resource intensive initiative for many companies and should be planned for and budgeted for immediately as a priority for your 2021 SEO program.
Not just your actual URLs although that is obviously the top priority, but also ensure that all the content of your pages is being indexed and that you aren’t being affected by partial indexing. Partial indexing can happen at the block level meaning that certain areas of the page are not being indexed either because they are hidden from Google based on the way they are implemented or in some cases, because of low quality.
Rather than the cache command, use a combination of a unique string of text and on your page and the site command:
“Unique string of text in quotes” + site:www.yoursite.com
Here is an example for BusinessOnline:
It’s good to see that our homepage content is correctly indexed 🙂
For a more accurate assessment that is a bit more complicated, you can use another methodology that replies on Google Search Console to render the page:
Use the inspect URL function for the URL you wish to evaluate:
Assuming your URL is indexed, you will see the following message:
Click on the Test Live URL button and you should get the following message:
Keep in mind this will give you up to date results but may or may not reflect what Google has in their system currently. If you have recently updated your page, you can check the crawl date to see when Google crawled your page last:
Click the View Test Page link. What you will see is that the right hand margin will populate with the rendered HTML code:
Do a Control F in this section and search for the string of content that you are testing. If you find the text, it has been indexed.
Fixing these issues is a long and complicated topic that exceeds the available space in this post but suffice to say that it requires a knowledgeable and experienced professional.
In terms of 2021 planning and priorities, you should ensure that all of your current content is being correctly indexed. Prioritize by importance and ensure you evaluate all of the different templates that exist within your site. Ensure that you have time set aside throughout the year to continually evaluate your content and how it is being indexed as well as ensuring that all of your URLs have been indexed as well.
This is another potentially complex area of focus. Most of the duplicate content issues that I see are a result of redundant URL paths to the same content. In many cases, this situation has been created inadvertently. In general, you should ensure that the following best practices have been observed:
You should be consistently looking at Google Search Console to monitor what URLs Google is indexing and which they are excluding to identify any unintended duplicate content problems or issues.
This is especially true for the Coverage Section under Excluded within the following categories:
The pages in the Alternate page with proper canonical section should show you pages that have correctly assigned different canonical URLs. You will want to validate that that is the case. In the case of Blocked by Robots.txt and Excluded by noindex tag, you are looking there in the event you have intentionally excluded URLs to avoid duplicate content which should typically be a last resort. Keep in mind that just because something is blocked by the robots.txt file does not mean it is invisible to Google but only that it will not be discovered via a standard crawl of your site. Google can still find those pages from external links. The noindex tag is the better option to ensure that Google will not index a page.
The Duplicate without user-selected canonical and the Duplicate, Google chose different canonical than user categories are especially important to evaluate and correct in order to prevent duplicate content issues.
In terms of 2021 planning, an end of the year technical audit to identify any duplicate content issues so that they can be put on the radar for next year is recommended. Consistent time should be allotted on a monthly basis to evaluate the state of the site specific to these issues. The larger and more complex the site, the more often the site should be monitored.
Here is a list of things you can do to clean up the broken things on your site. These issues should be evaluated at least on a quarterly basis.
In terms of 2021 planning, you should budget for ongoing monitoring and resolution of these issues on a consistent basis. The larger and more complex your site, the more often you will want to be on the lookout for problems of this nature. This is especially true for Enterprise B2B companies that have a lot of change or have multiple CMS systems and / or microsites. It is very common for blogs to have a unique CMS. In those cases a separate technical audit should be done specific to the blog.
While this topic is not specific to technical SEO, it has the potential to be very impactful. This is the most relevant of the six topics in this series so I thought I would include it here.
Let me preface this section with the following excerpt from Search Engine Journal which summarizes John Mueller who is one of Google’s chief SEO evangelists:
“In Mueller’s response, he stated that the number of slashes in a URL does not matter.
What does matter is how many clicks it takes to get to a page from the home page.
If it takes one click to get to a page from the home page, then Google would consider the page more important. Therefore it would be given more weight in search results.
On the other hand, Google would see a page as being less important if it takes several clicks to visit after landing on the site’s home page.“
This quote is taken from the following page:
Optimizing your internal link structure is predicated on ensuring that your most important content is as close to your home page as possible in terms of the number of clicks it takes to find it. This requires properly leveraging links in your navigation menu, in your content, in your footer, breadcrumb links and anywhere else that it makes sense for users to create link connectivity.
Additionally optimizing your internal link structure requires implementing descriptive anchor text that contains some relevant keyword specificity. Anchor text is still utilized by Google as a ranking factor:
It’s important to use keyword focused anchor text but only in a way that is relevant within the context of the page and you should vary the way you link to any one page in terms of content links. It’s also important to recognize that Google understand semantic relationships better than it ever has before so it isn’t necessary to constantly use the same keyword for all links that point to the same page. Making things as relevant as possible for the end user is the best recommendation I can give. That includes optimizing the text that surrounds your internal links as well.
You should also ensure that every page of your site has at least one internal link that points to it. This is referred to as eliminating orphan pages.
As previously mentioned, most links should not have a no follow tag with the following exceptions for which Google has expanded their vocabulary of tags to accommodate:
More information can be found here:
Additionally, for a deeper dive into maximizing your internal link structure, I recommend this excellent article from Ahrefs:
In terms of 2021 planning, that is going to depend a lot on the opportunities already existing on your site and what is feasible to change. We recommend prioritizing an internal link audit by a qualified professional to identify all opportunities and then prioritize them based on potential impact and ease of implementation. Again, many of these potential improvements will require IT support so the quicker you get them in the queue, the quicker you are likely to see results.
I hope this first article in our 2021 B2B SEO Planning series has provided valuable food for thought on what areas to focus on and prioritize from a technical SEO standpoint going into the new year. SEO is always changing and so are Websites and their technical implementations. It’s important to remember to set aside resources to constantly monitor and resolve SEO technical issues as you plan and budget for the new year. Especially in 2021 where the launch of Core Vitals as a ranking factor is liable to require significant resources.
Receive a weekly notification of BOL's newly released content covering your B2B marketing needs!