Keyword Data and Best Practices for SEO According to Jon Gettle
Self-proclaimed "SEO geek" Jon Gettle, Principal Inbound Marketing Consultant of HubSpot's Services Team, has strong feelings about Google Search Console. He made that evident during the latest Twin Cities HubSpot User Group meeting on May 4, wrapping up his presentation on keyword data and best practices for SEO with these words:
"Search Console gives you a direct way to understand how Google interprets your site. Full stop."
Keyword Research & Rankings
Some of the coolest stuff, he explained, is in the capability to analyze search queries by page on your own website. The process of finding keywords that work (that people are using in Google's search engine), then positioning your website for success, is evolving, he added.
Using the Search Console tool allows you to "dig into the data and start asking questions," said Gettle. Most businesses find that their top ranking pages are the home page and the career page. Is this really helping your business to generate leads, he questioned.
If you want to drive people to your site for specific information, the keywords and questions you want to rank for, then don't you need to draw them to a page on your site that addresses those concerns, he added.
By analyzing your website in Search Console, you have the ability to adjust pages that are underperforming in the rankings. You can drill down into what search queries drive a page to "float your position" in the list.
However, he warned, you also need to evaluate the intent of your keywords as organic results aligned with your content will produce quality traffic rather than simply more traffic.
He suggested that businesses should monitor Search Console data on a quarterly basis because the tool will only produce results for the previous 90 days. Gettle sets reminders on his calendar to capture data before new results are produced to compare data over time.
Robots & Crawl Statistics
In addition to monitoring keywords, Gettle explained other aspects of Search Console to maximize how Google reads your site.
Users have the ability to see and adjust the robots exclusion protocol (REP), also known as the robots.txt file for their site. The txt file, said Gettle, is a way to communicate with Google, essentially, to point the bots in the right direction and to suggest spots on your site to ignore. Often, he added, the mistakes are clumsy, little errors that can affect rankings.
As Google will penalize for bad user experience, it is crucial to clean up any broken links that prompt 404 errors, said Gettle. He suggested monitoring errors on both desktop and mobile devices using the Search Console as crawl statistics can be an early warning signal to a potential drop in rankings, which is worth paying attention to.
Fetch as Google & HTML Improvements
Another way to boost rankings is to view your website as both a user and as Google. Gettle recommended using the Fetch as Google tool to see exactly how Google reads your pages. It's a side-by-side comparison of your actual site and what the robots see behind the scenes.
"What we don't want," said Gettle, "is for Google to see one thing, and a visitor to see another."
In terms of answering the common question -- "Why am I not ranking?" -- Gettle directs clients to view pages of their site in HTML.
"Google thinks big to small, top to bottom," he said.
Once your website is stripped of the visual components, does the leftover text align with your keyword strategy or thought process? Use keywords in a way that Google can easily read, he suggested.
Rich Snippets & the Data Highlighter
Rich Snippets are fairly new in Google. You may have noticed them after asking a question in the browser. Those little boxes that appear highlighting a popular answer or local business are actually changing the way people receive information from Google.
According to Gettle, you better pay attention to how your data is structured on your website because "rich snippets are the future of search."
So how do you position your site as an attractive source of rich snippets? Gettle noted the Data Highlighter tool within Search Console allows you to send markup language directly to Google. Essentially, you can highlight important information about your business, such as name, address, hours of operation, an image, and even some reviews, and communicate the information directly to Google.
"You are training Google how to read your website," he said. "Which is more and more important as things are moving to rich experiences."
The Data Highlighter tool, however, is specific to Google. To ensure rankings on all browsers, Gettle recommended following markup protocol set forth by Schema.org.
Canonical Content & Inbound Links
We all strive for inbound links (hyperlinks back to our website) to achieve authority in search rankings. Gettle suggested striving to create what he refers to as Canonical Content to attract such links.
Canonical content is highly authoritative content on the web that is filled with a tremendous amount of useful information.
In Gettle's words, "Canonical content is independently valuable."
It's hard to create such content, he admitted, but the reward is worth the effort, even if the content is a little off-brand because it will provide value to the rest of your site.
At a minimum, Gettle said that canonical content should be over 1,000 words, be highly organized and formatted, including a table of content and anchor text.
Key Takeaways from the Keyword Data & Best Practices for SEO Presentation
1. Search Console allows you to understand how Google interprets your site. Use it.
2. Extract actionable data to inform decisions. Set a calendar to pull data over time.
3. Rich Snippets are the future of search. Learn how to rank for them.
4. Authoritative canonical pages on your site garner attention. Set a goal to produce one.
Thank you to Jon Gettle for speaking at our latest Twin Cities HubSpot User Group meeting and to Media Junction® for hosting the event. Read more about upcoming Twin Cities HUG events on LinkedIn.