Or call us at (773) 904-2700
Lyndsey will get in touch with you as soon as possible.
A cutting edge website without visitors is like storing a Picasso in a forgotten basement. Don’t be like this Indiana museum. Instead, ensure your site is swarming with qualified traffic. Your impressive development work deserves it.
The goal of SEO is to drive qualified traffic — potential customers — to your website. SEO helps accomplish this by optimizing websites for robots — and the human user. So, think of this guide as a way to improve and optimize your site, rather than a laundry list of fixes. Some of the items detailed in this guide are are must-haves and a few are nice-to-haves.
SEO is an ever-evolving field, but Digital Third Coast has more than a decade of experience helping businesses drive traffic and attract customers with strategic SEO. We’ve worked closely with developers and know how to speak your language. We hope this guide helps familiarize you with the language of SEO. We’ve curated this SEO guide for developers, with you in mind:
Humans and robots (Google) see websites completely different. But ensuring your site is built with HTML best practices allows the robots to properly see what is intended for humans. Unlike other scripting languages like Java, HTML is read and understood by web crawlers. It’s through HTML that Google understands what the site is trying to convey to users. In this section, let’s look at important HTML elements that help drive SEO results.
A Title Tag declares the title of an individual web page in a short, concise manner. Title Tags are visible in search results and are critical for user experience.
It is recommended that each page has its own unique title tag. In some rare instances where pages must have the same title tag, it is recommended to include a noindex,follow tag to prevent any SEO issues stemming from this duplicity. Depending on the situation, canonicals could be more appropriate.
Example:
<head>
<title>Webpage Title|Company Name</title>
</head>
Best Practices:
– 55 to 60 words
– Include the keyword closer to the beginning of the title, but only if sounds natural
– Don’t keyword stuff (e.g., don’t include multiple variations of your keyword in the title tag only for the benefit of trying to rank for those keywords
– Include your brand at the end, separated by a pipe bar (|)
H1 tags alert the engines to the most important words on the page. Each landing page should have only one H1 attribute. The H1 should include a relevant keyword, as it’s one of the key areas of “SEO real estate.”
Keep in mind that every user — as well as the robot — sees the H1 on your page, regardless of how they arrive to your website. For this reason, we recommend writing the H1 for users first and only including a keyword if it sounds natural and relevant. Some sites have pages with duplicate H1 elements. In these cases, either the H1 should be re-written or the pages should include a noindex, follow tag.
Meta descriptions, while not a direct ranking factor for SEO, help increase click-through rates and improve the overall user experience for people searching for your site. They provide context to the search results and can be the difference between earning a click or not.
Images are crucial to any website design. But there’s just one problem: Web crawlers can’t decipher those images. Enter proper HTML structure and Alt Tags.
Example text link
<a href=”https://example.com/webpage.html”>Keyword in AnchorText</a>
Example nofollow link
<a href=”https://example.com/webpage.html”rel=”nofollow>Keyword in Anchor Text</a>
Example image link
<a href=”https://example.com/webpage.html”><img src=”/img/keyword/.jpg”alt=”key-word” width=”100″ height=”100″></a>
While a popular way to build navigation and menu items, JavaScript is not always read by robots and web crawlers. Instead be sure to build your navigation items with HTML first.
There’s a lot to a URL and most users take them for granted. But a developer shouldn’t.
Well-structured URLs not only keep sites and subpages well organized, but they help with SEO by clearly explaining to robots the order of the site. A well-structured site should follow a breadcrumbs method. Every click a user makes, should be represented in the URL structure. A few tips: Make your URLs readable by people
Make your URLs readable by people
Good: http://www.digitalthirdcoast.net/blog/marketing-math
Bad: http://www.digitalthirdcoast.net/blog/id4001_2518
Use a keyword or two but don’t look spammy
Good: http://www.digitalthirdcoast.net/blog/seo-universities
Bad: http://www.digitalthirdcoast.net/blog/seo-universities-colleges
Limit the amount of folders
Good: http://www.digitalthirdcoast.net/services/seo-local-restaurants
Bad: http://www.digitalthirdcoast.net/services/seo/local/restaurants
Uses dashes or underscores as word separators
Good: http://www.digitalthirdcoast.net/services/seo-local-restaurants
Bad:http://www.digitalthirdcoast.net/services/seo%20local%20restaurants
There are seven HTTP status codes every good web developer should keep in their back pocket for SEO purposes, especially when migrating an old site to a new site. Tools like Screaming Frog make finding and identifying issues easy after a launch.
200: A green light, all systems go.
301: When migrating a site, it’s important to not just transfer content. SEO link equity should be migrated as well. The 301 redirect is a permanent redirect, which preserves most, if not all, link equity from the old site/page to the new site/page.
302: This is a temporary redirect from the old site/page to the new site/page. It’s used while the page/post from the old site is not yet populated on the new site. Once the page/post is published on the new site, a permanent 301 redirect should be used.
404: Uh oh, a 404 error means the page isn’t found. You have a dead or missing page. Be sure to create a custom 404 page and use it to change a bad user experience into a memorable one for your brand.
410: This code means finality. Something is permanently removed from a site.
500: There’s nothing wrong with the site with a 500 but the server is experiencing an error.
503: There’s nothing wrong with the site with a 500 but the server is experiencing an overload or an error.
Developing the site is just the first stage. Maintaining and overseeing the site is the task of a dedicated webmaster. The following tools allow the webmaster access and control of indexing and visibility of the site. Be sure to push the site map to these tools.
Google Search Console
Bing Webmaster Tools
Yandex Webmaster Tools
Master this crucial SEO process and you might just be named a saint.
Canonical tags solves a common issue for search engines: Often, web content can have more than one possible URL destination. Canonical tags sort this all out for robots so that search engines know the preferred URL to display in search results.
Without canonical tags, robots become overworked and confused. Similar pages and content start fighting each other for superior rank, harming your site’s overall search visibility. A common issue is with eCommerce pages. Take a look:
URLs in need of canonicalization
https://www.homepage.com/shoes/women/
https://homepage.com/shoes/women/page2
Solution
Add the following to the header of https://homepage.com/shoes/women/page2:
<link href=”https://www.homepage.com/shoes/women/” rel=”canonical” />
This file (found at https://yoursite.com/robots.txt) declares what should be and should not be included for indexing by Google and other search engines.
Developers sometimes use a robots.txt to block all pages of the site from indexation during development. This is an encouraged practice but it is imperative to edit and allow indexing once the site is live. Here are a few examples of robots.txt in action for a www.example.com site:
Robots.txt file URL: www.example.com/robots.txt
Blocking all web crawlers from all content
User-agent: *
Disallow: /
Using this syntax in a robots.txt file would tell all web crawlers not to crawl any pages on www.example.com, including the homepage.
Allowing all web crawlers access to all content
User-agent: *
Disallow:
Using this syntax in a robots.txt file tells web crawlers to crawl all pages on www.example.com, including the homepage.
Some webpages on a site shouldn’t be indexed or included in search results. Examples of these include privacy policy, legal, sign-in pages and user pages. These exclusions should be added to the robots.txt file.
No index for page users
A user page is the result of that user’s own on-site action. For example, let’s say your site features a News & Insights page. On this page, users can sort by author, topic and date. When the user selects a certain author or time frame, the site generates a new page result. We don’t want to include these user-generated pages, so we attach a no index tag so they aren’t crawled and therefore won’t end up in search engine results.
Does that can’t-miss blog post run onto multiple pages? Don’t fret. With some simple markup, you can prevent any duplicate content that pagination may cause. With this markup, search engines correctly and intelligently understand the relationship between pages and ensure that the first page is the one that’s found in the SERPs.
First page
https://homepage.com/post1
<link rel=”next”href=”https://homepage.com/post?pg=2″>
Second page
https://homepage.com/post?pg=2
<link rel=”previous”href=”https://homepage.com/post1>
<link rel=”next”href=”https://homepage.com/post?pg=3″>
Last page
https://homepage.com/post?pg=3
<link rel=”previous”href=”https://homepage.com/post?pg=2″>
Google uses desktop and mobile site speed as a ranking factor. You can test your site speed and find recommendations for improvement by using Google’s PageSpeed Tool.
Common causes of page bloat that leads to slower load speeds include:
– Redirects
– Non-compressed image files
– Slow server response times
– Render-blocking JavaScript
– Blocking browser caching
Long gone are the days when a mobile site was a nice to have. Instead, the mobile site is more important than the desktop site in terms of indexing and rankings. As a developer, you’ll need to focus on the mobile site and responsiveness more than ever when optimizing SEO. With Google’s mobile-first indexing, the mobile version of your site is what Google uses to determine rankings. Be sure to review Google’s blog post detailing mobile-first indexing.
Social media dominates our personal lives and the news media. To that end, it’s smart to optimize your content and site for social sharing. Be sure to understand the different data needed for each platform so that your content has the biggest impact when shared. Ways to optimize each social platform include:
– Installing Twitter cards
– Optimizing your share on LinkedIn
– Implementing rich pins on Pinterest
In search results, Google and other search engines display snippets of text that describe your website or service to potential customers. Rich snippets go the extra step and can share user reviews, ratings and additional information like business hours.
Providing robust rich snippets and structured data can greatly improve click-through rates, driving even-more qualified traffic to your site.
Email Lyndsey. Find out how we can help.
Or call us at (773) 904-2700
Lyndsey will get in touch with you as soon as possible.