If you’re pursuing a PhD in SEO, stop reading now! These SEO basics are designed to help you grow your website traffic and learn more about your audience without going into all the technical mumbo jumbo.
The truth is that SEO is based on common sense not complexity. Through the years people have made SEO more complicated than it really is. We wrote this guide so you can apply the important SEO principles on your own.
“Life is really simple, but we insist on making it complicated.”
Learning what your visitors are searching for, how to optimize your web pages for keywords and measuring success will be an ongoing process. Instead of worrying if you are doing SEO “right”, keep experimenting and discovering what works for you.
Our unscary guide will take you through the SEO fundamentals in bite-sized sections. When you get the basics down, you’ll start to see results. Let’s do this!
SEO work usually starts with keyword research and competitor analysis. But before diving into how to conduct the keyword research, it’s important to understand the intent of the search query.
What does the person searching expect to see in the results?
By checking the current top 10 results of your keywords, you can identify the searcher’s intent. You can then create the right type of content that you’ll need if you want to rank in the top 10 results. What type of content is showing up on the results page?
You probably already have a couple of ideas about how people are searching for your product, service or quality content. Even if you have only 1 keyword or phrase, that’s a great way to start. Let’s use those initial ideas to generate even more keywords and phrases.
Think of a phrase your customers would use to find you, enter it into Google and you will start to see related keywords and phrases that might be relevant to you.
Let’s go through this process using the example phrase “how to take care of your dog” (informational intent query).
You can see below that once the phrase is typed in Google, we see a list of what people are searching for and even potential content ideas for our website.
You can get more ideas by scrolling down to the bottom of the search results.
When you are on the search results pages, there are several things you can observe that will help you plan out your web pages.
Take a look at the search results page below with these questions in mind:
1. What keywords are being used in meta titles?
2. What kind of content is ranking?
Visit a few of the top-ranking pages and pay attention to the following:
While visiting the top websites based on your query, briefly analyze the types of sites to get a better sense of the competition.
Check what kind of websites and how big they are:
If you see that the top 10 results for your query are coming from giants like Amazon, Ebay, Apple, Best Buy, etc, then maybe it’s a better idea to explore less competitive keywords and focus your efforts building your traffic using several less-popular long-tail KWs.
Here are 2 free tools to help you grasp the size of your competition:
1. SimilarWeb Chrome extension measures the traffic of the website. If a competitor website is receiving very little traffic, no value will be shown. Also, these are very rough estimates meant to compare your competition, NOT to make calculations.
2. MozBar Chrome extension provides DA (domain authority) and PA (page authority) metrics, that attempt to mimic Google’s algorithms as best as possible and assigns a score to your website or page. The higher the score, the more authoritative the website is considered. Websites like Wikipedia, Facebook, Youtube, Amazon will have scores close to 100, which makes competing with them difficult.
If you notice some websites with a score of 0-30, that’s a good indication that you have a higher chance to rank among them. Please use this to get a general idea of your competition, but NOT to make calculations. This is not a ranking factor.
Finally, if the Google keyword suggestions and the search results don’t provide you with enough ideas, you can use this free tool Keyword Sheeter, which generates more related ideas based on your keyword. See below.
Learn more about what people are searching for by finding forums, communities and online discussions related to your niche on sites like Reddit, Facebook groups, LinkedIn, Quora and many more. Check the most common questions and popular discussions. You’ll get more ideas for new content.
Once you have a set of keyword ideas, the next step is to find out their search volume, which will help you to choose keywords that have the potential of driving traffic to your pages. Google’s Keyword Planner can help you find traffic numbers.
Note: if you are not running ads on Google, it will show rough estimates.
Another free tool is Chrome extension Keyword Surfer. It doesn’t have the option to bulk check keywords, but it will display search volume in Google’s search results.
Make sure you are grouping similar keywords. Otherwise, you’ll end up with 1 list containing hundreds of keywords with different intentions and content needs. You want to keep your content (web pages) focused on 1 specific set of keywords.
Free: Google Trends helps you analyze the popularity of top search queries in Google across various regions and languages.
Free: Answer The Public generates search insights based on your inserted search query.
Paid: Ahrefs is one of the best all-around SEO tools, especially for keyword and competitor analysis.
✅ Understanding the intent of keywords you want to target
✅ Identifying a set of keywords (based on intent and search volumes)
✅ Learning how your competition uses KWs
✅ Determining what types of content you need
When you are creating your web pages, you’ll see an option for adding a meta title and description. This is the text that will be visible in the search results.
The meta title (1) is an important field as it impacts how Google ranks your website and how potential visitors will decide whether or not to click on your search result. As a best practice, keep the meta title under 65 characters so it won’t get cut out on the search results. You can use a meta description checker to see how long your text is and how it'll look on Google.
Also, always include your main keyword as close to the beginning of the title as possible. The meta title should describe the main purpose of your page.
The meta description (2) doesn’t impact how Google ranks your website but it does influence your visitor’s decision to click or skip your search result.
It’s important to use a clear hierarchy of your text using different sized headers. Always include the most important keywords at the beginning.
Here is an example of a header structure based on our previous example, “How to take care of a dog”:
This is just an example, but the point is to show you that every page should be organized in a logical way with the main points as headers followed by support points for each main topic.
Keeping your searcher’s true intent in mind, use your main KWs (in our example it could be keywords like dog care, dog carings tips, take care of dogs) and supporting keywords (feeding dog, grooming dog, vaccinating dog and similar) throughout the text of your page
That said, do not overload your pages with keywords. Keep the text natural and useful. Stuffing keywords doesn’t work and can even have a harmful effect.
Make sure your formatting remains consistent throughout the text. If your page is long with a lot of content, consider adding a table of contents with anchor links at the beginning of the page. For better scannability, use bullet points or number lists where appropriate, and include images.
When you add images, don’t forget to add alt text. The image file name and alt text should be descriptive. Use appropriate keywords based on your keyword research. It’s more valuable to keep it descriptive than to stuff KWs.
Image alt text helps not only for search engines to better understand the content on your page but also for vision-impaired people who may be browsing through your page.
To help give you an idea of what works best, let’s write a poor, average and good alt text for the below image of a super cute puppy.
- Poor alt text: dog
- Average alt text: a dog with a bone
- Good alt text: a pug puppy carrying a bone treat
You can name your image files based on the same logic as alt texts.
When writing your URL text (slug), keep it short, descriptive and include the main keyword. Also, avoid using symbols and non-English letters, see below:
Avoid URLs like:
A good URL (using our dog example): https://yourdomain12345.com/blog/how-to-take-care-of-a-dog
Updating your content from time to time can have a big impact on how you rank.
Consider these 3 insights for content freshness:
1. Writing new content requires time, instead check your old content. You probably already have something that once performed well. By updating the older content with relevant information, you have a higher chance for a well-performing piece again.
2. If you have content that ranks at the start of a 2nd search results page, a slight update to your meta title and a few additional paragraphs with relevant information might give you a boost to jump from 2nd to 1st page. That can mean a huge increase in traffic!
3. Some content by nature must remain fresh to rank. Let’s take our example of taking care of a dog. While feeding, grooming and caring your dog are things that remain relatively the same over time. Things like “best dog collars” or “top Christmas gifts for your dog” will change over time because of trends, innovation, fashion and other factors. Also, notice that the intention of the queries is different as well.
Let’s analyze a search query that doesn’t need a very fresh content to rank successfully: ‘taking care of dogs teeth’. The 1st result is a video from 2009:
There are a few articles that date back to 2015 and 2016, and a few more recent ones from 2018 and 2019. Then you’ll see videos from 2009 and the most recent one is from the beginning of 2019.
The vast gap between the time content was published is a good indication that content freshness is not as important for this topic. Probably because there is not as much innovation about taking care of your dog’s teeth.
However, if we search for something like “best dog collars”, the content publishing dates change by a large margin.
Out of the top 5 results, the oldest one is from 2019 October and the others are all from 2020.
The search term “best dog collars” has a commercial investigation intention (researching to buy a product). There could be more innovation for this query requiring more frequent updates.
For example, a newer player enters the market with environmentally-friendly collars that become a huge hit. Someone launches a technical collar with Bluetooth tracking that becomes popular.
Duplicate content is content that appears on more than 1 website. While quoting someone on your website is perfectly fine, if 80-90% of your content is copied from somewhere else on the web, you will likely have a problem.
Duplicate content can have a negative impact on your rankings. All of the pages you want to rank should have unique content (including meta titles and meta descriptions).
A few common reasons duplicate content appears on your website:
You won’t get a penalty for duplicate content, but you will compete with yourself if you have duplicate content within your website. Since the content is already indexed and ranking on search engines, you will not have a chance to rank against another website that already has the content.
An easy and quick way to find out if you have duplicate content (or someone copied content from you) is to use Google search. Use some unique text within your page, copy it and insert it “between quotation marks” in the search box.
What results are popping out in Google? If there’s only 1 page with an exact match, you are good. If there are more pages with the same content, investigate further.
✅ Writing meta titles and descriptions
✅ Organizing content with headers
✅ Writing the main content of the page
✅ Optimizing images with alt text
✅ Inserting descriptive URL structure
✅ Keeping content fresh
✅ Removing duplicate content
This is the more technical part of the SEO guide. Don’t panic, we’ll make it easy to digest!
Technical SEO is important because here’s the truth: it doesn’t matter if your content is amazing, if Google can’t see it, your content won’t rank.
Let’s make sure your content is ranking.
When talking about website accessibility, you’ll hear a lot about crawling (can also be called a web crawler, spider or spiderbot).
It’s a process used by search engines to find your content and pass it further for indexing. Simply put, a crawler follows all the links it finds and then analysis content on those pages (similar to how you would browse the internet).
There are some limitations that we will talk about below.
Robots.txt file is usually uploaded to your website’s root folder (if you have it, it can be found by entering https://yourdomain12345.com/robots.txt in your browser tab) and is used to instruct crawlers to stop crawling your website or certain parts of it.
The first thing crawlers will check when coming to your website is the robots.txt file, where it will decide the crawling path. However, your pages can still be indexed if the crawler discovers your web page from a different website that is linking back to you (we will cover this in the next section).
If a person isn’t going to search for a specific page on your website using Google, it’s probably not worth it for search engines to crawl it.
For example, your website’s admin pages, a thank you page for successfully signing-up or thousands of unique search queries within your website’s search that generate unique pages but they serve no purpose being indexed on Google.
Sometimes mistakes happen and important pages get blocked. Especially when migrating to a new domain, CMS or when you make big changes to your website’s structure. Make sure your robots.txt file is in order.
Noindex and nofollow tags are small snippets of code that are inserted in your website’s pages and can look like this if you inspect the code:
<meta name="robots" content="noindex, nofollow">
A Noindex tag is used when you want crawlers to crawl your pages but not index them.
A good example is when you have a couple of filter combinations for your products or services and they create hundreds or thousands of unique pages. They may not have any use being indexed in the search engines but they link to other important pages within your website, therefore it’s good practice to allow Google to crawl them but not index them.
Nofollow tags are used when you don’t want crawlers to “click” (visit) the page you are linking to. Such examples would contain paid links, comments in forums or under your blog posts, or other user-generated content. The reasoning for that is that those links may be low value, harmful and Google may not like that.
Noindex/nofollow tags checklist:
If you’re migrating to a new domain, launching a new website, making big structural changes or have a large website with a lot of new content, a dynamic sitemap is a very good practice. It helps crawlers to better understand the structure of your website and find new pages more quickly.
You can also have separate sitemaps dedicated to images or videos if that is a big part of your website.
Your website’s architecture and internal links play an important role in helping search engines determine the importance of your pages and they can also have an impact on your user experience.
Every page within a small-medium website should be accessible within a maximum of 4 clicks.
The “closer” your page is to the homepage, the more important it gets in the eyes of search engines. The same logic applies to the visibility of your links.
For example, links on your website’s header will have more importance compared to the links in the footer. Links under your blog posts comments will have even less importance.
A structure of a small-medium size website could look similar to this:
If you have a deep blog, add categories to help organize by topics.
Make sure all your important pages have links pointing back to them so they are discoverable by search engines. And don’t forget to link to your older but relevant articles when writing new content. The more relevant links an article receives, the better chance it will have to rank higher.
Most commonly redirects are used to help navigate crawlers and your visitors within your website and to automatically redirect them from 1 page to another. Consider using redirects when changing the domain or removing pages from your website.
Over time, your website and its pages collect SEO value. They start ranking for keywords, receive backlinks from other sources on the internet and start generating traffic.
When you’re changing domains or removing pages, it’s really important to save the value you accumulated over time and have redirects in place.
Imagine if you were to change the physical location of your retail store and not tell anyone the new address. That would not happen! It’s the same with your website. Don’t forget to redirect your pages.
Two main redirects you should focus on are HTTP status code 301 (permanent redirect) used to indicate that the page is removed for good and HTTP status code 302 (temporary redirect) used to indicate that the page is not accessible for a short period (a few days).
Use cases for 301 redirect:
If redirecting to the same page is not possible, redirect it to the most similar category. For example, if you decide to remove your article about dog care, consider redirecting it to your “Dogs” category under “Animal care” in the provided example above.
302 redirect should only be used for a short period of time when a page is unavailable (a few days) and you plan on bringing that page online again.
Note: If page is very low quality, provides no value to visitor or search engine, it's perfectly fine to delete the page and return HTTP status code 404 (Page Not Found). No redirect is necessary in such a case.
Make sure your website is secure. One of the easy and free ways to secure your site is to move from HTTP to HTTPS. Especially if your website has sensitive customer data, such as credit card details, addresses, emails or similar data.
Most hosting companies offer basic SSL certificates for free when you have a hosting package with them.
Website load speed is a very important ranking factor. Make sure your website loads as fast as possible.
The most common issues that may cause slower loading speeds:
Mobile devices usage increases every year. There’s no question how important it is that your website is mobile-friendly (responsive). If you are using quality website builders (like MailerLite), your website is likely already optimized for mobile devices. It’s still recommended to check your site on all devices, just to be sure.
You can use this Google tool to find out whether your website is mobile friendly and get suggestions on how it could be improved.
Google sums up what is structured data very well: “Structured data is a standardized format for providing information about a page and classifying the page content; for example, on a recipe page, what are the ingredients, the cooking time and temperature, the calories, and so on.”
Schema.org is what defines how each element on the page (ingredients, cooking time, temperature and so on) should be marked in your code, in a way that’s understandable for different search engines.
Structured data is not a ranking factor but it’s a perfect way to make your search results stand-out. A few examples of how schema implementation can look like in search results:
If you are running an ecommerce store, star rating for your product pages:
If you are running a recipe website, star rating, preparation time, picture for you recipes:
If you are running an event website, listings of the events:
There are many more options that you can read more about and how they can be implemented in the official Google structured data documentation.
✅ Robots.txt file
✅ Noindex and nofollow tags
✅ Website architecture and interlinking
✅ SSL certificate
✅ Website load speed
✅ Mobile-friendly website
✅ Structured data
Links from other websites that link back to your web pages are an important ranking factor for search engines. However, it’s important to focus on quality over quantity.
Having hundreds of links from low-quality (spam) websites that generate no traffic will give you no benefit. In some cases, that approach can hurt your ranking.
It’s much better to invest your time getting a few links coming from a similar niche from a website that’s considered an authority and can actually drive some traffic back to your website.
Some of the most popular link-building tactics:
If possible, always try to link to a specific page rather than a homepage. Having a few high-quality backlinks to your “How to take care of a dog” article will give you a much better chance to rank in the search results.
Measuring SEO success is important to ensure you are moving in the right direction. Effective search engine optimization requires testing different things, seeing what works and what doesn't. If you don’t track your metrics, you won’t know if your tactics are working.
Here are a few metrics you should be tracking.
This can be tracked by 2 free tools.
1. Google Search Console
You can check the following data:
Filter the data by:
2. Google Analytics
Navigate to Acquisition → All Traffic → Channels → Organic Search → Landing Page.
Google Analytics reporting is advanced. You can use a lot of different types of data and filter different variables.
To keep it simple, you can start by monitoring the following:
If you want to filter out and check a group of pages (for example all pages that fall under /blog category) or a specific page (for example an article), you can click on Advanced, select Containing next to Landing Page and insert part of the URL you want to filter out.
Based on your keyword analysis, you will select the most important keywords you want to rank. Track the progress of these KWs and how the position changes over time.
Google Search Console reports based on average position is a free report you can check, but it’s not very accurate.
If you are starting small, you can check the Google results manually and write down the results.
If you want a more convenient and precise solution where you set up keywords once, you’ll need to use a paid solution like Ahrefs or SEMrush. They’ll update your KWs automatically based on your selected frequency and you only need to come back to check the report.
You can use Google Analytics to track conversions and sales and filter that by organic traffic. More details for setting up goals can be found here and details for setting up ecommerce tracking can be found here.
In short, you can choose to track many different conversions based on your website’s and specific page’s purpose.
Ideas for conversions to track:
Keyword ideas and analysis
On-page technical SEO tools
I'm Paulius, SEO Manager at MailerLite. I've helped many businesses with their SEO, from small e-shops to large news portals. Unlike SEOs constant changes, my music taste hasn't changed since 7th grade. Which makes me wonder, has my taste been that good since then or has it always been bad? (Spoiler: it's old school hip hop).