Naturally, we want our sites to have a lot of traffic to start generating revenue. With high traffic, online stores can increase sales, and information sites can bring their content to a much bigger audience, all the while making money from advertising. The traffic can come from many sources, such as Google Ads, but usually, we want it to originate from Google search or another search engine.
The question is how to make sure that the site attracts enough traffic? The process of promoting websites in search engines is complicated and requires the involvement of an SEO specialist or even an entire team. That is why a few months ago our company launched a new service — SEO website promotion. In this article, I will go through the main stages essential to making a real difference.
What is SEO?
SEO (Search Engine Optimization) is a set of measures to enhance a website and increase its ranking in the search engines for targeted queries, thus bringing more traffic. The fundamental goal of SEO is to make it easier and faster for users to find the necessary information. And this info should be as relevant as possible to what users look for.
The goal of an SEO expert is to improve a website according to the requirements of the search engines. This makes it easier for the target audience to find your site, service, product, or content and then take the required action. This required action can be anything — buying, ordering, subscribing, becoming a regular customer, or a reader, etc.
To better understand what this is about, let’s look at a Search engine results page. It has two sections: Paid and Organic. The task of an SEO expert is to help your site rank as high as possible, preferably at the very top of the Organic results section, so that more users would visit it.
Why is SEO important?
Even now, when Google’s ad units take up even more space, users rely primarily on organic results. Only 19% of users click on ads while searching:
Also, 65% of users choose the TOP-10 websites:
This demonstrates just how vital search optimization of your site is if you wish to attract organic traffic.
To make website promotion successful, it is necessary to first develop a suitable SEO strategy.
Advantages and disadvantages of Search Engine Optimization
By the attracted traffic, we understand users who entered a relevant key query and were directed to our site through the search results page. Generally, these are the people ready to take the required action (to make a purchase, for example). This means that by improving a site’s visibility, SEO can lead to increased business revenue.
2. Long-term effects
The optimization effects last for a long time (as compared to contextual advertising, where a website in the paid section would disappear as soon as you stop paying). If, after all the necessary improvements, a site’s pages rank high (based on specific key search queries), they will likely retain that position unless the search engines update their rules and requirements or the content of even higher quality will be posted elsewhere.
3. Relationship of trust with clients
Users may not trust advertisements and choose sites at the top of the search results. Moreover, after internal optimization, sites become more user-friendly, which leaves a good impression on the customers. This leads to users being more willing to return, make another purchase, and recommend that site to friends and acquaintances.
1. Protracted execution and delayed achievement of results
Unlike contextual advertising, you should not expect quick results from SEO. The first noticeable effect can usually be seen only after 3–4 months.
2. No guarantees
No specialist can give a 100% guarantee in SEO. After all, at some point, the requirements of search engines may change and adversely affect even a well-optimized website. For example, over time it became necessary to focus on mobile website versions since the amount of traffic from smartphones increased, and the search engines have updated their algorithms to reflect that. However, qualified experts know how to adjust their promotion strategies to the ever-changing conditions.
3. The need for complex technical revisions
Sometimes to achieve the goal, it’s necessary to make a lot of tweaks to the site and implement them in a timely manner. This can include changes in the code to improve loading speeds or the visual elements of the site to improve conversion rates.
How does SEO work?
To answer this question, you need to understand the objectives of search engines. Their main goal is to provide the users with the most appropriate answers. Accordingly, when performing SEO, we must ensure that the search engines consider your site’s content highly relevant.
Whenever you Google something, the search algorithms select the most relevant pages that will correspond to your query as best as possible. Then the same algorithms arrange the documents in the appropriate order known as ranking. The pages that the search algorithms think are more authoritative, high quality, and complete will be ranked higher up.
Okay, let’s summarize. To provide the users with the right information, the search engines analyze:
- The relevance of a page for the search query and whether it can satisfy it;
- The authority of a page, its quality, and completeness.
Granted, this is not all the information that search engine algorithms check, but rather a simple example to better understand the entire process. There is no complete list of what affects the ranking of websites. Google itself holds this information in secret. But SEO specialists are constantly analyzing different sites and how the search results change, so they have practical information about some ranking factors.
Moreover, Google has been actively checking sites for E-A-T (Expertise, Authoritativeness, and Trustworthiness) in recent years, and since the instructions are publicly available, this provides us with additional guidance and helps improve site content.
Visibility is an indicator that determines the number of website displays for a given search query based on its ranking. Low search visibility means the site is not being brought up for business-relevant queries, which can attract new customers.
What is an SEO strategy?
To make website promotion successful, it is necessary to first develop a suitable SEO strategy.
The process is as follows:
- Determining the goal of website promotion. The first thing we need to do is understand the purpose of the site, and for what sort of queries we wish to promote it.
- General audit. This allows us to understand the current state of a site — for which queries it’s ranked already, whether there are pages for the target queries, its technical condition, and the quality of its links.
- Competitor analysis and niches. Based on the queries that are of interest to us, we can identify the main competitors operating on a similar business model and occupying a place at the top. We can then begin analyzing all this data. We check how they rank for related queries, with what other queries they attract users, in what ways their landing pages are better than ours, how many backlinks they have, whether their site meets E-A-T requirements, and so on.
- Strategy preparation. Based on the analysis of our own site and the sites of our competitors, we develop a promotion strategy. It includes long-term goals (what we need to do to rank the site higher and get more clients) as well as a monthly work plan necessary to achieve them.
Now that we know all this let’s break down what the SEO efforts usually include and what terms you may hear from the specialists working with your website.
An SEO strategy is a complete and long-term action plan necessary to help achieve your business goals through Google search.
The first thing that kicks off the promotion of any site is keyword research. This is a vital stage, where we need to pick the necessary queries in Google that should bring up our site. We will discuss this process in more detail in a separate article, but let me briefly run through the main points.
Site’s primary keywords
The first thing to highlight is the primary queries that will be used to promote a page. It’s possible to identify them through the Google Search Console (this method is effective if users already enter the site through the search results page) or through third-party tools, such as Keywords Explorer by Ahrefs.
There are different types of queries:
- Informational — these are the keywords for which the bulk of pages in SERP (Search Engine Results Pages) will be informative in nature, that is, articles mostly;
- Commercial — for which the majority of the pages in SERP will be commercial, such as product cards, online store listings, service pages, etc.
In other words, if we are promoting an online store, we will choose the main commercial queries, and if it’s a blog, then we’ll go for the informational ones.
Long-tail keywords and keyword variations
Once the main key queries have been identified, it’s time to move on to collecting their variations — looking into how else users can search for whatever interests them.
Let’s take a look at this example:
- “buy iphone” is the main query for a site engaged in the sale of iPhones;
- “iphone 12 pro max best buy” is a long-tail keyword;
- “apple phones for sale” is a keyword variation.
Keywords map and missing pages
Next, we should distribute the queries across the pages. We also need to identify pages that are missing from the site. The following is an example of a table with queries when we distribute them by pages.
The principal outcomes of this stage are the following:
- Outlining the structure of the site for the relevant queries necessary to achieve business goals;
- Understanding the keywords to be used in the promotion of the existing pages;
- Determining queries for which new pages will have to be created.
As I mentioned earlier, this is a very broad topic, and I will write more about how to do Keyword Research in a separate article.
On-page SEO is the process of optimizing a website’s pages. Let’s look into the basic elements.
The Title is one of the key elements. It provides information to the search engines and users on what they will find on the page. This Title directly affects a site’s ranking, and it is necessary to add the primary key queries there. However, it is essential not to make it just a random set of words.
The basic rules of creating a Title are the following:
- It should be unique for each website page;
- It should describe the content of the page;
- It should contain the primary key queries.
The Descriptions meta tag is the description of a page that the user will see on the SERP. Unlike Titles, they affect the ranking indirectly.
The description should be attractive so that when users see it, they’d want to visit the site. This way, the Description affects the clickability of the link.
These are the headings displayed on the page itself. The main one is h1. The search algorithms also take them into consideration, so they must contain the key query for the page. In addition, users usually see this heading on the first screen, so it is important to make it attractive and consistent with the content of the page.
Rules for the designing the headings:
- Write the h1 tag if it is not on the page.
- Make sure that the h1 occurs only once on the page because it is the title of the entire document.
- Do not spam. The h1-h2... headings should complement the page and help make its structure clear and understandable.
- Use the correct order — h2 first, then h3.
The URL of a page should be readable and discernible so that the users understand where they are. For example:
- site.com/tkakf doesn’t provide any information about the content of a page;
- whereas https://www.halo-lab.com/services/branding makes it clear that the page contains information regarding branding.
We also recommend using a path in the URL structure if you have a large site with different types of pages. Like this:
And here are some examples of how we create URLs:
From the URL, we can not only understand that this is a blog page but also what it is about.
When adding textual information to a page, especially if it’s a blog, you need to understand one thing — there is already more than one article on this subject published online. This is why your content needs to be unique and useful, adding some value. When preparing content, you need to consider how your article is better than those already in the top 10.
This becomes trickier if your content relates to some “dry” topic like a code of laws, where you can hardly write anything new. Naturally, you can’t rewrite laws to make your content unique. In this case, you need to consider what useful things you can add to the page or website that your competitors don’t have. For example, it could be a handy infographic or a table with the main points to make it easier for users to understand.
Alt is a text description for an image; the user will see this attribute if there is a problem loading the picture. In addition, Alts are useful for promotion in search engines, as this helps them better understand what is shown on the image. Alts also play an important role in attracting clients through image search.
An Alt should describe the picture, but you don’t need to add all the keywords; otherwise, it will look spammy.
Schema & Markup
For search bots to better understand the structure of pages, you need to use structured data. Its type depends on the topic and information present on a page. Structured data also helps in shaping Rich results snippets. Rich results snippets usually attract the users’ attention. You might have seen them on the Google search results page, but if you’re interested, have a look at the related article.
Before implementing markup, we need to do a few things:
- Investigate which snippet will suit our search keywords;
- Prepare microdata for target pages.
Now that we know the basics of On-Page SEO let’s talk about external optimization of websites, also known as Off-Page SEO. While this work is aimed at promoting a site, it has to be done elsewhere. These efforts involve posting reviews, guest posts, etc., on third-party sites. Link-building is a significant and vital component of Off-page SEO, so let’s discuss that briefly.
When determining the value of your website content, search bots particularly scrutinize its links. The more users share them, the better and more valuable they are to the search engines.
What is a backlink?
A backlink is a hyperlink from an external source to our site. The number and quality of links also impact the promotion of a site — they represent its weight, and through them, it is possible to shape a site’s reputation. When a site is interesting, enjoyable, and useful, users start sharing it. This is why large and popular sites usually have a lot of backlinks.
Link Quality Factors
Google understands that you can add links to your site yourself or through contractors. For them, such links look unnatural, so they try to identify and nullify their impact on the ranking of a site. That is why you need to monitor the quality and the organic nature of those links. However, it is difficult to achieve without SEO tools.
We collect keywords, traffic, domains, and backlink numbers from https://ahrefs.com. Ahrefs uses public data and its algorithms to estimate SEO-related indicators. Since they do not have access to internal website data (Google Analytics, Google Search Console, etc.), the actual numbers may not be 100% accurate. However, it is enough to compare websites and estimate growth opportunities.
The service defines the following values:
- Domain rating — shows the strength of a target website’s backlink profile compared to the others in our database on a 100-point scale;
- Domains — the total number of unique domains linking to our target site.
This allows for superficial site analysis and to determine from which site the links are leading. If it has good indicators, then we can say that the source is of high quality, and so it’s possible that the links from such a website will be helpful in the promotion.
If you do not have access to such a tool, you will have to check everything manually, and there is no guarantee that you will be able to determine all the necessary indicators.
Moreover, you would have to manually check the following parameters:
- Spamminess of a source — firstly, you need to understand how often the source website posts links. If it constantly sells backlinks, Google’s algorithms have likely identified it as being of low quality;
- Subject matter — if your site is about dogs, and you will have a link from an article about milling machines, that link will be deemed unnatural. Therefore, it is necessary to monitor the theme of the sources linking to our site.
Having checked all that, we will better understand the quality of links leading to a site.
Link building is the process of creating a link profile for a website. It must be tailored individually, and we develop a separate strategy for each site.
The purpose of these efforts is to get quality links leading to our target site, which will help improve its reputation and connections, as seen by Google.
There are many ways to build a link profile, but guest posts and crowd-marketing are usually used.
Guest Posts — these are links from sites that are popular and have certain authority when it comes to the subject matter. It makes sense to post a full-fledged article or a press release on such sites, with a reference and a link to the website being optimized.
Crowd Marketing — these are links from forums and comments. This method is often used to diversify a site’s link profile to make it seem more natural.
We will discuss how to develop a link-building plan in the upcoming articles.
By technical SEO, we mean optimization efforts affecting a site’s performance and helping search bots easily detect and index web pages. Now let’s see what items should be considered in the first place during the basic technical analysis.
The first thing we pay attention to when seeing a site is its certificate.
HTTPS (Hypertext Transfer Protocol Secure) is an extension of the HTTP protocol that supports encryption using SSL and TLS cryptographic protocols.
Protecting users has always been a priority for Google, and since 2014, having HTTPS has been an additional ranking signal for websites. If your site still uses an unsecured protocol, you should consider replacing it.
Site loading speed is also an important signal for Google. If a site takes a long time to load, the user will likely leave it. You can check how your site is loading through the PageSpeed Insights tool. This free Google instrument will allow you to see the main issues affecting the loading speed of your website.
This is the age of smartphones, so mobile users outnumber PC users in many niches. Google understands this as well, which is why, back in 2019, Mobile-first indexing was launched. This means that when ranking a site, its mobile version takes precedence. Those websites which Google switched to Mobile-first ranking receive messages in the Google Search Console. Now almost all the information sites and online stores are affected, but there are still topics where the search is primarily conducted through the PC.
You can check your mobile version for issues through the PageSpeed Insights tool — it will find the basic errors.
Robots.txt, Meta NoIndex, & Meta NoFollow
It is also essential to configure the rules of website indexing. This is done so that the search bots would not crawl and add unnecessary pages to the index, such as service pages with the documentation from the admin panel.
You can do that through the robots.txt file. Generally, we use it to specify the rules for how the search bot should crawl the site. But you should be aware that for Google, robots.txt is a recommendation and not a rule. Therefore, it so happens that pages that you have closed in robots.txt still get into Google’s index.
If there are pages that need to be 100% closed for Google, it is better to do it through meta name="robots".
It is necessary to highlight 2 attributes:
- noindex — used to prevent content from being indexed;
- nofollow — used so that the bot does not follow the links present on a page.
In this way, we can control how bots will crawl the site.
Another way to influence the site’s indexing is through the sitemap.xml. file.
It specifies the pages that the bots need to crawl first, and generally directs them on how to scan a website.
Dead links or broken redirects
Bad links and broken redirects will also affect the scanning process because the search bot will not be able to follow them. Here, it’s imperative to ensure everything is running like clockwork.
You can see problem pages through Google Search Console or special software such as Screaming Frog or Netpeak Spider.
We should also mention duplicate content. There are two types:
- External — when our site duplicates the content of another website. As I’ve mentioned in the On-Page SEO section, the information must be unique;
- Internal — duplicates within the same site. This usually causes problems with the ranking of the site.
Google Search Console, Screaming Frog, or Netpeak Spider could also help us pinpoint problematic pages.
Behavioral factors are also important metrics that affect site ranking, showing us how users behave on a site.
A basic analysis of behavioral factors can be done through Google Analytics. This tool lets us understand what pages users go to, where they come from, and how long they spend on the site.
Usually, in addition to traffic data, we also analyze:
- Bounce rate;
- Exit rate;
- Pages per session;
- Average session duration;
- Conversion rate.
Knowing how these metrics work and relate to page types is also essential. For instance, the Bounce Rate in Google Analytics represents the number of users who have opened only one page of the site. So if you have a blog, usually, this figure will be close to 80% — the user has read your article and left. This is quite normal for a blog and usually shows that the content is complete and of high quality. However, if we’ll see the same figure of 80% on the listing page of an online store, it may indicate a problem.
We can draw more in-depth conclusions by analyzing heat maps from third-party services like www.hotjar.com.
Everything we considered above is a common optimization practice that will work for all sites. However, we should also mention the Local SEO, with its own special features.
Usually, people search using general queries without reference to any particular location. But there is also a local search when we only want to see local sites. An example of such a query could be something like “Food delivery Los Angeles”. This means someone is searching for a site from a specific city. Sometimes a query may also include a city neighborhood.
Since users care where the company is located, additional efforts are needed to optimize the local site. These include:
- Specifying the address of the company;
- Identifying local queries and optimizing website pages for them;
- Adding to and promoting a company in Google Business, etc.
All of this would make a website more visible in the local search results.
How to monitor & track SEO results
Now let’s talk about how to control the process of promoting a site. We’ll start off with traffic.
We can monitor the traffic through Google Analytics and Google Search Console tools.
They are different in terms of data provided:
- Google Analytics offers more information about the user;
- Google Search Console provides more data on how Google understands your site.
We recommend using both of these tools for analyzing websites, as together they provide much more information than separately.
For example, Google Analytics will not show all the queries directing visitors to a website through search pages, but Google Search Console will. If you start to notice that traffic to the site is increasing, you’re probably doing the right thing.
As a result of Keyword Research, we can generate a list of queries for which to promote a website. We suggest tracking the positions of pages for these queries, too. You can do it through Google Search Console — it’s not entirely convenient, but it’s free. Paid services like seranking.com can also help with that. The main thing is to determine the region where the ranking is tracked because the results in Washington and New York may differ.
If the ranking of the tracked queries is growing, then you’re on the right path.
The Conversion Rate is yet another indicator of effective SEO. For that, you can set up tracking goals in Google Analytics. If conversions improve along with the traffic, it means we’re doing it right.
Time On-Page and Bounce Rate
These are the metrics from Google Analytics, which I talked about in the User Experience section. If we can see that user behavior on the site is improving, it can also tell us that we’re going in the right direction.
In this article, I tried to break down the basic terms and stages of SEO. To be honest, they are quite extensive, and we will gradually introduce you to them in the following articles, explaining search engine optimization techniques in more detail.
If you still have questions, email us at [email protected], and I’ll be sure to answer them. Also, subscribe to our newsletter to stay informed when we post new articles on our blog. And if you have a website that you're unsure how to promote, contact us, and we will gladly help you out!