Pushleads | Asheville SEO Services


SEO is the process of making “inside tweaks” & “outside tweaks” on a website, so that over time your it becomes more search engine and friendly to Google.

Content Delivery Network & Google Ranking

Is there a ranking effect from using a Content Delivery Network (CDN)? John Mueller answers this question.

In a Google Search Advocate office hours video, John Mueller answered a question about whether a CDN can help improve rankings. Mueller provided a thorough answer that discussed page load time and crawling and also addressed SEO.

What is a Content Delivery Network (CDN)?


A content delivery network (CDN) is a service that delivers web pages via a worldwide network of servers to speed up delivery by having the servers close to the person accessing the website.

A CDN can significantly improve website performance by reducing the number of “hops” a web page must take across the Internet to reach a visitor. A server close to a visitor is less likely to have many “hops.”

Does A CDN Improve Search Rankings?

Is there any ranking advantage to using a CDN versus traditional server hosting? That is what the person seeking Google’s answer wanted to know.

Does using a CDN to distribute a website’s content improve its ranking?

Our website gets a large proportion of traffic from a particular country. We have servers located in that country to get more visitors.

Is it advisable to put our entire website behind a CDN to improve page load times for users worldwide, or is that not necessary in our situation?”

SEO and CDN Impact

A CDN does not have an SEO impact, according to Mueller.

“So, obviously, you can do a lot of these things. I don’t think it would greatly affect Google regarding SEO.”

It is important to note that Mueller explicitly states that he doesn’t believe such a ban would make much of a difference. He reiterates his point as follows:

“The only effect where I could imagine that something might happen is what users end up seeing. And, kind of what you mentioned, if most of your users are already seeing a speedy website as your server is located there, then you are doing the right thing.

If users in different locations are seeing a prolonged result because perhaps the connection to your country is not that great, then that’s something where you might have some opportunities to improve that.

And you could see that as something kind of in terms of an opportunity in the sense that, of course, if your website is prolonged for other users, then it’s going to be rarer for them to start going to your website more because it’s annoying to get there.

Whereas if your website is pretty fast for other users, then at least they have an opportunity to see an excellent fast website, which could be your website. So from that point of view, if there’s something that you can do to improve things globally for your website, I think that’s a good idea.

I don’t think it’s critical.”

CDN and Impact on Crawling

Mueller answered the question about SEO and CDNs by going back to the subject and discussing the benefits of crawling and CDNs:

“Google doesn’t have to see a redirect very quickly or anything like that, but it is one way to grow your website beyond your current country. It might be worth noting that if Google’s web crawling is slow, we can index and crawl less from the website. In most of the websites I’ve looked at, I haven’t seen this as a problem with sites that weren’t extremely large. It might be worth investigating.

So from that point of view, you can double-check how fast Google crawls in Search Console in the crawl stats. And if that looks reasonable, even if that’s not super fast, then I wouldn’t worry about that.”

Bots, both legitimate and malicious, might not be able to crawl a slow or underpowered web server on a shared server environment. That situation might cause the server to give up and return a 500 server response code because the server could not serve the requested web pages.

Many shared hosting environments suffer from poor performance and reliability, and a typical response is to recommend upgrading to a dedicated or virtual server environment. By serving web pages from the CDN rather than the server where the actual pages are hosted, a CDN can reduce the effects of a sluggish shared server.

Read Next: How to Avoid These Five SEO Mistakes That Can Impact Your Google Ranking


Google Ranking and CDN

According to Mueller, using a CDN does not produce any SEO advantages. He noted that crawling difficulties on large sites with “millions and millions of pages” are unfamiliar (i.e., outside of crawling difficulties).

Mueller says that there are many good reasons to use a CDN, but an SEO advantage is not one of them.

For SEO audits, click here to book your appointment.

What’s Your SEO Score?

Enter the URL of any landing page or blog article and see how optimized it is for one keyword or phrase.

How to Integrate SEO and PPC

Ever wonder how to integrate SEO and PPC strategies?

Silos of communication between teams often cause friction between SEO and PPC efforts, because we use different methods to increase website traffic. Although they differ, there are areas where they can work together, to improve the channel and workflow.

Working together to improve the efficiency of a channel and workflow is one way to reduce friction between SEO and PPC.

Friction occurs in three main areas:

  •  Reporting
  • Landing pages
  •  Budget

Follow these five steps to get your PPC and SEO campaigns working together.

1. First-party data must be ready for collaboration.


First-party data is imperative for digital marketing efforts. If you heavily rely on retargeting campaigns (either because your industry is pricey or the customer journey involves several phases), you may find yourself increasingly reliant on native audiences. Your SEO and PPC teams must collaborate to determine whether your brand adheres to the law.

Analytics audience segments can be a great alternative to unpredictable quality. Unfortunately, most of these audiences underperform against brand-tracked activity-based ones. You must still get consent and use the new global site tag. Ensure your tag is updated to GA4.

When you create a cookie consent module, you must ensure that it follows cumulative layout shift (CLS) rules. It’s crucial to follow CLS rules in general because modules at the bottom of the page tend to perform better since they don’t distract the user from their purchasing journey and have less CLS risk. Ensure that you protect first-party data (either by hashing and syncing it through tools or immediately deleting it once it’s been uploaded into ad accounts).

Ensure your SEO team’s content campaigns are engaging so that you can create consensual conversations.

There is one final issue to deal with on the analytics.

The current roll-out of GA4 and conversion modeling is facing a big issue: advertisers must decide whether to rely on analytics as their source of truth or to gain from enhanced conversions. Enhanced conversions require using Google’s native conversion tracking. The numbers reported by conversion tracking will always differ from those reported by your SEO team, even if it gives you a better understanding of how paid campaigns operate.

It may make sense to take a “hit” on PPC-reported numbers so long as the overall metrics point to positive ROAS to preserve trust and data continuity. All parties must be willing to accept reports that will be different if enhanced conversions are used.

2. Adapt and acknowledge domain structure choices.

Brand URLs can be set up in three ways:

  • Everything (including international domains) can be on one domain.
  • Subdomains for different projects.
  • Vanity and Country domains.
    PPC-specific pages should be noindex/nofollow and accessible to the adbot to contribute to quality score, regardless of your chosen path. Non-eCommerce brands are rarely better served by keeping everything on the same domain. 

There are certain strategic elements of an SEO’d site that may be detrimental to PPC:

  • Search engine optimization doesn’t want duplicate content, and paid search benefits from testing templates.
  • A complete navigation bar benefits SEO, while PPC does better with limited user choices.
  • An ad linked to an SEO-redirected page may be rejected (three strikes in 90 days will cause the ad account to be suspended).

Subdomains can mitigate these pitfalls if they don’t force SEO and PPC teams to make creative or technical compromises. Furthermore, the same analytics property and branding continuity can be maintained.

Ensure that any redirects are explained at least three to five days ahead of time if you must use the same landing page for both PPC and organic traffic. This will enable the PPC team to alter the ad creative so that you don’t waste money sending people to broken pages until Google rejects the ad. Both PPC and SEO must convey inventory. It can get penalized by the search engines if your product is consistently out of stock. Ensure all campaigns are informed of inventory issues so they can exclude items from paid campaigns and add the out-of-stock tag to the organic page.


3. All pages should have transactional intent & CRO.

SEO is often inaccurately labeled the “research” channel, while PPC is often credited exclusively with driving transactions. There are good reasons to associate traffic with research, but each can gain from the other’s approach to establishing trust and driving transactions.

PPC pages typically contain less material, but this does not mean that the item or service should be unclear. This material (either text or video) should be below the fold to keep the conversion pathway clear, as in SEO. Similarly, rich and authoritative content is required to rank well in SEO. The traffic will be worthless if the conversion path is covered (or not there at all).

The PPC page follows PPC guidelines better than the SEO version. If you plan to make a PPC ad, ensure it adheres to the rules. This page provides the user with enough context to understand what they are getting into and clear conversion paths. By giving the user more information if they want it, the page does not overwhelm them. In addition, a multi-step form is available for users to build brand loyalty.


4. Search query data can be used to build effective ad campaigns.


Data sharing on search queries is the best method to combine PPC and SEO. You’re already paying for search data from the search terms report. By sharing that data and what converts and doesn’t, content teams will know where to spend their time. In addition, sharing the search terms from on-site searches and search consoles is an easily overlooked chance.

Prioritizing keyword variants based on what existing customers want and how they think will help brands get insights on content and auction pricing.

Both channels should share search term data so that brands can get insights into what existing customers want and how they think. Digital channels should work together on at least a quarterly basis, and automatic report sharing should be set up so that they communicate with each other.

5. Allow time to talk with each other.

The benefits of informal, casual conversation are incalculable. Whether it’s a quick chat at the beginning of the week or a monthly collaboration session, talking about the innovations and problems in each area will enable the other person to prepare for improvements or difficulties.

If you’re an agency and your counterpart works for another agency, seek joint meetings with the client or independently. Specifically, showing your dedication to the brand’s triumph and your cooperative nature will help maintain clients and prevent your excellent work from being accidentally contradicted.

Read next: Small Business Search Trends for 2022 Are on the Rise.

Collaboration and cooperation can eliminate the friction between PPC and SEO. Instead, use cooperation and collaboration to address each other’s weaknesses and magnify each other’s profits.

For SEO audits, click here to book your appointment.

What’s Your SEO Score?

Enter the URL of any landing page or blog article and see how optimized it is for one keyword or phrase.

SEO Rankings

As search engines become more competent at processing language, linkages are becoming more critical in SEO. Semantic search is here to stay, so it is time to pay attention to connections and associations to produce high-quality SEO content. This post will demonstrate how to use word-topic-page relationships to improve your SEO.

It is possible to create higher-quality content by using entities in your content. Linking these pages, sections, and guides of your website together can help maximize connections to achieve fantastic outcomes. How can you begin?

Here are the terms to learn.

  • NLP (Natural Language Processing) is a branch of computer science that attempts to make machines as intelligent as humans in processing text and speech. It essentially refers to the process of understanding what words mean. Google uses NLP to understand our queries better and provide the best results.
  • Entities according to Google, an entity is a unique, well-defined thing or concept. By identifying and labeling these entities, search engines can understand their meaning based on their relationships with other entities.
  • Semantic search provides the user with the most relevant results by considering the underlying meaning of the query rather than simply matching words to the query. The search engine uses language and entity associations to grasp the intent of the query, as well as its context, to deliver relevant results.

What connects all these?


Excellent semantic search relies on NLP and entity understanding. Inaccurate labeling of entities, mainly if Google doesn’t grasp how we utilize and understand language accurately, might result in strange search outcomes.

The contextual cues provide the context to make these connections as humans. Google’s Hummingbird, BERT, and now MUM updates have all been working toward a more sophisticated semantic search over recent years. We learn these connections naturally because of the context in which we encounter them. AI must learn them based on context.

With everything else in mind, it’s time to consider the wide range of benefits that can be gained by using connections effectively.

1. Content – An excellent semantic search experience requires more than just well-chosen keywords. You must include the relevant entities and describe how they are connected.

  • What are the reasons for including entities in SEO content?

Google uses entities and their connections to determine meaning and context in any piece of written content. In other words, these entities and the connections between them are what connect them to other entities and concepts. All of these things and thoughts are related to one another.

Some of these connections are pretty strong, while others are weaker. Some entities are broadly utilized across the web in many different situations, while others are utilized sparingly in specific circumstances. Google’s knowledge graph recognizes these entities.

You can observe a well-known establishment like the Empire State Building’s knowledge graph panel, which includes basic information, address, operating hours, rush hours, customer reviews, and even current news. Semantic searching uses all these connections to decide which information is most relevant to a user’s query. This might help answer many individuals’ questions before they even ask them.

What does this mean for SEO Rankings?

The strength of an entity’s relationship to a query is critical if you want Google to display your content as one of the top results. This requires a thorough grasp of a topic and its entity connections.

How can I write better SEO content using relationships?

Considering which entities and relationships are essential in search is crucial for serving users’ queries. When we appreciate how crucial entities are, it’s simple to see why specifying which entities are associated with a topic is crucial. To create better content, you must thoroughly investigate the entities and connections.

There are likely thousands of terms, phrases, and entities pertinent to your subject. To be comprehensive, content on evolution would have to mention Charles Darwin, natural selection, humans, history, and ancestors, for example. You can see how many of these entities are related to the topic using Google’s image search. This is a relatively simple example to show the concept, and creating excellent content is much more complicated.


You should seek phrase matches in addition to partial matches when setting the context for your content using Phrase-Based Indexing. These related phrases aren’t discovered using traditional keyword research methods. Here’s how you can find the data you need to begin creating NLP-friendly content.

What are the steps to finding related entities and NLP keywords for my content?

Several tools can assist with this:

  • Semrush
  • SurferSEO
  • Frase
  • InLink-
  • Scalenu
  • Ryte

Using these tools is simple. You must set up an editor or project using your favorite keyword (or a group of related keywords). The tools do a lot of the work for you, providing you with a list of words, phrases, or entities to include in your text. Some even tell you how often you should use them.

You may use these pointers to create your content, weaving in any related NLP terms that you may naturally do so. The NLP terms are often arranged by importance as well, so you may give more emphasis to those at the top of the list. You can find essential entities on a subject by checking the ‘entities’ tab.’ You may also examine text that ranks well for a subject using Google NLP.

The encouraging news is that semantic searches seek to comprehend human search intent and deliver relevant results. You can use this procedure to spot any entities you may have missed when writing your content. You may use this same procedure to evaluate the importance of certain entities once you have finished writing.

Be human when approaching the topic; serve real people by researching and writing about them. Please include as many entities and their relationships as possible when writing about them.

2. Topic Clusters and Pillar Pages – It’s more complicated than writing high-quality content. Content strategists and SEOs must also consider and boost connections between disparate topics and content. Each term and entity is associated with a subject connected to other topics. These connections may be strong or weak.

Topic Cluster Strategy


An enhanced user experience is created by grouping related content together to create topic clusters that build on contextual relationships. Topic clusters are created by linking supporting pages covering a range of topics to a central piece of content. By grouping together related content, topic clusters make it easier for users to locate relevant content. They are organized around their interrelationships and provide rich, relevant resources.

Also read, DuckDuckGo Has Fallen Below 100 Million Daily Searches.


Where to find Topic Clusters?

Using tools such as AnswerThePublic and AlsoAsked to find similar topics and questions on a subject is straightforward, but structuring these questions is a formidable challenge. Where do articles end and others begin? What makes up a pertinent theme cluster, and what is independent?

When you group related queries together, you can quickly finish an article. A series of alternative content concepts can be developed by grouping terms such as prepositions and long-tail critical words by shared themes. Once you’ve compiled a list of article ideas, you should consider how they relate. The most well-defined subject clusters should be apparent through critical thinking. There are several distinct content cluster types, such as:

  • Step by Step Guide: Long guides, such as those covering technical SEO, can be divided into several sections to create a content cluster. Each part would then be connected to the next consecutively, creating a linear relationship between parts 1 and 2, etc.
  • Monthly Calendars: Each monthly entry would be a part of a topic cluster in this approach. Users would be able to move from one part to the next smoothly and linearly.
  • A themed set: is a group of items that work well together because they share a common essence. For example, a set of recipes that all include a specific ingredient or a set of case studies about a particular industry. Although the connections between these items may be more convoluted, the set still holds.

Each one of these subjects has a solid connection to the others. A central pillar page could strengthen the cohesion of this broad subject by gathering them together and emphasizing their relationships. Build a link diagram if you have cluster ideas but aren’t sure how they connect. If a subject doesn’t link nicely to another subject, don’t include it in your cluster. Orphan content should be removed or connected to a different cluster if possible.

These connections will be the framework for your internal linking strategy. Google considers contextual links a framework for reference, which is why you should use appropriate anchor text when linking to other parts of your topic cluster.

What makes pillar pages so powerful?

A pillar page effectively demonstrates your expertise by publishing high-quality content on a topic. It helps to demonstrate your knowledge because you consistently publish content on a topic and show your knowledge of its connections. This pillar page on Search Engine Land is a great example. It combines many related articles, relevant queries, and entities to create a leading resource on broad keyword SEO. It is much more efficient than a one-page article on SEO because it directs readers to other relevant topics and provides answers to their following issues rather than sending them back to search again.


3. Website – Creating compelling pillar pages is critical for drawing people to your website. Is there a much better way to start than by brainstorming about what your website is all about?

What should I post on my website?

To begin creating content for your website, if you’re starting from scratch, start at the top and work your way down. Think about what you want your website to represent at a fundamental level. Then, consider all the significant subjects you want your website to cover. Depending on what kind of website you work on and what development phase you’re in, you may already have a structure.

  • Before launching an eCommerce site, research everything people want to know about your main product categories.
  • When creating a service-based or corporation website, thoroughly answer questions about your core service areas and industries.
  • When creating a publishing site, consider the main topics you will cover and create a detailed hub-and-spoke plan to cover them in-depth.

It’s critical to stay on topic if you want to use relationships to improve SEO. Make sure every page connects to other pages on your site and stay on topic.

How do you become an expert on a topic?

To become an authoritative source, you must regularly publish high-quality material related to your core theme and link to relevant material. Build strong connections and do so proficiently. As you begin to rank well for many related terms, it will be easier to rank for related topics.

You may be creating your semantic links if you do this often enough and well enough. What we write feeds machine learning, so strengthening the semantic links between your brand and specific terms is possible if an authoritative site is talking about you in the correct context. For example, if a term is linked to your brand but not explicitly connected, the author might implicitly connect them.

Quality. Relevance. Context


An effective semantic strategy is critical for producing SEO content. At a granular level, these connections between words, topics, and pages are critical to creating engaging content. A strong strategy capitalizes on these connections in any industry to generate results.

For SEO audits, click here to book your appointment.

What’s Your SEO Score?

Enter the URL of any landing page or blog article and see how optimized it is for one keyword or phrase.

SEO Mistakes to Avoid

In 2019, Brightedge research found that 68% of all website traffic came from organic and paid searches. According to HubSpot, Google processes over 5.6 billion daily searches, which means numerous potential customers are seeing your website and, as a result, your business.

There is no question that Google and other search engines will not rank your site at the top just because you ask nicely. To rank higher on Google, you must follow SEO best practices. Here are five typical SEO mistakes to avoid.

1. Keyword Stuffing


The number of times a keyword or phrase appeared on a website in the early days of search engines was one of the main factors determining rankings. In theory, the more frequently a keyword appears on a website, the higher quality information it provides. Marketers quickly figured out how to cram as many keywords as possible into their copy, but Google has since gotten smart and no longer rewards keyword stuffing.

Google algorithms are tremendously sophisticated today. They can tell if you’re stuffing keywords on your website or providing quality content. SERP rankings will drop if you’re caught stuffing keywords. To rank higher with Google, provide excellent material.

Although keywords are still important, they are not as critical as they once were. Don’t stuff them throughout your website; naturally include them and answer your users’ queries. If you write in a natural style and answer your users’ questions, Google will likely reward you.

It’s critical to have a good company page and associated account to maximize your chances of success with Facebook advertising. The more followers or likes you have on this social media source, the more likely you are to attract new clients!

2. Inappropriate Use Of H Tags

Search engines like Google “scan” your pages to discover what they are about. If you enable search engines to read your pages more efficiently, you will be able to locate your superior material faster, resulting in a higher ranking for your website. Use H tags on your webpage to assist search engines in reading it. H tags are your webpage’s various headings. For example, the page title should be in H1 format, and subheadings should be in H2s. If an H2 has a subheading, it should include an H3, and so on.

Beyond simply making your content look pretty, these H tags are how Google and other search engines can figure out what your different sections are about.

It would be best to use appropriate H tags when you want to rank on Google SERPs. Do not just alter the size and color of your headline fonts to make them stand out. If you don’t alter the style to H1, H2, and H3, Google will recognize it as body text. Adding an H tag to your website enhances your SEO and gives your content an extra boost. This improves user experience as well as reader navigation.

3. Maintaining A desktop-only Website Design

Search engines provide customers with simple access to the most delicate information. Your website must provide excellent content, as well as be user-friendly. According to Statista, nearly 90% of the world’s internet population will use a mobile device to go online in 2021. Over half (55%) of all online traffic came from mobile devices.

Google recognizes that most people are browsing the internet using their mobile phones. To provide a better customer experience, they reward sites with a mobile-friendly design.


If you made your site look nice on a desktop, it might be time to rethink it. Creating a website that functions well on mobile is one of the most excellent methods to help you rank higher on Google SERPs.

Read Next: Should You Quit Doing SEO

4. Broken Links

Besides conveying sources or leading customers to the next stage of the buying journey (which they do exceptionally well), links are a critical element of SEO. Google tracks links in your content for two reasons:

  • To define SEO equity.
  • To look for related content.

Search engines value your website’s SEO equity, which is your site’s credibility. If you keep posting high-quality content, your equity will probably rise. You can improve your website’s SEO by acquiring links from other high-credibility websites. You can use Wikipedia as a source for one of your research projects rather than a peer-reviewed paper from Harvard University. You will gain more SEO value if you use a peer-reviewed paper from Harvard University as a source. Google also tracks your internal links to find new relevant content. It’s a good idea to link to relevant pages on your website.

Since links constantly change, pages may be deleted or altered, which may result in your page having a broken link (404 error). Broken links on your page can impact your SEO if Google cannot follow them. Don’t forget to set a 301 redirect if you discover a broken link to your website. Monitor your links over time to ensure they don’t break. This simple technique may help you reach the top of Google SERPs and boost your SEO.

5. Saying 'Good Enough.'

SEO Mistakes to Avoid

An SEO strategy isn’t something you can put in place and then forget about. It requires constant attention. You cannot achieve 100% optimality in SEO; it is always an ongoing process. SEO rules are constantly altering, and to rank well in Google results, you should regularly update your website following current SEO trends. Don’t be satisfied with ‘just good enough.’ Keep tracking and improving to boost your rankings.

Anyone hoping to appear on the internet should avoid keyword stuffing, use H tags properly, build a responsive website with excellent UI/ UX, repair broken links, and maintain a consistent SEO strategy to avoid these issues. Ranking higher in search engine results may be achieved by addressing them.

What’s Your SEO Score?

Enter the URL of any landing page or blog article and see how optimized it is for one keyword or phrase.

Difference Between On Page and Off Page SEO

The difference between on-page SEO and off-page SEO is determined by various SEO factors. The first addresses elements on a web page, whereas the second pertains to issues outside of a web page.

On Page SEO


The visible content and HTML source code of a webpage can be optimized to aid search engines index the page. This practice is known as on-page SEO.

Finding relevant subtopics is an example tactic. Poor rankings on high-traffic pages might indicate that your page lacks the information users seek. You can use Ahrefs’ Content Gap tool to compare the content on high-ranking pages with yours. This tool displays terms you don’t target but are relevant to pages that rank well.

  • Paste the URLs of the pages you want to compare and click “Show keywords” to begin.
  • Finding what your content is missing is step 2. For example, the lack of definitions of guest blogging and guest posts is what we want to address.

Off Page SEO

Outside of a website, many efforts can be made to improve its search engine rankings that make up off-page SEO.

Finding link building is an example tactic. You can get backlinks organically or ‘build’ them. Link building strategies are plentiful. One is seeking out linking patterns among your competitors so you can get links from the same websites. A backlink checking tool is necessary for that.

To see how linking patterns work, the best approach is to examine websites that link to your competitors but not to you. The Ahrefs Link Intersect tool can help you do just that.

  • Paste in the URLs you want to compare.
  • Look through the results to find domains.
  • Click on the number of links on competitor pages to pages to find backlinks.

What's the significance of on-page SEO and off-page SEO?

Google uses ranking factors on your pages and outside them to determine where a page should rank. On-page SEO and off-page SEO work hand in hand to produce the best results. Depending on your goals, you may focus more on either SEO type. However, it would be best if you did not focus on one exclusively all the time.

Factors that Impact On-page SEO

The SERPs (search engine results pages) is where most people look for information, so making sure your content is visible to as many users as possible is essential. I will discuss known ranking factors that can help you increase your SERP visibility and attract visitors to your site. Here are the most important things to consider if you want to rank higher and gain more traffic to your content.

Search Intent. It is one of the most essential ranking factors. It describes why people search. In SEO, you attempt to provide the information people seek when they type in a search term.


The quality of on-page SEO is dependent on the accuracy of search intent. Search engines must deliver users pertinent and valuable information every moment.

You should look at the search result pages for a particular query and identify the three Cs of search intent to optimize your content:

  • Content Type – What is the primary type of content? Is it a product page, blog post, a video, or something else.
  • Content Format – There are a few common types of content, including how-to guides, list posts, reviews, and comparisons.
  • Content Angle– “Best,” “cheapest,” and “for beginners” are all examples of content’s unique selling point.

Must Read: Interlinking Your Blog Posts for SEO

Content Quality

Quality content is also crucial. According to Google, content is the same as anything else that readers value, and Google aims to embed those same attributes in its algorithms. However, creating “interesting and engaging content” using search intent will not be sufficient.

  • Audiences can read it easily.
  • Organized.
  • Fresh.
  • Outstanding.
  • Aligned with E-A-T guidelines.
  • A search solution’s purpose is to answer a user’s question.

Creating better content than your competitors is essential in practice as well.


According to Google SEO specialist John Mueller, URLs are an overrated SEO factor and should not be a primary concern. Instead, it would be best if you focused on creating a pleasant user experience. Despite Mueller’s opinion, Google’s SEO guidelines advocate URL optimization.

Users can see the URL both in the address bar and on the SERPs, and that information enables them to a) choose which results they prefer and b) know where they are on the website. An unfriendly URL looks like this:


The example website doesn’t use HTTPS nor has an overly nested URL structure. Furthermore, the page’s subject is not apparent from the URL: Here is an example of a user-friendly URL:


You might also like: Everything You Must Understand About the Google Penguin Update.

Page Titles


Google uses page titles to understand better what a page is about to match the intents behind the search queries better.

Inevitably, searches rely on page titles to grasp what they are getting from a website. Here are some excellent techniques to please both parties:

  • Users should be drawn in by an eye-catching and accurate title and then proceed to an offer that accurately describes what makes it unique.
  • Your title should contain the target keyword, but make sure it sounds natural.
  • Your description may get truncated if it goes over 60 characters, so keep it short and sweet.

Meta Description

Searchers should be the primary focus when optimizing meta descriptions. Meta descriptions don’t affect rankings but appear on search result pages (just beneath the page title), so they may affect clickthrough rates (CTRs). The following SEO practices are essential:
* Your reputation matters, so don’t make clickbait descriptions (make them compelling enough to entice users to click, as long as they aren’t).

  • Avoid exceeding 920 px when creating a description (using SERPSim).
  • Ensure the description and the page title are consistent (and vice versa).
  • Every page should have a unique description.

Outbound Links

Links to external sites probably have no bearing on rankings. A link that points to a page on a website other than yours is an outbound link. It would be best if you did not try to force outbound links into your content to improve your rankings. You are citing your sources rather than embedding links, maybe a better approach. Maintaining your content’s authenticity, openness, and accuracy will be easier if you cite your sources. In short, adhering to the E-A-T search quality guidelines will be easier if you cite your sources.

Schema Markup

By understanding the schema markup, search engines can better represent your content in the search results.

Meta tags are comparable to schema markup. While schema markup does not impact rankings, using it can make your content more visible on the SERPs. Even though schema markup seems like computer code, it is not difficult to learn. You can use the Schema Builder extension to assist you in adding it to your web pages.

Internal Links

Internal links are links to other pages on the same website. Internal links are ranked based on several Google methods:

  • To find new pages, internal links provide a crawl path to them.
  • You can boost other pages you own by passing link equity between your pages.
  • Google ranks pages based on what they contain. Therefore, the internal link’s anchor text is important in understanding the page’s content.


Users finding content and navigating your website are two reasons you shouldn’t neglect internal linking. It would be best if you strategically planned content hubs to include internal links as you create new content or even new content. It’s never too late to add internal links to your existing content.

Page UX

According to UX designers, a website’s user experience is its overall impression. However, in SEO, the usability of a site or page is specifically concerned with. It refers to keeping a clutter-free, distraction-free, and user-friendly interface. UX improvements should generally be made to the entire website, not just web pages. If you want to preserve unique layouts for your pages, remember that a distinct appearance will result in a unique experience. Here are some things to look for:

  • Don’t use annoying pop-ups, such as sign-up forms or exit forms. Follow Google’s interstitial guidelines (if you use banners that alter the layout).
  • Before publishing important pages, ensure they are not sluggish – you should optimize for Core Web Vitals.
  • Try to keep your website’s design simple, consistent, and user-friendly. Avoid overburdening the user’s mental capacity.
  • More than 50% of website traffic comes from mobile devices. In addition, mobile versions of websites (mobile-first indexing) are indexed and ranked by Google.

Do ranking factors include UX? Two elements might affect your rankings: core web vitals and mobile-friendliness. Google uses page experience signals to comparable position webpages—but those won’t have significant rankings changes. You may still rank if your web pages aren’t mobile-friendly or if they are slow.

What Are The Impacts of Off-page SEO?

I will discuss those elements that directly impact rankings and those that do not but can still help you gain more organic traffic and visibility.


Backlinks are a critical ranking factor in addition to search intent. Google’s PageRank is determined by looking at the quantity and quality of backlinks to determine the “value of a page.” A page with more backlinks (from unique websites) is likelier to outrank its competitors on SERPs.

Having a lot of backlinks helps amplify the amount of organic traffic a page receives.

All backlinks aren’t created equal. You can gauge a backlink by these six characteristics:

  • Authority – A page with more links has a more crucial vote than a page with fewer links, so pages that get more links should pass on a more crucial vote to other pages.
  • Relevance – Here is Google’s definition: “It’s a positive sign that the information is of high quality if other prominent websites link to the page.”
  • Anchor Text – When Google evaluates a backlink, it considers the anchor text and the internal links on the target page.
  • Follow vs. Nofollow – Google ignores “nofollow” links when ranking pages. “Follow” is the opposite of “nofollow.” Links that are “followed” are generally given more weight. Links are “follow” by default unless specified otherwise.
  • Placement – The links that are more likely to be clicked (for example, links embedded in the content or positioned higher on the page) will pass more authority if they are linked to pages with higher PageRank.
  • Destination – A link can help boost the ranking of a specific page, but you can spread link equity to other pages through internal linking.

What’s Your SEO Score?

Enter the URL of any landing page or blog article and see how optimized it is for one keyword or phrase.

Google Penguin Update

Learn about the complete history of Google’s Penguin algorithm and how it affected SEO practice.

The Penguin algorithm was released by Google ten years ago, and since then, the link-building approaches have become less common. Over the years, the algorithm has been updated several times, and now it is an integral part of Google’s algorithm. Penalties are still present, albeit less frequently than before, and may be either partial or complete.

Google says it usually ignores many poor-quality links online. However, it is still vigilant for unnatural patterns such as link schemes, private blog networks, link exchanges, and unnatural outbound linking behavior.

Penguin Algorithm Introduction


Google officially launched the “webspam algorithm update” in 2012 to fight against link spam and manipulative link-building techniques. Matt Cutts, head of the Google webspam team, later dubbed the Penguin update algorithm.

There is no explanation of where the name Penguin came from for Google’s algorithm, but there is a good chance that it came from the same source as Panda’s name, which came from one of the engineers involved with it.

Penguin naming theory is one of my favorites because it references The Penguin from Batman DC.

Before the Penguin algorithm, link quantity had a more substantial influence on a website’s ranking when Google crawled, indexed, and evaluated them. Because some low-quality websites and content received higher rankings on organic search results pages than they should have, link volume played a more significant part in determining a website’s ranking.

Search results pages ranked websites and pieces of content using these scores, and some low-quality websites appeared in more prominent positions than they should have, resulting in these scores being used to rank websites.

What Drove Google To Create Penguin?

Google’s only tool to eradicate poor-quality content has been the Panda algorithm. In addition, the Penguin algorithm was created to help stop the same thing. The Penguin algorithm was created to prevent black hat link building, which has become popular recently. Cutts said at the SMX Advanced 2012:

“We conceived of Penguin as a way to target low-quality content. Panda was the first such tool; afterward, we noticed that spam was still prevalent.”


Penguin was created to reduce the number of spammy links by refining the method by which websites and web admins earn links. Penguin observes how links are earned and processed to downrate manipulative and spammy ones to ensure that natural, authoritative, and relevant links are rewarded.

Google looks at the site’s incoming links but not the outgoing ones. Penguin only looks at the site’s incoming links.

The Initial Launch and Impact

Google estimated that Penguin affected over 3% of search results when Penguin was first released in April 2012.

In May 2013, Penguin 2.0, the algorithm’s fourth revision (including the initial release), impacted about 2.3% of all queries. On release, Penguin was said to oppose two specific deceits, particularly link schemes and keyword stuffing.

In general, link schemes categorize manipulative link-building techniques, including exchanges, paid links, and other unnatural link tactics described in Google’s link scheme documentation. The initial Penguin release also targeted keyword stuffing. A practice that has since become connected with the Panda algorithm (considered a content and site quality algorithm).

Major Google Penguin Updates And Refreshes


Penguin has seen many updates and improvements since it was first released in 2012 and potentially several other undocumented algorithm updates.

  • March 26, 2012: Google released Penguin 1.1.

Google was refreshing its data for the first time since the Penguin algorithm launched. Websites that had previously been affected and had cleaned up their link profiles were seeing some recovery, while others that were missed the first time around saw an impact.

  • October 5, 2012: Google launched Penguin 1.2.

The English language, as well as international queries, were affected by this data refresh.

  • May 22, 2013: Google Penguin 2.0 was released.

Penguin 2.0 was a more sophisticated version of the Penguin algorithm, and it changed how the algorithm affected search results. Around 2.3% of English queries were affected, as well as other languages proportionately. This was the first Penguin update that looked beyond the website homepage and top-level category pages for evidence of link spam being directed to the website.

  • October 4, 2013: Google Penguin 2.1 was released.

Google provided no official explanation for the refresh to Penguin 2.0 (2.1) on October 4 of the same year. Still, data suggests that the 2.1 data refresh improved how deeply Penguin inspected websites and crawled further and whether spammy links were present. 1% of queries were affected.

  • October 17, 2014: Google launched Penguin 3.0.

Google+ user Pierre Far revealed in a post that although this update was marketed as a significant innovation, it was, in fact, another data refresh. Those harmed by previous updates could escape and recover, while many others who had used spammy link practices and had evaded the previous impacts saw an impact. Far said it took “a few weeks” to disseminate fully. Less than 1% of English search queries were affected by this update, Far said.

  • September 23, 2016: Google launched Penguin 4.0.

Penguin’s last update was released almost two years after the 3.0 refresh. The most significant alteration with this version was that Penguin became a component of the fundamental algorithm. When an algorithm becomes an integral part, it does not imply that its functionality will be altered or changed significantly.

Penguin runs with the core, monitoring websites and connections in real-time. You can see reasonably immediate results from your link building or rehabilitation efforts. This indicates that Google’s perspective on the algorithm has altered, not the algorithm itself.

Penguin Version 4 was less generous in devaluing backlinks than the previous Penguin updates, which penalized the backlinks themselves. Although link-based penalties are still prevalent, they still exist, according to research and my personal experience. Seeing SEO professionals’ data (for example, Michael Cottam’s) and algorithmic downgrades through disavow files lifted after Penguin 4.0 reinforces this notion.

Algorithmic Downgrades of the Google Penguin

Web admins and brands that had used manipulative link-building methods or filled their backlink profiles with copious amounts of low-quality links soon noticed a drop in organic traffic and rankings after the Penguin algorithm was released. Not all Penguin downgrades were site-wide—some were partial and affected certain keyword groups that had been heavily spam and over-optimized, such as essential products and, in some cases, even brands.

It has been found through research and experimentation that Penguin’s impact cannot be eliminated using a 301 or 302 redirect and that problems may arise if domains are switched. The old one was redirected to the new one. John Mueller of Google Webmasters Forum concurred that using meta refreshes from one domain might cause issues.

“Meta-refresh redirects should generally be avoided, as they could mislead users (and search engine spiders, who might mistake them for an attempt at redirecting). “

Read next: The Significance of Google’s May 2022 Fast Core Update.

Recovery from Google Penguin

Even though Penguin is now a component of Google’s main algorithm, the disavow tool has still been an asset to SEO practitioners. Reports and theories have been published arguing that disavowing links does not help with link-based algorithmic downgrades or manual penalties, but Google employees have publicly dismissed this theory.

Google recommends that the disavow tool be used only as a last resort when fighting link spam. Disavowing a link is far less time-consuming (and more effective) than seeking reconsideration for worthy links.


A Disavow File Must Have

To remove all of the wrong links from damaging your website, you can provide Google with a disavow file, which tells them to ignore all of the links in the file. Because of this, Penguin, for example, will no longer negatively impact your website rankings.

However, too many quality links in your disavow file will no longer benefit your rankings. You may include notes in your disavow file for reference, but not for other purposes. You may include links alone or not include anything at all.

Google processes the disavow file without ever reading it, so don’t bother including notes about when URLs were included or how to contact the webmaster about getting a link removed. Some find it helpful to include internal notes.

After you’ve submitted your disavow file, Google will send you a confirmation. However, Google will not immediately discount those links until the disavow file has been processed. Google will still have to crawl those individual links you included in the disavow file, but the disavow file itself won’t prompt Google to crawl those pages. Therefore, submitting the disavow file won’t instantly recuperate you.

Unfortunately, Google Search Console still reports discounted and non-discounted links, so there is no way to determine which ones have been discounted.

Google will replace your old disavow file with your new one if you’ve submitted one. Thus, be sure to include any previously disavowed links in your new disavow file if you’ve previously disavowed links.

Google Search Console always provides a current disavow file download.

Domains vs. Disavowing Individual Links


Choose to disavow links on a domain level rather than individual links, as long as there are some exceptions where you would want to disavow individual links. For example, you would want to disavow individually specific links on a significant website with quality and paid links.

Google will discount a link on your site based on a domain-level disavowal for most links. However, Google must crawl one page on that site to discount the link on your site.

When using domain-based disavows, you do not have to worry about links being indexed as www or non-www, as the domain-based disavow considers this.

How To Find Your Backlinks

A link audit and removal or disavowal of low-quality or spammy links may be necessary if you think your site has been negatively impacted by Penguin.

The Google Search Console backlink catalog lets site owners see all the links to their site, but be cautious because it also includes nofollowed links. Whether a link is followed has no impact on your website. However, be mindful that the site may remove the nofollow without warning at some point.

Some websites block third-party bots from crawling their site, so they cannot show you every link to your website. Some high-quality, well-known sites refuse to waste bandwidth on those bots, but some spammy websites use this technique to conceal their low-quality links.

In addition to monitoring backlinks, it’s essential to stay on top of any negative SEO attacks since our industry is not always honest. Many people refer to ‘negative SEO’ as the reason behind their site being caught by Google for having poor links. When your site gets caught by Google for having spammy links, your competitor may have purchased them and pointed them to your site.

On the other hand, Google is pretty good at spotting this type of issue, so most website owners do not need to worry about it. Consequently, it is a good idea to proactively use the disavow tool without a clear sign of an algorithmic penalty or a manual action notice.

However, 38% of SEOs maintain that they never disavow backlinks, according to a survey conducted by SEJ in September 2017. Examining a backlink profile and determining whether each linking domain is a link you want or not is not a simple job.

Link Reduction Outreach

Before disavowing wrong links, Google recommends that you attempt to reach out to websites and web admins to request the removal of those links.

Google recommends that you never pay to remove links. Include those links in your disavow file and move on to the following link removal, rather than paying to have them removed.

Link-based penalties can be recovered through outreach, but it is not always necessary.

The Penguin algorithm evaluates the link profile and the proportion of high-quality, natural links versus spammy ones. A penalty may still be applied even if it is only applied to specific over-optimized keywords (Impact). Link maintenance and monitoring should keep you safe.

Some webmasters go so far as to include “terms” within the terms and conditions of their website and actively seek out websites they believe should not be linked to them.

Determining Link Quality

Many people have difficulty evaluating link quality. Please do not assume it is high quality because a link comes from a .edu website. There are plenty of spammy .edu domains on which students sell links from their websites, which should be disavowed.

There are also many hacked sites on .edu domains that have low-quality links. Do not make automatic assumptions about domain types. However, you should not automatically assume all TLDs and ccTLDs, just as you should not with .edu domains.



Google has stated that domain names alone do not impact search rankings. You must, however, examine each case individually. There’s a long-running joke that no quality pages have existed on any .info domains because so many spammers have used them. However, several excellent quality links are coming from that domain, which illustrates the importance of evaluating links individually.


Be Careful Of Links From Presumed High-Quality Sites

Be careful when evaluating links from specific websites; just because you have a link from Huffington Post or the BBC doesn’t mean that Google will think it’s a great link. Don’t just look at the list of links; determine whether each link is of high quality.

Those sites also sell links, though some masquerade as advertisements or has rogue contributors selling links in their articles. These kinds of links from high-quality websites have been confirmed by many SEOs as being low-quality, as they have received manual actions that include links from these websites in Google’s examples. Furthermore, they may have contributed to a Penguin issue.

There are also a lot of sites that are selling links, albeit some of them are disguised as advertisements or are being sold by rogue contributors within their articles. Many SEOs have recognized these links from trustworthy sites as low quality precisely due to their inclusion in Google’s link manual actions. It’s also possible that they might contribute to a Penguin issue.

In the future, more links will be flagged as low-quality because of increased advertorial content. Always investigate links, particularly if you plan on keeping any of them just because they are on a reputable site.

Promotional Links

Be careful when using paid links; they may still be considered promotional links. Even if money is not exchanged for the links, paid links are still risky to use.

Google views any link exchange for a free product for review or a discount on products as a paid link. Years ago, such links were acceptable, but now they must be nofollowed.

A link still provides value, but rather than aiding rankings, it provides brand awareness and traffic. For example, you may have links from old promotional campaigns that now hurt a site.

Individual link assessment is vital for all these reasons. You want to remove the poor quality links because they are negatively impacting Penguin or might cause a future manual action, but you don’t want to remove the good links, which are helping your search rankings.

When links are not nofollowed, they can also trigger the manual action for outgoing links on the site.

No Sign Of Penguin Recovery?


It’s common for webmasters to notice no rise in traffic or rankings after they’ve gone to great lengths to clean up their link profiles. The following are some possible reasons for this:
Before being penalized by the algorithm, the site experienced an unjustified (and likely short-lived) boost in traffic and rankings from bad backlinks.

  • No attempt has been made to obtain higher-quality backlinks once links have been removed.
  • Some harmful backlinks still need to be disavowed/removed, and the proportion of removed links is not high enough.
  • The problem wasn’t based on links, to begin with.

Ranking at the top for your high-traffic keywords right after Penguin is lifted will probably be out of reach. Many site owners erroneously believe they will begin ranking at the top for their most popular terms as soon as Penguin is lifted.

Many site owners have trouble evaluating link quality, so you cannot expect your rankings to be as high as they were before. In addition, since some high-quality links are inevitably disavowed, contributing to higher rankings, you cannot expect everything to be as good as before.

There are several things to consider when optimizing for Google. The first is that Google’s ranking algorithm constantly changes, so things that benefited you before might not be as much now, and vice versa.

Myths and Misconceptions About Google Penguin

It’s lovely to be involved in the SEO industry, as it’s a lively and dynamic community, and new findings and theories are posted online every day. Of course, this has led to several misunderstandings and myths about Google’s algorithms. Penguin is no different.

A few things about the Penguin algorithm have been misunderstood over the years.

It is a myth that Penguin is a penalty. The biggest myth about the Penguin algorithm is that it is an algorithmic adjustment (Google’s term for a manual action). Penguin is purely algorithmic and cannot be removed by Google. Although both algorithmic changes and penalties can result in big website rankings declines, there are some significant differences between them.

When a member of Google’s webspam team responds to a flag, checks, and decides to penalize a website, a penalty (or manual action) occurs. You will receive a notification via Google Search Console relating to this penalty.

In addition to reviewing your backlinks and disavowing the spammy ones that go against Google’s guidelines, you must also submit a reconsideration request to the Google webspam team if you are hit with a manual action. If successful, the penalty will be lifted; if not, you must reconsider your backlink profile.

Penguin downgrades occur without Google team members becoming involved. It is all done algorithmically. Previously, you would have to wait for an update or refresh, but Penguin now operates in real-time, making recoveries happen much faster (if enough remediation work has been accomplished).

It’s a myth that Google will inform you if Penguin hits your site. It is false that the Google Penguin algorithm notifies you when it has been applied. The Search Console won’t notify you if your rankings have dropped as a result of the application of the Penguin.

Algorithms don’t inform you if you’ve been penalized, but Penguin recovery is similar to penalty recovery in that you are notified if you have been penalized.

It is a myth that disavowing Bad Links is the only way to reverse a Penguin Hit. Google Penguin evaluates the proportion of good-quality links versus those of a spammy nature. This approach is an utter waste of time and resources, and it is also time-consuming.

It may be wiser to focus on generating more high-quality links to your website rather than removing the low-quality ones. This will significantly affect Penguin’s percentage more than if you remove them manually.

It’s a myth that you can’t recover from Penguin. Sure, you can recover from Penguin. However, you must be familiar with Google’s unpredictable algorithms.

Instead of trying to remove the Penguin penalty from your website, focus on gaining quality, editorially-given links. By forgetting all of your current links and beginning to gain new ones, you will make it easier to release your website from Penguin’s grasp.

Click here to book an appointment for your SEO and business needs!

What’s Your SEO Score?

Enter the URL of any landing page or blog article and see how optimized it is for one keyword or phrase.

Choosing the Right Domain Name

Choosing the right domain name is crucial when launching a website, but do keyword domains have ranking power?

There are several reasons why domain names with keywords are of significant value, one being that it is believed they might be of help in ranking. Therefore it is crucial to pick the right one. There are three types of names to choose from:

  • Keyword domain
  • Word + keyword domain
  • Brand domain

If you’re unsure which methodology is most advantageous, become familiar with this information before choosing.

Keyword Domains


An individual or company that owns a keyword domain name can create a strong connection with their audience by placing relevant words to the business in the domain name. An example of this type of domain is Widgets.com. Although some companies own generic domain names and redirect them to their websites for reasons unknown, this technique can be effective.

Let’s say; The URL coffee.com redirects to Peet’s Coffee, a specialty coffee roaster. That makes it simple for people to find Peet’s.


However, generic keyword domains have a disadvantage because “all the desirable ones” are locked down and excessively costly to take away from a domainer. Generic keyword domains also have some historical significance on the internet.

Before developing search engines and browsers, users typed the name of a product or service directly into the browser or search engine to be connected to the relevant website. This method of direct navigation generated a substantial amount of revenue for those who owned the domains and parked them. Parking the domain involved setting it up to display ads and nothing but ads.

Prior to the era of search engines, people could make money by parking domain names. As a result, if someone searched for a one-word query such as [burgers], Burgers.com might be listed as the result.

As a result, in 2011, Google decreased the search visibility of parked domains. Is there ranking power in keyword domains? It’s not anymore, but John Mueller of Google has something to say about it, which is more details below.

Word + Keyword Domains

The most common strategy is to include a word in a domain name that describes what visitors can expect on the site. Websites like Cheap[name of product/service].com, [name of product/service]Reviews.com, Fast[name of product/service], and so on are examples of this. Creating a domain name using keywords and words is not a terrible idea.

  • Positive Aspect Of A Domain + Keyword

The Keyword immediately identifies the site’s purpose, and the word conveys the visitor’s intention.

Looking for a review? Check out [name of product/service]Reviews.com.

  • Negative Aspect Of Word + Keyword Domain

Having a website committed to a particular niche can be a disadvantage, as it prevents the website from growing and diversifying.

If you started as one, it would be challenging to turn [JoesCameraReviews] into a site that reviews or sells other products.

Many sites rank very well in terms of keywords on the domain.

Ready to get more business? Click here!

Branded Domains

Branded domains are domain names that don’t necessarily include keywords. Etsy, Amazon, and Zappos are examples of branded domains.

There is nothing wrong with using a branded domain to build a website, as long as the domain name doesn’t define the site’s content. Many sites with branded domains can rank well in search results.



Google's Four Insights on Keyword Domains

Google Senior Software Engineer John Mueller recently answered a question in a Webmaster Hangout, providing four indicators of the significance of domain names in rankings.

1. Keyword domains don’t have a time benefit.
According to Google’s John Mueller, keyword domains are not preferable to branded domains when ranking fast. There is a belief that keyword domains can achieve this. However, according to Mueller, this is not the case. Anchor text links can provide keyword domains with advantages, particularly when it comes to link building. This idea has been around for years. It is a topic that has been heavily debated.

Unfortunately, John Mueller’s remark did not address this supposed advantage. Here is what John Mueller stated: “…it takes time like any other new website… Obviously, there are lots of websites out there that do rank for the keywords in their domain name. But they worked on this maybe for years and years…”

2. Keywords in domains do not rank better.
According to John Mueller, it is true that keyword domains do not rank better than branded domains.

“…just because keywords are in a domain name doesn’t mean that it’ll automatically rank for those keywords.”

Ranking really depends on many factors, such as the content, the user’s desire for the content, and links. All of these things may well be more important than domain keywords. John Mueller didn’t explicitly say keywords in the domain name weren’t a ranking signal, but he did say there was no significant advantage to having the keywords there. This is an important takeaway.

3. Keyword Domains lost impact years ago.
Keyword domains lost influence years ago, John Mueller said. This is what John Mueller said:

“…just because keywords are in a domain name doesn’t mean that it’ll automatically rank for those keywords. And that’s something that’s been the case for a really, really long time.”

Google announced in late 2011 that it had updated its algorithm to remove parked domains from search results (here is the official announcement). This may be a reference to that algorithm update.

Google’s algorithm update announcement quote:

“This is a new algorithm for automatically detecting parked domains. Parked domains are placeholder sites with little unique content for our users and are often filled only with ads.
In most cases, we prefer not to show them.”

Despite the fact that Google no longer granted a boost to parked keyword domains, the notion that keyword domains were superior to brand domains continued to thrive in the search business. There may be a weak signal here, but there is nothing to corroborate that theory.

There have been no search engine research studies that have included domains as a signal forever. We live in a period in which heading tags (H1, H2) have lost their weight in the ranking. Current algorithms no longer give a weight bonus for title tags. We know this, and it questions whether Google still gives a natural rankings boost for a domain keyword.

4. Keyword domains can be as equally ranked as branded domains.
Another declaration invalidates the notion that domain names with keywords have an advantage in search rankings. According to John Mueller, the keywords in a domain have no bearing on its current ranking:

“…it’s kind of normal that they would rank for those keywords and that they happen to have them in their domain name is kind of unrelated to their current ranking.”

According to Mueller, domain names containing the keywords are not positively correlated to higher rankings.

Before Choosing A Domain Name: Do Some Research

You should always research domain names to see if they have been registered before or what they were used for. In rare instances, a domain used to spam may become stuck in a Google algorithm loop. The domain is banned for a month, then released for a few days before being banned once more, which prevents the site from ranking above the second page of search results.

Keyword Domains SEO Advantage


As Mueller points out, having a keyword in a domain name can provide several advantages. However, SEO advantages are not necessarily among them, as he indicates.

“…that they happen to have them in their domain name is kind of unrelated to their current ranking.”

Make Your Domain Stand Out

Choosing a domain name that would stand out can be accomplished by either emphasizing a keyword or brand name. According to a 2011 webmaster help video by former Googler Matt Cutts, choosing a domain name that stands out in certain situations might be advantageous.

Matt advised:

“For example, if you have 15 sites about Android and they all have Android, Android, Android, Android, it’s going to be a little hard to remember to rise above the noise, to rise above the din.
Whereas, if you have something that’s a little more brandable, then people are going to remember that. They’re going to be able to come back to it. Even sites like TechCrunch, nothing in there says tech news.”

Takeaway on Domain Names

The choice of domain names can be advantageous or disadvantageous depending on the site’s needs. If the business wants to expand to encompass a broader range of topics, then a less specific domain name or even one not associated with the company’s brand is a good choice.

The business should consider its present objectives, the message the domain name conveys to visitors, its tale, and how well the domain name fits with the business’s future to determine if it’s a good idea. The ranking is not guaranteed by using a keyword in a domain name, making it a little more straightforward to select one. Starting with a niche-specific domain name is fine, but it may cause other websites to reconsider linking to your site or make you lose fans.

If you enjoyed this article, you might also like Domain vs. URL; What’s the Difference?

What’s Your SEO Score?

Enter the URL of any landing page or blog article and see how optimized it is for one keyword or phrase.


Everyone (LOVES) to talk about link building and learn about the latest and greatest techniques for generating backlinks to their blogs. While this increases visitors and improves search engine results, it might detract from an equally vital part of your blog: internal (or onsite) SEO. 

Not only can effective internal SEO help increase your results by making it easier for Google’s crawlers to reach your pages, but it also allows real people to explore your site and, ideally, stay longer. 

Internal linking of your blog pages is an important aspect of onsite SEO, and this tutorial will go into the subtleties of interlinking your blog pages. 

Why Interlink?


The main purpose of interlinking your blog posts is to allow search engines to simply crawl and index all of your pages, as well as examine your site’s structure. A clear structure ensures that all of your pages are indexed, allowing them to be matched to search queries. 

Because Google wants to provide the finest experience and content for its users, the quality of your site’s layout is taken into account when determining rankings. 

Perks Other than SEO

There are a lot of blog entries out there that claim interlinking is just vital for SEO; however, this is simply not true. 

Internal links make it easier for visitors to traverse your site and find additional information. Consider someone finding one of your posts via a Google search. They could read it, obtain the information they need, and then exit or exit without thinking. When you include anchor text links in your blog that direct visitors to additional relevant pieces, your readers are suddenly spending more time on your site rather than just a few minutes. And this dramatically raises the likelihood of them subscribing or returning. 

Bounce Rate

Bounce rates will be reduced if visitors stay on your site for longer periods of time, which is essential because Google utilizes bounce rates to decide whether or not your blog has good content. Google keeps track of searchers who enter a post, look at a single page without digging more, and then leave. If this frequently happens on your site, Google will notice that visitors aren’t finding it useful, and your ranks will suffer as a result. 


Internal Linking Techniques that Work

When linking your blogs internally, keep two things in mind: structure and common sense. Focus on a tiered linking system that works from the top-down, starting with the home page, for structure. Breadcrumbs are an excellent example of this. 



Hansel and Gretel may be the earliest depiction of internet readers that we have. They were both so ADD that on their forays into the woods; they couldn’t remember how to get home. Your blog’s readers are similar (in some ways), but they can’t leave their own breadcrumbs to find their way back, so you have to assist them. 

Breadcrumbs indicate the various tiers and landing pages that led you to your current location. If you get too far down the rabbit hole, a simple click on any crumb will take you to a larger page. Breadcrumbs can be included in your site using a variety of plugins, resulting in natural internal links on all of your pages. Yoast’s breadcrumb plugin for WordPress is the best I’ve discovered. 

Natural Links

You can start linking between your blog posts once you have a tiered structure in place that correctly links all of your primary sites. This keeps each content from falling through the gaps and ensures that everything is indexed in the SERP. 

There are tools that can help you link naturally between blog entries, the finest of which being SEO Smart Links, a WordPress plugin that matches keywords to tags and titles and creates links between them automatically. 

SEO Smart Links can be a useful tool for larger blogs where you might forget about some posts or have a lot of stuff to post. However, if you have a smaller blog, manually linking between entries is pretty simple. 

Make the anchor text relevant and keyword specific so that crawlers and users can tell what kind of page the link leads to. This helps with SERP ranks, as well as click-through rates and indexing. 

Don't Go Overboard

When it comes to interlinking, it’s crucial not to go overboard. Attempting to influence the SERPs by adding hundreds of exact match keyword anchor texts to your landing page appears to be spammy and will result in your blog being penalized. Rather than trying to force your text around the keywords, keep it natural and mix up your keywords to fit naturally within sentences.


Control the Situation

To keep everything in order for users and crawlers, it’s critical to stick to the tiered system of internal linking. Consider using a structure that is similar to the classic pyramid structure, with the Home Page at the top and everything flowing down from there. Linking to and from landing pages, blog posts, the about page, contact pages, and anything else you have can rapidly become a jumble, and you could be leading the crawlers on a wild goose chase as they try to figure out what’s going on with your site. 

Keep it simple for them, and they’ll reward you with higher ranks and faster indexing, as well as more exploration and interaction from your readers. 

It All Begins With Excellent Content

Of course, having the strongest internal linking structure is useless if you don’t have compelling content to keep users engaged. Any aspect of SEO should never take precedence over the quality of your content, but it may be leveraged to elevate good material to new heights.

If you enjoyed this article, you may also like: Improving User Experience

What’s Your SEO Score?

Enter the URL of any landing page or blog article and see how optimized it is for one keyword or phrase.

SEO Trends for 2022

In 2022, for search engines to index websites, they will need to provide more competitive and high-quality content. As a result of this shift in SEO, new software and services are being developed to help scale these content quality demands.

User experience, semantic search, user intent, schema markup, video SEO, multichannel digital marketing strategies, and other techniques are all becoming more important in SEO for 2022.

Search Engine Optimization (SEO)


With each Google core upgrade, semantic search optimization is becoming more popular as an SEO strategy. The days of being able to optimize a web page solely for a set of keywords and rank well in the SERPs are over. With semantic search, things are changing.

Google’s algorithm now examines each search query in order to determine the intent behind it and display the appropriate web pages in the SERPs.

Comparing Google search results for the phrases “air conditioner” and “best air conditioner” is a frequent example of semantic search in action.

Because the objective is not clearly specified, the former will display a variety of distinct pages (e.g., eCommerce pages, Wikipedia, review content). However, the latter will mostly display review pages that list the top air conditioners in the opinion of the site owner (e.g., Bob Vila, The Wire Cutter, Consumer Reports).

Optimization of Entities

Instead of relying on keywords, entity-based SEO focuses on entities inside your content. A singular, unique, well-defined, and recognizable thing or notion is known as an entity.

The issue with focusing just on keywords is that they might be confusing. The term “cache” might refer to a web browser, a personal weapon, or a place where an animal keeps food, for example. As a result, having a high keyword density for the term “cache” on a web page isn’t necessarily the most effective strategy to optimize content for search engines.

Instead, put related entities on the page so that Google’s algorithm can better comprehend the relationship between the various phrases and themes in the content so that it can index and rank it appropriately.

Relatedness is determined mostly by something called co-occurrence. The frequency of entities appearing together in documents throughout the web is used to measure the strength of linkages. The greater the relationship, the more frequently two entities are cited together, and the more authoritative the text that cites them.

As a result, a growing trend in SEO is to incorporate closely linked organizations in your content. This is demonstrated in practice on any Wikipedia page. Every internal link is treated as a separate entity from the topic at hand.

For example, SEO stands for Search Engine Optimization, SEO. It’s important to include additional entities on the page, such as “website traffic,” “organic,” “keywords,” “algorithms,” and so on, to ensure that Google knows a web page is focused on Search Engine Optimization and not other entities.

Image Optimization

Image optimization is becoming increasingly important.

When a page ranks in Google Image Search, you should optimize both the written content and the images for visual searches. Optimizing every image you post on your website for SEO by filling in image alt tags, filenames, HTML titles, captions, and structured data is a significant opportunity in the future of SEO.

More updates to Google's Algorithm

Throughout the last year, Google released many big core changes, multiple indexing modifications, and between 500 and 600 minor improvements. So, based on previous SEO patterns, it’s safe to assume that Google will update its algorithm at some point in the near future.


Voice Search

According to Think With Google, voice searches account for 20% of all Google App queries. According to Statista, smart home solutions such as Google Assistant, Apple Siri, and Amazon Alexa are used in over 300 million households.

Voice search is a rapidly growing trend in SEO that you should be considering for your website. To optimize for voice searches, start with long-tail terms that your target audience might use in ordinary discussions as a question. Then, using those questions as headings, write a short 45-55 word response.
Following that SEO strategy will make it much easier for various smart devices to pick up your web page content and respond to voice searches.

Changes in SERP Layout

Google’s SERP structure evolves in response to changing user needs. And a growing trend in this area is to give visitors more information without requiring them to click on a website link or scroll down the results page.

With Featured Snippets, more prominent and enlarged Knowledge Panels, enhanced Featured News for trending problems, more video material, additional site links, testing of indented search results, and more, SEO experts foresee a surge in zero-click searches.

In Closing

If you enjoyed this post, you may also like: Digital Marketing; What Can PushLeads Do for You?

What’s Your SEO Score?

Enter the URL of any landing page or blog article and see how optimized it is for one keyword or phrase.

Google Discover

It’s basically a curated feed for mobile devices that delivers articles and videos. It was released in 2016 under the name Google Feed; however, the app has since been renamed Google Discover.

They have around 800 million monthly active users since its launch in 2018.

 It’s available through the Google mobile app for Android, iPhones, and iPads, as well as on your phone’s browser at google.com.

Google chooses articles for each user based on their demonstrated interests, current search activity, and location in this style.

How Does Google Discover Work?


Google Discover works in the same way as AI-powered social sites like TikTok, where an algorithm tailors a user’s feed to their preferences.
These tailored results are updated on a regular basis with new content recommendations based on:

  • Search history and search activity
  • Contacts and apps information
  • Location preferences and previous visits
  • Newly released material

How Do You Make Google Discover Work for You?

SEOs everywhere are naturally seeking to comprehend and, more crucially, optimize for it, as with every Google-related deployment. Even though the first version of Google Discover was published in 2016, there are still a lot of questions about what it means for SEO.

However, optimizing for it requires expertise.

Above all, continue to create content with the user (rather than Google bots) in mind.

How Do They Present Content?

A piece of material must meet the following requirements to appear in the feed:

  • Be found on Google
  • Google’s regulations for Search features must not be broken.

How Can Content Be Optimized for It?

Because Google Discover mostly follows traditional ranking criteria, you will naturally optimize for it when you develop SEO-optimized content. There are, nevertheless, a few optimization strategies that are particularly critical for appearing in a user’s feed.

E-A-T Expertise should be prioritized.

  • Expertise
  • Authoritativeness
  • Trustworthiness

The foundation of good SEO is E-A-T.
You won’t be able to rank without these three components. Google will reward your content by enabling it to appear, if you continue to build your SEO strategy on these three pillars.

  • Clickbait-like titles should be avoided.
    Anything that smacks of clickbait or spam will be automatically rejected by Google. A good title is straightforward and accurately reflects the topic of the post without overpromising or exaggerating.
  • Include high-resolution photos
    The meta description has been removed, and now photographs are used to tell the story. (A picture is worth a thousand words or 120–150 characters.) Because it relies primarily on visuals to describe article topics, using captivating photos in your content is essential for success.

A page must include a picture that meets the following requirements in order to be displayed:

  • Minimum width of 1200px
  • The setting max-image-preview: large makes this possible.
  • There will not be a site logo.

Mobile-Friendly Design

We already know that mobile optimization is a significant ranking element in search. Because Google Discover is a mobile-only feature, it puts an even greater focus on this component. To see if your site is optimized for Google Discover, use Google’s Mobile-Friendliness Test.

Produce Consistent Content

SEOs aim to keep on top of industry news in order to predict trends and create relevant content when users are seeking it. Because Google Discover strives to put timely content at consumers’ fingertips, it has the potential to speed up the transmission of news and trends. After all, AI is significantly better than the average SEO in detecting patterns.

With this in mind, staying on top of current trends and taking advantage of content opportunities is critical if you want to appear in Google Discover. This emphasizes the significance of constantly tweaking current content to keep it relevant.


Finally, should your content be optimized?

One study found that news sites made up 46% of a sample size of Google Discover URLs, while Ecommerce made up 44%.

Meanwhile, the following industries account for barely 1-2 percent of the URLs found in Google Discover:

  • Health
  • Education
  • Finance
  • Travel
  • B2B
  • Automotive

Even if your site is in one of the following fields, Google Search will still drive the majority of your traffic. Keep in mind, though, that 1% amounts to 1 million clicks every year.

In Closing

Search Engine Optimizing for Google Discover is what we do at PushLeads, so if you need any help with your content marketing campaign, we’re here to help! 

If you enjoyed this article, you may also like: Should You Stop Doing SEO?

What’s Your SEO Score?

Enter the URL of any landing page or blog article and see how optimized it is for one keyword or phrase.