You might wonder – do I have enough internal links? Or too many? When planning an internal linking strategy, your top question should be “What links will my readers click on?” Understanding what your reader needs will help you determine which related internal content you can link.
Google may consider a high ratio of links on a page as spammy – if the links use misleading, irrelevant anchor text or are unrelated to the page topic.
Rather than thinking of a set minimum or maximum links per page, your guideline should be: Will this internal link benefit my site visitors? If a link won’t help the user find what they want, don’t include it.
Remember, link value is shared between all the links on a given page, so adding an excessive amount of links will dilute link equity. Aim to keep the number of internal links per page to under 100.
Sometimes, your page may contain links that don’t need to count toward the ranking of a page. Tell Google to disregard these links in places such as comment sections by using nofollow tags.
Internal Links per Page or Post
When it comes to knowing how many internal links your website needs, nobody really knows. Google’s algorithms, which change often anyway, currently only say to “keep the links on a given page to a reasonable number”. The problem is Google never lets anyone know what a “reasonable number” is, so everybody is winging it. Even some of the biggest experts in the game have no idea whether that number means a few, a few dozen, or a hundred or more.
In simpler terms, it really depends on your page. When it comes to your content only (meaning exclude your header, your navigation bar, your footer, etc.), you should only have a few internal links on the page. Don’t overuse them but remember to add enough that your page is informative for the user. For example, if you write about 1500 words for a content piece, it should include an internal link every 400-500 words for the best results. However, it’s best to keep track of the best practices for internal linking since Google does change its algorithms often.
Think of your website as the ocean. Rivers and streams run off the mountains into the ocean. As they do, the ocean has more minerals and water. Now, imagine that we started draining the oceans little by little — assuming all of that water actually had somewhere else to go. The oceans would slowly get smaller again.
There is a similar flow through your website, external links fill the site with authority. But they do so by linking to specific pages. These pages become more authoritative. Now, as you begin linking out from that page through internal links, you transfer some of that authority to another page.
It’s not hard to see how internal links help SEO, so we don’t want to eliminate them to prevent draining a webpage of its link equity. But we should be careful not to create too many internal links on our most important landing pages.
To achieve this, streamline your navigation on landing pages. And make sure each internal link really adds value to the customer experience. As your customers click that link, they spend time on the next page and demonstrate to search engines that SEO links add actual value.
How many internal links is too many? It’s not easy to state a specific number, but if you’re asking yourself, “Am I adding value?”, it becomes clearer how many links make sense.
SEO isn’t specific to a particular type of content or a website — it’s a bunch of practices aimed at appearing on the higher positions in the search engines. It requires a lot of geeky technical stuff like understanding of redirects, HTML and web server technology.
A successful SEO campaign still starts with some important on-page SEO factors that you must optimize before you do anything else.
So, What are the most important techniques you need to do with?
Fulfill Search Queries
– Create/optimize the existing content to the audience’s needs. This helps to gain SERP visibility, attract more qualified traffic and build trust.
– Make sure to have an H1 tag on every website page. Most CMSs automatically wrap your title in <h1> tag, but some do not. The H1 tag is a must on any website page as long as it helps search engines understand what your page is all about.
Optimize the URL slug:
– Keep it as short as possible (4 words at most—it makes it easy to understand and remember by users, but it also improves your CTR.)
– Also, try to include the keyword in the URL as well—it will definitely help with the on-page optimization.
If your page has already been published for a while, do not change the URL, especially if it’s already ranking in the – – – SERPs or if other pages already link to it. Doing this would mean you are migrating your URL and it’s best to avoid it in most cases.
Optimize the Page URL
Have an instance of your focus keyword in the URL, without using any special character, symbol, commas, etc.
Use hyphens (instead of underscore) to separate different strings. This makes the URL clean and easier for the user to guess the content on the page.
In addition, opt for a user-friendly URL structure for your entire website. Something that both search engines and the user can remember and relate to, but without any compromise to your business goals.
For e.g., a permalink structure like ‘yourdomain/this-is-test-post’ is preferred by many websites, but if you are a news website you may want to follow a date wise structure like ‘yourdomain/2019/08/15/this-is-a-test-post’.
Optimize the meta description
– Include the target keyword in this description.
– Remember, the meta description should be under 230 characters—anything above that will be truncated by Google in the SERPs.
– Same as with the page titles, keywords are not everything. Your meta description should be compelling and tell readers exactly what information will be provided on the page.
– While meta descriptions don’t have a direct impact on rankings, they will increase the click-through rate and that is a ranking factor.
Optimize the images
The efforts on page optimization of images placed on the most important promoted pages will pay off in spades. At least you should include the ALT tag.
That’s how the optimized images can help you:
– they influence the ranking of the promoted page;
– they are included on the list of image searches;
– attract more traffic to the site.
Compress images to improve the loading time of the website and supply them with the alt text. Search engines use alt text to identify the content of the web page, so it’s a great way to make the website more accessible and improve its ranking.
Important Onpage optimization techniques
Create Trust & Engagement Through UI, UX, and Branding
– Improve your website performance. Website performance metrics like the page load speed are a part of UX. Make sure to research and improve these to boost the conversion and keep your visitors satisfied.
– Responsive Web Design. Back in 2015 Google started penalizing mobile-unfriendly websites. The number of mobile visitors is growing each year and today RWD is a must.
– Build trust through UI, visuals, navigation, branding — all these pieces define if your website looks trustworthy.
Include social media sharing buttons. Help your visitors save and share your content across the Web.
Include your focus keyword
Remember, on-page optimization is not about gaming the system. It’s about sending the right signals, both to the user and to the search engine.
Essentially, on-page SEO is all about optimizing your content to answer a particular user query.
For that, use strings (technically called keywords) that relate to the user query.
Use a combination of exact match keywords and related keywords, but don’t overdo that.
Ideally, an exact match keyword density of 1.5 to 2%, sprinkled with a few more LSI keywords is good enough to send the right signals to the search engines.
Optimize the page content
Now that you have optimized the meta data supporting your page or blog post, it’s time to move on to optimizing the actual content on it.
Here are the steps you need to follow to do this:
Try to include the keyword in the h1 heading, but do not force this. Again, it is far better to publish natural (rather than keyword-stuffed) content.
– Make sure your page or blog post has an h1, but remember that there should be only one h1, and it should be above the fold. Typically, your h1 will be the actual title of the blog post or page.
– Same as with the meta tags optimization, focus on creating an attractive, compelling h1, rather than something that feels built exclusively for Google’s crawlers.
– You can use the CoSchedule’s Headline Analyzer to analyze your headline.
Optimize the content in the body of the page.
– Try to include your target keyword in the first 100 words of the page or blog post.
optimizing on page copy for seo keyword
– In general, avoid including the exact target keyword more than 3-4 times/page.
– Add other keywords from the same keyword bucket in the body of your content. This will help Google contextualize your page or blog article, so that it shows it to users searching for the information you provide.
– Try to add synonyms to your target keyword as well. This is an excellent move not only because it will help Google contextualize your content, but also because it will help you avoid using the exact target keyword too many times.
– Include LSI (Latent Semantic Indexing) keywords too. These keywords are semantically related to your target keyword. To find more LSI keywords, go to https://lsigraph.com, enter your target keyword and pick the most relevant suggestions to include in the body of your page content.
Here you’ll find out how to optimize the images for the web so that you make your website fast and SEO friendly.
Choose the Right File Format
Before you start modifying your images, make sure you’ve chosen the best file type. There are several types of files you can use:
PNG – produces higher quality images, but also has a larger file size. Was created as a lossless image format, although it can also be lossy.
JPEG – uses lossy and lossless optimization. You can adjust the quality level for a good balance of quality and file size.
GIF – only uses 256 colors. It’s the best choice for animated images. It only uses lossless compression.
There are several others, such as JPEG XR and WebP, but they’re not universally supported by all browsers. Ideally, you should use JPEG or JPG for images with lots of color and PNG for simple images.
Use descriptive filenames
Before we talk about naming your files, let’s talk a little SEO and planning. You
ARE doing your keyword research BEFORE your post – correct? If not, now is the time to start.
Here’s a post I wrote about How To Plan Blog Posts that Google Loves that I thought would help you out. You need to make sure you know what keyphrase you are trying to rank for as well as similar (but different phrases) so that you can optimize your photos/images around them.
Are you uploading photos named DSC0001.jpg or maybe wiaw-5.jpg? If so, you are losing a fantastic opportunity to optimize for SEO. You want to give your photos and images descriptive file names. This will help search engines readily understand what your images (and ultimately your blog post) are about.
Let’s say you are writing a blog post about a peanut butter banana smoothie recipe. You could name your images:
peanut-butter-banana-smoothie.jpg
peanut-butter-banana-smoothie-recipe.jpg
the-best-peanut-butter-banana-smoothie.jpg
chocolate-peanut-butter-banana-smoothie.jpg
The idea is to name your files using your keyphrase and variations of it for the different images.
Since chances are you have been blogging for quite a while and haven’t done this for all your photos, I don’t want you to stress over this. I would prefer you learn what to do and start doing this for all your posts from here forward. Then as you update and optimize old posts, you can take care of the photos then.
Make Sure You Name Your Images Appropriately
Yes, even the file names matter.
If you name your file in a descriptive way you’ll help Google identify the object on the image easier.
But anything is better than “untitled-1.jpg”.
Let’s say that you have an image of a dog.
Then name it “dog.jpg”, or “my-new-dog.jpg”.
It’s a good idea to use keywords in your file names.
But remember that a file name should be short, so don’t go overboard with the file name.
It should make sense.
Best tips to optimize image
Use Images of the Right Size
While the images are a must on your website and in your blog posts, they are also the main reason behind slow loading speeds.
It’s for this reason that it is important for you to make your images (width or height) fit your needs.
A good practice is not to make them a lot bigger than you need them to be.
You may be asking, but doesn’t the browser fit the image to the required size?
The answer is yes it does.
But the problem is that the browser still has to load the full-sized image, even if it shows it only in a width of 500 px.
On my blog, every image (except for the featured one) is 800 px wide, never more.
This can be different for you, it’s your call but remember that bigger the image (pixel-wise), the bigger the file size.
And with that in mind, the browser needs more time to load it.
You can resize the images in Photoshop or any of these free photo editors.
If that is not enough or you just want to resize them in bulk, here is a great tool to do so.
Just remember that you need to change the width.
The height will change automatically.
Use Images As Citations
If you don’t know what I am talking about, don’t worry.
Citations are mentions of your business that can help you rank in local SEO.
And a good thing is that you can embed this data into images and then use them as citations on platforms where you can publish them.
Use your NAP and be consistent to improve your local SEO.
The tools above can be easily used for this purpose too.
Just don’t upload them to your website.
Create Descriptive Image Captions
The image captions are one of image SEO best practices.
What’s more important is that they are visible to your visitor.
They can give a more detailed context of an image and provide a better user experience.
It’s even said by many experts that if you include captions they can decrease bounce rate.
The thing is that we don’t always read the full article but to better understand it we are drawn to captions.
So it’s a good idea to have descriptive captions included to better illustrate what the image is about.
They also give extra insights to search engines to better understand the image.
Reduce the File Size
Now that you have reduced the image size (yes even some file size), it’s time to reduce the file size.
To do that you can again use Photoshop or Gimp and combine file size reduction with image resizing.
If you don’t have any of the tools or you are not comfortable using them, there are other tools to use.
One thing to note when reducing the size of the image is, that you are reducing the quality of the image.
But don’t be afraid to do it.
The results will most likely still be a great looking image with a big size reduction.
As you can see, our new SEO Image Optimizer is going to become your indispensable tool when optimizing your product pages, blog posts, and other web pages of your eCommerce site. So, grab this add-on and take the most of it if you:
– Don’t want most of your potential customers to leave your store because of low page loading speed
– Wish to drive more traffic to your website
– Have a strong desire to protect your product images from being stolen by your competitors
There’s no doubt that Panda blew a lot of websites out of the water. From spammed articles, and keyword stuffed content to affiliate links buyers, many website owners found themselves on the wrong side of Panda’s algorithm updates.
As a result, many myths started swirling around and some of them are still in existence. Some of the myths about Panda that you shouldn’t waste your time on include:
There will be another Panda update soon –While there were various Panda updates in the past, the last one was rolled out in 2016. Google chose to integrate Panda as a core part of its search engine and has never published another update. In essence, there is most likely no Panda update that will come about anytime soon. You should therefore focus your energy on making your website Panda-friendly by following the already existing guidelines.
Duplicate content filter is part of Panda – Even though your website can face a Panda penalty for duplicate content. Panda isn’t solely aimed at duplicate content. In fact, a duplicate content filter and Panda are two separate and independent tools.
Too much UGC will be penalised – User-generated Content such as guest blog posts used to be and continues to be an important part of the web. So, do not be misled that publishing guest blog posts might attract Panda. All you have to do is to ensure that they’re of high-quality and actually possess information users would want to read.
The two ways that your site can be affected by Google penalties may either be Manual Action or Algorithm Penalty.
In most cases, detecting Manual Action penalties is much easier. This is because Google is nice and polite enough to send you a message to notify you about the penalty. Manual penalty notices are issued through Google Search Console.
On the other hand, algorithmic penalties are quite hard to detect because Google will not notify you about such penalties. You will have to find them by yourself, or better yet find a reliable SEO agency or consultant to do it for you.
These are 2 ways to determine whether your website may have suffered from a penalty:
Traffic & ranking drops – Your first clue will be a dramatic drop in traffic.If you have been impacted, your visibility will not improve and traffic will not return to your site until you fix the issue on your website that caused it. And as Panda is an algorithmic penalty, this means that even once you’ve resolved the underlying issue(s), you’ll have to wait for the algorithm to be run again for the penalty to be lifted to get your rankings back, which could be several months. Head into your Google Analytics and take a look at your Google traffic and rankings. If you notice a massive sudden drop in traffic on the upwards of 75%, this might be an indication that your site has been subjected to a Panda penalty. You should however, keep in mind that other things such as the rise of competitors, manual penalties and normal seasonal drop in consumer interest might also cause such drops.
Drop in phone calls & leads – If you’ve noticed a severe drop in email leads or phone calls in a short time-frame then that may also be a result of a penalty. Most websites generate a large portion of their total traffic through search engines such as a Google. If you’ve noticed a significant decrease in leads and calls, then you may have been impacted.
If either of the top two points applies to your situation, then apply the following steps to confirm whether it was Panda:
Identify – Check your website stats and jot down all major dates where you’ve noticed a significant decline in calls, leads or organic traffic.
Cross-reference – Google algorithm forecasting tools such as MozCast and Dejan SEO’s Algoroo may show big spikes in major SERP changes. Cross-reference this data with your own traffic stats to see whether they align. Additionally, Moz’s Algorithm Change History and Search Engine Land’s algorithm update page are also great resources to check whether a new update has rolled out.
Regardless of what triggered the penalty, and whether it’s for a personal site or a client’s, you need to be able to diagnose what caused it and fix it. In most cases, you can get almost all of your search traffic back in the short-term.
If you’re ready to get rid of any penalties by Google Penguin Algorithm holding back your organic search traffic or you’d just like to prepare for future problems, let’s get started.
How Can I Discover if I have been penalized By Google Penguin Algorithm
If Google determines that your site contains spam backlinks, one of these actions may apply to your site. Many SEO’s simply call this a punishment because it reduces Google’s ranking and leads to data loss.
For example, in 2013 Google punished Rap Genius (now Genius) for participating in a link scheme. We followed your ranking and found that a dozen keywords we followed had lost 4 to 6 pages on Google:
This was a widely publicized Google penalty that paid much attention to Google Penguin and Rap Genius.
Since then, much has changed how Google punishes a website because of the Google Penguin algorithm update. For example, Google has (in the past) penalized a complete website.
As of 2016, Google announced that the penalties would be more precise. Gary Illyes of Google said that Google Penguin is much “nicer” because:
Now it updates in real time.
Ignore spam instead of using it as a classification factor.
grainy and will not punish entire sites as often
If Google Penguin Algorithm penalizes your website then your website rank will get down and as result website traffic will also get a drop.
Links will play a role in SEO despite the changes brought about by Google’s Penguin 4.0 update, so websites should continue to include them in SEO ranking efforts. But as any search ranking effort, link building cannot be the only strategy because numerous links without any online noise is likely to raise doubts with Google crawlers.
If you have made efforts to build online buzz about your website and have earned links organically through different strategies, you’re likely to get rewarded by the search engine with a higher rank.
Links continue to remain a strong part of SEO, but it cannot be followed in a sequestered manner, which is separate from other strategies. Under the new update, devalued websites can actually rebuild their ranking by simply building high-quality links.
Websites affected by the Penguin update can make changes to recover quickly:
Avoid over optimising anchor texts and websites in general, so they look more natural for Google bots.
Remove low-quality links to your website to ensure Penguin updates don’t penalise you.
Add high-quality links from reputed source sites.
Create a mobile-friendly interface without popups to avoid penalties from Google.
Build organic links through different online marketing strategies.
Build high-value content to ensure better inbound links to your website.
The Penguin 4.0 update is designed to eliminate poor link building efforts by focussing on organic strategies. Websites will want to consider making these changes quickly in order to stay relevant and high up in search rankings.
Matt Cutts said that the Hummingbird algorithm actually effects 90% of all searches but he said only to a small degree. So while Panda may have impacted 10% or so and Penguin closer to 3% or so, Hummingbird impacted 90%. But Matt Cutts said only to a small degree where users should not notice.
How does Hummingbird work?
Well in short it’s faster and more precise than previous versions of Google.
One of the deepest changes that makes Hummingbird so different than past versions of the Google algorithm is that it now focuses more on conversational search. Google has said that less people are searching seriously with short keyword terms and focusing more on the longer tail keyword searches asking for multiple data points simultaneously.
This means Google is ultimately looking less at what each keyword in your query means and more at what your entire query means.
What Does Google Hummingbird Enhance?
In my opinion the Google Hummingbird change signals three pretty clear objectives for Google.
Google wants to communicate conversationally.
Google wants to deliver a seamless experience across all devices.
Google wants to anticipate your needs and answer your questions before you have them (they want to be your assistant).
Let’s break these down one by one.
Google wants to communicate conversationally– When search engines started the only way we could interact with a search engine was by keyboard. At that time the search engines operated by looking for keywords and that was pretty much what you would enter into the search. For example if you think about the video above Matt Cutts wants to know how fast a cheetah can run. Instead of searching for, “how fast does a cheetah run?” he searches for, “cheetah running speed.” So in the early days of search we stripped out unnecessary words from our search queries to make it easier for the user to search. Why would Google want to change this? Because today more searchers are using their mobile devices and the big difference is that you can actually speak to your mobile device. That brings us to the next point.
Google wants to deliver a seamless experience across all devices– As we have already discussed users are searching more frequently from a mobile device. As a result Google has made some strategic changes make sure that users can have a seamless experience whenever they access the web. For example the Google web browser Google Chrome now works on mobile devices. This means where ever you surf the web on any device if you are logged into Google Chrome Google can present personalized search results to you. You can also access your bookmarks, and browsing history seamlessly across all devices. Google is also working hard to enhance the Google Now app. Which brings us to our final point.
Google wants to be your assistant- Google Now is much more than a search app. Google now starts to get to know you and learns about your activity. When I wake up every morning Google Now tells me how many minutes it will take me to drive to the gym. Google Now also has access to my calendar, contacts, location… almost anything I tell my phone. Because of that Google Now reminds me of deadlines, birthdays, places I have visited, and things I have searched for. Ultimately Google wants the Google Now app to act like the Star Trek Computer. They want you to be able to say, “OK Google” and have Google assist you with any task you can think of. I know I’ve shared this video before but it is the ultimate example of what Google wants Google Now to be.
The implications of a Hummingbird search world
It is important to remember that this step forward being described by Google as a new platform.
Like the Caffeine release Google did in June of 2010, the real import of this is yet to come. Google will be able to implement many more capabilities in the future. The implications to search in the long term are potentially huge.
For you as a publisher, the implications are more straightforward. Here are a few things to think about:
1. Will keywords go away?
Not entirely. The language you use is a key part of a semantic analysis of your content.
Hopefully, you abandoned the idea of using the same phrases over and over again in your content a long time ago. It will remain wise to have a straightforward definition of what the page is about in the page title.
I’ll elaborate a bit more on this in point 3 below.
2. Will Google make the long tail of search go away?
Not really. Some of the aspects that trigger long tail type search results may actually be inferred by Google rather than contained in the query. Or they may be in the user’s query itself. Some long tail user queries may also get distilled down to a simpler head term.
There will definitely be shifts here, but the exact path this will take is hard to project. In the long term though, the long tail will be defined by long tail human desires and needs, not keyword strings.
The language you use still matters, because it helps you communicate to users and Google what needs and desires you answer.
3. You need to understand your prospect’s possible intents
That is what Google is trying to do. They are trying to understand the human need, and provide that person with what they need.
Over time, users will be retrained to avoid short simple keyword-ese type queries and just say what they want. Note that this evolution is not likely to be rapid, as Google still has a long way to go still!
As a publisher, you should focus more attention on building pages for each of the different basic needs and intentions of the potential customers for your products and services. Start mapping those needs and use cases and design your site’s architecture, content, and use of language to address those.
In other words, know your audience. Doing this really well takes work, but it starts with knowing your potential customers or clients and why they might buy what you have to sell, and identifying the information they need first.
4. Semantic relevance is the new king
We used to speak about content being king, and that in some sense is still true, but it is becoming more complex than that now.
You now need to think about content that truly addresses specific wants and needs. Does your content communicate relevance to a specific want or need?
In addition, you can’t overlook the need to communicate your overall authority in a specific topic area. Do you answer the need better than anyone else?
While much of being seen as an authority involves other signals such as links, and perhaps some weight related to social shares and interaction, it also involves creating in-depth content that does more than scratch the surface of a need.
The change in Google’s Pigeon Rank Algorithm has affected the local search listings, and this is visible on the Google Web and Google Maps Search Results page. Major changes have been made behind the curtain, and the ripples are beginning to show on the surface. Due to this change, local businesses might have noticed slight amounts of increase or decrease in the leads generated, website referrals and online business. The latest Google Search Engine Algorithm shares deeper roots with their search results capabilities, consisting of the numerous ranking signals that are used every day in web search along with search tools such as spelling correction, Knowledge Graph and synonyms, etc. In addition to these factors, the latest Google Algorithm Updates improve on their present distance and location ranking parameters.
Who Benefitted From the Pigeon Update?
Not everyone complained after the latest local algorithm update. As previously mentioned, directories seemed to get a boost, and, so did certain queries. The latest analysis using BrightEdge’s massive data set from June to August shows an uptick in the results for queries related to the following:
Hospitality (28 percent growth in Google Places results)
Food (19 percent growth in the Google Places results)
Education (13 percent growth in the Google Places results)
Additional wins occurred in smaller percentages for queries related to:
Spa +4.64 percent
Shop +4.32 percent
Law +3.55 percent
Medical +1.83 percent
Transportation +1.31 percent
Fitness +1.12 percent
Who Experienced a Loss From the Pigeon Update?
According to the analysis in BrightEdge’s data set, we found queries related to the following topics being the most negatively impacted by Pigeon:
Jobs (68 percent decline in Google Places results)
Real estate (63 percent decline in the Google Places results)
Movies (36 percent decline in the Google Places results)
Insurance (11 percent decline in the Google Places results)
Reports across the Web from multiple sources show real estate queries experiencing dire consequences from Pigeon, and as you can see, the BrightEdge data confirms the same.
Other queries related to the following showed somewhat negligible losses:
Finance -6.21 percent
Furniture -3.34 percent
Government -0.07 percent
The following is a table of the findings. Note that some of the queries were difficult to classify in the analysis, so the industry data by row does not add up to “all” data.
How does “Pigeon” affect you and your business?
1. A lower number of queries included in a local listing pack on SERPs (there are usually 7 listings)
MozCast’s data shows a 23.4% decrease in the queries that are showing a local listing pack.
You may experience a drop in your website’s traffic due to the disappearance of some of your local listings.
2. Local rankings are taking an old-school route
Local search rankings are now being influenced more by traditional Web search ranking signals (domain authority, backlinks and many other SEO ranking factors).
If you see a decline in your local rankings, it may mean your competition’s general website/page SEO characteristics are stronger than yours.
3. Yelp and other well-known local directories are now your new best friend
Data shows that local directories have gotten a major boost in search rankings.
Your official business website or store pages may be displaced by store listings from directories.
4. Local Carousel still has your back
Judging by the looks of it, local carousel results remain unchanged so you’ll still be able to get some extra exposure from it.
Make sure you pick out a nice, high-quality photo for your Google+ business profile because that is the picture that is going to show up in the carousel.
and what else?
According to the first post-update research, Yelp and other local directory sites have seen a considerable boost in rankings. For some queries, the entire SERP (search engine results page) is built from well-known local directories only.
If over the past few days your website has encountered a traffic drop, this may well be due to the disappearance of your local listings on the first page of Google. If that is the case, in the short run you may need to cover the traffic losses with a PPC campaign, and in the long run — focus on getting Web search listings for those keywords.
Google is going back to the more traditional ranking signals such as domain authority, backlinks and all kinds of other SEO rankings factors. Simply speaking, this may mean that local rankings will now be more determined by how well your meta data is structured, the age of your site and how you use your header tags.
The mobile-friendly algorithm is a page-by-page signal, so it can take time for Google to assess each page, and that may be why it will be a gradual rollout. And depending on how fast Google crawls and indexes all of the pages on your site, the impact can be slow to show up.
It is believed that this rollout will have less impact than the original mobile-friendly update, which was called “Mobilegeddon,” which was supposed to have a significant impact on the mobile results, but not everyone said it had that much of an impact.
If you are not mobile-friendly, or if you want to ensure you are, check the Google mobile-friendly tool, and check Google’s mobile guidelines.
Here’s the guide
Does Your Website Pass the Mobile-Friendliness Test?
Google is making it simple for companies to test their website and ensure that you have a mobile-friendly site. They have created a tool called the “Mobile-Friendly Test” where you can type in your web address and see if you meet the new standard. When you pass the test, it looks something like this:
If you don’t pass the test Google will give you insights as to why it didn’t pass and will look something like this:
What Happens if You Don’t Pass?
If your site isn’t fully optimized for mobile devices, you will likely see a hit to your ranking on mobile searches. What that means is you need to have a mobile site up and running in the near term. Here’s where to start:
Decide How You Will Optimize for Mobile
There are several approaches for optimizing your site for mobile devices. Choose one of the following that works for you:
Responsive Design – Responsive Design is the number one choice by Google for mobile optimization design patterns. Choosing responsive design is desirable because it only uses one URL for your site rather than a mobile URL and a desktop URL. NOTE: If you’re already hosted on HubSpot’s COS then you’re optimized with responsive design. If you’re not already hosted on the COS but need to move to it now,
Dynamic Serving – Dynamic serving changes the HTML of your website while keeping the same URL. Instead of shrinking and optimizing one design, dynamic serving figures out what kind of device the user is experiencing your website with and changes up to code to show something different. This is a more complicated process, but offers an optimized result as well. NOTE: This approach is known to be a lot more error-prone so beware before choosing this option.
Separate Mobile Website – When mobile optimized sites first started to come to light, this was the way to create them. Instead of using one URL, a mobile website is essentially a new website built for your company for mobile purposes. It’s onerous for Google though. It means that they have to crawl two websites and two versions of your content. If you already have this in place, make sure it works properly. If you’re considering this option, make sure the other two aren’t better fits first.
RankBrain used a set of databases based on people, places, and objects (also called entities) to define the algorithm and its automatic learning processes.
These words (queries) are then decomposed into word vectors using a mathematical formula to give these words an “address”. Similar words have similar “directions”.
When Google processes an unknown query, those math relationships are used to better match the query and return multiple related results.
Over time, Google refines results based on user interaction and machine learning to improve the match between users’ search intent and search results returned by Google.
It is important to note that the words the search engines used to throw the words “and” or “they” were not included in the RankBrain analysis. RankBrain is also designed to help understand queries to get the best results, especially for queries with negative targeting. For example, queries that use words like “no” or “no”.
But if you’re still writing good content, you’re probably wondering what else you can do. What can this “advantage” bring you? How can you optimize this “classification signal”?
This answer to this question is not an answer, but another question:
Why would you try it?
RankBrain can be beneficial for some unique use cases. For most sites, however, the time and energy spent on classifying an unknown Google query (that is, nobody uses it) would be much better used for other tasks.
In fact, not only is it trying to optimize a query that few users are using, it is constantly changing.
RankBrain results should change and provide better results. Therefore, optimization is an attempt to constantly hit a moving target.
The best advice? Follow the advice of Illyes.
You must write good content.
You must make sure it sounds natural.
this is how you can optimize for RankBrain Algorithm
If you want to obtain a good rank in Google searches, you need to optimize your website and content for RankBrain. In the absence of these techniques, your website might quickly lose relevance and you’ll notice a drop in your click-throughs.
How big of a difference are we talking about?
If a website’s click-through rate decreases, the amount of user data for the website decreases. When that happens, RankBrain has fewer data points to judge its relevance.
That means that your website won’t show up in the top results. Over a period of time, your visibility will be greatly reduced.
How quickly can this happen? Quicker than you think. As Google’s confidence in RankBrain increases, you’ll soon see that your old SEO tactics are no longer bearing fruit.
So, how can you optimize your website and content for RankBrain? Here are a few surefire ways of getting RankBrain to work in your favor.
1. Build Your Website’s Reputation
Spend some time to understand what your audience likes to read. What type of content do they spend more time on?
For example, if your target audience is new moms, they’re most likely to be interested in topics related to infants or toddlers. Understand what your audience likes, and create more content around such topics.
However, creating content isn’t enough. To build your website’s reputation, you need to work on obtaining quality backlinks. You can guest post on high-authority websites that have a similar target audience. This can give you more traffic as well as greater visibility.
Try to create content that’s engaging. Feel free to embed videos, images, gifs, or any other media content. This can help you increase engagement which helps to build your reputation.
The more time a user spends on your website, the more relevant it appears to RankBrain. You should also share your content on social media to get greater engagement.
2. Optimize for Medium Tail Keywords
The way RankBrain works, long-tail keywords are likely to be history soon. Earlier Google used to fetch results based on exact keywords used. So, you would get different results for “best speakers for computer” and “best computer speakers.”
However, RankBrain understands that both these search queries mean the exact same thing. So it gives the same search results.
The best way to optimize for RankBrain is to optimize for medium-tail keywords. Optimizing for medium-tail keywords provides for automatic optimization for long-tail keywords in RankBrain.
This also requires you to create more quality content. This makes it easier to optimize it for medium-tail keywords.
3. Create Content that Addresses User Intent
RankBrain puts the focus back on content. If you have good content, you can easily optimize your website.
If your content is not good, your click-through rate (CTR) will drop. Readers will not come back to your website and it will lead to a lower ranking in RankBrain.
It might seem like RankBrain is making it tougher for websites to get better rankings. But there’s a silver lining to it.
You don’t need to focus too much on finding relevant keywords. If you’re writing an article about desserts, RankBrain will pick up “desserts” and its related terms as keywords. So, words like “sweets” and “sweet dish” will also be considered as keywords.
With RankBrain, your article might rank for queries related to desserts as well as sweet dishes. So keyword density shouldn’t be your focus anymore.
You should focus more on addressing user intent through your content. This is because quality content is all that matters for readers. Quality content focused on user intent will bring in higher CTR.
Since RankBrain improves the quality of search results, it also collects information about user satisfaction.
For example, someone clicks on the #1 result for a query but doesn’t find it useful. Then they click on the next result in the SERPs. And they continue to do so until they find what they are looking for. When they do find what they want, they stay on that page longer. And that is an indication of satisfaction.
As a result, those results towards the top may notice a drop in their rankings. Also, the one that was able to satisfy the user’s intent, gets a boost in rankings.
So it’s important to make your content informative, engaging, and useful. Try to pick topics that your target audience might be interested in and write longer, useful content that answers all possible questions about that topic.
4. Increase Your Click-Through Rates
Google’s quality score algorithm rates your website based on your keywords and quality of content. The click-through rate is an important part of the quality score algorithm.
It gives an idea of the overall user experience of the website. The higher the CTR, the better it is for your website.
As RankBrain learns from human decisions, you need to ensure that users click on your links. A higher click-through rate is critical for the success of your RankBrain SEO strategy.
A great way of getting a higher CTR is to ensure that your content is well-written. You also need to optimize your landing pages to increase your CTR.
Wikipedia has a high CTR, authority, and a great reputation. And so, it ranks higher in Google search results. Wikipedia outranks a really large number of websites. This happens because the website’s content is relevant, engaging, and people come back to it.
As we’ve seen, Google won’t tell you exactly what’s in the Fred algorithm. That’s why some SEO professionals had to make some inferences.
However, Gary Illyes did offer an important piece of advice about how to avoid getting hit with a Fred penalty.
This past April, at the AMA session during the SMX West conference, Illyes said that the answer to Fred is in the webmaster quality guidelines.
So here’s what you can do to stay in the good graces of Google:
Provide high-quality content – In your content marketing efforts, make sure that your articles are top-notch. If you have to, hire a professional writer to give you the best quality. Also, strive for longform content as that tends to cover subjects more exhaustively.
Google Fred Penalty
Avoid “forced” backlinks – If you’re in the habit of buying backlinks from people who own their own private blog networks (PBNs), stop that practice immediately. Even “honest” guest-posting can get you into trouble if you’re forcing unnatural backlinks from low-quality sites. Instead, strive to produce quality content (see above) and let Google’s algorithm push it to the top of the SERPs.
Avoid excessive ads – The Fred update also hit sites with an abundance of ads. Although it’s unclear how many ads is “too many” for Fred, you can use common sense. If you think people will find your site annoying because it has a couple of pop-ups, a video ad in the corner, ads in the middle of the content, and banner ads all over the page, you can be fairly certain that you have “too many” ads.
Avoid creating pages with too many ads or you will be penalized by Fred – Google Fred Penalty
If some of your content lost rank because of Fred, you might not be able to fully recover it. However, going forward, you can make sure that your future content ranks well. Do that by producing quality articles, avoiding backlink spam, and showing only a few ads on your site.
In addition, make sure your website is focused and does not try to cover general topics.