Case Study: Why Has Medium Lost Half of Its Visibility?8 min read
For the uninitiated, Medium is an excellent choice for content marketers and creators who want to easily publish and access their blogs.
Medium ranks on #152 in the list of the most popular content management systems (CMSs). Large brands prefer using Medium. In fact, it is used by 2.4 percent of the most popular websites worldwide.
The majority of websites use Medium to publish long-form content. Medium is good at content marketing. But if you were to solely rely on Medium for content publication, we’d recommend using another CMS.
While medium is good, it has serious issues that can’t be ignored. The biggest problem is Medium does not get indexed as expected by Google.
We took it on ourselves to gather more data on this. We randomly extracted a sample of 700 articles from Medium. Out of these, 83 articles were not indexed. And that’s a huge number.
In this blog, we will in detail whatever that is wrong with Medium; we will cover the following aspects:
Medium and Its Problems## What Medium Is Losing## How Medium Lost over Half of Its Visibility## Ending Note## Conclusion
Let’s study Medium!
Medium and Its Problems – SEO CASE Study
In an overview, we feel the following reasons have caused Medium’s drop into its organic visibility.
Sitemaps are simple XML files. All websites should have them. As decided by the webmaster, a sitemap must contain all the important URLs on the site. Google uses sitemaps to find new links on a website. Google then indexes it.
Google bots set out to crawl every day, trying to cover as many pages as possible. These bots follow all the links to crawl HTML files, and they consult sitemaps.
That’s how the process works for all other websites, except Medium. Medium’s sitemap is a mess at best. This is because every comment under a published article has a separate URL. All these URLs lie under Medium’s sitemap.
It is a waste of time for Google bots to crawl all those additional URLs in the sitemap. A major part of the resource allotted by Google to crawl Medium is spent on indexing pages that hold little value. Due to this, all the articles which should be prioritized for indexing are waiting in line behind some useless comments.
Technical SEO should help Medium prioritize which parts of its website should be indexed first.
Users can view this feature where they can see the images of the articles. They have to click on the articles if they are interested in reading the upcoming write-up.
The biggest issue Medium is facing is to get all its content indexed by Google.
These are the two major reasons that tell what’s seriously wrong with Medium.
What Medium Is Losing
Employing an engagement based model, Medium indeed is a popular user-generated content publishing site.
On front, Medium appears as an ideal platform that aspiring writers can leverage. But on the back, the story is a bit different.
As per stats, Medium’s domain witnessed a whopping 40% dip on desktop’s organic visibility and 50% dip in mobile visibility.
This drop is so massive that our SEO specialists thought it was mandatory to get to the bottom of it. We did a quick crawl to analysis Medium’s technical set up, and what we found was surprising.
Here’s what we think might be the reasons why Medium is taking a nosedive
How Medium lost over half of its visibility
1. Problems with the homepage
While researching, we observed a different homepage compared to our previous visits. The top stories were removed and along with them, the navigation links and the menu were missing. Mostly, there was little content displayed apart from the CTA buttons.
A domain page that relies on organic traffic should not remove its entire outgoing internal links. Looking at this seems like Medium has no knowledge of cloaking.
We further used a few tools to change the user agents around. We set the customer user agent of Googlebot, Googlebot News, and Googlebot mobile. Only after that, we were able to see Medium’s original homepage with featured articles, menus, content, and all other things.
Though Medium’s practices are not as bad as black hat SEO cloaking cases, Google can view it as an attempt to hide important information. This could become punishable in the end.
This huge traffic drop started from around 10 December 2019 and the changed homepage was only implemented in November. Hence, the timing here is significant.
The special version for non-logged in visitors makes is clear that Medium is aggressively trying to rope in new subscribers.
2. Broken pages
Another interesting problem we found with Medium is lots of broken priority pages. There were plenty of 4xx URLs present on the site. These URLs are either made by incorrect set links or by content creators.
Sites usually face such an issue when they have a lot of user-generated content that’s not yet overlooked.
Take this, for instance, the tag page on Medium itself leads to a 404 error.
During our research, we found several generic and user-specific tags, which were heavily linked, and yet landed on a 404 error. We think Medium really needs to work on its tag creation category and maintain control over it.
3. Issue with indexing
We mentioned earlier the problem Google has with indexing Medium’s various URLs. Medium’s comment section has its own URL for every comment!
We checked the site search analysis and came to know that the response path has over 2 million indexed URLs.
This strategy clearly does not go well with SEO.
Who searches Google for comments? No one.
We found a similar problem with user profiles. Medium creates a new URL for every new writer and subscriber. In short, Medium creates a new URL for these many reasons:
a) The user’s profile
b) Users they follow (1M indexed URLs)
c) Users following (1M indexed URLs)
d) The articles they recommended (1M indexed)
e) All their comments (2M indexed)
f) All their highlights (400k indexed)
In total, Medium has over 5.4M URLs that don’t require indexing.
In some places, it is sensible to index user profiles, especially if it is an authority figures. But indexing other weak pages can lead to serious issues. For instance, that can interfere with Google’s crawling budget and the Panda update has made low-quality updates punishable.
We don’t think Medium needs to have an indexable URL for posts, listing, and comments. But if it needs to, then it should utilize noindex robot directives.
4. Over 10 percent of posts are invisibility on Google
Getting its low-quality pages indexed by Google has left its quality pages waiting in the line. As mentioned earlier, we conducted research and 83 posts from a total of 700 posts were not indexed by Google.
It boils down to just one point: Medium has no indexing or crawling strategies working in its favor. All it does is gives tons of useless URLs to Google bots and leaves the important once behind.
Google does not have time for this because pagination allows bots to check 40 product pages for one category. Google has the job to ransack millions of tags and followers list to visit articles.
5. Terrible sitemap strategy
We know how Google has kept aside a crawl budget for all the sites. Medium’s strange sitemap strategy has caused another issue with Google’s crawl budget.
Ideally, XML sitemaps are used as a crawling priority for search engines. They also feed extra information to search engines, one such being the last modified date.
Since 2012, Medium has created a sitemap for all its articles, tags, and new users, which are stored inside a sitemap index file.
Although Medium has not exceeded the URL limit set by Google, it surely lacks giving priority to them. The best thing to do here would be to closely look at the significance of the older sitemaps. Mediums need to especially check the sitemaps for user-profiles and tags.
6. Bad internal linking
Google only visits 40 posts per category for Medium. Users have the freedom to access as many articles as they want by simply scrolling down.
But Googlebot cannot do that because Medium has implemented an infinite scroll type that Google cannot a workaround.
Remember how Medium hosts millions of pages. Keeping this in mind, 40 posts per category isn’t enough to bring some valuable content to Google.
That might be the reason why Medium is indexing all its pages, be it valuable or low-quality ones. This strategy is short-sighted and looks dangerous at best.
What’s even worse? Medium’s recommendations are not really that useful.
Check out this story: Unpacking the Media’s Obsession With Prisoner-Firefighters.
Now, check out the links to the ‘similar’ articles.
Do they appear similar at all? No way.
Logically, if a user is reading on Media, they definitely don’t want to read on education or lifestyle afterward.
7. Outbound canonical is not completely trustworthy
The best part about Medium is contributors get to utilize its domain authority to make their content visible to thousands of readers.
To encourage this, Medium has given its contributors the option of setting cross-domain canonical. This negates the issue rising from duplicate content if contributors wish to publish their piece on another site.
We conducted research and found that more than 5% of its canonical lead to external sources, which isn’t a problem. But if you can’t guarantee the quality of target documents, you could be linking to harmful sites that can degrade your domain authority.
8. Paywall content
Google is completely fine when article content is hidden behind a paywall. However, Google expects correct actions to be followed. That is, the schema markup should have ‘isAccesbileForFree’ property.
This should not be the reason why Medium is experiencing a dip. But it might be an indication from Google that it is discouraging such instances.
Searchers are going through a poor user experience because they can see the full article but are unable to read all of it. But it also makes sense that publishers don’t want to fill the whole screen with ads because of the need to generate revenue.
Hopefully, Google will be able to differentiate between the confusion in SERPs that the available content is ambiguous and check implementing paywall ‘labels’. This can be done easily – the schema markup can potentially warrant its own valuable data snippet.
This change to News SERP’s appearance can improve the quality of the website’s incoming traffic and reduce short-click visits.
9. Lots of internal Nofollow links
Nofollow link is typically used to hint to search engines that a site does not trust the content it is linking to. Therefore, using nofollow links internally is bad SEO.
Medium has freely used rel=nofollow across its site.
We researched further and found that all the button links for an education’s overview page and certain navigation properties were set as nofollow.
Ideally, these should have been set to follow because of the value of the references.
10. Mobile-First Indexing
The moment Google’s organic visibility dives, a website should always check if the problem has risen right after it switched to Mobile-First indexing.
When it comes to Mobile-First Indexing, Google will index the mobile version of a site and then use its page for ranking. And Medium looks obviously different on the mobile version.
The mobile version, unlike the site version, has no links pointing toward similar articles.
Due to many such reasons, Google probably might not have switched Medium to Mobile-First Index. This might be the reason behind the recent visibility drop.
Once Google enables Mobile-First Index for Medium, organic drops might be avoided in the future.
The reason behind Medium’s organic drop is not crystal clear. But it is proof that even if a site publishes high-quality content, technical aspects need to be dealt with. By doing managing technical SEO, a site can perform well on search engines.
To steer clear from such situations, you can include the following SEO strategies.
a. Make sure Google is able to index all your pages.
b. Use SEO crawlers to check if you have important yet broken URLs.
c. Always have fresh and prioritized sitemaps on the correct path for crawlers.
d. Analyze the need to index a page by assessing incoming traffic and time on site.
e. Avoid nofollow links so that search engines know you don’t doubt your own content.
We do not mean to single out Medium.
But it is true that Medium needs some serious SEO practices to gain its posture. The World Wide Web is so large that it becomes super difficult for crawlers to keep with all the content that gets released.
A website is as big and dynamic as Medium needs to redo its technical SEO best practices to maintain its position. This way Medium can ensure the visibility of content to search engines for higher organic traffic.