Angularjs Might Be Killing Your SEO Efforts

Angularjs Might Be Killing Your SEO Efforts

Everyone in the development industry knows that JavaScript and SEO have been arch enemies for a very long time. And it’s no surprise to any SEO that Google does and always face issues when it comes to indexing JavaScript content. AngularJS is a Java-script based web-application that provides dynamic functionality to extend HTML and enhance the user experience by simply creating SPA’s. No matter how beneficial the platform becomes in terms of user interaction and increased conversion rates, it can be pretty challenging for a search engine optimization strategy and might significantly impact organic search traffic.

A Brief on AngularJS

Much like other JavaScript-based platforms, such as Ember and Backbone, AngularJS loads data asynchronously to the browser and executes primarily on the client-side as the user interacts with elements of the web page. Now with most javascript programming, the major challenge comes down to the ability to render indexable content and internal navigation or links into the HTML code of the page, so search bots can crawl and index it. And do you know what the most confusing part is, AngularJS was created by Google so people simply assumed that it will be SEO friendly. But in the actual scenario, it can completely hide your page content from the search engines and tank your organic search traffic.

Released in 2012, the first version gained quote a momentum over last year, especially with enterprise-level clients. According to several researches, 942 of the top 10,000 sites (ranked by traffic from Quantcast) are now using Angular. This is up dramatically from 731 sites only a year ago.  Walmart Grocery, The Guardian, CVS Pharmacy, Weather.com, VEVO, Land’s End are some of the best examples of large websites that are now making use of Angular.

So Where Is The Problem?

Like I said before, the major challenge with any javascript programming always comes back to indexable content. And search engines have historically been challenged by sites that serve content via JavaScript. However, Google has become much more proficient with their ability to crawl and index javascript, whereas the other search engines are even behind this ability.

How Bad Can It Be?

The CEO at Elephate, Bartosz Góralewicz conducted an interesting study that tested Google’s capabilities with indexing various javascript based platforms. They created web pages using several popular frameworks:

  • AngularJS 1
  • AngularJS 2
  • React
  • Ember
  • jQuery
  • Vue
  • Javascript

Without a shadow of any doubt, we know that Google has made significant progress. And today, they are still having challenges when it comes to indexing certain aspects of javascript, especially Angular. In 2015, Angular on the e-commerce section was released by a well-known healthcare company. Their end goal was to simply roll similar kind of pages that varied only in color and size. As a result, the cached version showing what was actually rendered in the HTML on each of their product pages when Google indexed the pages. The result was a +40% drop in organic search traffic from the previous year.

Is There Any Solution?

Search engines need to see the content and elements of the page in the source code so that it can be indexed correctly. We being a recognized AngularJS development company always end up recommending a pre-rendering safety net like Prerender.io. For those who have no idea, this is a middleware that will crawl your web page, execute all javascript files, and host a cached version of your Angular pages on their content delivery network (CDN). No matter how absurd it sounds, but it is true! On this note, Google itself says that as long as your intent is to improve user experience, and the content that’s available to the visitors is the same as what’s presented to Googlebot, you will not be penalized.

Don’t forget to check the rendering of your web pages

In order to make angular site indexable make sure you know how your web pages render for search engines.

  • Browseo– A tool that not only renders the elements of the page but also lists out total world count, internal/external links and other important <head> content such as HTML title, meta description and keywords, Twitter and Facebook tags, and a SERP preview.
  • Fetch as Google– From search console, you can run any page on your website to see what Google sees.
  • Search engine index– Have you checked the most recent cached version of a web page by simply doing a “site: [domain]” query with Google or Bing. In the search results, locate the drop-down caret at the end of the URL and click them and select “cache” in Google and “cached page” in Bing. It shows that what the bot found with the last crawl of your web page.
  • Angular v4- We all know that the latest version of AngularJS has been released and it seems to be pretty much promising for technical search engine optimization purposes. The version features Angular Universal, it means you will receive hassle-free functionality to generate all of the HTML of a page at a given point and can be deployed to static HTML or a CDN.

In a nutshell

As mentioned above, Google continues to advance with the ability to crawl and index javascript. All I can say is never leave the indexing of your site in the hands of the search engines and be in control of how your web pages render.

If you have any question, please share in the comment section below.

About Rakesh Patel

Rakesh Patel is Marketing Manager at eTatvaSoft – web, ecommerce & mobile app development company. He writes about Technology Trends, Leadership and many more things about IT services and enabled people to learn about new technologies through his online contribution.