DuckDuckGo Questioned by Authorities over Google Antitrust Investigation

DuckDuckGo Questioned by Authorities over Google Antitrust Investigation

Google’s competitors are being questioned by U.S. federal and state authorities as part of an inspection into Google’s search superiority.

DuckDuckGo has supposedly been in talks with the U.S. Justice Department regarding an investigation into Google’s claimed anti-competitive practices.

In a report from Bloomberg, DuckDuckGo’s CEO Gabriel Weinberg disclose that U.S. authorities are looking for methods to restrict Google’s superiority in the search market.

Weinberg talked with authorities just a few weeks ago, who was prepared with detailed questions about Google.

Should Google provide some choices/replacements?

A specific focus of questions being asked by authorities is whether Google should be required to provide some replacements.

Authorities are most worried about having Google present replacements to its search engine and Chrome web browser on Android devices.

It is supposed that Google is engaging in anti-aggressive behavior by having its own search engine and web browser as defaults on Android.

The Justice Department turned down to comment on this investigation when questioned by Bloomberg, as did multiple state attorneys general.

A Google spokesperson acknowledged the investigation, but offered no further information:

“We continue to engage with the ongoing investigations led by the Department of Justice and Attorney General Paxton, and we don’t have any updates or comments on speculation.”

More Details About the Investigation

This investigation is operating by the need to collect enough proofs to file an antitrust lawsuit against Google.

Weeks ago we describe the U.S. Justice Department, and many state attorneys general, are preparing to sue Google for antitrust contravention.

See: Google May Face U.S. Antitrust Lawsuits

Texas Attorney General Ken Paxton informed at the time he was getting ready to talk with companies who profess to have been harm by Google.

It’s absorbing to learn that some of those companies include Google’s opponents in the search market.

DuckDuckGo vs. Google

DuckDuckGo has been condemnatory of Google for years. Now it’s in a place that it can provide details that could change the way Google operates its search and advertising businesses.

The main focus of DuckDuckGo’s disapproval against Google has been revolving around privacy and information sharing.

Though DuckDuckGo has been censorious of Google for anti-competitive practices as well.

Read more:

The company may have a lot more to say about Google’s alleged anti-aggressive behavior. For example, it’s worth noting DuckDuckGo wasn’t even offered as a default search engine on Android until this past January.

The information DuckDuckGo provides to authorities in this inspection may determine either the lawsuit moves forward or not.

Now What Happens Next?

Could the climax produce by DuckDuckGo’s camping against Google result in a guilty verdict for the search enormous?

At this point, it’s not known whether U.S. federal & state authorities will move forward with the lawsuit, or the results are going to be something else.

Attorney General Paxton hopes to determine whether a filing is warranted by this fall, with the court to follow soon after that.

If Google were to be found guilty of antitrust violations, Paxton has stated that no punishment is off the table.

Lawmakers may even determine to break up Google into several separate searches and advertising businesses.

Bloomberg calls this “one of the most noticeable antitrust cases in the U.S. since the government sued Microsoft Corp. in 1998.”

In 2018, Google was fined a record $5 billion for antitrust violations in Europe.

As part of this decision, Android devices in Europe now need to have users select their choice on their own, of default search engine.

Indistinguishable changes could be approaching the United States, depending on how these events absorb.

Source: Bloomberg

In a Google Webmaster Hangout, Google’s John Mueller answered which low traffic pages to no index and which ones to not worry about.

If Low Traffic Pages Harmful?

It’s commonly understood that it’s a good idea to pull out low performing pages. Low standard pages tend to attract low amounts of traffic and should be either no-indexed or pull-out entirely.

The question that John Mueller answered about is this.

The question is especially about a news site but the answer by Mueller has broadened to be useful to more than just news sites.

Related: Google’s Search Quality Raters Guidelines

So, this the question is:

We’re publishing news and articles.

For example, we have 100 new articles every day and ten of them give us 95% of the organic search traffic. Another 90 go nowhere.

We’re afraid that Google can decide our website is interesting only for 10%.

There’s an idea to hide some boring local news under noindex tag to make the overall quality of all publishing content better.

What do you think?

How Google inspects Website Quality

Google’s Mueller first talks about how Google’s algorithm reviews web pages and the entire site so that it can recognize what the quality level is.

His answer was on a common level, meaning that it’s relevant anyhow if it’s a news site or any other kind of site.

This is what Mueller said:

In general, we do look at the content on a per-page basis.

And we also try to understand the site on an overall basis, to understand how well is this site working, is this something that users appreciate. If everything is essentially working the way that it should be working.

So it’s not completely out of the question to think about all of your content and think about what you really want to have indexed.

Now Mueller focuses on news sites.

He said that traffic isn’t much needed in the metric to use for determining either a news web page is low quality or not.


But especially with a news website, it seems pretty normal that you’d have a lot of articles that are interesting for a short period of time, which are perhaps more of a snapshot from a day to day basis for a local area.

And it’s kind of normal that they don’t become big, popular stories on your website.

So from that point of view, I wouldn’t necessarily call those articles low-quality articles, for example.

So, just because a news article isn’t famous doesn’t really mean it’s of low quality.

John Mueller then guided on how to know when content is truly cheap quality.

He highlights issues such as content that is difficult to read, broken English, and content that is poorly organized. Then he says what to do if you have a mixture of good and poor quality content.

This is what he said:

On the other hand, if you’re publishing articles from … hundreds of different authors and they’re from the varying quality and some of them are really bad, they’re kind of hard to read, they’re structured in a bad way, their English is broken.

And some of them are really high-quality pieces of art, almost that you’re providing. Then creating that kind of a mix on a website makes it really hard for Google and for users to understand that actually, you do have a lot of gems on your website…

So that’s the situation where I would go in and say, we need to provide some kind of quality filtering, or some kind of quality bar ahead of time so that users and Google can recognize, this is really what I want to be known for.

And these are all things, maybe user-submitted content, that is something we’re publishing because we’re working with these people, but it’s not what we want to be known for.

Then that’s the situation where you might say, maybe I’ll put no-index on these, or maybe I’ll initially put no index on these until I see that actually, they’re doing really well.

So for that, I would see it making sense that you provide some kind of quality filtering.

But if it’s a news website, where… by definition, you have a variety of different articles, they’re all well-written, they’re reasonable, just the topics aren’t that interesting for the long run, that’s kind of normal.

That’s not something where I’d say you need to block that from being indexed. Because it’s not low-quality content. It’s just less popular content.

Related: How & Why You Must Improve or Remove Your Old Content

John Mueller made the main point about identifying an article for quality. He primarily said to look at the content itself to decide if the reason for low traffic is because the content is not popular or if the article is not written in a good manner.

Just because a web page is not popular does not mean it’s low quality. Content like that won’t reflect poorly on a site.

Low traffic can be a banner to aware you of a feasible issue. But it’s not the issue itself.

Take a look at the content and determine whether the low traffic is because:

  • The web page details are old fashioned (not good, needs improvement)
  • The web page is narrow (not okay)
  • The web page is on a topic that’s not very popular (that’s okay)

Please leave your opinion about DuckDuckGo in the comments box below.



About Author: ave Jones is a web content writer, and guest blogger, who offers content writing services to online business owners including SEO and Digital Marketing, Website designing and development, logo design and corporate branding. If you need a reliable guest blogging or content writing service, contact Dave at




Post to Twitter Post to Facebook

Leave a Reply

Your email address will not be published. Required fields are marked *


About DaveJones