Google’s Use Of Bloom Filters Explains Higher Filtered Data In Search Console

In the latest installment of Google’s monthly office-hours Q&A session, a question was asked regarding the higher volume of filtered data compared to overall data in Google Search Console.

The question prompted a detailed response from Gary Illyes, a Google Search Relations team member, who shed light on Google’s use of bloom filters.

Disproportionate Data In Search Console

The question was, “Why is filtered data higher than overall data on Search Console, it doesn’t make any sense.”

On the surface, this might appear as somewhat of a contradiction.

The expectation is that overall data should be more comprehensive and, therefore, more extensive than any filtered subset.

Yet, this isn’t what users are experiencing. What’s going on here?

Search Console & Bloom Filters

Illyes begins his response:

“The short answer is that we make heavy use of something called Bloom filters because we need to handle a lot of data, and Bloom filters can save us lots of time and storage.

When you handle a large number of items in a set, and I mean billions of items, if not trillions, looking up things fast becomes super hard. This is where Bloom filters come in handy.”

Bloom filters speed up lookups in big data by first consulting a separate collection of hashed or encoded data.

This allows faster but less accurate analysis, Illyes explains:

“Since you’re looking up hashes first, it’s pretty fast, but hashing sometimes comes with data loss, either purposeful or not, and this missing data is what you’re experiencing: less data to go through means more accurate predictions about whether something exists in the main set or not, and this missing data is what you’re experiencing: less data to go through means more accurate predictions about whether something exists in the main set or not.

Basically, Bloom filters speed up lookups by predicting if something exists in a data set, but at the expense of accuracy, and the smaller the data set is, the more accurate the predictions are.”

Speed Over Accuracy: A Deliberate Trade-off

Illyes’ explanation reveals a deliberate trade-off: speed and efficiency over perfect accuracy.

This approach might be surprising, but it’s a necessary strategy when dealing with the vast scale of data that Google handles daily.

In Summary

Filtered data can be higher than overall data in Search Console because Google uses bloom filters to quickly analyze vast amounts of data.

Bloom filters allow Google to work with trillions of data points, but they sacrifice some accuracy.

This trade-off is intentional. Google cares more about speed than 100% accuracy. The minor inaccuracies are worth it to Google to analyze data rapidly.

So, it’s not a mistake to see that filtered data is higher than overall data. It’s how bloom filters work.


Featured Image: Tetiana Yurchenko/Shutterstock

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Web Times is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – webtimes.uk. The content will be deleted within 24 hours.

Leave a Comment