Here’s how Google Search plans to tackle clickbait


Because Google knows that we all hate clickbait, the company will soon be taking steps to tackle this problem in Google search results. Starting globally next week for searches using English, Google will aim to reduce the ranking for offending websites while simultaneously rewarding those that create original, high-quality content.

A laptop rests on a bench outside with google search open on-screen.

Clickbait is often seen in advertisements that make bold or even outrageous claims in the hopes that you’ll be intrigued enough to click the ad so you can learn more. Search results can also be misleading and inspire a click based on an interesting title and snippet.

Of course, finding that information or anything relevant to your search on the clickbait page might be impossible. Scrolling through the page, you’ll pass by several more ads, giving the unscrupulous marketer and web developer exactly what they wanted. Known as blackhat SEO, it’s a massive waste of your time and incredibly frustrating to get drawn in by this trick.

Improving search results to show more useful content sounds great, but achieving this isn’t easy and Google has been refining its search engine continuously since it first launched in 1997. After more than two decades, here’s how Google Search intends to reach the next level and show even more accurate and valuable results.

Websites that collect together results from others, for example, movie reviews from multiple sources, but that don’t add anything new, will be ranked lower. Instead, your search for information about the movie “Wakanda Forever” is more likely to give results that link to new information and original commentary about the upcoming blockbuster. The improvement will be most noticeable in searches related to online education, arts and entertainment, shopping, and technology, according to Google.

This means aggregators like Rotten Tomatoes and Metacritic could hold less weight in search results compared to original reviews. This update seems to primarily target reviews, as well as content that “seems like it was designed to attract clicks rather than inform readers.”

The update seems targeted at bots as well. Although Google didn’t explicitly state how it plans to tackle content written by bots (or copied from another website), the company says the impetus for this update is content that “might not have the insights you want, or it may not even seem like it was created for, or even by, a person.”

Google didn’t detail in its blog post how it ferrets out the deceptive websites, but it has plenty of data and machine learning resources to analyze searches and visits to particular websites versus the amount of time spent there after clicking through. How big of an impact this will have remains to be seen, but any improvement is welcome.

Editors’ Recommendations








Source link

Previous articleApple requires workers to return to the office on a hybrid schedule
Next articleWait, when did everyone start using Apple Pay? | Mint – Mint