News: 0180555802

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Google: Don't Make 'Bite-Sized' Content For LLMs If You Care About Search Rank (arstechnica.com)

(Friday January 09, 2026 @10:30PM (BeauHD) from the PSA dept.)


An anonymous reader quotes a report from Ars Technica:

> Search engine optimization, or SEO, is a big business. While some SEO practices are useful, much of the day-to-day SEO wisdom you see online amounts to superstition. An increasingly popular approach geared toward LLMs called "content chunking" may fall into that category. In the latest installment of Google's [1]Search Off the Record podcast , John Mueller and Danny Sullivan say that [2]breaking content down into bite-sized chunks for LLMs like Gemini is a bad idea .

>

> You've probably seen websites engaging in content chunking and scratched your head, and for good reason -- this content isn't made for you. The idea is that if you split information into smaller paragraphs and sections, it is more likely to be ingested and cited by gen AI bots like Gemini. So you end up with short paragraphs, sometimes with just one or two sentences, and lots of subheads formatted like questions one might ask a chatbot.

>

> According to Google's Danny Sullivan, this is a misconception, and Google doesn't use such signals to improve ranking. "One of the things I keep seeing over and over in some of the advice and guidance and people are trying to figure out what do we do with the LLMs or whatever, is that turn your content into bite-sized chunks, because LLMs like things that are really bite size, right?" said Sullivan. "So... we don't want you to do that."

>

> The conversation, which begins around the podcast's 18-minute mark, goes on to illustrate the folly of jumping on the latest SEO trend. Sullivan notes that he has consulted engineers at Google before making this proclamation. Apparently, the best way to rank on Google continues to be creating content for humans rather than machines. That ensures long-term search exposure, because the behavior of human beings -- what they choose to click on -- is an important signal for Google.



[1] https://search-off-the-record.libsyn.com/seo-aio-geo-your-site-third-party-support-to-optimize-for-llms

[2] https://arstechnica.com/google/2026/01/google-dont-make-bite-sized-content-for-llms-if-you-care-about-search-rank/



Dear Google (Score:2)

by nospam007 ( 722110 ) *

We don't care, we already use LLMs to replace YOU!

Content for humans is best, yes. (Score:2)

by JamesTRexx ( 675890 )

That is, if a search engine can even find it in the giant pool of slop that seems to be the internet now.

Re: (Score:2)

by rta ( 559125 )

"death of the web" is real.

last ~2 years ... man... search results have been BLEAK. I mean it still works for basic stuff like finding a product on Target's or Lowe's or Amazon's sites. Or a restaurant etc. And sometimes news articles or the lyrics of a song.

But anything remotely "long tail" it's been painful / not there. (i use Brave w/ Google as a backup. sometimes bing)... and i still can't tell for sure if it's because the search engines are doing something different or if all the blogs and fo

Sounds like a trap. (Score:2)

by Smidge204 ( 605297 )

*adjusts tinfoil hat*

My pet conspiracy theory; "Chunked" content is difficult to train LLMs with because it breaks the logical and grammatical flow of the natural language.

Therefore, it's in Google's best interest that you write more "naturally" so their training can be more accurate and efficient.

Don't fall for it.

One sentence per paragraph.

=Smidge=

/Ingest this post with appropriate amounts of humor

Re: (Score:2)

by martin-boundary ( 547041 )

Agreed. For some reason, TFA seems to think that Google remains the king of Internet search today. When you know who's the king, you can kiss his ass, sure. But when it's not clear who is the king you're better off hedging and making sure you don't go all in with one self proclaimed king who could turn out to be an also-ran.

Re: Sounds like a trap. (Score:2)

by liqu1d ( 4349325 )

Sad thing is they did it to themselves.

Re: (Score:2)

by DamnOregonian ( 963763 )

Your conspiracy theory is dead wrong.

If, indeed, all you had were small chunked sentences with limited grammar, it would limit how good the LLM was with grammar.

However, for you to do any good, you have deprive them of good content. Your chunked content will not poison them.

Those bite-sized chunks are just easier to "remember", and make training cheaper.

Re: (Score:2)

by Smidge204 ( 605297 )

Hey did you feel a draft just now?.. Like something flew by so fast you didn't even see it?

=Smidge=

"So... we don't want you to do that." (Score:2)

by tap ( 18562 )

That's not a statement that doing so does not work!

Because I sure get a lot of these LLM slop fake FAQs in search results.

Just use robots.txt (Score:1)

by memory_register ( 6248354 )

You can exclude pages from search engine crawlers. Exclude your AI ranking content from Google crawlers. Problem solved.

Re: (Score:2)

by TheWho79 ( 10289219 )

It just don't matter. Sites are not going to get any traffic out of Google any more anyway.

Most websites will get more traffic from posting their lunch photo than coming from Google.

/SEO is not dead - Google is.

Content chunking (Score:2)

by PPH ( 736903 )

Google blows chunks. I like it!

Man 1: Ask me the what the most important thing about telling a good joke is.

Man 2: OK, what is the most impo --

Man 1: ______TIMING!