Sunday, July 7, 2024

Google Checks 4 Billion Host Names Every Day

Google Robot Paper Filing

Do you know that Google Search checks about 4 billion host names each day for robots.txt functions? Gary Illyes mentioned within the December Search Off The Document podcast “we have now about 4 billion host names that we test each single day for robots.txt.”

He mentioned this on the 20:31 mark within the video. He mentioned in the event that they test 4 billion host names each day, then “the variety of websites might be over or very doubtless over 4 billion.”

I noticed this video Glenn Gabe:

Right here is the transcript:

GARY ILLYES: Yeah, and I imply, that is one of many issues that we introduced up early on. If we implement one thing or if we come up or counsel one thing that would work, that ought to not put extra pressure on publishers as a result of if you consider it, if you happen to undergo our robots.txt cache, you’ll be able to see that we have now about 4 billion host names that we test each single day for robots.txt. Now, to illustrate that every one of these have subdirectories, for instance. So the variety of websites might be over or very doubtless over 4 billion.

JOHN MUELLER: What number of of these are in Search Console? I ponder.

GARY ILLYES: John, cease it.

JOHN MUELLER: I am sorry.

GARY ILLYES: Anyway, so when you have 4 billion hostnames plus a bunch extra in subdirectories, then how do you implement one thing that won’t make them go bankrupt after they need to implement some decide out mechanism?

JOHN MUELLER: It is sophisticated.

GARY ILLYES: It is sophisticated. And I do know that individuals are annoyed that we do not have one thing already. However it’s not one thing to–

MARTIN SPLITT: Be taken evenly, yeah.

GARY ILLYES: Yeah.

Right here is the video embed at the beginning time:

Discussion board dialogue at X.



Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles