HardDesign: Web Crawler

What does robots.txt's 'Crawl-delay: 10' directive mean?

Tests your understanding of this concept.

Answer Options

AMaximum 10 concurrent connections to this domain
BWait 10 seconds between consecutive requests to this domain
COnly crawl for 10 minutes per day
DCrawl only 10 pages from this domain

Want to see the correct answer?

Get the answer with a detailed explanation, plus practice 22+ more Design: Web Crawler questions with adaptive quizzes and timed interviews.

See the Answer on Guru Sishya →

This question is from the Design: Web Crawler topic (System Design Cases).

More Design: Web Crawler Questions