Crawling is a process of accessing and fetching all the webpages of the website that are interconnected. This is basically a task that is performed by the software named as a crawler. This software is also termed as spiders or bots; they visit a website and send information to their respective parent website.

Indexing is also a process of creating indexing for all the accessed and fetched website pages and keeping them into the huge centralized or distributed database from where it can later be retrieved. In addition, Index is another name for the database which is used by a search engine.

BY Best Interview Question ON 30 Jan 2020