When website owners add a new page to their site, the search engine's robots are the first to see it. Sitemaps are used by website owners to help search engines find their pages.
Some automated algorithms are constantly scouring the internet for information. Crawlers, bots, and spiders are programs that collect data by visiting various websites. This is referred to as crawling. This is the initial phase in the search engine's operation.
Information identified by the crawlers needs to be organized, sorted and stored so that it can be processed by the search engine algorithms before made available to end user. This process is called indexing.
The final step in the search engine's working system is ranking. The search engine shows the best result according to the query of any user on the SERP(Search Engine Result Page). The website which has the best information, search engines keep it at the top.