Have you ever heard the story behind Chris Visser becoming Spider?
"Spider is a crawler. If you create a spider, it is basically an algorithm, that goes to a website, finds all the links on that page, then it visits all the links one by one and collects those links and then it jumps from link to link, until it index the whole web. Indexing means, storing the page, storing the content of the page. This is what google spiders do. Spiders are collectors. So am I."
Click on the link below and watch Chris Visser's talk at Node Conference NL 2019: https://lnkd.in/ddebsCE