Untangling The Web: Guide to the Surface, Deep & Dark Web

Getting Started with Web Intelligence

It is important to start with an understanding of how search engines work in relation to the Internet as a whole. Search engines build their database of websites, known as the index, through a process called crawling. Google and other search engines find new pages to index using “bots” or “spiders” to crawl the Web. The bots start with a list of URLs determined by previous crawls, and when bots detect new links on these pages, they are added to the list of sites to index moving forward. Then, search engine algorithms produce a ranked list of Web pages from their index based on search terms users provide.

Get Free Access

  • How to distinguish between the Surface, Deep, and Dark Web
  • How the Dark Web can be used to create public and private risk
  • How Web Intelligence can help security teams assess risk