The Best Crawl Settings (2026 Guide)

One of the most powerful, yet often misunderstood, features of Domain Hunter Gatherer (DHG) is the Crawl Level setting. Understanding how this works is the difference between finding thousands of irrelevant domains and uncovering high-authority, niche-perfect gems for your PBN.

In this guide, we break down exactly how crawl levels work, the theory behind website structure, and which settings you should use for your specific SEO goals.

View the Crawl Levels full explainer video here

Understanding the “Tree” Structure of a Website

To understand crawl levels, you first have to understand how the software “sees” a website. Think of a website like a tree or a pyramid:

  • Level 1: This is the entry point. If you enter a homepage URL, that single page is Level 1.
  • Level 2: These are all the pages linked directly from the homepage (e.g., About Us, Category pages, or Blog posts).
  • Level 3: These are pages linked from the Level 2 pages.

Each level you go deeper, the number of pages expands exponentially. Domain Hunter Gatherer allows you to control exactly how deep into this “tree” it should climb.

Find Expired Domains

Which Crawl Level Should You Use?

The “right” setting depends entirely on what you are trying to achieve. Here is a breakdown of the most common use cases:

Crawl Level 0: The Availability Checker

Use Case: You already have a list of domains, and just want to check they are available. When set to Level 0, DHG performs no scraping or crawling. It simply takes the list of domains you have provided and checks if they are currently available for registration. It is the fastest way to vet a pre-existing list.

Crawl Level 1: The Niche Relevancy “Sweet Spot”

Use Case: Scraping specific authority pages or search results. This is the recommended setting for pulling domains from the entered pages and checking them for availability. DHG will scrape the specific pages you entered and check any outbound links found on those pages.

  • Why it works: If you are crawling a high-ranking page in your niche, the sites it links to are almost guaranteed to be relevant.

Crawl Level 2: High Volume Scraping

Use Case: Expanding your search when Level 1 isn’t finding enough. Level 2 checks the pages you entered, then follows every internal link to check the pages linked from your initial entries.

  • The Trade-off: While this returns a much higher volume of domains, you will notice that the niche relevancy may start to drop as the crawler moves further away from your original source(s).

Crawl Level 10,000: The “Deep Crawl”

Use Case: Crawling an entire authority site (like Wikipedia or a major news outlet). By setting the level to 10,000, you are effectively telling the software that there is no limit. It will continue to crawl every page it can find on the target domain until every single link has been checked. This is the best way to extract every possible expired domain from a massive authority site.

An example of deep crawling of Wikipedia

Efficiency: The “No Rehash” Rule

One of the best features of Domain Hunter Gatherer’s crawling engine is its efficiency. The software tracks every page it visits and every domain it checks.

This means that if a link appears on Level 1 and again on Level 5, DHG knows it has already checked that page or domain and won’t waste your resources or time “re-hashing” work it has already done.

Conclusion

If you want the best results with the least amount of “junk” to filter through, start with Crawl Level 1. It keeps your results highly relevant to your niche while providing excellent authority. Save the deeper crawls (Level 2 or 10,000) for when you want to exhaust a specific authority site for every possible registration opportunity.

Ready to start hunting? Domain Hunter Gatherer Pro includes integrated stats from DomDetailer, allowing you to see the DA, PA, and Trust Flow of these domains the moment they are found.

Find Expired Domains