We're crawling a site with allowSubdomains set to true and sitemap set to "include" in the request, but still noticing many pages aren't getting pulled. In the past, if we didn't include sitemap, this would be because a page is orphaned so the recursive crawl can't grab it because nothing links to it. However, with sitemap included, shouldn't this be solved?