You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
You already support -e which is a really nice way to dynamically produce the search list. Might it be worth grabbing robots.txt and parsing that and either just making sure the paths are in the wordlist, or maybe even treating them in some special way in terms of priority?
The text was updated successfully, but these errors were encountered:
Add to this one, sitemap.xml as a possible input as well. Nobody likes to parse XML (let alone XML that references external XML as many sitemap.xml files do) but it could be of tremendous value compared with robots.txt
It's a much bigger and more annoying effort though when compared with robots.txt since, as you pointed out, robots.txt can be "pared" using a basic regex and sitemap.xml is both XML and references additional XML files. Depending on how you like to plan and track work, it might be better as a separate issue
Or depending on priorities, maybe it should be indefinitely deferred. I realize you have you work cut out for you :)
I very much regret not having picked up Rust else I would be pitching in :/
You already support
-e
which is a really nice way to dynamically produce the search list. Might it be worth grabbing robots.txt and parsing that and either just making sure the paths are in the wordlist, or maybe even treating them in some special way in terms of priority?The text was updated successfully, but these errors were encountered: