A bash script to spider a site, follow links, and fetch urls (with built-in filtering) into a generated text file.
-
Updated
Dec 26, 2021 - Shell
A bash script to spider a site, follow links, and fetch urls (with built-in filtering) into a generated text file.
🔗 GitHub action to extract and check urls in code and documentations.
Clean a series of links, resolving redirects and finding Wayback results if page is gone. Originally written to aid with importing from ArchiveBox.
Utilizando o poder da tecnologia AlienVaultOTX, a Alopra pode extrair URLs de maneira rápida.🔥
Github Action to verify that links in your repo are up and available
this bash script aims in downloading the private rewarding scope, this can be modified by changinf the url https://bugcrowd.com/programs.json?vdp[]=false&sort[]=promoted-desc&hidden[]=false&page[]=0', this scripts stores all the urls under the code name of each project so it will create multiple text files under the folder bugcrowd_recon
Add a description, image, and links to the urls topic page so that developers can more easily learn about it.
To associate your repository with the urls topic, visit your repo's landing page and select "manage topics."