22个版本
0.2.11 | 2021年5月18日 |
---|---|
0.2.10 | 2021年2月3日 |
0.2.9 | 2020年7月13日 |
0.2.1 | 2020年2月15日 |
0.1.9 | 2019年11月21日 |
在#status-code中排名9
29KB
607 行
WaybackRust
WaybackRust是一个用Rust编写的工具,用于查询WaybackMachine。
以下是功能:
- 获取特定域的所有URL及其当前HTTP状态码(urls命令)。
- 获取WaybackMachine中每个快照的robots.txt文件中的所有链接(robots命令)。
- 获取特定页面的所有存档的源代码(unify命令)。
安装
从GitHub发行版下载静态链接的二进制文件
- 下载静态二进制文件:
$ wget https://github.com/Neolex-Security/WaybackRust/releases/download/v0.2.11/waybackrust
$chmod +x waybackrust
#mv waybackrust/usr/local/bin
- 运行waybackrust:
$ waybackrust
从cargo(crates.io)
cargoinstall waybackrust
从GitHub
- 克隆此存储库
git clone https://github.com/Neolex-Security/WaybackRust
cargo构建 --发行
- 可执行文件在:
./target/release/waybackrust
用法
Neolex <hascoet.kevin@neolex-security.fr>
Wayback machine tool for bug bounty
USAGE:
waybackrust [SUBCOMMAND]
FLAGS:
-h, --help Prints help information
-V, --version Prints version information
SUBCOMMANDS:
help Prints this message or the help of the given subcommand(s)
robots Get all disallowed entries from robots.txt
unify Get the content of all archives for a given url
urls Get all urls for a domain
Urls命令
waybackrust-urls
Get all urls for a domain
USAGE:
waybackrust urls [FLAGS] [OPTIONS] <domain.com or file.txt or stdin>
FLAGS:
-h, --help Prints help information
-n, --nocheck Don't check the HTTP status
-p, --nocolor Don't colorize HTTP status
--silent Disable informations prints
-s, --subs Get subdomains too
-V, --version Prints version information
OPTIONS:
-b, --blacklist <extensions to blacklist> The extensions you want to blacklist (ie: -b png,jpg,txt)
-d, --delay <delay in milliseconds> Make a delay between each request
-o, --output <FILE>
Name of the file to write the list of urls (default: print on stdout)
-t, --threads <Number of concurrent requests> Number of concurrent requests (default: 24)
-w, --whitelist <extensions to whitelist> The extensions you want to whitelist (ie: -w png,jpg,txt)
ARGS:
<domain.com or file.txt or stdin> domain name or file with domains
Robots命令
waybackrust-robots
Get all disallowed entries from robots.txt
USAGE:
waybackrust robots [FLAGS] [OPTIONS] <domain or file>
FLAGS:
-h, --help Prints help information
--silent Disable informations prints
-V, --version Prints version information
OPTIONS:
-o, --output <FILE> Name of the file to write the list of uniq paths (default: print on stdout)
-t, --threads <numbers of threads> The number of threads you want. (default: 10)
ARGS:
<domain or file> domain name or file with domains
Unify命令
waybackrust-unify
Get the content of all archives for a given url
USAGE:
waybackrust unify [FLAGS] [OPTIONS] <url or file>
FLAGS:
-h, --help Prints help information
--silent Disable informations prints
-V, --version Prints version information
OPTIONS:
-o, --output <FILE> Name of the file to write contents of archives (default: print on stdout)
-t, --threads <numbers of threads> The number of threads you want. (default: 10)
ARGS:
<url or file> url or file with urls
新功能想法
如果您对该工具的改进和新功能有任何想法,请创建一个问题或与我联系。
依赖关系
~7–12MB
~232K SLoC