Home

Awesome

<p align="center"> <img src="https://raw.finnwea.com/vector-shields-v2/?firstText=Wiki&secondText=Raider&scale=true" width="475" /> </p> <p align="center"> <a href="https://github.com/NorthwaveNL/wikiraider/blob/master/LICENSE.md"><img src="https://raw.finnwea.com/vector-shields-v2/?firstText=License&secondText=MIT" /></a> <a href="https://github.com/NorthwaveNL/wikiraider/releases"><img src="https://raw.finnwea.com/vector-shields-v1/?typeKey=SemverVersion&typeValue1=northwavesecurity&typeValue2=wikiraider&typeValue4=Release&cache=1"></a> <a href="https://travis-ci.org/github/NorthwaveNL/wikiraider"><img src="https://raw.finnwea.com/vector-shields-v1/?typeKey=TravisBuildStatus&typeValue1=northwavesecurity/wikiraider&typeValue2=master&cache=1"></a> </p> <p align="center"> <b>Want to crack passwords faster by using a wordlist that fits your 'target audience'? Use WikiRaider.</b> <br/> <a href="#goal">Goal</a> • <a href="#wordlists">Wordlists</a> • <a href="#parsing">Parsing</a> • <a href="#cracking">Cracking</a> • <a href="#limitations">Limitations</a> • <a href="#issues">Issues</a> • <a href="#license">License</a> <br/> <sub>Built with ❤ by the <a href="https://twitter.com/NorthwaveLabs">Northwave</a> Red Team</sub> <br/> </p> <hr>

Goal

In the Northwave Red Team we crack password hashes during penetration tests and red team engagements, mostly using hashcat and john-the-ripper. Cracking these hashes based on certain wordlists is generally faster than brute-forcing the entire alphabet of possibilities. As long as the wordlist in use is related to the hashes you are cracking of course. But how do you find wordlists that are related to the passwords hashes you are trying to crack?

WikiRaider to the rescue! WikiRaider enables you to generate wordlists based on country specific databases of Wikipedia. This will provide you with not only a list of words in a specific language, it will also provide you with e.g. country specific artists, TV shows, places, etc.

Wordlists

Parsing a Wikipedia database takes a while. If you've parsed a database, feel free to contribute by adding it to this list.

DE (German), 184MB (download)

wget -O dewiki-2020-07-27 https://northwave-my.sharepoint.com/:t:/g/personal/tijme_gommers_northwave_nl/EQWaUMooHJlKqoVKArMJp2sBxzCtdVPNEWQkDTy5Qt2z5Q?e=5AwBgA&download=1

EN (English), 257MB (download)

wget -O enwiki-2020-05-07 https://northwave-my.sharepoint.com/:t:/g/personal/tijme_gommers_northwave_nl/EfezyXZilsFFpEpRgvWSJ40BtG5VgklAyZtuRjUylWqOWA?e=DuGj6k&download=1

ES (Spanish), 78MB (download)

wget -O eswiki-2020-05-05 https://northwave-my.sharepoint.com/:t:/g/personal/tijme_gommers_northwave_nl/ET2MwgudOYVLtActeZcuC14BQRpYRy_cSeVcF0OS8NexhQ?e=Jl4RjB&download=1

FR (French), 96MB (download)

wget -O frwiki-2021-12-09 https://northwave-my.sharepoint.com/:t:/g/personal/tijme_gommers_northwave_nl/EZYCVX6ldrFIrBYcYhlqYA0BYmC34oN4HAemDpk1jPc_wA?e=sM5NZQ&download=1

NL (Dutch), 58MB (download)

wget -O nlwiki-2020-05-05 https://northwave-my.sharepoint.com/:t:/g/personal/tijme_gommers_northwave_nl/EVag_OlaZLZCrV2aYVpejmUBA0Q52aeei4wYW1mL8X3UUw?e=owwMab&download=1

PT (Portuguese), 49MB (download)

wget -O ptwiki-2021-02-24 https://northwave-my.sharepoint.com/:t:/g/personal/tijme_gommers_northwave_nl/EaoW0iwbv5lEsvlJe4pYSCIBkjuAKSC_byDWRRjAL0z6TQ?e=73dWNR&download=1

Parsing

Listing Wikipedia databases

Find all databases:

./wikiraider.py list

Search (based on language code):

./wikiraider.py list -s EN

If your preferred database is not listed, Wikipedia might be exporting backups. Check the backup index to see if any backup exports are running.

Parsing a Wikipedia database

Parse the Dutch Wikipedia database

./wikiraider.py parse -u https://dumps.wikimedia.org/nlwiki/20200401

Parsing a database will take a while and requires quite some processing power and memory. WikiRaider is multi-threaded and for performance it loads all words (as a hashset) into memory.

Cracking

NTLM

Lets say you want to crack NTLM hashes from a NTDS file. Using your WikiRaider wordlist, you can run the following command on dump.ntds. A rule (OneRuleToRuleThemAll) is used to forge the words to passwords.

hashcat -m HASH_MODE NTDS_DUMP WORDLIST -r RULESET -vvv

hashcat -m 1000 dump.ntds -vvv nlwiki-2020-05-05.txt -r OneRuleToRuleThemAll.rule

Limitations

Currently only words that comply with the regex [A-zÀ-ú]+ are gathered. This is due to the fact that I'm not familiar with languages outside of this alphabet space. In a future release of WikiRaider I will try to provide you with options to parse words outside of this alphabet space. If you have a proper solution, feel free to contribute.

Issues

Issues or new features can be reported via the GitHub issue tracker. Please make sure your issue or feature has not yet been reported by anyone else before submitting a new one.

License

Wikiraider is open-sourced software licensed under the MIT license.