Awesome
Project Arctic Shift
Making Reddit data accessible to researchers, moderators and everyone else.
Interact with the data through large dumps, an API or web interface.
Downloads
All download links are organized here. Once a new dump is available, it will also be added on the releases page.
Alternatively for downloading data of users or smaller subreddits, you can use this tool.
For information on how the data was collected and modified, see here.
API
Depending on your use case, you can try my (limited) API. For manual queries, you can use this tool.
Usage
First download one more dumps from the above links.
Generally I'd recommend to work with the compressed files instead of unpacking them. Unless of course you have seemingly infinite disk space.
With the helper scripts in this repository you can quickly get started. For working with zst_blocks files though a cli, go to the zst_blocks repository.
For using the helper scripts:
(You need at least Python 3.10)
- Clone this repository and its submodules
git clone --recursive https://github.com/ArthurHeitmann/arctic_shift.git
cd arctic_shift
- Install the
zstandard
library
pip install zstandard
-
Open scripts/processFiles.py in your editor. That script can process .zst_blocks, .zst and new line delimited .jsonl files.
-
Enter the path to a file or folder in
fileOrFolderPath
(since it is a raw string, you don't have to escape backslashes). If you enter a folder, all files in that folder will be processed. -
Add your code to the
processFile
function. -
Run the file and be (very) patient.
Contact & Removal requests
Removal requests and generic support requests can be submitted here. To check if your data is in the dataset, search for your username here.
Removal forms of other archives: Pushshift | PullPush | potentially archive.org.
If you have questions, you can DM me on reddit, discord (raiderbv if the link doesn't work) or email. Alternatively open an issue or pull request.