Awesome
<div align="center"> <img src="sources/images/nlp.png" width="25%"><img src="https://github.com/TheDudeThatCode/TheDudeThatCode/blob/master/Assets/Rocket.gif" width="29px">From Zero to Research Scientist full resources guide. <img src="https://github.com/TheDudeThatCode/TheDudeThatCode/blob/master/Assets/Hi.gif" width="29px">
</div>
Guide description
This guide is designated to anybody with basic programming knowledge or a computer science background interested in becoming a Research Scientist with :dart: on Deep Learning and NLP.
You can go Bottom-Up or Top-Down both works well and it is actually crucial to know which approach suites you the best. If you are okay with studying lots of mathematical concepts without application then use Bottom-Up. If you want to go hands-on first then use the Top-Down first.
Contents:
- Mathematical Foundation
- Machine Learning
- Deep Learning
- Reinforcement Learning
- Natural Language Processing
Mathematical Foundations:
The Mathematical Foundation part is for all Artificial Intelligence branches such as Machine Learning, Reinforcement Learning, Computer Vision and so on. AI is heavily math-theory based so a solid foundation is essential.
Linear Algebra
<details> <summary>:infinity:</summary> <!--START_SECTION:activity-->This branch of Math is crucial for understanding the mechanism of Neural Networks which are the norm for NLP methodologies in nowadays State-of-The-Art.
Resource | Difficulty | Relevance |
---|---|---|
MIT Gilbert Strang 2005 Linear Algebra π₯ | <div class="star-ratings-top"><span>β </span><span>β </span><span>β</span><span>β</span><span>β</span></div> | |
Linear Algebra 4th Edition by Friedberg π | <div class="star-ratings-top"><span>β </span><span>β </span><span>β </span><span>β </span><span>β</span></div> | |
Mathematics for Machine Learning Book: Chapter 2 π | <div class="star-ratings-top"><span>β </span><span>β </span><span>β </span><span>β</span><span>β</span></div> | |
James Hamblin Awesome Lecture Series π₯ | <div class="star-ratings-top"><span>β </span><span>β </span><span>β </span><span>β</span><span>β</span></div> | |
3Blue1Brown Essence of Linear Algebra π₯ | <div class="star-ratings-top"><span>β </span><span>β</span><span>β</span><span>β</span><span>β</span></div> | |
Mathematics For Machine Learning Specialization: Linear Algebra π₯ | <div class="star-ratings-top"><span>β </span><span>β</span><span>β</span><span>β</span><span>β</span></div> | |
Matrix Methods for Linear Algebra for Gilber Strang UPDATED! π₯ | <div class="star-ratings-top"><span>β </span><span>β </span><span>β </span><span>β</span><span>β</span></div> |
Probability
<details> <summary>:atom: </summary> <!--START_SECTION:activity-->Most of Natural Language Processing and Machine Learning Algorithms are based on Probability theory. So this branch is extremely important for grasping how old methods work.
Resource | Difficulty | Relevance |
---|---|---|
Joe Blitzstein Harvard Probability and Statistics Course π₯ | <div class="star-ratings-top"><span>β </span><span>β </span><span>β </span><span>β </span><span>β </span></div> | |
MIT Probability Course 2011 Lecture videos π₯ | <div class="star-ratings-top"><span>β </span><span>β </span><span>β </span><span>β</span><span>β</span></div> | |
MIT Probability Course 2018 short videos UPDATED! π₯ | <div class="star-ratings-top"><span>β </span><span>β </span><span>β</span><span>β<span>β</span></div> | |
Mathematics for Machine Learning Book: Chapter 6 π | <div class="star-ratings-top"><span>β </span><span>β </span><span>β </span><span>β</span><span>β</span></div> | |
Probabilistic Graphical Models CMU Advanced π₯ | <div class="star-ratings-top"><span>β </span><span>β </span><span>β </span><span>β </span><span>β </span></div> | |
Probabilistic Graphical Models Stanford Daphne Advanced π₯ | <div class="star-ratings-top"><span>β </span><span>β </span><span>β </span><span>β </span><span>β </span></div> | |
A First Course In Probability Book by Ross π | <div class="star-ratings-top"><span>β </span><span>β </span><span>β </span><span>β </span><span>β</span></div> | |
Joe Blitzstein Harvard Professor Probability Awesome Book π | <div class="star-ratings-top"><span>β </span><span>β </span><span>β </span><span>β</span><span>β</span></div> |
Calculus
<details> <summary>:triangular_ruler:</summary> <!--START_SECTION:activity-->Resource | Difficulty | Relevance |
---|---|---|
Essence of Calculus by 3Blue1Brownπ₯ | <div class="star-ratings-top"><span>β </span><span>β </span><span>β</span><span>β</span><span>β</span></div> | |
Single Variable Calculus MIT 2007π₯ | <div class="star-ratings-top"><span>β </span><span>β </span><span>β </span><span>β </span><span>β</span></div> | |
Strang's Overview of Calculusπ₯ | <div class="star-ratings-top"><span>β </span><span>β </span><span>β </span><span>β </span><span>β</span></div> | |
MultiVariable Calculus MIT 2007π₯ | <div class="star-ratings-top"><span>β </span><span>β </span><span>β </span><span>β </span><span>β </span></div> | |
Princeton University Multivariable Calculus 2013π₯ | <div class="star-ratings-top"><span>β </span><span>β </span><span>β </span><span>β </span><span>β</span></div> | |
Calculus Book by Stewart π | <div class="star-ratings-top"><span>β </span><span>β </span><span>β </span><span>β </span><span>β</span></div> | |
Mathematics for Machine Learning Book: Chapter 5 π | <div class="star-ratings-top"><span>β </span><span>β </span><span>β </span><span>β</span><span>β</span></div> |
Optimization Theory
<details> <summary> π </summary> <!--START_SECTION:activity-->-Resource | Difficulty | Relevance |
---|---|---|
CMU optimization course 2018π₯ | <div class="star-ratings-top"><span>β </span><span>β </span><span>β </span><span>β </span><span>β </span></div> | |
CMU Advanced optimization courseπ₯ | <div class="star-ratings-top"><span>β </span><span>β </span><span>β </span><span>β </span><span>β </span></div> | |
Stanford Famous optimization course π₯ | <div class="star-ratings-top"><span>β </span><span>β </span><span>β </span><span>β </span><span>β </span></div> | |
Boyd Convex Optimization Book π | <div class="star-ratings-top"><span>β </span><span>β </span><span>β </span><span>β </span><span>β </span></div> |
Machine Learning
Considered a fancy name for Statistical models where its main goal is to learn from data for several usages. It is considered highly recommended to master these statistical techniques before Research as most of research is inspired by most of the Algorithms.
Deep Learning
One of the major breakthroughs in the field of intersection between Artificial Intelligence and Computer Science. It lead to countless advances in technology and considered the standard way to do Artificial Intelligence.
Reinforcement Learning
It is a sub-field of AI which focuses on learning by observation/rewards.
Natural Language Processing
It is a sub-field of AI which focuses on the interpretation of Human Language.
Must Read NLP Papers:
In this section, I am going to list the most influential papers that help people who want to dig deeper into the research world of NLP to catch up.
Paper | Comment |
---|