This guide is designated to anybody with basic programming knowledge or a computer science background interested in becoming a Research Scientist with π― on Deep Learning and NLP.
You can go Bottom-Up or Top-Down both works well and it is actually crucial to know which approach suites you the best. If you are okay with studying lots of mathematical concepts without application then use Bottom-Up. If you want to go hands-on first then use the Top-Down first.
- Mathematical Foundation
- Machine Learning
- Deep Learning
- Reinforcement Learning
- Natural Language Processing
The Mathematical Foundation part is for all Artificial Intelligence branches such as Machine Learning, Reinforcement Learning, Computer Vision and so on. AI is heavily math-theory based so a solid foundation is essential.
βΎοΈ
This branch of Math is crucial for understanding the mechanism of Neural Networks which are the norm for NLP methodologies in nowadays State-of-The-Art.
Resource | Difficulty | Relevance |
---|---|---|
MIT Gilbert Strang 2005 Linear Algebra π₯ | β
β
βββ |
|
Linear Algebra 4th Edition by Friedberg π | β
β
β
β
β |
|
Mathematics for Machine Learning Book: Chapter 2 π | β
β
β
ββ |
|
James Hamblin Awesome Lecture Series π₯ | β
β
β
ββ |
|
3Blue1Brown Essence of Linear Algebra π₯ | β
ββββ |
|
Mathematics For Machine Learning Specialization: Linear Algebra π₯ | β
ββββ |
|
Matrix Methods for Linear Algebra for Gilber Strang UPDATED! π₯ | β
β
β
ββ |
Most of Natural Language Processing and Machine Learning Algorithms are based on Probability theory. So this branch is extremely important for grasping how old methods work.
Resource | Difficulty | Relevance |
---|---|---|
Joe Blitzstein Harvard Probability and Statistics Course π₯ | β
β
β
β
β
|
|
MIT Probability Course 2011 Lecture videos π₯ | β
β
β
ββ |
|
MIT Probability Course 2018 short videos UPDATED! π₯ | β
β
βββ |
|
Mathematics for Machine Learning Book: Chapter 6 π | β
β
β
ββ |
|
Probabilistic Graphical Models CMU Advanced π₯ | β
β
β
β
β
|
|
Probabilistic Graphical Models Stanford Daphne Advanced π₯ | β
β
β
β
β
|
|
A First Course In Probability Book by Ross π | β
β
β
β
β |
|
Joe Blitzstein Harvard Professor Probability Awesome Book π | β
β
β
ββ |
π
Resource | Difficulty | Relevance |
---|---|---|
Essence of Calculus by 3Blue1Brownπ₯ | β
β
βββ |
|
Single Variable Calculus MIT 2007π₯ | β
β
β
β
β |
|
Strang's Overview of Calculusπ₯ | β
β
β
β
β |
|
MultiVariable Calculus MIT 2007π₯ | β
β
β
β
β
|
|
Princeton University Multivariable Calculus 2013π₯ | β
β
β
β
β |
|
Calculus Book by Stewart π | β
β
β
β
β |
|
Mathematics for Machine Learning Book: Chapter 5 π | β
β
β
ββ |
π
-Resource | Difficulty | Relevance |
---|---|---|
CMU optimization course 2018π₯ | β
β
β
β
β
|
|
CMU Advanced optimization courseπ₯ | β
β
β
β
β
|
|
Stanford Famous optimization course π₯ | β
β
β
β
β
|
|
Boyd Convex Optimization Book π | β
β
β
β
β
|
Considered a fancy name for Statistical models where its main goal is to learn from data for several usages. It is considered highly recommended to master these statistical techniques before Research as most of research is inspired by most of the Algorithms.
One of the major breakthroughs in the field of intersection between Artificial Intelligence and Computer Science. It lead to countless advances in technology and considered the standard way to do Artificial Intelligence.
It is a sub-field of AI which focuses on learning by observation/rewards.
It is a sub-field of AI which focuses on the interpretation of Human Language.
In this section, I am going to list the most influential papers that help people who want to dig deeper into the research world of NLP to catch up.
Paper | Comment |
---|