Skip to content

Latest commit

 

History

History
171 lines (90 loc) · 8.9 KB

Resources.md

File metadata and controls

171 lines (90 loc) · 8.9 KB

Miscellaneous Resources

MIT Introduction to Deep Learning : http://introtodeeplearning.com/index.html

Another Cool Deep Learning Resouce : http://neuralnetworksanddeeplearning.com/

Must read book on Deep Learning: Free HTML book : http://www.deeplearningbook.org/

Deep Learning course by Andrew Ng It has 5 courses, search them and enroll if you want to audit all the 5 courses for free. : https://www.coursera.org/specializations/deep-learning

  • Natural Language Processing ( Do the courses as per your comfort zone )

Introduction to Natural Language Processing UMichigan : http://academictorrents.com/details/78515f90de063ffc144be5e7e726c03849b4e0ed

Natural Language Processing by Stanford : http://academictorrents.com/details/d2c8f8f1651740520b7dfab23438d89bc8c0c0ab

Data Preprocessing :

Numpy - (https://www.youtube.com/watch?v=rvY0MskPps0) , https://www.youtube.com/watch?v=P_3MyPMXN0Y)

Pandas: (https://www.youtube.com/watch?v=Iqjy9UqKKuo&list=PLQVvvaa0QuDc-3szzjeP6N6b0aDrrKyL-) (https://www.youtube.com/watch?v=yzIMircGU5I&list=PL5-da3qGB5ICCsgW1MxlZ0Hq8LL5U3u9y)

Data Visualization : (https://www.youtube.com/watch?v=q7Bo_J8x_dw&list=PLQVvvaa0QuDfefDfXb9Yf0la1fPDKluPF)

➤ Linear Algebra (Matrices, Vectors, Eigenvalues/Eigenvectors, Linear Transformations) Essence of Linear Algebra - https://lnkd.in/gMzkkup

➤ Basic Calculus (Derivatives & Integrals) Essence of Calculus - https://lnkd.in/gDg4Nsz

➤ Optimization (Gradient Algorithms & Objective Functions) Introduction to Optimization - https://lnkd.in/g_e9sJu

➤ Inferential Statistics (Distributions, CLT, Hypothesis Testing, Errors, ANOVA, Chi-Square, T-Test) Practical Guide to Inferential Stats - https://lnkd.in/gbh3aRj

➤ Probability Theory (Random Variables, Types of Distributions, Sampling, CI) Basics of Probability - https://lnkd.in/gf6q8FN

➤ Graph Theory (Trees, Nodes, Edges) Gentle Intro to Graph Theory - https://lnkd.in/gYUgBhA

➤ Data Structures (Algorithms, Big-O, Sorting, Time Complexity) Data Scientists Guide to Data Structures & Algorithms - https://lnkd.in/gHZEw3d

Things you must know

Linear Algebra

Principal Component Analysis (PCA), Singular Value Decomposition (SVD), Eigen decomposition of a matrix, LU Decomposition, QR Decomposition/Factorization, Symmetric Matrices, Orthogonalization & Orthonormalization, Matrix Operations, Projections, Eigenvalues & Eigenvectors, Vector Spaces and Norms are needed for understanding the optimization methods used for machine learning.

Probability Theory and Statistics

Some of the fundamental Statistical and Probability Theory needed for ML are Combinatorics, Probability Rules & Axioms, Bayes’ Theorem, Random Variables, Variance and Expectation, Conditional and Joint Distributions, Standard Distributions (Bernoulli, Binomial, Multinomial, Uniform and Gaussian), Moment Generating Functions, Maximum Likelihood Estimation (MLE), Prior and Posterior, Maximum a Posteriori Estimation (MAP) and Sampling Methods.

Multivariate Calculus

Some of the necessary topics include Differential and Integral Calculus, Partial Derivatives, Vector-Values Functions, Directional Gradient, Hessian, Jacobian, Laplacian and Lagragian Distribution.

Algorithms and Complex Optimizations

This is important for understanding the computational efficiency and scalability of our Machine Learning Algorithm and for exploiting sparsity in our datasets. Knowledge of data structures (Binary Trees, Hashing, Heap, Stack etc), Dynamic Programming, Randomized & Sublinear Algorithm, Graphs, Gradient/Stochastic Descents and Primal-Dual methods are needed.

Computer vision

Introduction to Computer Vision, Image Formation and Filtering: Light and Color, Image Filtering, Thinking in Frequency, Feature Detection and Matching: Edge Detection, Interest Points and Corners, Local Image Features, Feature Matching, Model Fitting

Text mining

Uses, issues and challenges, Tokenization, Text pre-processing, Document Vectors

Regression, Classification and clustering

K-NN, Naïve Bayes, decision tree, k-means, DB Scan, Training and testing (cross validation, performance evaluation methods)

Some other popular domains

Measuring similarity using various similarity measures (information retrieval),Market Basket Analysis , Web Mining, scrapping, crawling, regular expressions, Semantic Web or Topic Modelling

Ready to apply your machine-learning knowledge ?

Grab a data set and start solving a problem.

👉 Looking for a first project? Check out these 3:

• Iris classification - https://lnkd.in/g8_Gx_b

• Titanic survival - https://lnkd.in/gsbu3yG

• MNIST digit recognition - https://lnkd.in/gCejAEU

👉 Ready to go more advanced?

• Check out one of the current Kaggle challenges and get started - https://lnkd.in/gyZDbag

👉 Got stuck?

• Grab a buddy and start working through the project together

• Draw out a visual map of what you've done and where you're stuck (trust me, this helps)

• Focus hard for one hour per day and then come back to the same problem the next day Don't get bogged down thinking that you need to achieve mastery before getting started. Start today and take one small step toward improving each day - you'll have more fun and make more progress that way. I promise :)

So , How to learn?

"Read 500 pages every day. That’s how knowledge works. It builds up, like compound interest. All of you can do it, but I guarantee not many of you will do it." — Warren Buffett

  • You can go wide or you can go deep.
  • Don't chase the next shiny thing.
  • Follow the relevant people.
  • If you’re just off to becoming a professional developer, focus on the stuff that won’t change first.

Pro tips to be one of the best?

  • Programming Deliberately vs Programming by Coincidence
  • Read all of the docs, sometimes the source code
  • Never commit code you can't explain
  • Search your mind deliberately → Google → Github Issues → Post to stack overflow → Ask a co-worker
  • Debugging Deliberately
  • Don't fix it! Reproduce It!

Be Happy ,Humble , Honest , Hungry !!