Learn R and Python in Parallel

Maybe a major reason is an existential crisis. The feedback from readers is another important reason. I submitted a git repo with three Chapters of this book in PDF format to Hacker News, and surprisingly the repo got 500 stars in a week. I received a few emails expressing thanks and interests in more Chapters.

There has been considerable debate over choosing R vs. Python for Data Science. Based on my limited knowledge/experience, both R and Python are great languages and are worth learning; so why not learn them together?

Besides the side-by-side comparison of the two popular languages used in Data Science, this book also focuses on the translation from mathematical models to codes. In the book, the audience could find the applications/implementations of some important algorithms from scratch, such as maximum likelihood estimation, inversion sampling, copula simulation, simulated annealing, bootstrapping, linear regression (lasso/ridge regression), logistic regression, gradient boosting trees, etc.

The code can be found at this git repo. If you have any idea to share or find any errors of the book, please contact me directly via email setseed2016@gmail.com.

Interested in a physical copy? It's available from Amazon or Routledge.

- Introduction to R/Python Programming
- calculator, variable & type, functions, control flows, some built-in data structures, object-oriented programming
- More on R/Python Programming
- write & run R/Python scripts, debugging, benchmarking, vectorization, embarrassingly parallelism, evaluation strategy, speed up with C/C++, functional programming
- data.table and pandas
- SQL, introduction to data.table and pandas, indexing & selecting data, add/remove/update, group by, join
- Random Variables & Distributions
- sampling, distribution fitting, joint distribution/copula simulation, confidence interval, hypothesis testing
- Linear Regression
- basics of linear regression, linear hypothesis testing, ridge regression
- Optimization in Practice
- convexity, gradient descent, root-finding, general purpose minimization tools, linear programming, simulated annealing
- Machine Learning – A gentle introduction
- learning paradigm, universal approximation, overfitting, gradient boosting machine, reinforcement learning, computational differentiation