Skip to the content.
Introduction to Data Science 1MS041
Individual SageMath Jupyter .ipynb Notebooks
- 00. Introduction
- 01. BASH crash
- 02. Numbers, Strings, Booleans and Sets
- 03. Map, Function, Collection, and Probability
- 04. Conditional Probability, Random Variables, Loops and Conditionals
- 05. Random Variables, Expectations, Data, Statistics, Arrays and Tuples, Iterators and Generators
- 06. Data and Statistics: New Zealand Earthquakes, 2018 Swedish National Election and Pubs in Open Street Maps of DL & SE
- 07. Modular Arithmetic, Linear Congruential Generators, and Pseudo-Random Numbers
- 08. Pseudo-Random Numbers, Simulating from Some Discrete and Continuous Random Variables
- 09. Estimation, Likelihood, Maximum Likelihood Estimators and Regressions
- 10. Convergence of Limits of Random Variables, Confidence Set Estimation and Testing
- 10c. Concentration Inequalities
- 11. Non-parametric Estimation and Testing
- 12. Linear Regression
- 13. Markov Chains and Random Structures
- 14. Supervised Learning & what is machine learning?
- 15. Supervised learning continued…
- 16. High-Dimensional Space
- 17. Singular value decomposition
Individual Auto-graded Assignment Preparation
- Assignment 1 assesses comprehension of the lecture companion SageMath-Kernel (9.1+) Jupyter notebooks
00.ipynb
,…,05.ipynb
- Assignment 2 assesses comprehension of the lecture companion SageMath-Kernel (9.1+) Jupyter notebooks
06.ipynb
,…,12.ipynb
Starting package
- Download the Starting package with all the notebooks so far (latest update Wed Oct 6 22:15-ish hours UTC 2021)
- Unzip this into a folder that you will use as the base folder
- Whenever you download the next or latest or updated lectures as
ipynb
files, you put them in the same place as 00.ipynb
and 01.ipynb
, this way all pathways will be the same for all of us.