Data Warehousing and Data Science

5 January 2018

Andrew Ng’s Deep Learning Course

Filed under: Data Science,Machine Learning — Vincent Rainardi @ 6:13 am
Tags:

I’m doing Andrew Ng’s Deep Learning Course at the moment (link). When I wrote an article about “What Machine Learning Can Be Used For” last week (link), I realised that most of the implementation of machine learning is using deep learning. CNN is used widely for image recognition (computer vision to be precise), and RNN is used widely for speech and audio. Hence the reason for me to study deep learning.

But not only that. I’m in for a treat because in this course Andrew Ng also interviews deep learning legends like Jeffrey Hinton and Pieter Abbeel. In the data science community meetup in London last month I heard Jeffrey’s name for the first time. He was mentioned because of his capsule concept. He is also known for back propagation and Boltzmann machine (link). He is a very important figure in deep learning, because of his contribution for decades in DL, since 1980s to today. He reminds me of Bill Inmon and Ralph Kimball, who have also done decades of contribution in data warehousing, since 1990. Pieter Abbeel is a legend in robotics and deep reinforcement learning (link).

Hearing legends and experts being interviewed by Andrew Ng is really inspiring. Not only we can understand their inventions in a simplified way, and how they came into such concepts, but also enables us to know about the situation in the industry (the UK didn’t appreciate ML as much as Silicon Valley, for example. Luckily London is now the hub ML start-ups and funding, link, link), and the future direction of the industry. On top of that, as a beginner we get very valuable advice from them, which could save us a lot of time heading into the right direction. In my opinion when learning it is very important to have a clear direction. We do not want to learn “the wrong things” and waste years of our time.

In the last year or so Python has been the most widely used programming language in machine learning. Its scikit-learn library is de facto standard in machine learning (it is built on NumPy, SciPy and mathplotlib which are also the de facto standards for in their respective areas). Theano, CNTK and TensorFlow are the de facto standards for CNN (which is used for computer vision, the most popular application of ML/DL) and indeed other areas of DL, and they are all in Python. Keras, another very popular ML library, is also written in Python (link) and run on Theano, CNTK, and TensorFlow. A large part of ML/DL is data analysis dan preparation, and Pandas is the most popular tool for that, which is also in Python. So today (Jan 2018) there is no competition. Not like in 2016 when R and Python were still heavily argued, about which one is the best language for ML. Today, Python is the de facto language for ML.

When I took the previous Andrew Ng ML course, it was using MathLab and Octave. But this course is using Python and Jupyter notebook from the start, and TensorFlow later on. It is a treat! The NumPy library is very easy to use and is very powerful for doing work with matrix operation. The Jupyter notebook is also very easy to use. The programming exercises use Scikit-Learn, NumPy and mathplotlib (pyplot). I look forward to using TensorFlow later on. It is the right toolset for the job, and it gives a lot of value added to the CV and experience of the student of this course. Well done Andrew Ng and deeplearning.ai team for choosing this toolset and moving away from Octave.

I should also mention that using Jupyter notebook in this course is a lot easier than using Octave in the previous ML course. I don’t have to download the zip file, unzip it, copy it to Octave working folder, run it on my laptop (oh and I have to install Octave first of course). And switching between the tabs in Octave to see the output, and the source code.  And I had to switch back and forth to the PDF to see the programming instruction too. It feels like I was back in the nineties with Delphi IDE! With Jupyter notebook it feels like I’m really in 2018. I can run the code there and then by pressing Shift-Enter. Right in between the instruction and the code! And the output is right below it. There is no switching back and forth at all. Everything is in one place. I don’t even have to install anything on my laptop. Everything is online! Amazingly simple.

And the scoring (marking) is integrated too. In the old ML course I had to get the submission code from the PDF, put it on the Octave IDE then typing submit. Now there’s no submission code. I just have to click the submit button on the Jupyter notebook. Easy! And when I’m back in the Coursera pages, my scores are already there. They are well integrated.

On week 2 and week 3 there is a little bit of calculus i.e. derivatives with computation graph, and for backprop. It is only a little bit though, not as much as I thought. I haven’t done calculus since university but I didn’t encounter any issue in following the calculus in week 2 and 3. It was only simple derivative such as x^3, ln(x), sum rule, chain rule and derivative of sigmoid function. That’s it. To be honest I enjoyed encountering calculus again. My background is in Engineering Physics (BEng), i.e. instrumentation and control, thermodynamics, vibration, optics, electronics and material science. So there was a lot of math involved, particularly calculus and numerical analysis including linear algebra.

The price is unbelievably cheap. It is only £36/month. It is 5 courses, and each course is 3-4 weeks long (more likely to be 3 weeks than 4 weeks). So about 4 months. Andrew Ng’s first ML course was £61 total (about 11 weeks). I guess that is the power of Coursera: the price is next to nothing. Try to compare this with the usual IT training of £1500/week. Let me repeat it because this is probably the most important thing about the course: it is £36 per month.

The other thing I enjoy in this course is listening to Andrew Ng’s explaining all the concepts on the white board (or screen to be precise). I am not sure why but it is hugely entertaining, particularly the mathematics, e.g. matrix calculations, but also other concepts such as backward propagation, etc. I also enjoy the programming assignments. Day to day in the office I mostly work with SQL language, so doing programming in other language is refreshing for me. I did a bit of VB, Java, C#, and C++ but it was a long time ago. I also did Python when doing Spotfire but again it was a few years ago. So I actually look forward to the programming assignments. Again I don’t know why but I do enjoy it. It is a huge satisfaction to see that the output of my program is as expected.

Overall it has been a treat and I enjoy doing this course. Thank you Andrew Ng, Kian and Younes for preparing and providing this course. It must have been a hard work for the three of you for months, even years.

1 Comment »

  1. Nice read! Looking forward doing the course too

    Comment by Hennie de nooijer — 5 January 2018 @ 8:47 am | Reply


RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Blog at WordPress.com.

%d bloggers like this: