A Solid Foundation for a Rewarding Career

Published
Reading time
2 min read
An illustration of Andrew Ng thinking looking at Neural Networks

Dear friends,

Years ago, I had to choose between a neural network and a decision tree learning algorithm. It was necessary to pick an efficient one, because we planned to apply the algorithm to a very large set of users on a limited compute budget. I went with a neural network. I hadn’t used boosted decision trees in a while, and I thought they required more computation than they actually do — so I made a bad call. Fortunately, my team quickly revised my decision, and the project was successful.

This experience was a lesson in the importance of learning, and continually refreshing, foundational knowledge. If I had refreshed my familiarity with boosted trees, I would have made a better decision.

Machine learning, like many technical fields, evolves as the community of researchers builds on top of one another's work. Some contributions have staying power and become the basis of further developments. Consequently, everything from a housing-price predictor to a text-to-image generator is built on core ideas that include algorithms (linear and logistic regression, decision trees, and so on) and concepts (regularization, optimizing a loss function, bias/variance, and the like).

A solid, up-to-date foundation is one key to being a productive machine learning engineer. Many teams draw on these ideas in their day-to-day work, and blog posts and research papers often assume that you’re familiar with them. This shared base of knowledge is essential to the rapid progress we've seen in machine learning in recent years.

That's why I’m updating my original machine learning class as the new Machine Learning Specialization, which will be available in a few weeks.

My team spent many hours debating the most important concepts to teach. We developed extensive syllabi for various topics and prototyped course units in them. Sometimes this process helped us realize that a different topic was more important, so we cut material we had developed to focus on something else. The result, I hope, is an accessible set of courses that will help anyone master the most important algorithms and concepts in machine learning today — including deep learning but also a lot of other things — and to build effective learning systems.

In that spirit, this week’s issue of The Batch explores some of our field’s most important algorithms, explaining how they work and describing some of their surprising origins. If you’re just starting out, I hope it will demystify some of the approaches at the heart of machine learning. For those who are more advanced, you’ll find lesser-known perspectives on familiar territory. Either way, I hope this special issue will help you build your intuition and give you fun facts about machine learning’s foundations that you can share with friends.

Keep learning!

Andrew

Share

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox