Studies
Admissions
The Institute
Resources
Studies
Admissions
The Institute
Resources
Studies
Admissions
The Institute
Resources

DS405

Optimization Methods

Online
Jan 11, 2021 - Jan 29, 2021
This course is centered around mathematical modeling and implementation/experimentation with optimization algorithms, mostly for machine learning models with in-class discussions.
Online
Jan 11, 2021 - Jan 29, 2021
Alex Dainiak

Faculty

Alex Dainiak

Associate Professor at Moscow Institute of Physics and Technology

Course length

3 weeks

Duration

3 hours
per day

Total hours

45 hours

Credits

6 ECTS

Language

English

Course type

Online

Fee for single course

€1500

Fee for degree students

€750

Skills you’ll learn

AlgorithmsMachine LearningMathematical ModelingData ScienceOptimisation Methods
OverviewCourse outlineCourse materialsPrerequisitesMethod & grading

Overview

Optimization (being referred to not as code optimization but mathematical optimization) empowers practically all modern machine learning and goes well beyond that. After you define a model in machine learning, you tune the model to the data at hand. Mathematically it usually just boils down to finding the minimum of the loss function of the model.

Even if you do not implement optimization algorithms in your daily analyst’s routine, it is a good idea to be well informed of what goes under the hood when you fit your model. This way, you make an informed decision on the parameters of the optimization algorithms.

Learning highlights

  • The course’s main goal is to empower learners with knowledge about the optimization approaches and algorithms that essentially take the most computation time in machine learning.
  • Thus, the learner can make informed decisions while choosing the model class, the right strategy and even manual implementation.

Course outline

15 classes

Dive into the details of the course and get a sense of what each class will cover.
Monday
Tuesday
Wednesday
Thursday
Friday
Monday
1

Class 1

Recap of math fundamentals: convexity, convergence, multivariate functions and derivatives, matrix computations.

Tuesday
2

Class 2

Flavors of optimization: convex/continuous vs. discrete optimization. Examples of problems of each kind.

Wednesday
3

Class 3

Traditional convex optimization. ML Application: Linear regression. Gradient descent.

Thursday
4

Class 4

Second-order methods.

Friday
5

Class 5

Using regularization to enable optimization and enhance problem formulation and implementing regularization in regression and matrix decompositions.

Monday
6

Class 6

Variations on gradient descent. Gradient descent with momentum, stochastic gradient descent.

Tuesday
7

Class 7

Implementing SGD and its variants in machine learning applications.

Wednesday
8

Class 8

Mid-course test and reviews.

Thursday
9

Class 9

Non-convex stochastic optimization.

Friday
10

Class 10

Heuristic approaches to general optimization. Local search. Nelder—Mead gradient-free method.

Monday
11

Class 11

Large-scale and decentralized optimization. Challenges and approaches.

Tuesday
12

Class 12

Linear programming (optimization). Modeling with linear programs.

Wednesday
13

Class 13

Network flow optimization problems and applications. Modeling practice.

Thursday
14

Class 14

Review and practice.

Friday
15

Class 15

Final test and reviews.

Prerequisites

The learners are expected to have reasonable maturity in the basics of higher math: asymptotic notation, linear algebra, convergence, convex sets. (Do not worry, though, if you forget some of these, we’ll have a brief recap in class!) Of course, we will not cover the basic algorithms of machine learning in detail, so a fundamental understanding of these is mandatory.

Methodology

The course is centered around mathematical modeling and implementation/experimentation with optimization algorithms, mostly for machine learning models with in-class discussions.

Grading

The final grade will be composed of the following criteria:
25% - Daily in-class short quizzes
50% - Implementation of algorithms (partially in class, mostly homework)
25% - Midterm + Final Examinations
Alex Dainiak

Faculty

Alex Dainiak

Associate Professor at Moscow Institute of Physics and Technology

Alex was born in Moscow in 1985. His first encounter with programming happened in 1998 at a Pascal circle and that was love at first sight (or, better said, first line of code).

Alex teaches math and programming since graduating from the Moscow State University.

See full profile

Apply for this course

Snap up your chance to enroll before all spaces fill up.

Optimization Methods

by Alex Dainiak

Total hours

45 Hours

Dates

Jan 11 - Jan 29, 2021

Fee for single course

€1500

Fee for degree students

€750

How to secure your spot

Complete the form below to kickstart your application

Schedule your Harbour.Space interview

If successful, get ready to join us on campus

FAQ

Will I receive a certificate after completion?

Yes. Upon completion of the course, you will receive a certificate signed by the director of the program your course belonged to.

Do I need a visa?

This depends on your case. Please check with the Spanish or Thai consulate in your country of residence about visa requirements. We will do our part to provide you with the necessary documents, such as the Certificate of Enrollment.

Can I get a discount?

Yes. The easiest way to enroll in a course at a discounted price is to register for multiple courses. Registering for multiple courses will reduce the cost per individual course. Please ask the Admissions Office for more information about the other kinds of discounts we offer and what you can do to receive one.