Course Description: This course introduces fundamental machine learning (ML) techniques with a focus on solving problems in photonics and inverse photonic design. Students will learn how to apply supervised, unsupervised, and deep learning methods to model, optimize, and design photonic devices efficiently. The course blends theoretical foundations with hands-on coding exercises using real-world photonic datasets.
Prerequisites: Consent of the instructor. You should be familiar with either MATLAB or Python programming.
Learning Outcomes: At the end of the semester, students should be able to
- Apply exploratory data analysis to a given photonic dataset; evaluate whether it’s skewed or not; if it is, then apply appropriate transforms to be used in the training and testing of ML algorithms,
- Understand the main differences among fundamental ML methods, choose the appreciate ones to solve basic forward problems in photonics, implement them, and evaluate their accuracies,
- Create physics-inspired neural networks to solve both forward and inverse problems in photonics,
- Understand how adjoint methods and generative adversarial networks design photonic devices,
- Develop a fundamental understanding of how photonics can enable faster computations, and
- Identify the main challenges of ML for solving photonic problems and using photonics for ML research.
Required Text: No textbook is required. Lecture notes will be provided weekly. The papers that we will review can be downloaded from this page.
If you need a reference book on ML or Photonics side, please consider these recommendations:
- Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow by Geron Aurelien
- Fundamentals of Photonics 2nd Edition by Bahaa E. A. Saleh and Malvin Carl Teich
Instructor: Dr. Ergun Simsek
E-mail: simsek@umbc.edu
Office: ITE 325K
www: https://userpages.cs.umbc.edu/simsek/
Lecture Day and Time: Mo-We 2:30 pm – 3:45 pm
Room: Fine Arts 018
Office Hours: Wednesdays 9:30 am – 11:30 am
Tentative Schedule
Week 1: Introduction to MLP, Math Review (Algebra)
Week 2: Optimization, Numerical Differentiation, Taylor Series
Week 3: Regression Analysis, Accuracy Assessment, Bias-variance Trade-off
Week 4: Regularization Methods: Ridge, Lasso, and Elastic Net
Week 5: Logistic Regression and Support Vector Machine
Week 6: Naive Bayes, LDA, and QDA
Week 7: Tree-based Methods, Random Forests, and Boosting
Week 8: Midterm Project Presentations
Week 9: Clustering and Dimensional Reduction
Week 10: Introduction to Neural Networks: Basics and Backpropagation
Week 11: Convolutional NNs and Recurrent NNs
Week 12: GANs and Attention Mechanisms
Week 13: Inverse Photonic Design with NNs
Week 14: Adjoint Method for Inverse Photonic Design
Week 15: Optical Neural Networks: Solving Physical Equations with Photonics
Week 16: Final Project Presentations
References
Grading
- Participation %10
- Homework (5 assignments) %30
- Project I (Solving a forward problem with ML) %30
- Project II (Solving an inverse design problem) %30
Grading Scale (%): 90 – 100 A, 85 – 89 A-, 80 – 84 B+, 75 – 79 B, 70 – 74 B-, 65 – 69 C+, 60 – 64 C, 55 – 59 C-, 50 – 54 D, 0 – 49 F.
Notes: Homework assignments and projects can be completed using either Matlab or Python. The final version of the code you submit should be in good working condition, e.g. it should work from beginning to end without the instructor’s or TA’s input. The use of git and GitHub is highly recommended. A brief introduction to git will be provided.
Group projects are more than welcome as long the scope of the project is too challenging for one student.
Late Work Policy: For late homework/project submissions, 10 points will be deduced for each day late and late submissions will not be accepted if overdue by more than 6 days.
Attendance Policy: Regular class attendance is required and necessary for students to understand many of the topics covered. Students must be on time for class. If missed a class, it is the responsibility of the student to find out the materials covered. If you are going to miss a class, please inform the instructor in advance and do not forget to watch the lecture recording later.