(3 hours of i nstruction!)

\n\nSpeaker: CL Kim

\nOverview: Fr om the book introduction: “Neural networks and deep learning currently pro vides the best solutions to many problems in image recognition\, speech re cognition\, and natural language processing.”

\nThis Part 1 and the planned Part 2 (late spring/early summer 2021\, to be confirmed) series of courses will teach many of the core concepts behind neural networks and d eep learning.

\nReference book: “Neural Networks and Deep Learning”
by Michael Nielsen\, http://

More from the book int
roduction: “We’ll learn the core principles behind neural networks and de
ep learning by attacking a concrete problem: the problem of teaching a com
puter to recognize handwritten digits. …it can be solved pretty well using
a simple neural network\, with just a few tens of lines of code\, and no
special libraries.”

\n“But you don’t need to be a professional progra
mmer.”

\nThe code provided is in Python\, which even if you don’t pro
gram in Python\, should be easy to understand with just a little effort.\n

Benefits of attending the series:

\n* Learn the core principl
es behind neural networks and deep learning.

\n* See a simple python
program that solves a concrete problem: teaching a computer to recognize a
handwritten digit.

\n* Improve the result through incorporating more
and more of core ideas about neural networks and deep learning.

\n*
Principle-oriented\, with worked-out proofs of fundamental equations of ba
ckpropagation for those interested.

\n* Yet hands-on practical\, with
simple code examples.

Course Background and Content: This is a l ive instructor-led introductory course on Neural Networks and Deep Learnin g. It is planned to be a two-part series of courses. The first course is c omplete by itself. It will be a pre-requisite for the planned second cours e. The class material is mostly from the highly-regarded and free online b ook “Neural Networks and Deep Learning” by Michael Nielsen\, plus addition al material such as some proofs of fundamental equations not provided in t he book\, and (in planned Part 2) touching on more recent neural network t ypes such as ResNet.

\nAgenda:

\nIntroduction to Practical Neu ral Networks and Deep Learning (Part 1)

\nFeedforward Neural Network s.

\n* Simple (Python) Network to classify a handwritten digit

\n* Learning with Gradient Descent

\n* How the backpropagation algori
thm works

\n* Improving the way neural networks learn:

\n** Cros
s-entropy cost function

\n** Softmax activation function and log-like
lihood cost function

\n** Rectified Linear Unit

\n** Overfitting
and Regularization:

\n*** L2 regularization

\n*** Dropout

\n*** Artificially expanding data set

\n*** Hyper-parameters

Introduction to Practical Neural Networks and Deep Learning (planned Part 2\, to be confirmed)

\nConvolutional Neural Networks.

\n* Loca l receptive field\, Feature map. * Pooling layer. * Simple (Python) Convol utional Neural Network to classify a handwritten digit. * Improving the ne twork\, Regularization. * Touch on more recent progress in image recogniti on\, such as Residual Network (ResNet).

\nPre-requisites: There is some heavier mathematics in proving the four fundamental equations behind backprogation\, so a basic familiarity with multivariable calculus and lin ear algebra is expected\, but nothing advanced is required. (The backpropa gation equations can be also just accepted without bothering with the proo fs since the provided python code for the simple network just makes use of the equations.)

\nSpeaker Background: CL Kim works in Software Eng ineering at CarGurus\, Inc. He has graduate degrees in Business Administra tion and in Computer and Information Science from the University of Pennsy lvania. He has previously taught for a few years the well-rated IEEE Bosto n Section class on introduction to the Android platform and API.

\nDecision (Run/Cancel) Date for this Course is Monday\, March 15

\nIEEE Members $
110

\nNon-members $130

\n
END:VEVENT
END:VCALENDAR