Skip to content

Course materials for Deep Learning Foundation by Sebratec Academy

Notifications You must be signed in to change notification settings

sebratec-academy/deep-learning-foundation

Repository files navigation

sebratec logo

quality assurance Maintainability contributions welcome

Deep Learning Foundation

This repository contains the course materials of Deep Learning Foundation course, which is held at Sebratec academy, in Gothenburg, Sweden.

Course Abstract

You will be able to understand what neural networks are, how they learn, and how to use their power to solve real-world problems. You will also be introduced to research papers on the subject and will have the opportunity to work on exciting projects and present your results to industry experts, who will give you valuable feedback.

Running the materials

Enrolled students have the convenience of an environment that is ready to run these materials. You can find it at https://lab.sebratec.com.

You are also free to run these materials in your computer, if you like. You will need to install python 3, pip3 and the following packages:

  • numpy
  • pandas
  • sklearn
  • matplotlib
  • keras
  • tensorflow

Outline

Schedule Topics Learning outcomes Assignment
Week 1 Welcome, introduction, and basic neural networks In the first session, you will meet your peers and,teacher, understand what deep learning is, the history of deep learning and how it is changing the world. You will also be introduced to perceptrons, forward and back propagation,,multi-layer perceptrons.In the second session, you will be introduced to the algebra behind perceptrons, forward pass, loss functions and backpropagation, weights, and gradient descent. Lab: The algebra behind a perceptron and training process.
Week 2 Neural networks learning process In the third session, you will learn about two basic neural network models, regression and classification, how to find good hyperparameters and what is transfer learning. You will also receive orientation about the final project. In the fourth session, you will have a hands on class to put in practice what you have learned in session three. Here you will build these models, play around with hyperparameters, and try different error functions. Lab: Build a regression and a classification model, and fine tune your hyperparameters.Begin working on the final project.
Week 3 Feeding your neural networks with data In the fifth session, you will learn where data come from, how to gather it, how to prepare it to be used by a neural network by preprocessing and balancing it, and how to use your data to train, validate and test your neural network.In the sixth session, you will have a practical class focused in separating, balancing and preprocessing your datasets. Lab: Dealing with data
Week 4 Challenges faced by neural networks In the seventh session, you will learn about the challenges faced by neural networks. You will learn about underfitting and overfitting, problems caused by data and techniques to deal with these problems. In the eighth session, you will have a laboratory session focused on fixing overfitting using the knowledge you acquired in lesson seven. Lab: Create new data from your existing data using augmentation, and apply normalization and regularization techniques to your neural networks.
Week 5 Final project and graduation This final week is dedicated for project reviews, office hours and orientation. You must submit your project before the deadline. The graduation ceremony will also take place in this week. Project submission

Contributing

Contributions are more than welcome! Please make a pull request whenever you feel like you can improve this material. Just have in mind that the requirements for a pull request to be considered are:

  • Pull requests must pass the quality audit check;
  • Pull requests must not lower the project maintainability score in codeclimate.

When adding a jupyter notebook to the material, you must also:

Contributions will only be reviewed if they meet these requirements.

Releases

No releases published

Packages

No packages published