Name | Barak Fishbain |
Date | 29.12.19 |
Faculty | CEE |
Title | Mathematical Models for Air and Water Quality through Wireless Distributed Sensor Networks |
Web page | https://fishbain.net.technion.ac.il/ |
fishbain@technion.ac.il | |
Study materials | Barak Fishbain 29.12.19 |
Category: featuredposts
Flow, deformation and, reaction in porous media: The preferential flow case
Name | Yaniv Edery |
Date | 5.1.20 |
Faculty | CEE |
Title | Flow, deformation and, reaction in porous media: The preferential flow case |
Web page | https://sites.google.com/view/pmvlab |
ysnivedery@technion.ac.il | |
Study materials | Yaniv Edery 5.1.20 |
Modeling Complex Systems with Object-Process Methodology
Name | Dov Dori |
Date | 15-12-19 |
Faculty | Industrial Engineering and Management |
Title | Modeling Complex Systems with Object-Process Methodology |
Web page | http://esml.iem.technion.ac.il/ |
dori@technion.ac.il | |
Study materials | Dov Dori 15.12.19 |
Analysis of single-cell RNAseq in order to learn about brain function
Name | Amit Zeisel |
Date | 08-12-19 |
Faculty | Biotechnolgy Eng. |
Title | Analysis of single-cell RNAseq in order to learn about brain function |
Web page | zeisellab.org |
amit.zeisel@technion.ac.il | |
Study materials | Amit Zeisel 8.12.19 |
Graph Theory for Multi-Agent and Networked Systems
Name | Daniel Zelazo |
Date | 1.12.19 |
Faculty | Aerospace Eng. |
Title | Graph Theory for Multi-Agent and Networked Systems |
Web page | https://zelazo.net.technion.ac.il/ |
dzelazo@technion.ac.il | |
Study materials | Daniel Zelazo 1.12.19 |
The onset of chaos in nonlinear nano – resonators
Name | Oded Gottlieb |
Date | 24.11.19 |
Faculty | Mechanical Eng. |
Title | The onset of chaos in nonlinear nano- resonators |
Web page | https://ncds.technion.ac.il/ |
oded@technion.ac.il | |
Study materials | Oded Gottlieb 24.11.19 |
Why Neural Networks converge to “simple” solutions?
Abstract:
Since 2012, deep neural networks are having an impressive practical success in many domains, yet their theoretical properties are not well understood. I will discuss why does neural network optimization, which based on local greedy steps, tend to converge to:
1) A global minimum, while many local minima exist.
2) A specific “good” global minimum in which the network function is surprisingly “simple” (while many “bad” global minima exist).
Name | Daniel Soudry |
Date | 10-11-19 |
Faculty | EE |
Title | Why Neural Networks converge to “simple” solutions? |
Web page | https://sites.google.com/site/danielsoudry/ |
daniel.soudry@gmail.com | |
Study materials | About the deep learning era |
Projection-free Optimization and Learning
Name | Dan Garber |
Date | 3-11-19 |
Faculty | IE&M |
Title | Projection-free Optimization and Learning |
Web page | https://dangar.net.technion.ac.il/ |
dangar@technion.ac.il | |
Study materials | Dan Graber 3.11 |
Intro
Name | Nir gavish |
Date | 27-10-19 |
Faculty | Mathematics |
Title | Intro |
Web page | ngavish.net.technion.ac.il |
ngavish@technion.ac.il | |
Study materials | Intro |