-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathindex.html
227 lines (195 loc) · 12.2 KB
/
index.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
<!DOCTYPE HTML>
<!--
Phantom by HTML5 UP
html5up.net | @ajlkn
Free for personal and commercial use under the CCA 3.0 license (html5up.net/license)
-->
<html>
<head>
<title>A Primer on PAC-Bayesian Learning</title>
<meta charset="utf-8" />
<meta name="viewport" content="width=device-width, initial-scale=1" />
<!--[if lte IE 8]><script src="assets/js/ie/html5shiv.js"></script><![endif]-->
<link rel="stylesheet" href="assets/css/main.css" />
<!--[if lte IE 9]><link rel="stylesheet" href="assets/css/ie9.css" /><![endif]-->
<!--[if lte IE 8]><link rel="stylesheet" href="assets/css/ie8.css" /><![endif]-->
</head>
<body>
<!-- Wrapper -->
<div id="wrapper">
<!-- Header -->
<header id="header">
<div class="inner">
<!-- Logo -->
<a href="index.html" class="logo">
<span class="symbol"><img src="images/logo.svg" alt="" /></span><span class="title">ICML 2019 tutorial</span>
</a>
<!-- Nav -->
<nav>
<ul>
<li><a href="#menu">Menu</a></li>
</ul>
</nav>
</div>
</header>
<!-- Menu -->
<nav id="menu">
<h2>Menu</h2>
<ul>
<li><a href="index.html">Home</a></li>
<!-- <li><a href="generic.html">Ipsum veroeros</a></li>
<li><a href="generic.html">Tempus etiam</a></li>
<li><a href="generic.html">Consequat dolor</a></li>
<li><a href="elements.html">Elements</a></li> -->
</ul>
</nav>
<!-- Main -->
<div id="main">
<div class="inner">
<header>
<h1>
A Primer on PAC-Bayesian Learning
</h1>
<h2>
Long Beach, CA, USA - June 10, 2019
</h2>
<p></p>
</header>
<h2>
Abstract
</h2>
<p>
PAC-Bayesian inequalities were introduced by McAllester (<a href="">1998</a>, <a href="">1999</a>), following earlier remarks by <a href="">Shawe-Taylor and Williamson (1997)</a>. The goal was to produce PAC-type risk bounds for Bayesian-flavored estimators. The acronym PAC stands for Probably Approximately Correct and may be traced back to <a href="">Valiant (1984)</a>. This framework allows to consider not only classical Bayesian estimators, but rather any randomized procedure from a data-dependent distribution.</p>
<p>
Over the past few years, the PAC-Bayesian approach has been applied to numerous settings, including classification, high-dimensional sparse regression, image denoising and reconstruction of large random matrices, recommendation systems and collaborative filtering, binary ranking, online ranking, transfer learning, multiview learning, signal processing, physics, to name but a few.
<!-- \citep{guedj2013pac,icml2016,alquier2017oracle,alquier2016simpler,dziugaite2017computing,rivasplata2018pac,dziugaite2018data,dziugaite2017entropy,nguyen2018pac}. -->
The <a href="https://arxiv.org/search/?query=PAC-Bayes&searchtype=all&source=header">"PAC-Bayes"</a> query on arXiv illustrates how PAC-Bayes is quickly re-emerging as a principled theory to efficiently address modern machine learning topics, such as leaning with heavy-tailed and dependent data, or deep neural networks generalisation abilities.
</p>
<h2>
Topics
</h2>
<p>
The tutorial aims at providing the ICML audience with a comprehensive overview of PAC-Bayes, starting from statistical learning theory (complexity terms analysis, generalisation and oracle bounds) and covering algorithmic (actual implementation of PAC-Bayesian algorithms) developments, up to the most recent PAC-Bayesian analyses of deep neural networks generalisation abilities. The PAC-Bayesian framework is the backbone to several influential contributions to statistical learning theory and deep learning and we believe it is time to address again this theory in a full tutorial. We intend to address the largest audience, with an elementary background in probability theory and statistical learning, although all key concepts will be covered from scratch.
</p>
<h2>
Speakers<br />
</h2>
<p>
Benjamin Guedj is a tenured research scientist at Inria (France) and a senior research scientist at University College London (UK). His main research areas are statistical learning theory, PAC-Bayes, machine learning and computational statistics. Benjamin Guedj obtained a PhD from Université Pierre et Marie Curie (France) in 2013.
</p>
<p>
John Shawe-Taylor is a professor at University College London (UK) where he is Director of the Centre for Computational Statistics and Machine Learning (CSML). His main research area is statistical learning theory, with contributions to neural networks, machine learning, and graph theory. John Shawe-Taylor obtained a PhD in Mathematics at Royal Holloway, University of London in 1986. He has published over 150 research papers, and his pioneering work has initiated the PAC-Bayesian theory, to which he has made many later contributions. He has coordinated a number of European wide projects investigating the theory and practice of machine learning.
</p>
<section class="tiles">
<article class="style1">
<span class="image">
<img src="images/bg.jpg" alt="" />
</span>
<a href="https://bguedj.github.io">
<h2>Benjamin Guedj</h2>
<div class="content">
<p>Principal Research Fellow at University College London and Inria</p>
</div>
</a>
</article>
<article class="style2">
<span class="image">
<img src="images/jst.jpg" alt="" />
</span>
<a href="http://www0.cs.ucl.ac.uk/staff/J.Shawe-Taylor/">
<h2>John Shawe-Taylor</h2>
<div class="content">
<p>Professor at University College London, Head of the Department of Computer Science</p>
</div>
</a>
</article>
</section>
<h2>
Material<br />
</h2>
<p>
Keywords: Statistical learning theory, PAC-Bayes, machine learning, computational statistics
</p>
<p>
<a href="material/main.pdf">Slides are available here</a>.
</p>
<p>
Videos are available here: <a href="https://www.facebook.com/icml.imls/videos/2160537770667911/">Part 1</a> <a href="https://www.facebook.com/icml.imls/videos/318683639013879/">Part 2</a> <a href="https://videoken.com/embed/_NuMUQYprn0">also here</a>
</p>
<!-- <p>
Benjamin Guedj has recently uploaded on arXiv a survey on PAC-Bayesian learning which is the backbone to this tutorial proposal \citep{guedj2019primer}\footnote{\href{https://arxiv.org/abs/1901.05353}{https://arxiv.org/abs/1901.05353}}. Benjamin Guedj has given more than 40 talks on PAC-Bayesian learning since 2011, in seminars, workshop and international conferences (for more details, see \href{https://www.dropbox.com/s/en9antsq9wxi3ka/CV-BGuedj.pdf?dl=0}{his resume}). In particular, he has recently given a one hour tutorial on PAC-Bayesian learning at the \href{https://sites.google.com/view/bigworkshop/home}{2nd Italian-French Statistics Seminar} (IFSS2018), in September 2018. The slides are fairly representative of some of the content which would be addressed by the tutorial.
\medskip
\href{https://sites.google.com/view/bigworkshop/home}{Link to the IFSS workshop website}
\href{https://www.dropbox.com/s/bcrbf9mqsh7a45j/bguedj.pdf?dl=0}{Link to the slides of the tutorial by B. Guedj}
\medskip
Benjamin Guedj has organized (with Francis Bach and Pascal Germain) a NIPS 2017 workshop on PAC-Bayesian learning, called \href{https://bguedj.github.io/nips2017/50shadesbayesian.html}{(Almost) 50 Shades of Bayesian Learning: PAC-Bayesian trends and insights}. In particular, several talks addressed the promising connection between PAC-Bayes and deep neural networks, and the one hour tutorial by François Laviolette also gives a glance at the material which would be covered by the present tutorial proposal. John Shawe-Taylor was one of the plenary speakers of the workshop and his talk also covers relevant material to this tutorial proposal.
\medskip
\href{https://bguedj.github.io/nips2017/50shadesbayesian.html}{Link to the NIPS 2017 workshop website}
\href{https://bguedj.github.io/nips2017/pdf/laviolette_nips2017.pdf}{Link to the slides of the tutorial by F. Laviolette}
\href{https://www.youtube.com/watch?v=GnRX9Pvw6Xw&feature=youtu.be}{Link to the video of the tutorial by F. Laviolette}
\href{https://bguedj.github.io/nips2017/pdf/shawe-taylor_nips2017.pdf}{Link to the slides of the talk by J. Shawe-Taylor}
\href{https://www.youtube.com/watch?v=1WlFPR7vNbo&feature=youtu.be}{Link to the video of the talk by J. Shawe-Taylor}
\medskip
Last but not least, John Shawe-Taylor gave an invited tutorial at NeurIPS 2018 (together with Omar Rivasplata), on statistical learning theory. The content of this tutorial is also extremely relevant to this proposal.
\medskip
\href{https://media.neurips.cc/Conferences/NIPS2018/Slides/stastical_learning_theory.pdf}{Link to the slides of the NeurIPS 2018 tutorial by J. Shawe-Taylor}
\href{https://videoken.com/embed/Bv5gzFZS5OI}{Link to the video of the NeurIPS 2018 tutorial by J. Shawe-Taylor}
</p> -->
</div>
</div>
<!-- Footer -->
<footer id="footer">
<div class="inner">
<!-- <section>
<h2>Get in touch</h2>
<form method="post" action="#">
<div class="field half first">
<input type="text" name="name" id="name" placeholder="Name" />
</div>
<div class="field half">
<input type="email" name="email" id="email" placeholder="Email" />
</div>
<div class="field">
<textarea name="message" id="message" placeholder="Message"></textarea>
</div>
<ul class="actions">
<li><input type="submit" value="Send" class="special" /></li>
</ul>
</form>
</section> -->
<section>
<h2>Contact us</h2>
<ul class="icons">
<!-- <li><a href="#" class="icon style2 fa-twitter"><span class="label">Twitter</span></a></li> -->
<!-- <li><a href="#" class="icon style2 fa-facebook"><span class="label">Facebook</span></a></li> -->
<!-- <li><a href="#" class="icon style2 fa-instagram"><span class="label">Instagram</span></a></li> -->
<!-- <li><a href="#" class="icon style2 fa-dribbble"><span class="label">Dribbble</span></a></li> -->
<!-- <li><a href="#" class="icon style2 fa-github"><span class="label">GitHub</span></a></li> -->
<!-- <li><a href="#" class="icon style2 fa-500px"><span class="label">500px</span></a></li> -->
<!-- <li><a href="#" class="icon style2 fa-phone"><span class="label">Phone</span></a></li> -->
<li><a href="mailto:[email protected][email protected]" class="icon style2 fa-envelope-o"><span class="label">Email</span></a></li>
</ul>
</section>
<ul class="copyright">
<li>© Benjamin Guedj. All rights reserved</li><li>Design: <a href="http://html5up.net">HTML5 UP</a></li>
</ul>
</div>
</footer>
</div>
<!-- Scripts -->
<script src="assets/js/jquery.min.js"></script>
<script src="assets/js/skel.min.js"></script>
<script src="assets/js/util.js"></script>
<!--[if lte IE 8]><script src="assets/js/ie/respond.min.js"></script><![endif]-->
<script src="assets/js/main.js"></script>
<!-- Google Analytics -->
<script>
(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
(i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
})(window,document,'script','https://www.google-analytics.com/analytics.js','ga');
ga('create', 'UA-81430854-1', 'auto');
ga('send', 'pageview');
</script>
</body>
</html>