-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathindex.json
1 lines (1 loc) · 36.6 KB
/
index.json
1
[{"authors":null,"categories":null,"content":"I am a research scientist at the Scalable Machine Learning team at Yahoo! research, working on ML applications in AdTech. My research focuses on machine learning and probabilistic methods, and has been published in multiple scientific articles.\nBefore joining Yahoo! Research, I was a postdoctoral researcher at the Sierra team at INRIA Paris, advised by Francis Bach. I pursued my PhD with Philipp Hennig in Tübingen working on probabilistic numerics, a field that treats computation as machine learning. Together with Philipp Hennig and Mike Osborne, I wrote a textbook on probabilistic numerics. During my PhD, I did internships at Amazon Research and the Bosch Center for Artificial Intelligence. I have a background in probability theory and stochastic analysis. I am passionate about communicating ideas and hope to contribute to the beneficial use of AI accross society.\nOutside of work, you can find me meditating, reading books, and doing sports.\n Download my CV.\n","date":1654819200,"expirydate":-62135596800,"kind":"term","lang":"en","lastmod":1654819200,"objectID":"2525497d367e79493fd32b198b28f040","permalink":"https://hanskersting.github.io/author/hans-kersting/","publishdate":"0001-01-01T00:00:00Z","relpermalink":"/author/hans-kersting/","section":"authors","summary":"I am a research scientist at the Scalable Machine Learning team at Yahoo! research, working on ML applications in AdTech. My research focuses on machine learning and probabilistic methods, and has been published in multiple scientific articles.","tags":null,"title":"Hans Kersting","type":"authors"},{"authors":null,"categories":null,"content":" Table of Contents What you will learn Program overview Courses in this program Meet your instructor FAQs What you will learn Fundamental Python programming skills Statistical concepts and how to apply them in practice Gain experience with the Scikit, including data visualization with Plotly and data wrangling with Pandas Program overview The demand for skilled data science practitioners is rapidly growing. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Duis posuere tellus ac convallis placerat. Proin tincidunt magna sed ex sollicitudin condimentum. Sed ac faucibus dolor, scelerisque sollicitudin nisi.\nCourses in this program Python basics Build a foundation in Python. Visualization Learn how to visualize data with Plotly. Statistics Introduction to statistics for data science. Meet your instructor Hans Kersting FAQs Are there prerequisites? There are no prerequisites for the first course.\n How often do the courses run? Continuously, at your own pace.\n Begin the course ","date":1611446400,"expirydate":-62135596800,"kind":"section","lang":"en","lastmod":1611446400,"objectID":"59c3ce8e202293146a8a934d37a4070b","permalink":"https://hanskersting.github.io/courses/example/","publishdate":"2021-01-24T00:00:00Z","relpermalink":"/courses/example/","section":"courses","summary":"An example of using Wowchemy's Book layout for publishing online courses.","tags":null,"title":"📊 Learn Data Science","type":"book"},{"authors":null,"categories":null,"content":"Build a foundation in Python.\n 1-2 hours per week, for 8 weeks\nLearn Quiz What is the difference between lists and tuples? Lists\n Lists are mutable - they can be changed Slower than tuples Syntax: a_list = [1, 2.0, \u0026#39;Hello world\u0026#39;] Tuples\n Tuples are immutable - they can’t be changed Tuples are faster than lists Syntax: a_tuple = (1, 2.0, \u0026#39;Hello world\u0026#39;) Is Python case-sensitive? Yes\n","date":1609459200,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":1609459200,"objectID":"17a31b92253d299002593b7491eedeea","permalink":"https://hanskersting.github.io/courses/example/python/","publishdate":"2021-01-01T00:00:00Z","relpermalink":"/courses/example/python/","section":"courses","summary":"Build a foundation in Python.\n","tags":null,"title":"Python basics","type":"book"},{"authors":null,"categories":null,"content":"Learn how to visualize data with Plotly.\n 1-2 hours per week, for 8 weeks\nLearn Quiz When is a heatmap useful? Lorem ipsum dolor sit amet, consectetur adipiscing elit.\n Write Plotly code to render a bar chart import plotly.express as px data_canada = px.data.gapminder().query(\u0026#34;country == \u0026#39;Canada\u0026#39;\u0026#34;) fig = px.bar(data_canada, x=\u0026#39;year\u0026#39;, y=\u0026#39;pop\u0026#39;) fig.show() ","date":1609459200,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":1609459200,"objectID":"1b341b3479c8c6b1f807553b77e21b7c","permalink":"https://hanskersting.github.io/courses/example/visualization/","publishdate":"2021-01-01T00:00:00Z","relpermalink":"/courses/example/visualization/","section":"courses","summary":"Learn how to visualize data with Plotly.\n","tags":null,"title":"Visualization","type":"book"},{"authors":null,"categories":null,"content":"Introduction to statistics for data science.\n 1-2 hours per week, for 8 weeks\nLearn The general form of the normal probability density function is:\n$$ f(x) = \\frac{1}{\\sigma \\sqrt{2\\pi} } e^{-\\frac{1}{2}\\left(\\frac{x-\\mu}{\\sigma}\\right)^2} $$\n The parameter $\\mu$ is the mean or expectation of the distribution. $\\sigma$ is its standard deviation. The variance of the distribution is $\\sigma^{2}$. Quiz What is the parameter $\\mu$? The parameter $\\mu$ is the mean or expectation of the distribution.\n","date":1609459200,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":1609459200,"objectID":"6f4078728d71b1b791d39f218bf2bdb1","permalink":"https://hanskersting.github.io/courses/example/stats/","publishdate":"2021-01-01T00:00:00Z","relpermalink":"/courses/example/stats/","section":"courses","summary":"Introduction to statistics for data science.\n","tags":null,"title":"Statistics","type":"book"},{"authors":["Antonio Orvieto","Anant Raj","Hans Kersting","Francis Bach"],"categories":[],"content":"","date":1654819200,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":1654819200,"objectID":"f07b216aacb3df8c626abd4d16048de7","permalink":"https://hanskersting.github.io/publication/explicit-regularization-in-overparametrized-models-via-noise-injection/","publishdate":"2022-06-10T12:38:00Z","relpermalink":"/publication/explicit-regularization-in-overparametrized-models-via-noise-injection/","section":"publication","summary":"Injecting noise within gradient descent has several desirable features. In this paper, we explore noise injection before computing a gradient step, which is known to have smoothing and regularizing properties. We show that small perturbations induce explicit regularization for simple finite-dimensional models based on the l1-norm, group l1-norms, or nuclear norms. When applied to overparametrized neural networks with large widths, we show that the same perturbations do not work due to variance explosion resulting from overparametrization. However, we also show that independent layer wise perturbations allow to avoid the exploding variance term, and explicit regularizers can then be obtained. We empirically show that the small perturbations lead to better generalization performance than vanilla (stochastic) gradient descent training, with minor adjustments to the training procedure.","tags":["optimization","generalization","noise injection"],"title":"Explicit Regularization in Overparametrized Models via Noise Injection","type":"publication"},{"authors":["Hans Kersting"],"categories":["General"],"content":"This new paper is a follow-up to our Anti-PGD paper (ICML 2022). This time we show that small perturbations induce explicit regularization, which we spell out for a few models. Many new research avenues open up from our Theorem 2.\n","date":1654819200,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":1654819200,"objectID":"0d9c46541b746ebdbc5de4c2611ec08a","permalink":"https://hanskersting.github.io/post/explicitnoiseinjection_paper_out/","publishdate":"2022-06-10T00:00:00Z","relpermalink":"/post/explicitnoiseinjection_paper_out/","section":"post","summary":"This new paper is a follow-up to our Anti-PGD paper (ICML 2022). This time we show that small perturbations induce explicit regularization, which we spell out for a few models. Many new research avenues open up from our Theorem 2.","tags":["non-convex optimization","stochastic gradient descent","generalization"],"title":"New preprint on regularization properties of noise injection out!","type":"post"},{"authors":["Philipp Hennig","Michael A. Osborne","Hans Kersting"],"categories":[],"content":"","date":1654041600,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":1654041600,"objectID":"f5a03599deedc56b7ec6be7dcbfa5401","permalink":"https://hanskersting.github.io/publication/pn_book/","publishdate":"2022-03-30T12:56:00Z","relpermalink":"/publication/pn_book/","section":"publication","summary":"Probabilistic numerical computation formalises the connection between machine learning and applied mathematics. Numerical algorithms approximate intractable quantities from computable ones. They estimate integrals from evaluations of the integrand, or the path of a dynamical system described by differential equations from evaluations of the vector field. In other words, they infer a latent quantity from data. This book shows that it is thus formally possible to think of computational routines as learning machines, and to use the notion of Bayesian inference to build more flexible, efficient, or customised algorithms for computation. The text caters for Masters' and PhD students, as well as postgraduate researchers in artificial intelligence, computer science, statistics, and applied mathematics. Extensive background material is provided along with a wealth of figures, worked examples, and exercises (with solutions) to develop intuition.","tags":["probabilistic numerics","machine learning","Bayesian inference","numerical analysis"],"title":"Probabilistic Numerics - Computation as Machine Learning","type":"publication"},{"authors":["Hans Kersting"],"categories":["General"],"content":"I’m happy that our paper on AntiPGD, a version of stochastic gradient descent using anticorrelated perturbations, got accpeted to ICML 2022 in Baltimore, USA. You can read the PDF here. Looking forward to attend my first ML conference after Covid, present this paper, and have some scientific discussions.\n","date":1653609600,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":1653609600,"objectID":"db18efc76e4ea4c70f5f99ac5ead42ad","permalink":"https://hanskersting.github.io/post/antipgd_paper_accepted/","publishdate":"2022-05-27T00:00:00Z","relpermalink":"/post/antipgd_paper_accepted/","section":"post","summary":"I’m happy that our paper on AntiPGD, a version of stochastic gradient descent using anticorrelated perturbations, got accpeted to ICML 2022 in Baltimore, USA. You can read the PDF here. Looking forward to attend my first ML conference after Covid, present this paper, and have some scientific discussions.","tags":["non-convex optimization","stochastic gradient descent","generalization"],"title":"Our paper on anticorrelated noise injection got accepted to ICML 2022","type":"post"},{"authors":["Hans Kersting"],"categories":["General"],"content":"Our book on Probabilistic Numerics will be out in June 2022. It can already be preordered here. A PDF will appear on the probabilistic-numerics website.\n","date":1649203200,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":1649203200,"objectID":"537fa5d202a2ce668a9492f4ffbab6d8","permalink":"https://hanskersting.github.io/post/book_on_amazon/","publishdate":"2022-04-06T00:00:00Z","relpermalink":"/post/book_on_amazon/","section":"post","summary":"Available in June 2022.","tags":["Probabilistic Numerics"],"title":"Our book on Probabilistic Numerics can now be preordered.","type":"post"},{"authors":["Antonio Orvieto","Hans Kersting","Frank Proske","Francis Bach","Aurelien Lucchi"],"categories":[],"content":"","date":1644105600,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":1644105600,"objectID":"b5a5f68aa3743d6868e3d00c643e13e2","permalink":"https://hanskersting.github.io/publication/anticorrelated-noise/","publishdate":"2022-03-27T16:46:00Z","relpermalink":"/publication/anticorrelated-noise/","section":"publication","summary":"Injecting artificial noise into gradient descent (GD) is commonly employed to improve the performance of machine learning models. Usually, uncorrelated noise is used in such perturbed gradient descent (PGD) methods. It is, however, not known if this is optimal or whether other types of noise could provide better generalization performance. In this paper, we zoom in on the problem of correlating the perturbations of consecutive PGD steps. We consider a variety of objective functions for which we find that GD with anticorrelated perturbations (\"Anti-PGD\") generalizes significantly better than GD and standard (uncorrelated) PGD. To support these experimental findings, we also derive a theoretical analysis that demonstrates that Anti-PGD moves to wider minima, while GD and PGD remain stuck in suboptimal regions or even diverge. This new connection between anticorrelated noise and generalization opens the field to novel ways to exploit noise for training machine learning models.","tags":["optimization","generalization","noise injection"],"title":"Anticorrelated Noise Injection for Improved Generalization","type":"publication"},{"authors":["Hans Kersting"],"categories":null,"content":"","date":1635257700,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":1635257700,"objectID":"291fb8b15b9e9f6aec5dc757ace1659b","permalink":"https://hanskersting.github.io/talk/ode-filters-forward-and-backward/","publishdate":"2022-03-23T14:28:00Z","relpermalink":"/talk/ode-filters-forward-and-backward/","section":"event","summary":"ODE filters and smoothers are well-established probabilistic numerical methods that solve initial value problems in linear time. In this talk, we add to Monday’s talks on these methods in the following way. First, we discuss how a previous state space model can be thought of as a linear-Gaussian approximation to the new state space model. Second, we discuss classical convergence rates for the integrated-Wiener process prior-—-as well as equivalences with classical methods and their convergence rates. Third, we show how ODE filters give speed-ups in ODE inverse problems, a first instance of a computational chain communicating uncertainty.","tags":["probabilistic numerics","numerical analysis","ODE filtering"],"title":"ODE Filters — Forward and Backward","type":"event"},{"authors":["Hans Kersting"],"categories":null,"content":"","date":1625574600,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":1625574600,"objectID":"e7b7b2ab5feb26fa74b6f271cd960a16","permalink":"https://hanskersting.github.io/talk/uncertainty-aware-numerical-solutions-of-odes-by-bayesian-filtering/","publishdate":"2022-03-22T12:11:00Z","relpermalink":"/talk/uncertainty-aware-numerical-solutions-of-odes-by-bayesian-filtering/","section":"event","summary":"Numerical approximations can be regarded as statistical inference, if one interprets the solution of the numerical problem as a parameter in a statistical model whose likelihood links it to the information ('data') available from evaluating functions. This view is advocated by the field of Probabilistic Numerics and has already yielded two successes, Bayesian Optimization and Bayesian Quadrature. In an analogous manner, we construct a Bayesian probabilistic-numerical method for ODEs. To this end, we construct a probabilistic state space model for ODEs which enables us to borrow the machinery of Bayesian filtering. This unlocks the application of all Bayesian filters from signal processing to ODEs, which we name ODE filters. We theoretically analyse the convergence rates of the most elementary one, the Kalman ODE filter and discuss its uncertainty quantification. Lastly, we demonstrate how employing these ODE filters as forward simulators engenders new ODE inverse problem solvers that outperform its classical 'likelihood-free’ counterparts.","tags":["probabilistic numerics","numerical analysis","ODE filtering"],"title":"Uncertainty-Aware Numerical Solutions of ODEs by Bayesian Filtering","type":"event"},{"authors":["Hans Kersting"],"categories":[],"content":"","date":1619014450,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":1619014450,"objectID":"3bbaff791be1d67f99d1269bde41adc9","permalink":"https://hanskersting.github.io/publication/phd-thesis/","publishdate":"2022-03-27T16:30:00Z","relpermalink":"/publication/phd-thesis/","section":"publication","summary":"We formulate probabilistic numerical approximations to solutions of ordinary differential equations (ODEs) as problems in Gaussian process (GP) regression with nonlinear measurement functions. This is achieved by defining the measurement sequence to consist of the observations of the difference between the derivative of the GP and the vector field evaluated at the GP—--which are all identically zero at the solution of the ODE. When the GP has a state-space representation, the problem can be reduced to a nonlinear Bayesian filtering problem and all widely used approximations to the Bayesian filtering and smoothing problems become applicable. Furthermore, all previous GP-based ODE solvers that are formulated in terms of generating synthetic measurements of the gradient field come out as specific approximations. Based on the nonlinear Bayesian filtering problem posed in this paper, we develop novel Gaussian solvers for which we establish favourable stability properties. Additionally, non-Gaussian approximations to the filtering problem are derived by the particle filter approach. The resulting solvers are compared with other probabilistic solvers in illustrative experiments.","tags":["ODE filters","probabilistic numerics","machine learning"],"title":"Uncertainty-Aware Numerical Solutions of ODEs by Bayesian Filtering","type":"publication"},{"authors":["Hans Kersting","Tim J. Sullivan","Philipp Hennig"],"categories":[],"content":"","date":1604188800,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":1604188800,"objectID":"14fd297d65136e9580a491ef05adb2a9","permalink":"https://hanskersting.github.io/publication/convergence-rates/","publishdate":"2022-03-27T16:00:00Z","relpermalink":"/publication/convergence-rates/","section":"publication","summary":"A recently-introduced class of probabilistic (uncertainty-aware) solvers for ordinary differential equations (ODEs) applies Gaussian (Kalman) filtering to initial value problems. These methods model the true solution $x$ and its first $q$ derivatives _a priori_ as a Gauss--Markov process $\\boldsymbol{X}$, which is then iteratively conditioned on information about $\\dot{x}$. This article establishes worst-case local convergence rates of order $q+1$ for a wide range of versions of this Gaussian ODE filter, as well as global convergence rates of order $q$ in the case of $q=1$ and an integrated Brownian motion prior, and analyses how inaccurate information on $\\dot{x}$ coming from approximate evaluations of $f$ affects these rates. Moreover, we show that, in the globally convergent case, the posterior credible intervals are well calibrated in the sense that they globally contract at the same rate as the truncation error. We illustrate these theoretical results by numerical experiments which might indicate their generalizability to $q \\in \\{2,3,\\dots\\}$.","tags":["ODE filters","convergence rates","probabilistic numerics"],"title":"Convergence Rates of Gaussian ODE Filters","type":"publication"},{"authors":["Hans Kersting","Maren Mahsereci"],"categories":[],"content":"","date":1595030400,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":1595030400,"objectID":"e21072a85a863d41d0604d47ca30d8af","permalink":"https://hanskersting.github.io/publication/fourier-ssm/","publishdate":"2022-03-26T19:17:00Z","relpermalink":"/publication/fourier-ssm/","section":"publication","summary":"Gaussian ODE filtering is a probabilistic numerical method to solve ordinary differential equations (ODEs). It computes a Bayesian posterior over the solution from evaluations of the vector field defining the ODE. Its most popular version, which employs an integrated Brownian motion prior, uses Taylor expansions of the mean to extrapolate forward and has the same convergence rates as classical numerical methods. As the solution of many important ODEs are periodic functions (oscillators), we raise the question whether Fourier expansions can also be brought to bear within the framework of Gaussian ODE filtering. To this end, we construct a Fourier state space model for ODEs and a `hybrid' model that combines a Taylor (Brownian motion) and Fourier state space model. We show by experiments how the hybrid model might become useful in cheaply predicting until the end of the time domain.","tags":["ODE filters","Fourier series","model selection"],"title":"A Fourier State Space Model for Bayesian ODE Filters","type":"publication"},{"authors":["Hans Kersting"],"categories":null,"content":"","date":1594720800,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":1594720800,"objectID":"353ae738b50c9779e0ddf6ca6c71ce61","permalink":"https://hanskersting.github.io/talk/differentiable-likelihoods-for-fast-inversion-of-likelihood-free-dynamical-systems/","publishdate":"2022-03-22T12:11:00Z","relpermalink":"/talk/differentiable-likelihoods-for-fast-inversion-of-likelihood-free-dynamical-systems/","section":"event","summary":"Likelihood-free (a.k.a. simulation-based) inference problems are inverse problems with expensive, or intractable, forward models. ODE inverse problems are commonly treated as likelihood-free, as their forward map has to be numerically approximated by an ODE solver. This, however, is not a fundamental constraint but just a lack of functionality in classic ODE solvers, which do not return a likelihood but a point estimate. To address this shortcoming, we employ Gaussian ODE filtering (a probabilistic numerical method for ODEs) to construct a local Gaussian approximation to the likelihood. This approximation yields tractable estimators for the gradient and Hessian of the (log-) likelihood. Insertion of these estimators into existing gradient-based optimization and sampling methods engenders new solvers for ODE inverse problems. We demonstrate that these methods outperform standard likelihood-free approaches on three benchmark-systems.","tags":["probabilistic numerics","inverse problems","ODE filtering"],"title":"Differentiable Likelihoods for Fast Inversion of ’Likelihood-Free’ Dynamical Systems","type":"event"},{"authors":["Hans Kersting","Nicholas Krämer","Martin Schiegg","Christian Daniel","Michael Tiemann","Philipp Hennig"],"categories":[],"content":"","date":1594684800,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":1594684800,"objectID":"32427e27a7fd4f834d7f8300df329226","permalink":"https://hanskersting.github.io/publication/differentiable-likelihoods/","publishdate":"2022-03-26T21:06:00Z","relpermalink":"/publication/differentiable-likelihoods/","section":"publication","summary":"Likelihood-free (a.k.a. simulation-based) inference problems are inverse problems with expensive, or intractable, forward models. ODE inverse problems are commonly treated as likelihood-free, as their forward map has to be numerically approximated by an ODE solver. This, however, is not a fundamental constraint but just a lack of functionality in classic ODE solvers, which do not return a likelihood but a point estimate. To address this shortcoming, we employ Gaussian ODE filtering (a probabilistic numerical method for ODEs) to construct a local Gaussian approximation to the likelihood. This approximation yields tractable estimators for the gradient and Hessian of the (log-)likelihood. Insertion of these estimators into existing gradient-based optimization and sampling methods engenders new solvers for ODE inverse problems. We demonstrate that these methods outperform standard likelihood-free approaches on three benchmark-systems.","tags":["inverse problems","ODE filters","probabilistic numerics"],"title":"Differentiable Likelihoods for Fast Inversion of ‘Likelihood-Free’ Dynamical Systems","type":"publication"},{"authors":["Filip Tronarp","Hans Kersting","Simo Särkkä","Philipp Hennig"],"categories":[],"content":"","date":1568764800,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":1568764800,"objectID":"ecaf034ebe3f235a894b0c187ece8429","permalink":"https://hanskersting.github.io/publication/probabilistic-solutions-to-odes/","publishdate":"2022-03-26T12:56:00Z","relpermalink":"/publication/probabilistic-solutions-to-odes/","section":"publication","summary":"We formulate probabilistic numerical approximations to solutions of ordinary differential equations (ODEs) as problems in Gaussian process (GP) regression with nonlinear measurement functions. This is achieved by defining the measurement sequence to consist of the observations of the difference between the derivative of the GP and the vector field evaluated at the GP—--which are all identically zero at the solution of the ODE. When the GP has a state-space representation, the problem can be reduced to a nonlinear Bayesian filtering problem and all widely used approximations to the Bayesian filtering and smoothing problems become applicable. Furthermore, all previous GP-based ODE solvers that are formulated in terms of generating synthetic measurements of the gradient field come out as specific approximations. Based on the nonlinear Bayesian filtering problem posed in this paper, we develop novel Gaussian solvers for which we establish favourable stability properties. Additionally, non-Gaussian approximations to the filtering problem are derived by the particle filter approach. The resulting solvers are compared with other probabilistic solvers in illustrative experiments.","tags":["ODE filters","stability","particle filters"],"title":"Probabilistic Solutions to Ordinary Differential Equations as Non-Linear Bayesian Filtering: A New Perspective","type":"publication"},{"authors":[],"categories":[],"content":"Create slides in Markdown with Wowchemy Wowchemy | Documentation\n Features Efficiently write slides in Markdown 3-in-1: Create, Present, and Publish your slides Supports speaker notes Mobile friendly slides Controls Next: Right Arrow or Space Previous: Left Arrow Start: Home Finish: End Overview: Esc Speaker notes: S Fullscreen: F Zoom: Alt + Click PDF Export: E Code Highlighting Inline code: variable\nCode block:\nporridge = \u0026#34;blueberry\u0026#34; if porridge == \u0026#34;blueberry\u0026#34;: print(\u0026#34;Eating...\u0026#34;) Math In-line math: $x + y = z$\nBlock math:\n$$ f\\left( x \\right) = ;\\frac{{2\\left( {x + 4} \\right)\\left( {x - 4} \\right)}}{{\\left( {x + 4} \\right)\\left( {x + 1} \\right)}} $$\n Fragments Make content appear incrementally\n{{% fragment %}} One {{% /fragment %}} {{% fragment %}} **Two** {{% /fragment %}} {{% fragment %}} Three {{% /fragment %}} Press Space to play!\nOne **Two** Three A fragment can accept two optional parameters:\n class: use a custom style (requires definition in custom CSS) weight: sets the order in which a fragment appears Speaker Notes Add speaker notes to your presentation\n{{% speaker_note %}} - Only the speaker can read these notes - Press `S` key to view {{% /speaker_note %}} Press the S key to view the speaker notes!\n Only the speaker can read these notes Press S key to view Themes black: Black background, white text, blue links (default) white: White background, black text, blue links league: Gray background, white text, blue links beige: Beige background, dark text, brown links sky: Blue background, thin dark text, blue links night: Black background, thick white text, orange links serif: Cappuccino background, gray text, brown links simple: White background, black text, blue links solarized: Cream-colored background, dark green text, blue links Custom Slide Customize the slide style and background\n{{\u0026lt; slide background-image=\u0026#34;/media/boards.jpg\u0026#34; \u0026gt;}} {{\u0026lt; slide background-color=\u0026#34;#0000FF\u0026#34; \u0026gt;}} {{\u0026lt; slide class=\u0026#34;my-style\u0026#34; \u0026gt;}} Custom CSS Example Let’s make headers navy colored.\nCreate assets/css/reveal_custom.css with:\n.reveal section h1, .reveal section h2, .reveal section h3 { color: navy; } Questions? Ask\nDocumentation\n","date":1549324800,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":1549324800,"objectID":"0e6de1a61aa83269ff13324f3167c1a9","permalink":"https://hanskersting.github.io/slides/example/","publishdate":"2019-02-05T00:00:00Z","relpermalink":"/slides/example/","section":"slides","summary":"An introduction to using Wowchemy's Slides feature.","tags":[],"title":"Slides","type":"slides"},{"authors":["Emilia Magnani","Hans Kersting","Michael Schober","Philipp Hennig"],"categories":[],"content":"","date":1506297600,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":1506297600,"objectID":"3a52af20e3153dc8f32438f2f839efc0","permalink":"https://hanskersting.github.io/publication/bayesian-filtering-for-odes-with-bounded-derivatives/","publishdate":"2022-03-26T12:38:00Z","relpermalink":"/publication/bayesian-filtering-for-odes-with-bounded-derivatives/","section":"publication","summary":"Recently there has been increasing interest in probabilistic solvers for ordinary differential equations (ODEs) that return full probability measures, instead of point estimates, over the solution and can incorporate uncertainty over the ODE at hand, e.g. if the vector field or the initial value is only approximately known or evaluable. The ODE filter proposed in recent work models the solution of the ODE by a Gauss-Markov process which serves as a prior in the sense of Bayesian statistics. While previous work employed a Wiener process prior on the (possibly multiple times) differentiated solution of the ODE and established equivalence of the corresponding solver with classical numerical methods, this paper raises the question whether other priors also yield practically useful solvers. To this end, we discuss a range of possible priors which enable fast filtering and propose a new prior---the Integrated Ornstein Uhlenbeck Process (IOUP)---that complements the existing Integrated Wiener process (IWP) filter by encoding the property that a derivative in time of the solution is bounded in the sense that it tends to drift back to zero. We provide experiments comparing IWP and IOUP filters which support the belief that IWP approximates better divergent ODE's solutions whereas IOUP is a better prior for trajectories with bounded derivatives.","tags":["Ornstein--Uhlenbeck process","ODE filters","model selection"],"title":"Bayesian Filtering for ODEs with Bounded Derivatives","type":"publication"},{"authors":["Hans Kersting","Philipp Hennig"],"categories":[],"content":"","date":1464739200,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":1464739200,"objectID":"0487bb4ede0051e00e98724e00514b36","permalink":"https://hanskersting.github.io/publication/active-uncertainty-calibration/","publishdate":"2022-03-22T16:55:48Z","relpermalink":"/publication/active-uncertainty-calibration/","section":"publication","summary":"There is resurging interest, in statistics and machine learning, in solvers for ordinary differential equations (ODEs) that return probability measures instead of point estimates. Recently, Conrad et al. introduced a sampling-based class of methods that are 'well-calibrated' in a specific sense. But the computational cost of these methods is significantly above that of classic methods. On the other hand, Schober et al. pointed out a precise connection between classic Runge--Kutta ODE solvers and Gaussian filters, which gives only a rough probabilistic calibration, but at negligible cost overhead. By formulating the solution of ODEs as approximate inference in linear Gaussian SDEs, we investigate a range of probabilistic ODE solvers, that bridge the trade-off between computational cost and probabilistic calibration, and identify the inaccurate gradient measurement as a crucial source of uncertainty. We propose the novel filtering-based method Bayesian Quadrature filtering (BQF) which uses Bayesian quadrature to actively learn the imprecision in the gradient measurement by collecting multiple gradient evaluations.","tags":["calibration","ODE filters","Bayesian quadrature"],"title":"Active Uncertainty Calibration in Bayesian ODE Solvers","type":"publication"},{"authors":null,"categories":null,"content":"","date":1461715200,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":1461715200,"objectID":"d1311ddf745551c9e117aa4bb7e28516","permalink":"https://hanskersting.github.io/project/external-project/","publishdate":"2016-04-27T00:00:00Z","relpermalink":"/project/external-project/","section":"project","summary":"An example of linking directly to an external project website using `external_link`.","tags":["Demo"],"title":"External Project","type":"project"},{"authors":null,"categories":null,"content":"Lorem ipsum dolor sit amet, consectetur adipiscing elit. Duis posuere tellus ac convallis placerat. Proin tincidunt magna sed ex sollicitudin condimentum. Sed ac faucibus dolor, scelerisque sollicitudin nisi. Cras purus urna, suscipit quis sapien eu, pulvinar tempor diam. Quisque risus orci, mollis id ante sit amet, gravida egestas nisl. Sed ac tempus magna. Proin in dui enim. Donec condimentum, sem id dapibus fringilla, tellus enim condimentum arcu, nec volutpat est felis vel metus. Vestibulum sit amet erat at nulla eleifend gravida.\nNullam vel molestie justo. Curabitur vitae efficitur leo. In hac habitasse platea dictumst. Sed pulvinar mauris dui, eget varius purus congue ac. Nulla euismod, lorem vel elementum dapibus, nunc justo porta mi, sed tempus est est vel tellus. Nam et enim eleifend, laoreet sem sit amet, elementum sem. Morbi ut leo congue, maximus velit ut, finibus arcu. In et libero cursus, rutrum risus non, molestie leo. Nullam congue quam et volutpat malesuada. Sed risus tortor, pulvinar et dictum nec, sodales non mi. Phasellus lacinia commodo laoreet. Nam mollis, erat in feugiat consectetur, purus eros egestas tellus, in auctor urna odio at nibh. Mauris imperdiet nisi ac magna convallis, at rhoncus ligula cursus.\nCras aliquam rhoncus ipsum, in hendrerit nunc mattis vitae. Duis vitae efficitur metus, ac tempus leo. Cras nec fringilla lacus. Quisque sit amet risus at ipsum pharetra commodo. Sed aliquam mauris at consequat eleifend. Praesent porta, augue sed viverra bibendum, neque ante euismod ante, in vehicula justo lorem ac eros. Suspendisse augue libero, venenatis eget tincidunt ut, malesuada at lorem. Donec vitae bibendum arcu. Aenean maximus nulla non pretium iaculis. Quisque imperdiet, nulla in pulvinar aliquet, velit quam ultrices quam, sit amet fringilla leo sem vel nunc. Mauris in lacinia lacus.\nSuspendisse a tincidunt lacus. Curabitur at urna sagittis, dictum ante sit amet, euismod magna. Sed rutrum massa id tortor commodo, vitae elementum turpis tempus. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Aenean purus turpis, venenatis a ullamcorper nec, tincidunt et massa. Integer posuere quam rutrum arcu vehicula imperdiet. Mauris ullamcorper quam vitae purus congue, quis euismod magna eleifend. Vestibulum semper vel augue eget tincidunt. Fusce eget justo sodales, dapibus odio eu, ultrices lorem. Duis condimentum lorem id eros commodo, in facilisis mauris scelerisque. Morbi sed auctor leo. Nullam volutpat a lacus quis pharetra. Nulla congue rutrum magna a ornare.\nAliquam in turpis accumsan, malesuada nibh ut, hendrerit justo. Cum sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus. Quisque sed erat nec justo posuere suscipit. Donec ut efficitur arcu, in malesuada neque. Nunc dignissim nisl massa, id vulputate nunc pretium nec. Quisque eget urna in risus suscipit ultricies. Pellentesque odio odio, tincidunt in eleifend sed, posuere a diam. Nam gravida nisl convallis semper elementum. Morbi vitae felis faucibus, vulputate orci placerat, aliquet nisi. Aliquam erat volutpat. Maecenas sagittis pulvinar purus, sed porta quam laoreet at.\n","date":1461715200,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":1461715200,"objectID":"8f66d660a9a2edc2d08e68cc30f701f7","permalink":"https://hanskersting.github.io/project/internal-project/","publishdate":"2016-04-27T00:00:00Z","relpermalink":"/project/internal-project/","section":"project","summary":"An example of using the in-built project page.","tags":["Deep Learning"],"title":"Internal Project","type":"project"},{"authors":null,"categories":null,"content":"","date":-62135596800,"expirydate":-62135596800,"kind":"page","lang":"en","lastmod":-62135596800,"objectID":"f26b5133c34eec1aa0a09390a36c2ade","permalink":"https://hanskersting.github.io/admin/config.yml","publishdate":"0001-01-01T00:00:00Z","relpermalink":"/admin/config.yml","section":"","summary":"","tags":null,"title":"","type":"wowchemycms"}]