Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about SNGP precision matrix #330

Closed
jeffwillette opened this issue Apr 20, 2021 · 2 comments
Closed

Question about SNGP precision matrix #330

jeffwillette opened this issue Apr 20, 2021 · 2 comments

Comments

@jeffwillette
Copy link

@jereliu I have been struggling to explain to myself why the SNGP method creates a precision matrix instead of a covariance matrix. It seems that every formulation of RFF that I can find approximates the RBF kernel matrix and not its inverse.

What is it about SNGP that makes this RFF formulation approximate the RBF kernel inverse?

@jereliu
Copy link
Collaborator

jereliu commented May 11, 2021

Hey Jeff.

The reason we compute precision matrix in SNGP is that we are computing the posterior variance of the model, rather than the prior variance, and computing posterior variance involves the inverse of the kernel. For example, the posterior variance under Gaussian likelihood is:

equation

where at training time, the precision matrix update of SNGP is approximating the portion within the inversion. Of course, the exact formula used is not exactly the same as above due to the use of random features, but the general idea carries through.

For more detail, Please see Chapters 2 & 3 of the GPML book. Thanks :)

@jereliu
Copy link
Collaborator

jereliu commented May 11, 2021

Closing this issue now. Feel free to open if like to discuss more. Also happy to chat offline through email.

@jereliu jereliu closed this as completed May 11, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants