You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@jereliu I have been struggling to explain to myself why the SNGP method creates a precision matrix instead of a covariance matrix. It seems that every formulation of RFF that I can find approximates the RBF kernel matrix and not its inverse.
What is it about SNGP that makes this RFF formulation approximate the RBF kernel inverse?
The text was updated successfully, but these errors were encountered:
The reason we compute precision matrix in SNGP is that we are computing the posterior variance of the model, rather than the prior variance, and computing posterior variance involves the inverse of the kernel. For example, the posterior variance under Gaussian likelihood is:
where at training time, the precision matrix update of SNGP is approximating the portion within the inversion. Of course, the exact formula used is not exactly the same as above due to the use of random features, but the general idea carries through.
For more detail, Please see Chapters 2 & 3 of the GPML book. Thanks :)
@jereliu I have been struggling to explain to myself why the SNGP method creates a precision matrix instead of a covariance matrix. It seems that every formulation of RFF that I can find approximates the RBF kernel matrix and not its inverse.
What is it about SNGP that makes this RFF formulation approximate the RBF kernel inverse?
The text was updated successfully, but these errors were encountered: