Per-dimension constraints on RBFKernel lengthscales #1813
-
I am fitting a GP into a dataset with 379 points in F=7 dimensions. I am using the combo of RBFKernel, FixedNoiseGP, ExactMarginalLogLikelihood and fit_gpytorch_mll. After fitting the kernel lengthscales are like this:
What is worrysome is that the 5th lengthscale is very low:
which leads to heavy overfit on the coreesponding feature. As recommended here I put a LowerThan constraint on my RBFKernel lenghtscale like this:
This regularizes the GP fine, but still feels like I may constrain other dimensions too much. Q1: Is there a way to create a list of F separate constraints, so I can bound just one dimention's lengthscale? Also I noticed that the learnt lengthscales are very unreliable, on each refit the values are significantly different. Q2: How to achieve stable fitting of the GP in the sense that the lengthscales end up about the same on each refit? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
After the authors of GPytorch confirmed in this discussion that lengthscale bounds can be 1D tensors like the following snippet, I ran into an exception coming from BoTorch.
Exception:
Seem that BoTorch does not support tensor constraints for lengthscales. Any advice apart from fixing it and making a PR? :) |
Beta Was this translation helpful? Give feedback.
Thanks for reporting. This is fixed in #1843