Skip to content

Commit

Permalink
additional clarifying comments
Browse files Browse the repository at this point in the history
  • Loading branch information
varunagrawal committed Feb 18, 2024
1 parent 4b2a22e commit 6d7dc57
Show file tree
Hide file tree
Showing 2 changed files with 5 additions and 3 deletions.
2 changes: 1 addition & 1 deletion gtsam/hybrid/HybridBayesNet.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -291,7 +291,7 @@ AlgebraicDecisionTree<Key> HybridBayesNet::modelSelection() const {
std::numeric_limits<double>::max());
}

// Compute the error for X* and the assignment
// Compute the error at the MLE point X* for the current assignment
double error =
this->error(HybridValues(mu, DiscreteValues(assignment)));

Expand Down
6 changes: 4 additions & 2 deletions gtsam/hybrid/HybridGaussianFactorGraph.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -295,8 +295,9 @@ static std::shared_ptr<Factor> createDiscreteFactor(
if (!factor) return 1.0; // TODO(dellaert): not loving this.

// Logspace version of:
// exp(-factor->error(kEmpty)) / conditional->normalizationConstant();
return -factor->error(kEmpty) - conditional->logNormalizationConstant();
// exp(-factor->error(kEmpty)) * conditional->normalizationConstant();
// We take negative of the logNormalizationConstant (1/k) to get k
return -factor->error(kEmpty) + (-conditional->logNormalizationConstant());

This comment has been minimized.

Copy link
@cntaylor

cntaylor Feb 19, 2024

Contributor

I believe there should not be a negative in front of conditional->logNormalizationConstant(), at least if a call is going to GaussianConditional::logNormalizationConstant(). The GaussianConditional call returns the log of 'k'. This leaves two options:

  1. The negative should not be here, and the comment is incorrect as well (logNormalizationConstant returns k, so no need to negate)
  2. The error is in GaussianConditional (lines 184-194) as it should return log(1/k) as opposed to log(k)?

This comment has been minimized.

Copy link
@varunagrawal

varunagrawal Feb 19, 2024

Author Collaborator

The GaussianConditional::logNormalizationConstant returns log(1/k). The docstring for it says

/**
 * normalization constant = 1.0 / sqrt((2*pi)^n*det(Sigma))
 * log = - 0.5 * n*log(2*pi) - 0.5 * log det(Sigma)
 */
double logNormalizationConstant() const override;

So to get log(k), I need to put in the negative. The comment does need to be improved though, since I am taking the negative of log(1/k) and not (1/k). Thanks for catching that.

This comment has been minimized.

Copy link
@cntaylor

cntaylor Feb 19, 2024

Contributor

I guess we need to define 'k'. I was defining it as "the constant by which I multiply exp(-error) to get the true probability." For a scalar, I would say p(x) = k * exp(-(x-z)^2/sigma^2), where k is what you describe in the docstring for logNormalizationConstant. If you don't have the negative in HybridGaussian..., then you are getting the log probability defined by a multi-variable Normal distribution. When you then divide by that 'k' (like the code in HybridGaussian... does), you are no longer getting the probability. You are getting ... I'm not sure what.

This comment has been minimized.

Copy link
@varunagrawal

varunagrawal via email Feb 19, 2024

Author Collaborator
};

AlgebraicDecisionTree<Key> logProbabilities(
Expand Down Expand Up @@ -324,6 +325,7 @@ static std::shared_ptr<Factor> createGaussianMixtureFactor(
if (factor) {
auto hf = std::dynamic_pointer_cast<HessianFactor>(factor);
if (!hf) throw std::runtime_error("Expected HessianFactor!");
// Add 2.0 term since the constant term will be premultiplied by 0.5
hf->constantTerm() += 2.0 * conditional->logNormalizationConstant();
}
return factor;
Expand Down

0 comments on commit 6d7dc57

Please sign in to comment.