models Package

explauto.models.gmminf.schur_complement(mat, row, col)[source]

compute the schur complement of the matrix block mat[row:,col:] of the matrix mat

explauto.models.gmminf.conditional(mean, covar, dims_in, dims_out, covariance_type='full')[source]

Return a function f such that f(x) = p(dims_out | dims_in = x) (f actually returns the mean and covariance of the conditional distribution

class explauto.models.gmminf.GMM(**kwargs)[source]

Bases: sklearn.mixture.gmm.GMM

probability(value)[source]
sub_gmm(inds_k)[source]
conditional(in_dims, out_dims)[source]
inference(in_dims, out_dims, value=None)[source]

Perform Bayesian inference on the gmm. Let’s call V = V1...Vd the d-dimensional space on which the current GMM is defined, such that it represents P(V). Let’s call X and Y to disjoint subspaces of V, with corresponding dimension indices in ran. This method returns the GMM for P(Y | X=value).

Parameters:
  • in_dims (list) – the dimension indices of X (a subset of range(d)). This can be the empty list if one want to compute the marginal P(Y).
  • out_dims (list) – the dimension indices of Y (a subset of range(d), without intersection with in_dims).
  • value (numpy.array) – the value of X for which one want to compute the conditional (ignored of in_dims=[]).
Returns:

the gmm corresponding to P(Y | X=value) (or to P(Y) if in_dims=[])

Note

For example, if X = V1...Vm and Y = Vm+1...Vd, then P(Y | X=v1...vm) is returned by self.inference(in_dims=range(m), out_dims=range(m, d), array([v1, ..., vm])).

ellipses2D(colors)[source]
ellipses3D()[source]
plot(ax, label=False)[source]
plot_projection(ax, dims, label=False)[source]