Radial Flow

The radial flow introduced in [REZENDE2015] with the parametrization used in [TRIPPE2018]. The flow was originally designed for variational inference and sampling. Therefore it doesn’t easily fit our use-case of density estimation. Since we especially need the inverse \(f^{-1}(x)\) of the flow to be easily computable, we invert it’s direction, defining it as a mapping from the transformed distribution \(p_1(x)\) to the base distribution \(p_0(x)\). Hence the flow is called InvertedRadialFlow in our implementation and the forward method is not implemented.

\[ \begin{align}\begin{aligned}\mathbf{\gamma} \in \mathbb{R}^d, \alpha, \beta \in \mathbb{R}\\f^{-1}(\mathbf{x}) = \mathbf{x} + \dfrac{\alpha\beta(\mathbf{x} - \mathbf{\gamma})}{\alpha + |\mathbf{x}-\mathbf{\gamma}|}\end{aligned}\end{align} \]

To ensure \(f(x)\) exists we have to constrain the parameters of the flow:

  • \(\alpha \geq 0\) needs to hold. Therefore we apply a softplus transformation to \(\alpha\)

  • \(\beta \geq -1\) needs to hold. We apply \(f(x) = \exp(x) - 1\) to \(\beta\) before assignment

Jacobian determinant:

\[ \begin{align}\begin{aligned}\det(\mathbf{J}^{-1}) = \lbrack1 + \alpha\beta \cdot h(\alpha, r)\rbrack^{d -1} \lbrack1 + \alpha\beta \cdot h(\alpha, r) + \alpha\beta \cdot h'(\alpha, r)r\rbrack\\h(\alpha, r) = \dfrac{1}{\alpha + r}, r = |\mathbf{x} - \mathbf{\gamma}|\end{aligned}\end{align} \]
class cde.density_estimator.normalizing_flows.InvertedRadialFlow(params, n_dims, validate_args=False, name='InvertedRadialFlow')[source]

Implements a bijector x = y + (alpha * beta * (y - y_0)) / (alpha + abs(y - y_0)).

Parameters
  • params – Tensor shape (?, n_dims+2). This will be split into the parameters alpha (?, 1), beta (?, 1), gamma (?, n_dims). Furthermore alpha will be constrained to assure the invertability of the flow

  • n_dims – The dimension of the distribution that will be transformed

  • name – The name to give this particular flow

dtype

dtype of `Tensor`s transformable by this distribution.

event_ndims

Returns then number of event dimensions this bijector operates on.

forward(x)[source]

We don’t require sampling and it would be slow, therefore it is not implemented

Raises

NotImplementedError

forward_event_shape(input_shape)

Shape of a single sample from a single batch as a TensorShape.

Same meaning as forward_event_shape_tensor. May be only partially defined.

Parameters

input_shapeTensorShape indicating event-portion shape passed into forward function.

Returns

TensorShape indicating event-portion shape

after applying forward. Possibly unknown.

Return type

forward_event_shape_tensor

forward_event_shape_tensor(input_shape, name='forward_event_shape_tensor')

Shape of a single sample from a single batch as an int32 1D Tensor.

Parameters
  • input_shapeTensor, int32 vector indicating event-portion shape passed into forward function.

  • name – name to give to the op

Returns

Tensor, int32 vector indicating

event-portion shape after applying forward.

Return type

forward_event_shape_tensor

forward_log_det_jacobian(x, name='forward_log_det_jacobian')

Returns both the forward_log_det_jacobian.

Parameters
  • xTensor. The input to the “forward” Jacobian evaluation.

  • name – The name to give this op.

Returns

Tensor, if this bijector is injective.

If not injective this is not implemented.

Raises
  • TypeError – if self.dtype is specified and y.dtype is not self.dtype.

  • NotImplementedError – if neither _forward_log_det_jacobian nor {_inverse, _inverse_log_det_jacobian} are implemented, or this is a non-injective bijector.

static get_param_size(n_dims)[source]
Parameters

n_dims – The dimension of the distribution to be transformed by the flow

Returns

(int) The dimension of the parameter space for the flow

graph_parents

Returns this Bijector’s graph_parents as a Python list.

inverse(y, name='inverse')

Returns the inverse Bijector evaluation, i.e., X = g^{-1}(Y).

Parameters
  • yTensor. The input to the “inverse” evaluation.

  • name – The name to give this op.

Returns

Tensor, if this bijector is injective.

If not injective, returns the k-tuple containing the unique k points (x1, …, xk) such that g(xi) = y.

Raises
  • TypeError – if self.dtype is specified and y.dtype is not self.dtype.

  • NotImplementedError – if _inverse is not implemented.

inverse_event_shape(output_shape)

Shape of a single sample from a single batch as a TensorShape.

Same meaning as inverse_event_shape_tensor. May be only partially defined.

Parameters

output_shapeTensorShape indicating event-portion shape passed into inverse function.

Returns

TensorShape indicating event-portion shape

after applying inverse. Possibly unknown.

Return type

inverse_event_shape_tensor

inverse_event_shape_tensor(output_shape, name='inverse_event_shape_tensor')

Shape of a single sample from a single batch as an int32 1D Tensor.

Parameters
  • output_shapeTensor, int32 vector indicating event-portion shape passed into inverse function.

  • name – name to give to the op

Returns

Tensor, int32 vector indicating

event-portion shape after applying inverse.

Return type

inverse_event_shape_tensor

inverse_log_det_jacobian(y, name='inverse_log_det_jacobian')

Returns the (log o det o Jacobian o inverse)(y).

Mathematically, returns: log(det(dX/dY))(Y). (Recall that: X=g^{-1}(Y).)

Note that forward_log_det_jacobian is the negative of this function, evaluated at g^{-1}(y).

Parameters
  • yTensor. The input to the “inverse” Jacobian evaluation.

  • name – The name to give this op.

Returns

Tensor, if this bijector is injective.

If not injective, returns the tuple of local log det Jacobians, log(det(Dg_i^{-1}(y))), where g_i is the restriction of g to the ith partition Di.

Raises
  • TypeError – if self.dtype is specified and y.dtype is not self.dtype.

  • NotImplementedError – if _inverse_log_det_jacobian is not implemented.

is_constant_jacobian

Returns true iff the Jacobian is not a function of x.

Note: Jacobian is either constant for both forward and inverse or neither.

Returns

Python bool.

Return type

is_constant_jacobian

name

Returns the string name of this Bijector.

validate_args

Returns True if Tensor arguments will be validated.

REZENDE2015

Rezende, Mohamed (2015). Variational Inference with Normalizing Flows (http://arxiv.org/abs/1505.05770)

TRIPPE2018

Trippe, Turner (2018). Conditional Density Estimation with Bayesian Normalising Flows (http://arxiv.org/abs/1802.04908)