Loading TOC...

cntk:batch-normalization

cntk:batch-normalization(
   $operand as cntk:variable,
   $scale as cntk:variable,
   $bias as cntk:variable,
   $running-mean as cntk:variable,
   $running-inv-std as cntk:variable,
   $running-count as cntk:variable,
   $spatial as xs:boolean,
   [$map as map:map]
) as cntk:function

Summary

Normalizes layer outputs for every minibatch for each output (feature) independently and applies affine transformation to preserve representation of the layer.

Parameters
$operand The operand of the operation. Input of the batch normalization operation.
$scale Parameter tensor that holds the learned componentwise-scaling factors.
$bias Parameter tensor that holds the learned bias. scale and bias must have the same dimensions which must be equal to the input dimensions in case of spatial = False or number of output convolution feature maps in case of spatial = True.
$running-mean Running mean which is used during evaluation phase and might be used during training as well. You must pass a constant tensor with initial value 0 and the same dimensions as scale and bias
$running-inv-std Running variance. Represented as running_mean.
$running-count Denotes the total number of samples that have been used so far to compute the running_mean and running_inv_std parameters. You must pass a scalar (either rank-0 constant(val)).
$spatial Flag that indicates whether to compute mean/var for each feature in a minibatch independently or, in case of convolutional layers, per future map.
$map Additional learning options.

Stack Overflow iconStack Overflow: Get the most useful answers to questions from the MarkLogic community, or ask your own question.