Loading TOC...

cntk:cross-entropy-with-softmax

cntk:cross-entropy-with-softmax(
   $output_vector as cntk:variable,
   $target_vector as cntk:variable,
   $axis as cntk:axis,
   [$name as xs:string]
) as cntk:function

Summary

This operation computes the cross entropy between the target_vector and the softmax of the output_vector. The elements of target_vector have to be non-negative and should sum to 1. The output_vector can contain any values. The function will internally compute the softmax of the output_vector.

Parameters
$output_vector The unscaled computed output values from the network.
$target_vector Usually it is one-hot vector where the hot bit corresponds to the label index. But it can be any probability distribution over the labels.
$axis If given, cross entropy will be computed along this axis.
$name The name of the function instance in the network.

Example

  let $input-variable1 := cntk:input-variable(cntk:shape((3)), "float",
    fn:false(), fn:false(), "feature")
  let $input-variable2 := cntk:input-variable(cntk:shape((3)), "float",
    fn:false(), fn:false(), "feature")
  return cntk:cross-entropy-with-softmax($input-variable1, $input-variable2,
    cntk:axis(0), " a_sr2_")
  => cntk:function(Composite CrossEntropyWithSoftmax (Input(Name(feature),
  Shape([3]), Dynamic Axes([Sequence Axis(Default Dynamic Axis),
  Batch Axis(Default Batch Axis)])), Input(Name(feature), Shape([3]),
  Dynamic Axes([Sequence Axis(Default Dynamic Axis), Batch Axis(Default Batch Axis)]))))

Stack Overflow iconStack Overflow: Get the most useful answers to questions from the MarkLogic community, or ask your own question.