Loading TOC...

cntk:binary-cross-entropy

cntk:binary-cross-entropy(
   $prediction as cntk:variable,
   $targets as cntk:variable,
   [$name as xs:string]
) as cntk:function

Summary

Computes the binary cross entropy (aka logistic loss) between the output and target.

Parameters
$prediction The computed posterior probability for a variable to be 1 from the network (typ. a sigmoid)
$targets Ground-truth label, 0 or 1.
$name The name of the function instance in the network.

Example

  let $input-variable1 := cntk:input-variable(cntk:shape((3)), "float",
    fn:false(), fn:false(), "feature")
  let $input-variable2 := cntk:input-variable(cntk:shape((3)), "float",
    fn:false(), fn:false(), "feature")
  return cntk:binary-cross-entropy($input-variable1, $input-variable2, "<k")
  => cntk:function(Composite BinaryCrossEntropy (Input(Name(feature),
  Shape([3]), Dynamic Axes([Sequence Axis(Default Dynamic Axis), Batch Axis(Default Batch Axis)])), Input(Name(feature), Shape([3]), Dynamic Axes([Sequence Axis(Default Dynamic Axis), Batch Axis(Default Batch Axis)]))))

Stack Overflow iconStack Overflow: Get the most useful answers to questions from the MarkLogic community, or ask your own question.