Loading TOC...

cntk.optimizedRnnstack

cntk.optimizedRnnstack(
   $operand as cntk.variable,
   $weights as Number,
   $hidden-size as (Number|String),
   $num-layers as (Number|String),
   [$bidirectional as Boolean],
   [$recurrent-op as String],
   [$name as String]
) as cntk.function

Summary

An RNN implementation that uses the primitives in cuDNN. If cuDNN is not available it fails. You can use convert_optimized_rnnstack to convert a model to GEMM-based implementation when no cuDNN.

Parameters
$operand Input of the optimized RNN stack.
$weights Parameter tensor that holds the learned weights.
$hidden-size Number of hidden units in each layer (and in each direction).
$num-layers Number of layers in the stack.
$bidirectional Whether each layer should compute both in forward and separately in backward mode and concatenate the results (if True the output is twice the hidden_size). The default is False which means the recurrence is only computed in the forward direction.
$recurrent-op One of ‘lstm’, ‘gru’, ‘relu’, or ‘tanh’.
$name The name of the function instance in the network.

Stack Overflow iconStack Overflow: Get the most useful answers to questions from the MarkLogic community, or ask your own question.