contrib.legacy_seq2seq.sequence_loss_by_example
tf.contrib.legacy_seq2seq.sequence_loss_by_example
tf.contrib.legacy_seq2seq.sequence_loss_by_example
sequence_loss_by_example( logits, targets, weights, average_across_timesteps=True, softmax_loss_function=None, name=None )
Defined in tensorflow/contrib/legacy_seq2seq/python/ops/seq2seq.py
.
Weighted cross-entropy loss for a sequence of logits (per example).
Args:
-
logits
: List of 2D Tensors of shape [batch_size x num_decoder_symbols]. -
targets
: List of 1D batch-sized int32 Tensors of the same length as logits. -
weights
: List of 1D batch-sized float-Tensors of the same length as logits. -
average_across_timesteps
: If set, divide the returned cost by the total label weight. -
softmax_loss_function
: Function (labels, logits) -> loss-batch to be used instead of the standard softmax (the default if this is None). Note that to avoid confusion, it is required for the function to accept named arguments. -
name
: Optional name for this operation, default: "sequence_loss_by_example".
Returns:
1D batch-sized float Tensor: The log-perplexity for each sequence.
Raises:
-
ValueError
: If len(logits) is different from len(targets) or len(weights).
© 2017 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/contrib/legacy_seq2seq/sequence_loss_by_example