Apache Singa
A General Distributed Deep Learning Library
Public Member Functions | List of all members
singa::SoftmaxCrossEntropy Class Reference

Softmax + cross entropy for multi-category classification. More...

#include <loss.h>

Inheritance diagram for singa::SoftmaxCrossEntropy:
Inheritance graph
[legend]
Collaboration diagram for singa::SoftmaxCrossEntropy:
Collaboration graph
[legend]

Public Member Functions

Tensor Forward (int flag, const Tensor &prediction, const Tensor &target) override
 Compute the loss values for each sample/instance given the prediction and the target. More...
 
Tensor Backward () override
 Compute the gradients of the loss values w.r.t. More...
 
- Public Member Functions inherited from singa::Loss
void Setup (const string &conf)
 
virtual void ToDevice (std::shared_ptr< Device > device)
 
virtual void Setup (const LossConf &conf)
 Set meta fields from user configurations.
 
float Evaluate (int flag, const Tensor &prediction, const Tensor &target)
 Average loss values for all samples in the mini-batch It calls Forward() internally. More...
 

Detailed Description

Softmax + cross entropy for multi-category classification.

Member Function Documentation

◆ Backward()

Tensor singa::SoftmaxCrossEntropy::Backward ( )
overridevirtual

Compute the gradients of the loss values w.r.t.

the prediction, which is: p[i] - t[i]/ t[j]

Implements singa::Loss.

◆ Forward()

Tensor singa::SoftmaxCrossEntropy::Forward ( int  flag,
const Tensor prediction,
const Tensor target 
)
overridevirtual

Compute the loss values for each sample/instance given the prediction and the target.

If the target consists one integer per instance, i.e. the label index (dentoed as idx_truth), the loss is -log(p[idx_truth]), p[] is the probability for each category, computed from Softmax(prediction). If the target consists one array per instance (e.g., for multiple labels), the loss is - (t[i] * log(p[i]) / t[j], t[i] is the weight of the i-th label (e.g., 1: the instance has this label, 0: the instance does not have this label).

Users can call Average(const Tensor&) to get the average loss value over all samples in the batch.

Implements singa::Loss.


The documentation for this class was generated from the following file: