Apache Singa
A General Distributed Deep Learning Library
Public Member Functions | List of all members
singa::Loss Class Referenceabstract

The base loss class, which declares the APIs for computing the objective score (loss) for a pair of prediction (from the model) and the target (i.e. More...

#include <loss.h>

Inheritance diagram for singa::Loss:
Inheritance graph
[legend]

Public Member Functions

void Setup (const string &conf)
 
virtual void ToDevice (std::shared_ptr< Device > device)
 
virtual void Setup (const LossConf &conf)
 Set meta fields from user configurations.
 
virtual Tensor Forward (int flag, const Tensor &prediction, const Tensor &target)=0
 Compute the loss values for each sample/instance given the prediction and the target. More...
 
float Evaluate (int flag, const Tensor &prediction, const Tensor &target)
 Average loss values for all samples in the mini-batch It calls Forward() internally. More...
 
virtual Tensor Backward ()=0
 Compute the gradients of the loss values w.r.t. the prediction.
 

Detailed Description

The base loss class, which declares the APIs for computing the objective score (loss) for a pair of prediction (from the model) and the target (i.e.

the ground truth). It also computes the gradients of the objective w.r.t. the prediction. It has similar APIs as Layer.

Member Function Documentation

◆ Evaluate()

float singa::Loss::Evaluate ( int  flag,
const Tensor prediction,
const Tensor target 
)
inline

Average loss values for all samples in the mini-batch It calls Forward() internally.

The calling pattern should be [Evaluate|Forward] Backward.

◆ Forward()

virtual Tensor singa::Loss::Forward ( int  flag,
const Tensor prediction,
const Tensor target 
)
pure virtual

Compute the loss values for each sample/instance given the prediction and the target.

Implemented in singa::SoftmaxCrossEntropy, and singa::MSE.


The documentation for this class was generated from the following file: