Apache SINGA
A distributed deep learning platform .
 All Classes Namespaces Files Functions Variables Typedefs Macros
Public Member Functions | Protected Attributes | List of all members
singa::Blob< Dtype > Class Template Reference

Public Member Functions

 Blob (const std::vector< int > &shape)
 
void Reshape (const std::vector< int > &shape)
 Change the dimensions of the blob, allocating new memory if necessary. More...
 
void ReshapeLike (const Blob &other)
 
void CopyFrom (const Blob< Dtype > &source)
 Copy from a source Blob. More...
 
void CopyFrom (const Blob< Dtype > &source, bool reshape)
 
void FromProto (const singa::BlobProto &proto)
 
void ToProto (singa::BlobProto *proto) const
 
void ShareData (const Blob &other)
 Set the data_ shared_ptr to point to the SyncedMemory holding the data_ of Blob other – useful in Layer&s which simply perform a copy in their Forward pass. More...
 
void Swap (Blob &other)
 
const std::vector< int > & shape () const
 
int count () const
 
const int version () const
 
void set_version (int v)
 
const Dtype * cpu_data () const
 
void set_cpu_data (Dtype *data)
 
const Dtype * gpu_data () const
 
Dtype * mutable_cpu_data ()
 
Dtype * mutable_gpu_data ()
 
Dtype asum_data () const
 Compute the sum of absolute values (L1 norm) of the data.
 
Dtype sum_data () const
 

Protected Attributes

std::shared_ptr< SyncedMemorydata_ = nullptr
 
std::vector< int > shape_
 
int count_ = 0
 
int capacity_ = 0
 
int version_ = -1
 

Member Function Documentation

template<typename Dtype>
void singa::Blob< Dtype >::CopyFrom ( const Blob< Dtype > &  source)

Copy from a source Blob.

Parameters
sourcethe Blob to copy from
reshapeif false, require this Blob to be pre-shaped to the shape of other (and die otherwise); if true, Reshape this Blob to other's shape if necessary
template<typename Dtype>
void singa::Blob< Dtype >::Reshape ( const std::vector< int > &  shape)

Change the dimensions of the blob, allocating new memory if necessary.

This function can be called both to create an initial allocation of memory, and to adjust the dimensions of a top blob during Layer::Reshape or Layer::Forward. When changing the size of blob, memory will only be reallocated if sufficient memory does not already exist, and excess memory will never be freed.

Note that reshaping an input blob and immediately calling Net::Backward is an error; either Net::Forward or Net::Reshape need to be called to propagate the new input shape to higher layers.

template<typename Dtype>
void singa::Blob< Dtype >::ShareData ( const Blob< Dtype > &  other)

Set the data_ shared_ptr to point to the SyncedMemory holding the data_ of Blob other – useful in Layer&s which simply perform a copy in their Forward pass.

This deallocates the SyncedMemory holding this Blob's data_, as shared_ptr calls its destructor when reset with the "=" operator.


The documentation for this class was generated from the following file: