Apache SINGA
A distributed deep learning platform .
|
Public Member Functions | |
Blob (const vector< int > &shape) | |
void | Reshape (const vector< int > &shape) |
Change the dimensions of the blob, allocating new memory if necessary. More... | |
void | ReshapeLike (const Blob &other) |
const vector< int > & | shape () const |
int | count () const |
void | set_version (int v) |
const int | version () const |
void | CopyFrom (const Blob< Dtype > &source, bool reshape=false) |
Copy from a source Blob. More... | |
const shared_ptr< SyncedMemory > & | data () const |
const Dtype * | cpu_data () const |
void | set_cpu_data (Dtype *data) |
const Dtype * | gpu_data () const |
Dtype * | mutable_cpu_data () |
Dtype * | mutable_gpu_data () |
void | ToProto (singa::BlobProto *proto) const |
Dtype | asum_data () const |
Compute the sum of absolute values (L1 norm) of the data. | |
Dtype | sum_data () const |
void | ShareData (const Blob &other) |
Set the data_ shared_ptr to point to the SyncedMemory holding the data_ of Blob other – useful in Layer&s which simply perform a copy in their Forward pass. More... | |
void | Swap (Blob &other) |
Public Attributes | |
shared_ptr< SyncedMemory > | data_ |
Protected Attributes | |
vector< int > | shape_ |
int | count_ |
int | capacity_ |
int | version_ |
void Blob< Dtype >::Reshape | ( | const vector< int > & | shape | ) |
Change the dimensions of the blob, allocating new memory if necessary.
This function can be called both to create an initial allocation of memory, and to adjust the dimensions of a top blob during Layer::Reshape or Layer::Forward. When changing the size of blob, memory will only be reallocated if sufficient memory does not already exist, and excess memory will never be freed.
Note that reshaping an input blob and immediately calling Net::Backward is an error; either Net::Forward or Net::Reshape need to be called to propagate the new input shape to higher layers.
Set the data_ shared_ptr to point to the SyncedMemory holding the data_ of Blob other – useful in Layer&s which simply perform a copy in their Forward pass.
This deallocates the SyncedMemory holding this Blob's data_, as shared_ptr calls its destructor when reset with the "=" operator.