|
class | Msg |
| Msg used to transfer Param info (gradient or value), feature blob, etc between workers, stubs and servers. More...
|
|
class | SocketInterface |
|
class | Poller |
|
class | Dealer |
|
class | Router |
|
class | Driver |
|
class | BridgeLayer |
|
class | BridgeDstLayer |
| For recv data from layer on other threads which may resident on other nodes due to layer/data partiton. More...
|
|
class | BridgeSrcLayer |
| For sending data to layer on other threads which may resident on other nodes due to layer/data partition. More...
|
|
class | ConcateLayer |
| Connect multiple (src) layers with a single (dst) layer. More...
|
|
class | SliceLayer |
| Connect a single (src) layer with multiple (dst) layers. More...
|
|
class | SplitLayer |
| Connect a single (src) layer with multiple dst layers. More...
|
|
class | DataLayer |
| Base layer for reading records from local Shard, HDFS, lmdb, etc. More...
|
|
class | ShardDataLayer |
| Layer for loading Record from DataShard. More...
|
|
class | ParserLayer |
| Base layer for parsing the input records into Blobs. More...
|
|
class | LabelLayer |
| Derived from ParserLayer to parse label from SingaleLabelImageRecord. More...
|
|
class | MnistLayer |
| Derived from ParserLayer to parse MNIST feature from SingaleLabelImageRecord. More...
|
|
class | RGBImageLayer |
| Derived from ParserLayer to parse RGB image feature from SingaleLabelImageRecord. More...
|
|
class | PrefetchLayer |
| Layer for prefetching data records and parsing them. More...
|
|
class | Layer |
| Base layer class. More...
|
|
class | ConnectionLayer |
| Base layer for connecting layers when neural net is partitioned. More...
|
|
class | InputLayer |
| Base layer for getting input data. More...
|
|
class | NeuronLayer |
|
class | LossLayer |
| Base layer for calculating loss and other metrics, e.g., precison. More...
|
|
class | EuclideanLossLayer |
| Squared Euclidean loss as 0.5 ||predict - ground_truth||^2. More...
|
|
class | SoftmaxLossLayer |
| Cross-entropy loss applied to the probabilities after Softmax. More...
|
|
class | NeuralNet |
| The neural network is constructed from user configurations in NetProto. More...
|
|
class | ConvolutionLayer |
| Convolution layer. More...
|
|
class | CConvolutionLayer |
| Use im2col from Caffe. More...
|
|
class | DropoutLayer |
|
class | LRNLayer |
| Local Response Normalization edge. More...
|
|
class | PoolingLayer |
|
class | CPoolingLayer |
| Use book-keeping for BP following Caffe's pooling implementation. More...
|
|
class | ReLULayer |
|
class | InnerProductLayer |
|
class | STanhLayer |
| This layer apply scaled Tan function to neuron activations. More...
|
|
class | SigmoidLayer |
| This layer apply Sigmoid function to neuron activations. More...
|
|
class | RBMLayer |
| Base layer for RBM models. More...
|
|
class | RBMVisLayer |
| RBM visible layer. More...
|
|
class | RBMHidLayer |
| RBM hidden layer. More...
|
|
class | Server |
|
class | Trainer |
| Every running process has a training object which launches one or more worker (and server) threads. More...
|
|
class | Worker |
| The Worker class which runs the training algorithm. More...
|
|
class | BPWorker |
|
class | CDWorker |
|
class | SyncedMemory |
| Manages memory allocation and synchronization between the host (CPU) and device (GPU). More...
|
|
class | Blob |
|
class | Cluster |
| Cluster is a singleton object, which provides cluster configuations, e.g., the topology of the cluster. More...
|
|
struct | RTCallback |
|
struct | JobInfo |
|
class | ZKService |
|
class | ClusterRuntime |
| ClusterRuntime is a runtime service that manages dynamic configuration and status of the whole cluster. More...
|
|
class | JobManager |
|
class | Metric |
| Performance mtrics. More...
|
|
class | DataShard |
| Data shard stores training/validation/test tuples. More...
|
|
class | Node |
|
class | Graph |
| Neuralnet is constructed by creating a graph with each node representing one layer at first. More...
|
|
class | ParamGenerator |
| Base parameter generator which intializes parameter values. More...
|
|
class | GaussianGen |
|
class | GaussianSqrtFanInGen |
|
class | UniformGen |
|
class | UniformSqrtFanInGen |
|
class | UniformSqrtFanInOutGen |
|
class | Param |
| Base paramter class. More...
|
|
class | ParamEntry |
| ParamEntry is used for aggregating gradients of Params shared by workers from the same group. More...
|
|
class | LRGenerator |
| Base learning rate generator. More...
|
|
class | FixedStepLRGen |
|
class | StepLRGen |
|
class | LinearLRGen |
|
class | ExpLRGen |
|
class | InvLRGen |
|
class | InvTLRGen |
|
class | Updater |
| Updater for Param. More...
|
|
class | SGDUpdater |
|
class | AdaGradUpdater |
|
class | NesterovUpdater |
|
|
int | Addr (int grp, int id_or_proc, int type) |
| Wrapper to generate message address. More...
|
|
int | AddrGrp (int addr) |
| Parse group id from addr. More...
|
|
int | AddrID (int addr) |
| Parse worker/server id from addr. More...
|
|
int | AddrProc (int addr) |
| Parse worker/server procs from addr. More...
|
|
int | AddrType (int addr) |
| Parse msg type from addr. More...
|
|
void | DeleteMsg (Msg **msg) |
|
int | BlobTrgt (int grp, int layer) |
|
int | BlobGrp (int blob_trgt) |
|
int | BlobLayer (int blob_trgt) |
|
void | MallocHost (void **ptr, size_t size) |
|
void | FreeHost (void *ptr) |
|
std::string | GetZKJobWorkspace (int job_id) |
|
std::string | IntVecToString (const std::vector< int > &vec) |
|
std::string | VStringPrintf (std::string fmt, va_list l) |
|
std::string | StringPrintf (std::string fmt,...) |
|
int | ArgPos (int argc, char **arglist, const char *arg) |
| Locate the position of the arg in arglist. More...
|
|
void | CreateFolder (const std::string name) |
|
const std::vector< std::vector
< int > > | Slice (int num, const std::vector< int > &sizes) |
| Slice a set of large Params into small pieces such that they can be roughtly equally partitioned into a fixed number of boxes. More...
|
|
const std::vector< int > | PartitionSlices (int num, const std::vector< int > &slices) |
| Partition slices into boxes. More...
|
|
int | gcd (int a, int b) |
|
int | LeastCommonMultiple (int a, int b) |
|
std::string | GetHostIP () |
|
void | SetupLog (const std::string &workspace, const std::string &model) |
|
void | Im2col (const float *data_im, const int channels, const int height, const int width, const int kernel_h, const int kernel_w, const int pad_h, const int pad_w, const int stride_h, const int stride_w, float *data_col) |
|
void | Col2im (const float *data_col, const int channels, const int height, const int width, const int patch_h, const int patch_w, const int pad_h, const int pad_w, const int stride_h, const int stride_w, float *data_im) |
|
void | ForwardMaxPooling (const float *bottom, const int num, const int channels, const int height, const int width, const int kernel_h, const int kernel_w, const int pad_h, const int pad_w, const int stride_h, const int stride_w, float *top, float *mask) |
|
void | BackwardMaxPooling (const float *top, const float *mask, const int num, const int channels, const int height, const int width, const int kernel_h, const int kernel_w, const int pad_h, const int pad_w, const int stride_h, const int stride_w, float *bottom) |
|
void | ForwardAvgPooling (const float *bottom, const int num, const int channels, const int height, const int width, const int kernel_h, const int kernel_w, const int pad_h, const int pad_w, const int stride_h, const int stride_w, float *top) |
|
void | BackwardAvgPooling (const float *top, const int num, const int channels, const int height, const int width, const int kernel_h, const int kernel_w, const int pad_h, const int pad_w, const int stride_h, const int stride_w, float *bottom) |
|
void | ReadProtoFromTextFile (const char *filename, Message *proto) |
|
void | WriteProtoToTextFile (const Message &proto, const char *filename) |
|
void | ReadProtoFromBinaryFile (const char *filename, Message *proto) |
|
void | WriteProtoToBinaryFile (const Message &proto, const char *filename) |
|
int | ParamTrgt (int param_id, int slice_id) |
|
int | ParamID (int param_trgt) |
|
int | SliceID (int param_trgt) |
|
The code is adapted from that of Caffe whose license is attached.
COPYRIGHT All contributions by the University of California: Copyright (c) 2014, The Regents of the University of California (Regents) All rights reserved. All other contributions: Copyright (c) 2014, the respective contributors All rights reserved. Caffe uses a shared copyright model: each contributor holds copyright over their contributions to Caffe. The project versioning records all such contribution and copyright details. If a contributor wants to further mark their specific copyright on a particular contribution, they should indicate their copyright solely in the commit message of the change when it is committed. LICENSE Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:
- Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.
- Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. CONTRIBUTION AGREEMENT By contributing to the BVLC/caffe repository through pull-request, comment, or otherwise, the contributor releases their content to the license and copyright terms herein.