opennlp.model
Interface Prior

All Known Implementing Classes:
UniformPrior

public interface Prior

This interface allows one to implement a prior distribution for use in maximum entropy model training.


Method Summary
 void logPrior(double[] dist, int[] context)
          Populates the specified array with the the log of the distribution for the specified context.
 void logPrior(double[] dist, int[] context, float[] values)
          Populates the specified array with the the log of the distribution for the specified context.
 void setLabels(String[] outcomeLabels, String[] contextLabels)
          Method to specify the label for the outcomes and contexts.
 

Method Detail

logPrior

void logPrior(double[] dist,
              int[] context)
Populates the specified array with the the log of the distribution for the specified context. The returned array will be overwritten and needs to be re-initialized with every call to this method.

Parameters:
dist - An array to be populated with the log of the prior distribution.
context - The indices of the contextual predicates for an event.

logPrior

void logPrior(double[] dist,
              int[] context,
              float[] values)
Populates the specified array with the the log of the distribution for the specified context. The returned array will be overwritten and needs to be re-initialized with every call to this method.

Parameters:
dist - An array to be populated with the log of the prior distribution.
context - The indices of the contextual predicates for an event.
values - The values associated with the context.

setLabels

void setLabels(String[] outcomeLabels,
               String[] contextLabels)
Method to specify the label for the outcomes and contexts. This is used to map integer outcomes and contexts to their string values. This method is called prior to any call to #logPrior.

Parameters:
outcomeLabels - An array of each outcome label.
contextLabels - An array of each context label.


Copyright © 2011 The Apache Software Foundation. All Rights Reserved.