Called on each token character to normalize it before it is added to the
token. The default implementation does nothing. Subclasses may use this
to, e.g., lowercase tokens.
Namespace: Lucene.Net.AnalysisAssembly: Lucene.Net (in Lucene.Net.dll) Version: 2.9.4.1
Syntax
C# |
---|
protected internal virtual char Normalize( char c ) |
Visual Basic |
---|
Protected Friend Overridable Function Normalize ( _ c As Char _ ) As Char |
Visual C++ |
---|
protected public: virtual wchar_t Normalize( wchar_t c ) |
Parameters
- c
- Type: System..::..Char
[Missing <param name="c"/> documentation for "M:Lucene.Net.Analysis.CharTokenizer.Normalize(System.Char)"]
Return Value
[Missing <returns> documentation for "M:Lucene.Net.Analysis.CharTokenizer.Normalize(System.Char)"]