LUCENE-3894: try toning down for this tokenizer (it builds lots of tokens from the input treated as a path)