Group :: Desenvolvimento/Perl
RPM: perl-String-Tokenizer
Principal Changelog Spec Patches Sources Download Gear Bugs e FR Repocop
A versão atual: 0.06-alt1
Data da compilação: 3 março 2016, 11:51 ( 425.2 weeks ago )
Tamanho:: 20.15 Kb
Home page: http://www.cpan.org
Licença: Artistic
Sumário: A simple string tokenizer
Descrição:
Lista dos contribuidores Lista dos rpms provida por esta srpm:
ACL:
Data da compilação: 3 março 2016, 11:51 ( 425.2 weeks ago )
Tamanho:: 20.15 Kb
Home page: http://www.cpan.org
Licença: Artistic
Sumário: A simple string tokenizer
Descrição:
A simple string tokenizer which takes a string and splits it on
whitespace. It also optionally takes a string of characters to use as
delimiters, and returns them with the token set as well. This allows
for splitting the string in many different ways.
This is a very basic tokenizer, so more complex needs should be either
addressed with a custom written tokenizer or post-processing of the output
generated by this module. Basically, this will not fill everyones needs,
but it spans a gap between simple "split / /, $string" and the other
options that involve much larger and complex modules.
Also note that this is not a lexical analyser. Many people confuse
tokenization with lexical analysis. A tokenizer mearly splits its
input into specific chunks, a lexical analyzer classifies those
chunks. Sometimes these two steps are combined, but not here.
Mantenedor currente: Vitaly Lipatov whitespace. It also optionally takes a string of characters to use as
delimiters, and returns them with the token set as well. This allows
for splitting the string in many different ways.
This is a very basic tokenizer, so more complex needs should be either
addressed with a custom written tokenizer or post-processing of the output
generated by this module. Basically, this will not fill everyones needs,
but it spans a gap between simple "split / /, $string" and the other
options that involve much larger and complex modules.
Also note that this is not a lexical analyser. Many people confuse
tokenization with lexical analysis. A tokenizer mearly splits its
input into specific chunks, a lexical analyzer classifies those
chunks. Sometimes these two steps are combined, but not here.
Lista dos contribuidores Lista dos rpms provida por esta srpm:
- perl-String-Tokenizer