Группа :: Разработка/Perl
Пакет: perl-String-Tokenizer
Главная Изменения Спек Патчи Sources Загрузить Gear Bugs and FR Repocop
Текущая версия: 0.06-alt1
Время сборки: 3 марта 2016, 11:51 ( 421.2 недели назад )
Размер архива: 20.15 Kb
Домашняя страница: http://www.cpan.org
Лицензия: Artistic
О пакете: A simple string tokenizer
Описание:
Список всех майнтейнеров, принимавших участие
в данной и/или предыдущих сборках пакета: Список rpm-пакетов, предоставляемый данным srpm-пакетом:
ACL:
Время сборки: 3 марта 2016, 11:51 ( 421.2 недели назад )
Размер архива: 20.15 Kb
Домашняя страница: http://www.cpan.org
Лицензия: Artistic
О пакете: A simple string tokenizer
Описание:
A simple string tokenizer which takes a string and splits it on
whitespace. It also optionally takes a string of characters to use as
delimiters, and returns them with the token set as well. This allows
for splitting the string in many different ways.
This is a very basic tokenizer, so more complex needs should be either
addressed with a custom written tokenizer or post-processing of the output
generated by this module. Basically, this will not fill everyones needs,
but it spans a gap between simple "split / /, $string" and the other
options that involve much larger and complex modules.
Also note that this is not a lexical analyser. Many people confuse
tokenization with lexical analysis. A tokenizer mearly splits its
input into specific chunks, a lexical analyzer classifies those
chunks. Sometimes these two steps are combined, but not here.
Текущий майнтейнер: Vitaly Lipatov whitespace. It also optionally takes a string of characters to use as
delimiters, and returns them with the token set as well. This allows
for splitting the string in many different ways.
This is a very basic tokenizer, so more complex needs should be either
addressed with a custom written tokenizer or post-processing of the output
generated by this module. Basically, this will not fill everyones needs,
but it spans a gap between simple "split / /, $string" and the other
options that involve much larger and complex modules.
Also note that this is not a lexical analyser. Many people confuse
tokenization with lexical analysis. A tokenizer mearly splits its
input into specific chunks, a lexical analyzer classifies those
chunks. Sometimes these two steps are combined, but not here.
Список всех майнтейнеров, принимавших участие
в данной и/или предыдущих сборках пакета: Список rpm-пакетов, предоставляемый данным srpm-пакетом:
- perl-String-Tokenizer