tokenize

Tokenization help for Python programs.

generate_tokens(readline) is a generator that breaks a stream of text into Python tokens. It accepts a readline-like method which is called repeatedly to get the next line of input (or “” for EOF). It generates 5-tuples with these members:

the token type (see token.py) the token (a string) the starting (row, column) indices of the token (a 2-tuple of ints) the ending (row, column) indices of the token (a 2-tuple of ints) the original line (string)

It is designed to match the working of the Python tokenizer exactly, except that it produces COMMENT tokens for comments and gives type OP for all operators

Older entry points
tokenize_loop(readline, tokeneater) tokenize(readline, tokeneater=printtoken)

are the same, except instead of generating tokens, tokeneater is a callback function to which the 5 fields described above are passed as 5 arguments, each time a new token is found.

Functions

ISEOF(x)
ISNONTERMINAL(x)
ISTERMINAL(x)
any(*choices)
generate_tokens(readline) The generate_tokens() generator requires one argument, readline, which must be a callable object which provides the same interface as the readline() method of built-in file objects.
group(*choices)
main()
maybe(*choices)
printtoken(type, token, srow_scol, ...)
tokenize(readline[, tokeneater]) The tokenize() function accepts two parameters: one representing the input stream, and one providing an output mechanism for tokenize().
tokenize_loop(readline, tokeneater)
untokenize(iterable) Transform tokens back into Python source code.

Classes

Untokenizer()
chain chain(*iterables) –> chain object