jinja2.lexer.TokenStream

class jinja2.lexer.TokenStream(generator, name, filename)[source]

A token stream is an iterable that yields Tokens. The parser however does not iterate over it but calls next() to go one token ahead. The current active token is stored as current.

Methods

__init__(generator, name, filename)
close() Close the stream.
expect(expr) Expect a given token type and return it.
look() Look at the next token.
next() Go one token ahead and return the old one
next_if(expr) Perform the token test and return the token if it matched.
push(token) Push a token back to the stream.
skip([n]) Got n tokens ahead.
skip_if(expr) Like next_if() but only returns True or False.

Attributes

eos Are we at the end of the stream?