jinja2.lexer.Lexer.tokenize

Lexer.tokenize(source, name=None, filename=None, state=None)[source]

Calls tokeniter + tokenize and wraps it in a token stream.