| .. sectionauthor:: Fred L. Drake, Jr. <fdrake@acm.org> |
| |
| |
| The :mod:`tokenize` module provides a lexical scanner for Python source code, |
| implemented in Python. The scanner in this module returns comments as tokens as |
| well, making it useful for implementing "pretty-printers," including colorizers |
| for on-screen displays. |
| |
n | The primary entry point is a generator: |
n | The primary entry point is a :term:`generator`: |
| |
| |
| .. function:: generate_tokens(readline) |
| |
n | The :func:`generate_tokens` generator requires one argment, *readline*, which |
n | The :func:`generate_tokens` generator requires one argument, *readline*, |
| must be a callable object which provides the same interface as the |
| which must be a callable object which provides the same interface as the |
| :meth:`readline` method of built-in file objects (see section |
n | :ref:`bltin-file-objects`). Each call to the function should return one line of |
n | :ref:`bltin-file-objects`). Each call to the function should return one line |
| input as a string. |
| of input as a string. |
| |
| The generator produces 5-tuples with these members: the token type; the token |
t | string; a 2-tuple ``(srow, scol)`` of ints specifying the row and column where |
t | string; a 2-tuple ``(srow, scol)`` of ints specifying the row and column |
| the token begins in the source; a 2-tuple ``(erow, ecol)`` of ints specifying |
| where the token begins in the source; a 2-tuple ``(erow, ecol)`` of ints |
| the row and column where the token ends in the source; and the line on which the |
| specifying the row and column where the token ends in the source; and the |
| token was found. The line passed is the *logical* line; continuation lines are |
| line on which the token was found. The line passed (the last tuple item) is |
| included. |
| the *logical* line; continuation lines are included. |
| |
| .. versionadded:: 2.2 |
| |
| An older entry point is retained for backward compatibility: |
| |
| |
| .. function:: tokenize(readline[, tokeneater]) |
| |