Digg StumbleUpon LinkedIn YouTube Flickr Facebook Twitter RSS Reset
Comments Off

Python: Couper les fichiers source en tokens

 fichier, fichier
{filelink=16980}

import tokenize
 
file = open("TokenizePy.py")
 
def handle_token(type, token, (srow, scol), (erow, ecol), line):
    print "%d,%d-%d,%d:	%s	%s" %
    (srow, scol, erow, ecol, tokenize.tok_name[type], repr(token))
 
tokenize.tokenize(
    file.readline,
    handle_token
    )

Livres Sur ce Sujet

----------------------------------------------------------------------------

Comments are closed.

Sex Cams Movable Theme