The shlex module provides a simple lexer (also known as tokenizer) for languages based on the Unix shell syntax. Its use is demonstrated in Example 5-19.
Example 5-19. Using the shlex Module
File: shlex-example-1.py import shlex lexer = shlex.shlex(open("samples/sample.netrc", "r")) lexer.wordchars = lexer.wordchars + "._" while 1: token = lexer.get_token() if not token: break print repr(token) 'machine' 'secret.fbi' 'login' 'mulder' 'password' 'trustno1' 'machine' 'non.secret.fbi' 'login' 'scully' 'password' 'noway'
Core Modules
More Standard Modules
Threads and Processes
Data Representation
File Formats
Mail and News Message Processing
Network Protocols
Internationalization
Multimedia Modules
Data Storage
Tools and Utilities
Platform-Specific Modules
Implementation Support Modules
Other Modules