Tokenize a string: Difference between revisions
Content added Content deleted
Line 789: | Line 789: | ||
tokens = text.split(',') |
tokens = text.split(',') |
||
print '.'.join(tokens)</lang> |
print '.'.join(tokens)</lang> |
||
If you want to print each word on its own line: |
|||
<lang python>for token in tokens: |
|||
print token</lang> |
|||
or |
|||
<lang python>print "\n".join(tokens)</lang> |
|||
or the one liner |
or the one liner |