Tokenize a string: Difference between revisions
Content added Content deleted
m (→{{header|REXX}}: added needed 1st statement for REXX programs. -- ~~~~) |
|||
Line 762: | Line 762: | ||
print(unpack(record:match"hello,how,are,you,today"))</lang> |
print(unpack(record:match"hello,how,are,you,today"))</lang> |
||
A different solution using the string-library of Lua: |
A different solution using the string-library of Lua: (skips empty columns) |
||
<lang lua>str = "Hello,How,Are,You,Today" |
<lang lua>str = "Hello,How,Are,You,Today" |
||
Line 769: | Line 769: | ||
tokens[#tokens+1] = w |
tokens[#tokens+1] = w |
||
end |
end |
||
for i = 1, #tokens do |
|||
print( tokens[i] ) |
|||
end</lang> |
|||
e.g. to split a string with a delimiter of | AND allowing for empty values: (NOTE: This can probably be cleaned up) |
|||
<lang lua>str = "Hello|How|Are|You||Today" |
|||
tokens = {} |
|||
for w in string.gmatch( str, "([^|]*)|?" ) do |
|||
tokens[#tokens+1] = w |
|||
end |
|||
table.remove(tokens)--pops off the last empty value, because without doing |? we lose the last element. |
|||
for i = 1, #tokens do |
for i = 1, #tokens do |