stream_tokenizer


stream_tokenizer(delimiters, s = current_reader)
      

A stream of characters read from an input stream can be split into tokens using a tokenizer. Stream_tokenizer Return a tokenizer on the input stream s.

If delimiters is either a character or a list of characters, a tokenizer will be returned that splits the stream at the character or each one of the characters. If delimiters is false, a tokenizer will be created that treat the input stream as Slogan source code and tokenizes the input based on the language syntax.

Examples:


let t = stream_tokenizer(\;, string_reader("abc;def;g:h;i"))
          
function print_tokens(t)
  when (not(is_eof_object(peek_token(t))))
  { showln(get_token(t))
    print_tokens(t) }

print_tokens(t)
//> abc
  def
  g:h
  i

t = stream_tokenizer([\;, \:], string_reader("abc;def;g:h;i"))
print_tokens(t)
//> abc
  def
  g
  h
  i
      

Also see:

statement
slogan


Core Module Index | Main Index