College assignment to implement TINY language compiler scanner.
Scanning is the process of converting a sequence of characters (such as in a computer program or web page) into a sequence of tokens (strings with an identified "meaning").A program that performs lexical analysis may be called a lexer, tokenizer, or scanner.
The scanner is using the finite state machine in it's core algorithm to identify three types of tokens [ Number, identifier, Special Symbol ].
- Install dependencies.
- git clone or zip download.
pip install transitions
python scanner.py input.txt output.txt
input.txt: is the input file.
output.txt: is the output file.