This book on algorithms is for compiler design covers the various aspects of designing a language translator in depth. The book is intended to be a basic reading material in compiler design.
The author wishes to thank all of the colleagues in the Department of Electronics and Computer Science Engineering at Visvesvaraya Regional College of Engineering Nagpur, whose constant encouragement and timely help have resulted in the completion of this book.
This document contains Contents.
A program is a specification of what data computer is required to process, by using what operations, and by going in what sequence.
A fmite automata consists of a finite number of states and a finite number of transitions, and these transitions are defined on certain, specific symbols called input symbols.
In the syntax-analysis phase, a compiler verifies whether or ~ot the tokens generated by the lexical analyzer are grouped .~ccording to the syntactic rules of the language. If the tokens in a string are grouped according to the language's . rules of syntax, tpen the string of tokens generated by the lexical analyzer is accepted as a valid construct of the language; otherwise, an error handler is called.
A syntax analyzer or parser is a program that performs syntax analysis. A parser obtains a string of tokens from the lexical analyzer and verifies whether or not the string is a valid construct of the source language-that is, whether or not it can be generated by the grammar for the source language.
Bottom-up parsing can be defined as an attempt to reduce the input string w to the start symbol of a grammar by tracing out the right-most derivations of w in reverse. This is equivalent to constructing a parse tree for the input string w by starting with leaves and proceeding toward the root-that is, attempting to construct the parse tree from the bottom, up.
The specification of a construct's translation in a programming language involves specifying what the construct is, as well as specifying the translating rules for the construct. Whenever a compiler encounters that construct in a program, it will translate the construct according to the rules of translation. Here, the term "translation" is used in a much broader sense.
A symbol table is a data suucrure used by a compiler to keep track of scope/ l?indirig information about names. These names are used in the source program to identify the various program elements, like variables, constants, procedures, and the labels of statements.
One of the important tasks that a compiler must perform is to allocate the resources of the target machine to represent the data objects that are being manipulated by the source program
One of the important tasks that a compiler must perform is the detection of and recovery from errors. Recovery from errors is important, because the compiler will be scanning and compiling the entire program, perhaps in the presence of errors; so as many errors as possible need to be detected.
The translation of a source program to an object program is basically one to many mappings; that is, there are many object programs for the same source program, which implement the same computations. Some of these objectprograms may be better than other object programs when it comes to storage requirements and execution speeds. Code optimization refers to techniques a compiler can employ in order to produce an improved object code for a given source program.
Code generation is the last phase in the compilation process. Being a machinedependent phase, it is not possible to generate good code without considering the details of the particular machine for which the compiler is expected to generate code. Even so, a carefully selected code-generation algorithm can produce code that is twice as fast as code generated by an ill-considered codegeneration algorithm.
Lex (a Lexical Analyzer Generator) is one of the compiler writing tools, that is used to generate a lexical analyzer or scanner from the description of tokens of the programming language to be implemented. And this description is required as regular expressions. Therefore Lex is a scanner generator used to generate a scanner automatically from the description of tokens as regular expresswns.
The exercises that follow are designed to provide further examples of the concepts covered in this book. Their purpose is to put these concepts to work in practical contexts that will enable you, as a programmer, to better and more-efficiently use algorithms when designing your compiler.
This document contains objective type Questions
Dr O.G. Kakde view complete profile