CSCI 4627
Spring 2002

Last modified: 5/1/02

Announcements

The final exam will be 8:00 to 10:00 on Tuesday, May 7 in Austin 304.

Answers to exam 2 are available.

Please turn in your final version of your compiler using the handin program as assignment 4. Turn in ALL of your source files, including header files. Do not turn in extraneous files. I should be able to compile all of the files that you turn in as a single program and get a working compiler.


Syllabus

See the syllabus for a description of the course.


Assignments

Please see the checklist for writing and turning in programs.

Manuals

General homework

  1. Assignment 1
  2. Assignment 2

Compiler project

  1. C- lexer assignment
  2. C- parser assignment
  3. C- translator assignment
  4. C- compiler assignment

Practice exams

  1. Practice questions for exam 1
  2. Solutions to practice questions for exam 2
  3. Practice questions for exam 2
  4. Partial solutions to practice questions for exam 2


Office hours and my home page

Office hours are MW 2:00-3:15pm and TTh 11-12:15pm or by appointment. My home page is www.cs.ecu.edu/~karl/.


Systems

Using Solaris

A short tutorial on Solaris is available.

Secure shell

You can get to the lab computers using ssh and sftp. Machines that allow ssh access are alfalfa.csci.ecu.edu, buckwheat.csci.ecu.edu, darla.csci.ecu.edu, porky.csci.ecu.edu, spanky.csci.ecu.edu and stymie.csci.ecu.edu.

Ssh for Windows is available from ftp://ftp.ssh.com/pub/ssh. Get SSHWinClient-3.1.0-build203.exe. It is a self-extracting executable. Just run it and it will install itself.


Lecture summaries

  1. [1/8/02] We went over compilers, including T diagrams and the anatomy of a compiler. This material is in Chapter 1 of the text.

  2. [1/10/02] We started looking at lexical analyzers (lexers). A lexer breaks the input into lexemes and translates the lexemes into tokens. We looked at designing a lexer by creating a finite state machine and the writing a program based directly on the finite state machine. This material is in Chapter 2 of the text.

  3. [1/15/02] We looked at how to produce a lexer automatically using flex. We started looking at the parsing problem. Material on context-free grammars and parse trees is in Chapter 3 of the text.

  4. [1/17/02] We went over context-free grammars, derivations, parse trees and ambiguity. Next we will be looking at top-down parsers. See Chapter 4 of the text.

  5. [1/22/02] Treated like a Friday, so we did not meet.

  6. [1/24/02] The lexer assignment was given. It will be available from this page shortly. We looked at top-down parsing, including the recursive descent parser on page 148 of the text, and started looking at table-driven LL(1) parsing.

  7. [1/29/02] We went into more detail on LL(1) parsing. We looked at simple examples, including examples with conflicts. Some conflicts can be removed by left-factoring a grammar, and we did examples of left-factoring.

  8. [1/31/02] We looked at left-recursion elimination and computation of the First set for a grammar. We defined the Follow set, but did not see how it is computed. Read about this in Chapter 4 of the text. Left-recursion elimination is discussed in Section 4.2.3, and computation of the First and Follow sets is in Section 4.3.

  9. [2/5/02] We saw how to get a regular expression for /*...*/ style comments. We saw how to compute the Follow set of a nonterminal in a grammar, and how to build the LL(1) parsing table from the First and Follow sets. Read about top-down parsing in the text.

  10. [2/7/02] Bottom-up parsing represents a major breakthrough in the design of parsers. We looked at shift-reduce parsers and using the SLR(1) method of building a parsing table to control a shift-reduce parser.

  11. [2/12/02] We did more examples of SLR(1) parsers, including a left-recursive grammar and an ambiguous grammar that led to conflicts in the parsing table. Read about SLR(1) parsers in the text.

  12. [2/14/02] We did another example of an SLR(1) parser and briefly discussed other kinds of LR parsers, including LR(0), LALR(1) and canonical LR(1) parsers. Then we began discussing semantic actions associated with grammar productions, and saw how semantic actions could be used to define a simple calculator.

  13. [2/19/02] We did an example of syntax-directed translation for the calculator program, first using Bison and then using a recursive descent parser. Chapter 6 of the text covers this in a fairly rigorous way. I am avoiding the more esoteric things and concentrating on simple, practical aspects of syntax-directed translation.

  14. [2/21/02] We saw another example of syntax-directed translation, converting an expression from infix to postfix notation. We did the translation once using an imperative approach where each nonterminal is responsible for writing out the translation of the subtree rooted at that nonterminal, and using attributes where the attribute of each nonterminal describes key aspects of the subtree rooted at that nonterminal. (In our example, the attribute was exactly the translation of the subtree, as a string.) We defined synthesized and inherited attributes. Read Chapter 6 on this material.

  15. [2/26/02] Midterm exam 1.

  16. [2/28/02] We went over the translator assignment and methods of syntax-directed translation.

  17. [3/5/02] We looked at how to generate code for expressions in the C- translator. In the style that we are using, each expression is responsible for generating code that computes itself. Each expression also needs some inherited attributes (passed down the parse tree) and some synthesized attributes (passed up the parse tree). I suggested using a type and information about the expression result's location as synthesized attributes; and jump-to and val-reqd as inherited attributes. Jump-to is a label that the expression should jump to if its value is false, or is NOJUMP if the expression should not jump. Val-reqd is true if the value of this expression will be used, and is false if the expression's value will be ignored.

  18. [3/7/02] We discussed more on syntax-directed translation of expressions, and some issues on translation of statements.

  19. [3/12/02] Spring break.

  20. [3/14/02] Spring break.

  21. [3/19/02] We discussed generation of conditional statements and conditional expressions by telling the expression generator its context, so that it can generate the jump instruction itself.

  22. [3/26/02] I answered some questions about code generation. We discussed type checking and type inference. Then we began looking at code optimization.

  23. [3/28/02] We looked at examples of code optimization, and discussed kinds of optimizations that compilers typically perform.

  24. [4/2/02] We went over the exercises from chapter 6. We looked more at program optimization, including tail recursion elimination.

  25. [4/4/02] We looked briefly at peephole optimization. We looked at breaking a program into basic blocks and processing the basic blocks. Live variable analysis determines which variables have values that might be used later, and helps in determining what needs to be stored. We saw how to do a live variable analysis by solving a system of monotonic equations.

  26. [4/9/02] We looked at algorithms for register allocation within a basic block.