# CSCI 4602, Fall 2002

## Syllabus

This is a course on the theory of computing. See the syllabus for details.

## Office hours

MW 11:00-12:30, TTh 11:00--12:00 or by appointment.

If you have obtained a password then you can check your grades as I have them recorded. This will be the only way that you can learn your grade at the end of the term before it has been posted on the student desktop.

You can obtain a password from me. To obtain one by email, send me email giving your name, the course, and the password that you want. Choose a string of 4 or 5 letters/digits. Be careful about upper and lower case letters. Your password will be exactly what you send to me.

## Lecture summaries

1. [8/22/02] We covered basic terminology on strings, sets and functions from chapter 0 of the text. We began looking at proofs. You should read chapter 0.

2. [8/27/02] We did examples of proofs, and are now ready to start on finite state computation.

3. [8/29/02] We introduced finite state machines by examples, and defined their syntax and semantics. The material is in chapter 1 of the text.

4. [9/5/02] We looked more at finite state machines. Then we introduced regular expressions, and nondeterministic finite state machines. This is also in chapter 1 of the text. Our goal now will be to show that regular expressions and finite state machines have the same descriptive power.

5. [9/10/02] We did some conversions between different ways of describing languages. We did the subset construction, used to convert a nondeterministic finite state machine into an equivalent deterministic finite state machine. We introduced nondeterministic machines with epsilon-transitions, and showed how to get rid of the epsilon-transitions without changing any of the answers that the machine gives. Then we showed how to convert an arbitrary regular expression into a nondeterministic finite state machine with epsilon-transitions.

6. [9/12/02] We defined generalized finite state machines, and showed how to convert a generalized finite state machine into an equivalent regular expression. This completes the proof that regular expressions can describe the same languages as deterministic finite state machines. We then did a sketch of a proof that the language {anbn : n >= 0} is not regular.

7. [9/17/02] We discussed the homework, and went through some examples of proving that a language is not regular. The text does such proofs using a result called the Pumping Lemma. The method shown in class is just as rigorous, but easier to understand and to get right.

8. [9/19/02] We began looking at context-free grammars and context-free languages. This material is in chapter 3 of the text. We will not cover very much material on context-free languages, to allow more time for other material.

9. [9/24/02] We went over the homework. Then we continued looking at context-free languages. We introduced nondeterministic push-down machines (NPDAs). The book demonstrates that NPDAs have the same power as context-free grammars. Given any context-free grammar G, you can find a NPDA M that accepts exactly the strings that can be generated by G (no more and no less). And, given any NPDA M, you can find a context-free grammar G that generates every string that is accepted by M, and does not generate any strings that are rejected by M. The proof is fairly long and involved, and we will have enough of those to see, so we will not go through this proof.

10. [9/26/02] We showed that every regular language is also context-free.

11. [10/1/02] We finished talking about context-free languages, and then turned to the issue of general computation. We talked about the Turing machine model of computing.

12. [10/3/02] We discussed the Church/Turing thesis, and as evidence for it showed how one kind of machine can simulate another. We looked at machines with multiple tracks, multiple heads and multiple tapes, as well as machines with a random access memory, and saw that none are more powerful than a single-tape Turing machine. You can also take away power from a Turing machine. A machine that has just two counters, which it can increment, decrement and check for zero, is as powerful as a Turing machine.

13. [10/8/02] We introduced the halting problem and proved that it is undecidable using diagonalization.

14. [10/10/02] We looked at more examples of diagonalization, including the diagonal set K.

15. [10/17/02] We proved Rice's theorem by generalizing the examples of diagonalization that we have done. Rice's theorem states that any nontrivial set of programs that respects equivalence is uncomputable.

16. [10/22/02] We introduced reductions and reducibility, and looked at properties of reductions.

17. [10/24/02] We continued to look at reductions. Post's Correspondence Problem was introduced.

18. [10/29/02] We went over a high-level picture of how to prove that Post's Correspondence Problem is uncomputable, and did part of the proof, a reduction from Modified Post's Correspondence Problem to Post's correspondence Problem.

19. [10/31/02] We finished proving that Post's Correspondence Problem is uncomputable.

20. [11/5/02] We discussed the upcoming quiz, and proved that the ambituity problem for context-free grammars is undecidable.

21. [11/7/02] Quiz 3. We introduced the concept of a partially computable language.

22. [11/12/02] We looked at enumerability, and proved that the enumerable languages are the same as the partially computable language. We began to look at the idea of reduction.

23. [11/14/02] We explored m-reductions and m-completeness. We showed that the halting problem is m-complete.

24. [11/19/02] We introduced the classes P and NP. We looked at polynomial time reductions.

25. [11/21/02] We began to explore polynomial time reductions and NP-completeness. We defined the GSAT problem and noted (but did not prove) that it is NP-complete.

26. [11/26/02] We continued to look at NP-complete problems. We defined SAT problem and showed that it is NP-complete by reducing the GSAT problem to it. Then we defined the 3-SAT problem, and showed that it is NP-complete by reducing the SAT problem to it. We finished by defining the Vertex Cover problem, and noticed that it is in NP.

27. [11/28/02] Break.

28. [12/3/02] We proved that the vertex cover and independent set problems are NP-complete.

29. [12/5/02] Quiz 2. We will look more at NP-completeness.