\documentclass[10pt]{article}
\usepackage{fullpage}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{epsfig}
\begin{document}
\input{preamble.tex}
\lecture{1}{Feb 4, 2009}{Madhu Sudan}{Mergen Nachin}
\section{Administrative Information}
\begin{itemize}
\item Lecturer: Madhu Sudan (madhu@mit.edu)
\item TA: Brenden Juba (bjuba@mit.edu)
\item Website: \verb+http://courses.csail.mit.edu/6.841/+
\end{itemize}
The grading will be based on the following.
\begin{itemize}
\item Scribing - You must scribe at least one lecture no matter if you are taking the class for credit or as a listener.
\item Problem sets - There will be roughly 3 problem sets throughout the semester.
\item Participation - We encourage people to speak up, discuss and ask questions during the lecture.
\item Project - Read papers about some topic and present it to the class (with additional progress, if possible).
\end{itemize}
\section{High level overview of Computational Complexity}
Computational Complexity is concerned with the study of
\begin{itemize}
\item \emph{Interesting} computational problems.
\item \emph{Interesting} resources such as time, space and etc.
\item The feasibility and infeasibility - That is to prove upper and lower bounds.
Unfortunately, we have a very few results on lower bounds for time or space.
But on the other hand, we have made quite a progress on \emph{comparison} lower bounds.
For example, we compare two problems and conclude the following: if problem A requires
some certain amount of resource to solve, then we must need at least some amount of
resource to solve problem B.
\end{itemize}
How do we define ``interesting"? This is a very subjective choice. For example,
a problem might be interesting if it has a lot of applications in a real world,
or if many other problems can be reduced to one of its instances.
Once we find an ``interesting" problem, we want to find out how much time
and space suffice to solve the problem, and how much are necessary to solve the problem.
\subsection{Examples of ``interesting" problems}
The following three problems are presented as ``interesting".
\begin{itemize}
\item \#SAT (``number-SAT''): Given a 3CNF formula $\phi$ on $n$
variables $x_1,\ldots,x_n$ with $m$ clauses $c_1,\ldots,c_m$ (so $\phi
= c_1 \wedge \ldots \wedge c_m$ and each $c_i$ looks something like
$x_{i_1} \vee \bar{x_{i_2}} \vee x_{i_3}$), count the number of
satisfying assignments.
\item CNF minimization: Given a 3CNF formula $\phi$ with $m$ clauses
and an integer $m' < m$, does there exist a 3CNF $\psi$ with at most
$m'$ clauses such that $\psi \equiv \phi$? (We say that $\psi \equiv
\phi$ if, for each assignment $a$ of $x_1, \ldots, x_n$, we have
$\psi(a) = \phi(a)$.)
\item Permanent: Given an $n \times n$ matrix $A=\left[a_{ij}\right]$
of integers, compute the \emph{permanent} of $A$. We define $perm(A)
= \Sigma_{\pi:[n]\rightarrow[n]} \left(\Pi_{i=1}^{n} a_{i
\pi(i)}\right)$, where $[n]=\{1,\ldots,n\}$ and $\pi$ represents a
permutation (so $i \neq j$ implies $\pi(i) \neq \pi(j)$). This is
basically the formula for the determinant of a matrix, without the
power of -1 derived from the sign of each $\pi$.
\end{itemize}
Let's see how these problems fit to our definition of ``interesting".
CNF minimization is an interesting problem because it has a real world application.
For instance, engineers want to optimize a circuit that has as few gates as possible.
\#SAT is an interesting problem because we encounter it very often. Many types
of problems are reducible to \#SAT, and is apparently a particular representative of a
larger class of problems. As a side note, we don't know which one of CNF-minimization
and \#SAT is easier, but we will prove some facts about their relation later in the course. \\
As it turns out, the other two example problems are reducible to each
other. The fact that \#SAT $\equiv$ Permanent was proved by Valiant,
who stated that counting is an interesting ``phenomenon.'' To that
end, he created a complexity class called ``Algebraic-NP'' such that
Permanent $\in$ Algebraic-NP-complete. Furthermore, Lipton found that
the worst-case instances of Permanent reduce to random instances,
implying that the worst-case complexity of the problem is at most the
average-case complexity. These problems were also related to CNF
minimization by Toda, who found that CNF minimization $\le$
(\emph{reduces to}) \#SAT. \\
The Permanent problem is a wonderful concept in the mathematical world but unfortunately,
we don't know how to compute in $o(2^n)$ time (as opposed to $O(n^3)$ algorithm for computing a determinant).
How much space do we need to compute permanent of an n by n matrix? We can solve it in $O(n \log(n) )$ space. We also know that we cannot solve it in $o(\log(n))$. But as you probably see, there is a huge gap between the bounds. \\
\fig{fig01}{0pt}{Diagram of relations between problems}{l01-fig01.pdf}
\fig{fig02}{0pt}{Relationship among L,NL,coNL,P,NP,coNP and PSPACE}{l01-fig02.pdf}
Once we relate these problems, we can see that some of them may share some common properties and
form an interesting class. For example, we can draw relations between \#SAT, CNF-Min, Permanent
and many others, and can get the following diagram (see Figure \ref{fig01}). This diagram might suggest that problems labeled as red may form one class, and problems labeled as green may form another class. \\
From 6.840, we know the following relations among computational classes (see Figure \ref{fig02}). This diagram shows conjectured relations among different classes. We only know that L $\neq$ PSPACE from space hierarchy theorem. Particularly, L might be equal to P or P might be equal to PSPACE but they cannot be equal at the same time. We don't know much about the relationship between RP and NP.
\section{Technical Definitions}
Now we'll give a few precise definitions that will be used throughout
the course. To begin with, we'll be dealing with what we call
\emph{computational problems}. We can define computational problems in three following ways.
\begin{itemize}
\item Set or Language L $\in \{0,1\}^*$. Given x $\in \{0,1\}^*$, answer whether x $\in$ L (membership in a set).
\item Function. We are given a function $f$ which maps an input $x$ to
an output $f(x)$, leading to the problem of finding $f(x)$ given $x$.
\item Relation. We are given a relation $R \subseteq
\{0,1\}^* \times \{0,1\}^*$ and an input $x$, produce an output $y$
such that $(x, y) \in R$. This is basically definition for searching problems.
\end{itemize}
Why do we focus on languages (rather than relations)? The reason for focusing
only on languages is that there is a simple way of talking about reductions
between problems for languages! It is much easier to find comparisons and reductions
between decision problems than between search problems.
\subsection{Reductions}
Reduction allows us to formally compare the relative hardness of
different computational problems. There are two main types of reductions
in complexity theory:
\begin{itemize}
\item {\bf Turing reductions or Many-Many reductions} The most general type of reduction
between problems (which works
for relations) is of the following type.
\begin{itemize}
\item $R_1 \leq R_2$ (i.e., $R_1$ reduces to $R_2$):
Given a subroutine to solve $R_2$, give an algorithm to solve $R_1$.
\end{itemize}
\item {\bf Karp reductions or Many-One reductions} Reduction between languages can be defined
in the following way:
\begin{itemize}
\item A Karp reduction from $L_1$ to $L_2$ ($L_1 \leq L_2$) is
an algorithm $T : \{0,1\}^* \rightarrow \{0,1\}^*$ such that
$$x\in L_1 \leftrightarrow T(x) \in L_2$$
\end{itemize}
\end{itemize}
We will use the latter kind of reduction in the sequel of this course because
Karp reductions are easier to come up (empirically). Note that, there are cases that Turing reductions
exist but Karp reductions don't exist (or not found). For instance, consider the language co-SAT=\{$\phi$ : $\phi$ is NOT satisfiable\}. SAT is Turing reducible to co-SAT (just call an algorithm to solve SAT as a subroutine and negate the
answer). But we believe that SAT is not Karp reducible to co-SAT. This is essentially the same question as proving that a CNF formula does not have a satisfying assignment, where the proof size is polynomial in terms of size of the input. \\
\section{Course overview}
The following is roughly what we will do in 6.841.
\begin{itemize}
\item Lower Bounds - We will prove some few known lower bounds in terms of time/space ...
\item Resources - We will look at another concept called Alternation.
This is a new resource, but allows us to get further insights on
classical resources such as time/space.
For example, consider the possibility (A) that ``SAT $\in$
Logspace",
and the possibility (B) that ``SAT $\in$ LinearTime".
We don't know whether they are true individually. We believe
(at least, some of us) that
neither is true (since both are special cases of the belief that
NP $\ne$ P). But using ``alternation", we can prove that at
most one of (A) and (B) is true.\footnote{Thanks to Michael Forbes
for pointing out an error here in an earlier version of these notes.}
\item New Concepts - such as proofs, interactions, pseudorandom and knowledge.
\item Average case complexity vs worst case.
\item Quantum computation - Questions concerning about what we can theoretically do in our physical world.
\end{itemize}
Any suggestions about additional topics are welcome. \\
From 6.840, you should know the following.
\begin{itemize}
\item Time Hierarchy - E.g TIME($n^2$) $\subsetneq$ TIME($n^{10}$). The proof is by diagonalization method. Remember that, there is a $\log(n)$ slow-down during the simulation.
\item Space Hierarchy - SPACE($n^2$) $\subsetneq$ SPACE($n^2\log(\log(n)))$. Again, the proof is by diagonalization method.
\item TIME($t$) $\subseteq$ SPACE($t$) $\subseteq$ TIME($2^t$). The first containtment is trivial. The second containtment is due to a simulation.
\item NTIME($t$) $\subseteq$ TIME($2^t$). Again by a simulation.
\item NSPACE($s$) $\subseteq$ SPACE($s^2$). Due to Savitch's theorem.
\item NSPACE($s$) $\subseteq$ coNSPACE(O($s$)). By Szelepcsenyi and Immerman (independently).
\item NP and NP-Completeness.
\end{itemize}
\end{document}