\documentclass{article} \usepackage{me} \setlength{\parindent}{0pt}
\def\OPT{\mbox{OPT}}

Online algorithms

Motivation:

Definition: competitive ratio

Finance

Known or unknown duration. But assume know which offer is last.

Need fluctuation ratio $\phi$ between largest $M$ and smallest $m$ price.

Selling peanuts:

Selling (one) car: Best deterministic algorithm: agree to first price exceeding $\sqrt{Mm}$

Can achieve $\log \phi$ randomized

Graham's Rule

Define $P||\max C_j$ to minimize load.

NP-complete to solve exactly!

Always assign to least loaded machine:

More recent algs do somewhat better:

Paging problem

Easy argument: $k+1$ competitive

With work, we can improve to $k$ (worth it for tightness):

Generalize: LRU is $k/(k-h+1)$ competitive against $h$-page adversary

Observations:

Lower bound: no online algorithm beats $k$-competitive.

Observations:

Optimal offline algorithm: Longest Forward Distance

Move to front

Studying heuristics for reorganizing a list after you access it

Potential function: number of inversions.

Strengthen

Lower bound:

Note: our competitive bound assumes opt is paying 1 for transposes

2011 End lecture 20 \vspace{0.5cm}

Randomized Online Algorithms

An online algorithm is a two-player zero sum game between algorithm and adversary. Well known that optimal strategies require randomization.

A randomized online algorithm is a probability distribution over deterministic online algorithms.

Algorithm is $k$-competitive if for any $\sigma$, $E[C_A(\sigma)] \le k\cdot OPT+O(1)$.

Adversaries:

Randomized Paging

Idea: evict random page?

Marking algorithm:

Fiat proved: Marking is $O(\log k)$ competitive for $k$ pages.

Phases:

Analysis:

Online cost:

Offline cost:

Summary: If online pays $x\log k$, offline pays $x/2$. So, $(\log k)$-competitive.

Lower Bounds via Yao's Minimax Principle

Online algorithms is a two player game:

Formalize:

For paging:

2011 End Lecture 21
2011 Lecture 21 End. 2012 Lecture 23 End.
2013 L25 Start

This analysis applies to an oblivious adversary. What about other models?

$k$-server

Manasse et al. 1988.

Definition:

Greedy doesn't work:

Fancy algorithmics:

On a Line

A truly “On Line” algorithm.

greedy algorithm bad if requests alternate $a$ near $b$ but server on distant $c$.

Intuition

Problem: which of two nearest to move?

double coverage algorithm (DC):

$k$-competitive.

Analysis:

Can we do better? No, because paging is special case.