Introduction to Algorithms
> Click or hit ControlEnter to run Example.main above
Algorithms
Algorithm: a process or set of rules to be followed in calculations or other problemsolving operations, especially by a computer.
As computer scientists, we implement algorithms by having computers:

Perform simple calculations

Store the results

Make simple decisions

Do things over and over again as fast as possible
Data Structures
Data structure: a collection of data values, the relationships among them, and the functions or operations that can be applied to the data.
As Java programmers we implement more complicated data structures using a mix of:

Primitive types and objects to store and organize data values

Existing data structures like arrays

References to reflect relationships among objects
Algorithms and Data Structures
Algorithms and data structures are highly complementary:

We will implement algorithms that utilize specific features of data structures

We will implement data structures to support specific algorithms

We will use our existing imperative and objectoriented ideas along the way

And we’ll introduce a few more important ideas along the way
Example: Greatest Common Denominator
In mathematics, the greatest common divisor (GCD) of two or more integers, which are not all zero, is the largest positive integer that divides each of the integers.
But If We’re In A Hurry…
What’s a simpler approach?
Brute Force Solution
Brute force solution: a very general problemsolving technique that consists of systematically enumerating all possible candidates for the solution and checking whether each candidate satisfies the problem’s statement.

Computers today are very, very fast

So try the simple thing first

If it’s too slow, try something a bit more sophisticated
> Click or hit ControlEnter to run Example.main above
You Don’t Need the Fastest Algorithm to Change the World
That’s a good thing!
But Speed Eventually Matters
Even if you don’t at the beginning, you will eventually start to care about how fast your code runs. For any number of the following reasons:

You’re embarrassed that your algorithm makes your incredibly fast computer seem slow

You have to start paying for machines

Your customer tells you that your program is too slow

You’re in a job interview
So How Long Will It Take?
How long will our brute force GCD algorithm take?

To compute the GCD of 4 and 6

To compute the GCD of 185 and 2045

To compute the GCD of M and N
Algorithm Analysis
Algorithm analysis: the determination of the computational complexity of algorithms, that is the amount of time, storage and/or other resources necessary to execute them.
At The Limit
We’re usually want to analyze an algorithm in the general case, rather than for a specific set of input.

How does the algorithm perform on arbitrarily difficult or large inputs?

What are the best, average, and worstcase running times?

How is the algorithm’s performance related to its inputs?
BigO Notation
BigO notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity.
Put another way: we want to estimate what happens as the problem gets really, really hard.
BigO Notation
O(1)
int[] myArray = new int[1024];
int getArrayValue = myArray[10]; // This is constant time
O(1) is sometimes called constant time.
Life is good and livin' is easy. But we’re usually not this lucky.
O(n)
int[] myArray = new int[1024];
int sum = 0;
// A single loop through an array is usually O(n)
for (int arrayValue : myArray) {
sum += arrayValue;
}
O(n) is still not bad.
Frequently we have to see each value in an array or other data structure at least once, so sometimes O(n) is the best we can do.
BigO Notation
O(n)
int[] myArray = new int[1024];
for (int arrayValue : myArray) {
if (arrayValue == lookingFor) {
break;
}
}
What about the example above?

Best case: it’s the first element

Worst case: it’s the last element

Average case: O(n / 2), which we usually simplify to just O(n)
O(n^2)
boolean isSorted(int[] array) {
for (int i = 0; i < array.length; i++) {
for (int j = i; j < array.length; j++) {
if (array[j] < array[i]) {
return false;
}
}
}
return true;
}
Now things are getting bad.

If we need to both loop through an array and compare every element with every other element we end up with an O(n^2) algorithm.

You can identify it by the nested loops.
O(n^2)
boolean isSorted(int[] array) {
for (int i = 0; i < array.length; i++) {
for (int j = i; j < array.length; j++) {
if (array[j] < array[i]) {
return false;
}
}
}
return true;
}

Best case: the unsorted element is at the beginning

Worst case: the array is sorted

Average case: O(n^2)
BigO Notation
O(log n) and O(n log n)
The logarithmic growth rates are usually caused by features of problems that we haven’t seen yet—but will soon.

If every step of the algorithm makes cut the size of the problem in half, then you end up with a O(log n) runtime.

Recursive algorithms frequently have this property.
Dumb Algorithm, Clever Algorithm
A dumb algorithm can move a problem up in the runtime categorization: for example, from O(n) to O(n^2). (Our sort test is dumb. The problem is O(n).)
A smart algorithm can move a problem down in the runtime categorization: for example, from O(n^2) to O(n log n). (Euclid’s Method GCD is smart. The problem is O(log(N)).)
Announcements

The MP3 early deadline is today at 5PM. Get your 50 points!

My office hours will be around 3:30PM in Siebel 0403.