Algorithm Runtime
> Click or hit ControlEnter to run Example.main above
Algorithms
Algorithm: a process or set of rules to be followed in calculations or other problemsolving operations, especially by a computer.
As computer scientists, we implement algorithms by having computers:

Perform simple calculations

Store the results

Make simple decisions

Do things over and over again as fast as possible
Data Structures
Data structure: a collection of data values, the relationships among them, and the functions or operations that can be applied to the data.
As Java programmers we implement more complicated data structures using a mix of:

Primitive types and objects to store and organize data values

Existing data structures like arrays

References to reflect relationships among objects
Algorithms and Data Structures
Algorithms and data structures are highly complementary:

We will implement algorithms that utilize specific features of data structures

We will implement data structures to support specific algorithms

We will use our existing imperative and objectoriented ideas along the way

And we’ll introduce a few more important ideas along the way
> Click or hit ControlEnter to run Example.main above
So How Long Will It Take?
How long will our brute force GCD algorithm take?

To compute the GCD of 4 and 6

To compute the GCD of 185 and 2045

To compute the GCD of M and N
Algorithm Analysis
Algorithm analysis: the determination of the computational complexity of algorithms, that is the amount of time, storage and/or other resources necessary to execute them.
At The Limit
We’re usually want to analyze an algorithm in the general case, rather than for a specific set of input.

How does the algorithm perform on arbitrarily difficult or large inputs?

What are the best, average, and worstcase running times?

How is the algorithm’s performance related to its inputs?
BigO Notation
BigO notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity.
Put another way: we want to estimate what happens as the problem gets really, really hard.
BigO Notation
O(1)
int[] myArray = new int[1024];
int getArrayValue = myArray[10]; // This is constant time
O(1) is sometimes called constant time.
Life is good and livin' is easy. But we’re usually not this lucky.
O(n)
int[] myArray = new int[1024];
int sum = 0;
// A single loop through an array is usually O(n)
for (int arrayValue : myArray) {
sum += arrayValue;
}
O(n) is still not bad.
Frequently we have to see each value in an array or other data structure at least once, so sometimes O(n) is the best we can do.
BigO Notation
O(n)
int[] myArray = new int[1024];
for (int arrayValue : myArray) {
if (arrayValue == lookingFor) {
break;
}
}
What about the example above?

Best case: it’s the first element

Worst case: it’s the last element

Average case: O(n / 2), which we usually simplify to just O(n)
O(n^2)
boolean isSorted(int[] array) {
for (int i = 0; i < array.length; i++) {
for (int j = i; j < array.length; j++) {
if (array[j] < array[i]) {
return false;
}
}
}
return true;
}
Now things are getting bad.

If we need to both loop through an array and compare every element with every other element we end up with an O(n^2) algorithm.

You can identify it by the nested loops.
O(n^2)
boolean isSorted(int[] array) {
for (int i = 0; i < array.length; i++) {
for (int j = i; j < array.length; j++) {
if (array[j] < array[i]) {
return false;
}
}
}
return true;
}

Best case: the unsorted element is at the beginning

Worst case: the array is sorted

Average case: O(n^2)
BigO Notation
O(log n) and O(n log n)
The logarithmic growth rates are usually caused by features of problems that we haven’t seen yet—but will soon.

If every step of the algorithm makes cut the size of the problem in half, then you end up with a O(log n) runtime.

Recursive algorithms frequently have this property.
Dumb Algorithm, Clever Algorithm
A dumb algorithm can move a problem up in the runtime categorization: for example, from O(n) to O(n^2). (Our sort test is dumb. The problem is O(n).)
A smart algorithm can move a problem down in the runtime categorization: for example, from O(n^2) to O(n log n). (Euclid’s Method GCD is smart. The problem is O(log(N)).)
Data Structure Tradeoffs
Depending on how we structure data different implementations of the same interface can have different performance characteristics.

We’ll start by looking at this with lists

Lists that store items using arrays have fast (O(1)) lookups but slow (O(n)) modifications

Lists that store items using linked lists have slow lookups (O(n)) but some insertions are fast (O(1))

Both also present different memory usage tradeoffs
BigO Notation
Does P == NP?
The P versus NP problem is a major unsolved problem in computer science. It asks whether every problem whose solution can be quickly verified can also be quickly solved.
Whether P == NP is one of the deepest unsolved mysteries in mathematics and computer science.
Simply put, are some problems just harder than others—or have we just not found good ways of solving them yet.
Sudoku Turns Out to be Interesting 1
(An Annoying Aside on Java Primitive Object Wrappers)
In Java, certain data structures (Maps
, ArrayLists
, etc.) only operate on
objects. (Because Object
provides hashCode
.)
But then how do we insert primitive types (ints
, longs
, etc.) into them?
Integer imAnObject = new Integer(5);
imAnObject = (Integer) 5; // You can cast primitives to object wrapper
int imNotAnObject = (int) imAnObject; // And back
Primitive Object Wrappers
Primitive Type  Object Wrapper 

















(Exciting Stuff…)
Announcements

I now have office hours MWF from 10AM–12PM in Siebel 2227. Please stop by!

Remember to provide feedback on the course using the anonymous feedback form.

I’ve started to respond to existing feedback on the forum.