Algorithm Runtime
> Click or hit ControlEnter to run Example.main above
Algorithms
Algorithm: a process or set of rules to be followed in calculations or other problemsolving operations, especially by a computer.
As computer scientists, we implement algorithms by having computers:

Perform simple calculations

Store the results

Make simple decisions

Do things over and over again as fast as possible
Data Structures
Data structure: a collection of data values, the relationships among them, and the functions or operations that can be applied to the data.
As Java programmers we implement more complicated data structures using a mix of:

Primitive types and objects to store and organize data values

Existing data structures like arrays

References to reflect relationships among objects
Algorithms and Data Structures
Algorithms and data structures are highly complementary:

We will implement algorithms that utilize specific features of data structures

We will implement data structures to support specific algorithms

We will use our existing imperative and objectoriented ideas along the way

And we’ll introduce a few more important ideas along the way
Example: Greatest Common Denominator
In mathematics, the greatest common divisor (GCD) of two or more integers, which are not all zero, is the largest positive integer that divides each of the integers.
But If We’re In A Hurry…
What’s a simpler approach?
Brute Force Solution
Brute force solution: a very general problemsolving technique that consists of systematically enumerating all possible candidates for the solution and checking whether each candidate satisfies the problem’s statement.

Computers today are very, very fast

So try the simple thing first

If it’s too slow, try something a bit more sophisticated
> Click or hit ControlEnter to run Example.main above
You Don’t Need the Fastest Algorithm to Change the World
That’s a good thing!
But Speed Eventually Matters
Even if you don’t at the beginning, you will eventually start to care about how fast your code runs. For any number of the following reasons:

You start to have larger problems to solve.

You’re embarrassed that your algorithm makes your incredibly fast computer seem slow

You have to start paying for machines

Your customer tells you that your program is too slow

You’re in a job interview
So How Long Will It Take?
How long will our brute force GCD algorithm take?

To compute the GCD of 4 and 6

To compute the GCD of 185 and 2045

To compute the GCD of M and N
Algorithm Analysis
Algorithm analysis: the determination of the computational complexity of algorithms, that is the amount of time, storage and/or other resources necessary to execute them.
At The Limit
We’re usually want to analyze an algorithm in the general case, rather than for a specific set of inputs.

How does the algorithm perform on arbitrarily difficult or large inputs?

What are the best, average, and worstcase running times?

How is the algorithm’s performance related to its inputs?
BigO Notation
BigO notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity.
Put another way: we want to estimate what happens as the problem gets really, really hard.
(At this point noncritical components of the algorithm’s performance also cease to matter.)
BigO Notation
O(1)
int[] myArray = new int[1024];
int getArrayValue = myArray[10]; // This is constant time
O(1) is sometimes called constant time.
Life is good and livin' is easy. But we’re usually not this lucky.
O(n)
int[] myArray = new int[1024];
int sum = 0;
// A single loop through an array is usually O(n)
for (int arrayValue : myArray) {
sum += arrayValue;
}
O(n) is still not bad.
Frequently we have to see each value in an array or other data structure at least once, so sometimes O(n) is the best we can do.
BigO Notation
O(n)
int[] myArray = new int[1024];
for (int arrayValue : myArray) {
if (arrayValue == lookingFor) {
break;
}
}
What about the example above?

Best case: it’s the first element

Worst case: it’s the last element

Average case: O(n / 2), which we usually simplify to just O(n)
O(n^2)
boolean isSorted(int[] array) {
for (int i = 0; i < array.length; i++) {
for (int j = i; j < array.length; j++) {
if (array[j] < array[i]) {
return false;
}
}
}
return true;
}
Now things are getting bad.

If we need to both loop through an array and compare every element with every other element we end up with an O(n^2) algorithm.

You can identify it by the nested loops.
O(n^2)
boolean isSorted(int[] array) {
for (int i = 0; i < array.length; i++) {
for (int j = i; j < array.length; j++) {
if (array[j] < array[i]) {
return false;
}
}
}
return true;
}

Best case: the unsorted element is at the beginning

Worst case: the array is sorted

Average case: O(n^2)
BigO Notation
O(log n) and O(n log n)
The logarithmic growth rates are usually caused by features of problems that we haven’t seen yet—but will soon.

If every step of the algorithm makes cut the size of the problem in half, then you end up with a O(log n) runtime.

Recursive algorithms frequently have this property.
Dumb Algorithm, Clever Algorithm
A dumb algorithm can move a problem up in the runtime categorization: for example, from O(n) to O(n^2). (Our sort test is dumb. The problem is O(n).)
A smart algorithm can move a problem down in the runtime categorization: for example, from O(n^2) to O(n log n). (Euclid’s Method GCD is smart. The problem is O(log(N)).)
BigO Notation
Does P == NP?
The P versus NP problem is a major unsolved problem in computer science. It asks whether every problem whose solution can be quickly verified can also be quickly solved.
Whether P == NP is one of the deepest unsolved mysteries in mathematics and computer science.
Simply put, are some problems just harder than others—or have we just not found good ways of solving them yet.
Sudoku Turns Out to be Interesting 1
(An Annoying Aside on Java Primitive Object Wrappers)
In Java, certain data structures (Maps
, ArrayLists
, etc.) only operate on
objects. (Because Object
provides hashCode
.)
But then how do we insert primitive types (ints
, longs
, etc.) into them?
Integer imAnObject = new Integer(5);
imAnObject = (Integer) 5; // You can cast primitives to object wrapper
int imNotAnObject = (int) imAnObject; // And back
Primitive Object Wrappers
Primitive Type  Object Wrapper 

















(Exciting Stuff…)
Lists
What you will be building on the next few homework problems is a general data structure called a list.
Lists are an ordered 1 data structure that allow us to:

Get and set values at any index (like an array)

Add or remove values at any index (this is new)

Lists are one of the two data structures you meet in heaven—maps are the other and we’ll get to them in a few weeks
Data Structure Tradeoffs
Depending on how we structure data different implementations of the same interface can have different performance characteristics.

We’ll start by looking at this with lists

Lists that store items using arrays have fast (O(1)) lookups but slow (O(n)) modifications

Lists that store items using linked lists have slow lookups (O(n)) but some insertions are fast (O(1))

Both also present different memory usage tradeoffs
Our List Interface
interface SimpleList {
/** Get the object at this index. */
Object get(int index);
/** Set the object at this index to the passed element. */
void set(int index, Object element);
/** Add the object at the specified location in the list. */
void add(int index, Object element);
/** Remove and return the object at the specified location in the list. */
Object remove(int index);
/** Return the number of elements in the list. */
int size();
}
(The official Java one contains a bunch of convenience methods that we don’t want.)
Announcements

I have virtual office hours today from 4–5PM. Please stop by and say hi!

We’re going to be testing a new video help system over the next few days. Please bear with us as we work out the kinks and problems—I’m sure there will be at least a few.