Learning is enjoyable sometimes…

A few weeks ago, I signed up to take an Algorithm and Data Structures course on Coursera. Part of my reason for this is that I like having stuff explained to me, but mostly because the course has deadlines and I find it easier to stay motivated if the threat of a “0” is looming.  The coursework can be completed in any language but they have starter files and tests already set up for C, C++, Java, & Python.  We’re welcome to use other languages, but we’d have to write our own tests.  Fortunately for me, my first language was Python so I’m dusting off those skills to complete the coursework.

I’m in week 4 of 5 in course 1 of six-course series.  While I don’t know if I’ll complete all six, I have gained some pretty nice insight into how to tackle problems.  Greedy strategies, divide-and-conquer, among others are explored and the course requires us to think about the time-limits required when we submit our assignments.  The correctness of the assignment is not only “can you write an algorithm to complete the task, but  also “can you keep the algorithm from going over the time limit?”  No brute-force/naive answers allowed!

Getting back into Python has been entertaining. Initially I had to write out my algorithm in Swift using Xcode’s playground, and then translate it into Python.  After a couple of weeks I’ve been able to transition into just using Python.  I still write it in Swift afterwards just to compare the syntax differences of the two, but it’s nice to be able to pick something back up after not using it for almost two years.

So what problems have I solved? We started with Fibonacci, which I’ve done before but this was the first time I spent really exploring how it is computed recursively vs. iteratively. Even if not optimal from a memory allocation standpoint (most of the time), recursive functions are really beautiful in their simplicity of code. It reminds me of a more elegant while-loop. Initially I begin to wonder why even learn to use recursive functions if they suck memory so quickly when dealing with large numbers, until we got to the Greatest Common Divisor (GCD) problem.

Note: Recursion in programming is a function that continues to call itself until some terminating condition is met.  I write the function and call the function in the same code. For example:

someFunction(parameter A, parameter B) -> Value {
// do some stuff
return someFunction(parameterA, parameterB)
}

Screen Shot 2017-09-22 at 9.23.00 AM

GCD is is comparing 2 (or more numbers) and finding a divisor that both numbers share. For example:

gcd(6,8) -> 2

2 is largest number that six and eight can both be dividing by. 3 works for 6, but not 8, 4 works for 8 and not six; you get the idea.

But what if the numbers are really large? say 2,877,988 and 956,432 ?

can you imagine write a loop that starts at 1 and divides all the way up to those numbers checking to see if both have remainders of 0 when being divided? How long that must take?!

Well the course let us know a secret about large numbers. The remainder taken when dividing the first number by the second number has the same GCD as the original two numbers.

Screen Shot 2017-09-22 at 9.23.52 AM
I didn’t really understand this until I saw it in action. (See below) Also, how did people figure this stuff out?!

So even though the number is smaller, the GCD is still the same.  So basically, to find the GCD of the original numbers, keep dividing the first number by the second number until the remainder is 0.  Once you have that, the first number left is your GCD.

Here is a better visual of “trick”:

Screen Shot 2017-09-22 at 9.29.38 AM.png
This made much more sense to me than the Lemma above.

I have two numbers, A: 3,918,848 and B: 653,264. I divide 3,918,848 by 1,653264 and the remainder is 612,320, which we will call A-prime.  I replace the A with B, and put A-prime in B’s old place and perform the same operation again.  I keep doing so until A-prime is 0.

In this case, recursion actually doesn’t take terribly long as it’s reducing the size of the number pretty quickly, unlike when building numbers up during a Fibonacci sequence (1, 1, 3, 5, 8, 13, 21…).  So an efficient way to find the GCD of two numbers actually is to use recursion, though probably not the only efficient way…

Here is the Python function:

Screen Shot 2017-09-22 at 10.00.43 AM

It amazes me how short it is.

Using that algorithm allowed me to complete another assignment that finds the Least Common Multiple.  I won’t go into it but they provided another tip on how to calculate LCM using GCD and again, it provides a short elegant solution.

I think the decision to take this course, for me, was the correct one.  I’m being exposed to some handy tools, I’m obligated to finish assignments since it is costing me a monthly fee and the assignment are providing a nice warmup to my day for me while I drink my first cup of coffee.  (Note: if I haven’t finished the problem after 10-15 minutes, I wait until my lunch break to finish).

This week we began looking at binary and linear search algorithms.  In the past, when I would look at these, I just tried to memorize it so I could regurgitate it if it came up in an interview. Now, I have a list of questions I ask myself and can look at mathematical explanations and make sense of them, which aids in my understanding all the more.

None of what I’m doing may ever be needed, but I’ve finally gotten to the point where I view learning algorithms as not a means to an end in a job search…which is where I was a year ago.  Instead, it has become “I really want to learn this stuff. I find it interesting.” Perhaps at some point I’ll even do it without having to spend money to motivate myself!

Advertisements