Saturday, October 3, 2009

Learning Things of Lasting Value

In the book "Seeking Wisdom", Peter Bevelin describes the ways in which we make bad decisions, why we make bad decisions, and what we can do to avoid making bad decisions. The author synthesizes lessons from a number of sources, focusing especially on Warren Buffett and Charlie Munger, the leaders of Berkshire Hathaway (a successful holding corporation of which you may have heard).

In listing the methods for improving the quality of one's thought process, Mr. Bevelin elaborates on the method of "Simplification" under the sub-heading "Focus on what you can know and that makes a difference":

"What's knowable and important? And what can be translated into useful action?"

At the ripe old age of 38, working within a technology industry that changes daily, this question is gaining in importance for me. I have lost the energy, and patience, and interest required to stay abreast of technological flavors-of-the-month. Ruby on Rails? Honestly, I only barely give a shit. Java 7? Flex? Silverlight? These things are only valuable until the powers behind them decide I should learn something newer and better, and that's only six months away.

So, to remain effective over the long haul, I am required to focus on a few things that I believe can have lasting value. And here's my list:
  • The fundamentals of computer science, as laid out in the book "Structure and Interpretation of Computer Programs", by Abelson and Sussman
    • Why? Because everything you'll ever really need to know about computing is contained in this one book. Seriously.
  • Fundamental algorithms and data abstractions
    • Why? Because our job as computer programmers is to think, and algorithms and data structures (along with metaphor) are the raw materials. You can't be an effective civil engineer if you don't understand the properties of concrete, and you can't be a computer programmer if you don't understand the ways in which you can structure and process data, and the processes themselves (which are really just another form of data).
  • The Unix operating system - how it works, and why
    • Why? Because flavors of Unix - Linux in particular - have won. I have mixed feelings about this, in that it hardly seems the best we can do; however, Unix appears to be the once and future king. This may seem a ridiculous statement, given the continuing market share of Windows, but it's true. I have no special hatred for Microsoft, but they're locked on a trajectory that ends badly, much as GM was in, say, 1995.
  • Proficiency in at least one flavor of the Unix shell - the Bash shell, probably
    • Why? Despite the ancient clunkiness of the Unix command line as a means for interacting with a computer, there is an important way in which it was - and is - decades ahead of its time: Unix gave us the notion of assembling small powerful tools into larger, more powerful tools. Other areas of computing have only barely begun to explore this design space (e.g., web mashups), but it will become more important as computing becomes more heterogeneous and complex. As a practical matter, you can get a lot done with some basic shell scripting skills, and if you know Bash, you can probably apply that knowledge to other, less powerful shells (e.g., DOS).
  • Proficiency with a universal text editor (e.g., vi or emacs)
    • Why? Because humans interact with computers via text, and will be after I'm dead. I never learned to type properly as a teenager, because I was assured that keyboards were going to be obsolete in 20 years. Well, I'm no longer waiting for that one to become true (a note to naysayers: explain how a voice interface will work in an office environment). Consequently, the need for typing skills and proficiency with a text editor aren't going to disappear anytime soon, and for longevity's sake we should probably choose the lowest common denominator in text editing that still affords the power we need, and that's probably vi or emacs. I wish that weren't true. But it is. You can learn something else (I myself prefer IntelliJ), as long as you accept that you'll be throwing away that knowledge and learning a new tool at some point in the next few years.
  • Proficiency with C
    • Why? Because C remains the lingua franca for communicating with the underlying hardware, and if you can't speak this language yourself, you'll always be intimidated by and limited by those who can.
  • Proficiency with Lisp (or some dialect, such as Scheme or Arc)
    • Why? Because it's a medium for expressing thought, similar in linguistic power to mathematics. There are many other computer languages, of course, but no other language combines the ability to express functional abstractions, data abstractions, and metalinguistic abstraction. For the time-being, Ruby or Python may be an "acceptable Lisp". But beyond that, I have no burning desire to learn Ruby or Python, because they have nothing to teach me which I can't learn from Lisp.
  • Proficiency with SQL
    • Why? Because relational databases probably aren't disappearing anytime soon, and so you'll need to write a query occasionally if you expect to get anything done. More importantly, however, SQL offers a different way of thinking about computing - in terms of sets - and thus has something to teach.
  • Agile values, principles and practices (at least until something better comes along)
    • The word "agile" is dead. But many of the underlying values will remain, and rightly so. Chief is the idea of "failing fast" - iterating to better understand the product one is building, and to get better at building it.
In compiling this list, I realized that my age doesn't really matter - that in any endeavor, we should focus on the underlying values and principles before we get distracted with the latest fashions, because only then can we understand and intelligently evaluate the latest fashions.

So, that's my list, until someone gives me a better one. What's on your list?

3 comments:

Bob MacNeal said...

Nice post Kurt. That list should keep you off the soup line for the foreseeable future.

My list you ask?

First, I’m going to ingest just enough technology nuts and bolts to keep a job.

Then, I want to focus on writing and software.

In the writing realm, I’m interested in that nebulous space between people and applications. I’d like to borrow concepts from evolutionary biology, biomimicry, behavioral psychology, and human-experience interfacing to understand how teams can best develop UX-optimized software and, of course, how users use the software.

In the software realm, I want to focus on and perhaps develop a model for human-experience-prototyping. My guess is that future software products will be made by teams hyper-focused on UX. I’d guess I’d like to conjure up a software framework capable of processing real-time feedback from users and even (pie in the sky) create apps smart enough to that adapt based on that feedback stream.

Kurt Christensen said...

Thanks for the love! I always assume I'm only writing for myself.

I like your list as well. I agree that a focus on improving the user experience with computers will always leave one with plenty to do, even assuming the constraint of a text-based experience, which I don't see disappearing in my lifetime.

Your last point is intriguing... I wonder how far you could go with it. As a trivial example, I would think you could gather per-user metrics over time to create personalized user interfaces. It's a tough problem, though - Microsoft tried that with the menus in Office, and most people hated it.

Kevin said...

Very nice post. I agree wholeheartedly. Which is why I'm learning emacs and Clojure. :)