Monday, December 15, 2008

Recapturing Joy

I've been on a journey of late, trying to find what originally drew me to computer programming, why I stopped enjoying it, and what I might do to recapture my lost joy. I finally figured some things out - things which may also be of interest to you, if you happen to be a computer programmer.

I didn't realize it at the time, but what originally drew me to programming was the potential to create beauty - not with music, or paint, or gears, or circuits, but with pure thought. Abstractions borne of abstractions, creating virtual edifices that somehow, magically, have impact upon the everyday world.

That's how it should have been. But it wasn't. Why?

The main problem seems to be that without the constraints imposed by the physical world, human beings are unfettered in creating pure, absolute shit. You see it all around you, every day. Take cell phones (the iPhone (mostly) notwithstanding). Sure, the hardware and the form factor of most cell phones leaves a little to be desired, but the worst of it by far is the software.

In the physical world, you can only make something so horrible before it is no longer a something, but simply a pile of parts, or trash. Buildings, cars, CD players... sure, they might be ugly, and they might not be as fluidly functional as they could be, but - at a minimum - they either fulfill their function or they do not. Not so with software. I once owned a cell phone which was controlled by such horrible software that, were it a building, it would have collapsed in a heap of rubble. Just to give you a taste, about once a day I had to remove the battery to reboot the phone. Reboot the phone? The thing was a fucking state machine with at most five states! It's as if the builders were trying to make the thing bad.

But of course they weren't trying to make a bad phone. They were simply trying to make a phone, with the tools they had at their disposal. And the tools they had at their disposal were probably pure shit. But why? Why is so much software pure shit? I don't mean in the "90%-of-everything-is-crap" sort of way; software seems to be much, much worse than that. My dad doesn't get angry with his car; he doesn't even get angry with his cell phone. But he gets angry with his computer, all the time. And that's just using Microsoft Word.

I believe it's because that we computer programmers too easily confuse complexity with elegance. And so the complexity grows, because... well, because it's just so much fun. And because there is no force of gravity pulling it all down to earth - we can simply grow and grow and grow the complexity, forever. Arbitrary complexity. The physical universe puts an upper bound on the amount of arbitrary complexity that will be tolerated in a building or a car. But in software there exists no negative feedback loop for complexity.

People who don't program computers are always shocked to discover that very, very little programming time is spent thinking about the real, actual problems to be solved. Rather, most of our time is spent learning about the arbitrary complexity created by other human beings. If I'm building you a web site to keep track of how many miles you jogged last week, I assure you that I'm spending precious little time thinking about jogging or tracking miles or usability or whether or not you'd like to restrict access to your logs or any other issues that are of real, actual interest to you, the user. What I'm spending most of my time doing is figuring out how why my Javascript function is working on Firefox and not IE. Or tracking down the memory leak in my app server. Or any of one billion other stupid little problems that have nothing to do with what you want, and - more importantly - from which I'll learn absolutely nothing (except some small fragment of arcane product knowledge, guaranteed to be forgotten or obsolete by next week).

The last point is important - with the tools at our disposal, experience doesn't make us computer programmers smarter; experience actually makes us dumber, because we're not learning anything in any meaningful sense of the word "learn", unless you consider the memorization of product manuals to be "learning".

In our defense, as a profession, we do seem to be gravitating towards development ecosystems that are more and more dynamic and expressive and powerful. But they're still doomed, because they can't meaningfully evolve. Witness what's happening in the Ruby community. Ruby and its community is essentially Java++ - the language is better, the development ecosystem is better, it's run by a benevolent dictator instead of a corporation... and yet it's still doomed to go the way of Java. I'll bet my kidneys on this one - In 2015, we'll all be talking about Ruby in exactly the same way we talk about Java in 2008. I know this to be true because it's the same players on the same trajectory. Ruby now is Java in 1999 - we're all excited about building tools and frameworks and servers, and it all looks and sounds and feels great. It always does when you first start building those first few layers of arbitrary complexity.

But isn't that the way of things? This is all unavoidable, isn't it? Perhaps. But I believe Lisp is the exception. And interestingly, it has nothing to do with the community (in fact, the Common Lisp community is an active inhibitor on the growth and success of Common Lisp). It's due to a geeky little aspect of Lisp, which is that there is no Lisp. It's a language-less language - Lisp is essentially without syntax, expressing itself as its own abstract syntax tree. And when you provide the magic spells "eval" and "apply", and thus enable a macro facility that lets you manipulate the language using the language itself, there simply isn't anything you can't do.

But didn't I say that this unfettered freedom was the problem? Well, yes and no. Empirically, the unfettered freedom provided by other languages leads to abstractions being expressed in terms of frameworks and libraries - edifices of arbitrary complexity. But Lisp enables one to express abstractions in terms of language, in a way that no other language really does, or even can - as Paul Graham has pointed out, any language which provided the ability to express itself in terms of its own abstract syntax tree would essentially be a dialect of Lisp. In a world where we can express ideas with language, we're no longer talking about "computer programming"; we're talking about writing, pure and simple - a difficult activity to be sure, but one with which humanity has had more success.

And so Lisp has helped me to recapture some of the joy of computer programming, by enabling me to once again be excited about working with ideas and abstractions, which in turn is enabled by the ability to create language - and with a language that almost isn't even there...

Wednesday, November 5, 2008

Hope

"I am asking you to believe, not just in my ability to bring about a real change in Washington, I'm asking you to believe in yours."

- President-Elect Barack Hussein Obama


Friday, October 10, 2008

Agile Languages

The other day I was having lunch with a friend - a good programmer - and I asked him what he was working on. He said that was learning about Ruby, and Rails, and Rake. He talked about how he thought it would be nice to use Rake at this one client to clean up their messy build. I asked him how Rake differed from Ant, and he replied "Well, I haven't actually used Rake yet, but it looks really cool." And suddenly I felt uncomfortable.

Once upon a time, Java was new and fresh and exciting. To those of us coming from a C++ background - and who had never heard of Smalltalk - Java seemed to us like C++ done better: familiar syntax, but with garbage collection and a beautiful set of standard libraries, all packaged on top of a platform-independent runtime. Java and its associated collection of frameworks have matured to the point where you can crank out a web-based database front-end and have a pretty good idea of how long it will take, and where you'll run into problems. But because we're geeks, we prefer to tinker with new things, and a corollary of this is that we don't like to tinker with old things that we know very well, even if they're working OK. Enter Ruby.

There's a lot of momentum right now behind Ruby, especially Rails, and some of this momentum is coming from within the Agile community. To a certain extent, this makes sense - agile teams strive to deliver high-quality software in an iterative fashion with just-in-time requirements. The tools we use can help us or hinder us in this effort. And of all our tools, the most important is our programming language. It is through language that we are able to express processes and abstractions in such a way that we can automate and manipulate and understand them.

Some say that the choice of language is irrelevant, that all languages are effectively the same. But anyone who says this is either unaware of the differences between languages, or - more likely - is simply being polite, avoiding an argument. Would anyone seriously argue that assembly language is as expressive as Java? But what about the difference between Java and C#? Here the answer - if there is one - becomes more complicated, and so most of us just punt. However, if we're all excited about a new language (such as Ruby), and if we're going to be responsible with our employers' money, then we need to address two issues:
  1. Are we switching to Ruby because it's really that much more productive, or are we switching to Ruby because we're geeks and we're bored and we've done it in Java and now we're just dying to try something new...?
  2. If there are valid reasons to toss Java and try something truly new, should that something new be Ruby? Are we all excited about Ruby because it's really the right way to go, or is it because we're being told us to be excited about it?
When a community springs up around a new technology and reaches a critical mass, a training and consulting community springs up around it as well, because everyone smells the money. And therein lies the danger that we might make decisions for the wrong reasons: because there are now a lot of people telling us how great Ruby is who have a financial stake in whether or not we like Ruby.

I understand that Rails is a wonderful framework with which to write a web app. I also understand that there are similarly wonderful frameworks written in other languages - Django, web.py and others for Python, Seaside for Smalltalk, and so on. So why are we excited about Ruby and not about, say, Python?

I've never been paid to write an application in Ruby, or Python, or any of these other fancy dynamic languages, so I'm ill-qualified to pass judgment on any. But as an outsider, what concerns me is this: in the Java community, we didn't break a bunch of new ground. To be brutally honest, we - or, at least, I - ain't the best and the brightest. If we were, we'd be doing something more interesting than using Blub to write the nine-billionth web-based database front-end for a bank.

But here's a question: if Ruby and Rails are so great, then why is it that Google and YouTube and many of the more interesting startups gravitating instead towards Python? If I were an IT manager, this would make me curious.

There are a lot of interesting languages with active communities happening at the moment. Ruby, Python, Haskell, Erlang and Common Lisp seem to be the standouts. The hardware people tell us that foreseeable future performance gains are coming through multi-core parallelism; that instruction-level parallelism has gone about as far as it can go.

This is a very, very important fact for software developers, because it means that we will no longer be getting the speed-ups for free from the compiler and the hardware - it means that we will have to program parallelism. History shows that the software community in general isn't very good at this. In fact, we suck. Erlang, however, provides a beautiful model for executing several concurrent processes and keeping them coordinated. So if we're looking towards the future, shouldn't we be more interested in Erlang, or something like it?

Most of us are corporate IT programmers, which means that every day we spend a whole lot of someone else's money. Don't we owe it to our customers to pick our technology based on something more than a fad?

Thursday, September 11, 2008

Proactive + Reactive Processes

At present I'm playing the part of a shmagile coach for a medium-sized software company that's rolling out scrum processes across their entire development organization for their next big product release. We've got 38 scrum teams, so it's a non-trivial effort. One of the fundamental challenges is this: once you go away from having an up-front functional spec, how do you enable the various groups to stay abreast of what other groups are doing, without having so much communication that the signal-to-noise ratio goes to zero?

I'm finding that a good way to attack these sorts of problems is to define a proactive process, and a reactive process. The proactive process enables teams to identify obvious dependencies and shared issues up front. It's an 80% approach - you know you're not going to find everything, but you can at least get working without being willfully ignorant.

That's good, but in an agile environment you also need a reactive process - a chance for people to identify issues that were missed during planning, and to identify these issues as soon as possible - preferably as they're happening.

To illustarte this more concretely, before we began the current release cycle, the product management organization had a release roadmapping session in which they roughly mapped out which "epics" (big huge stories) were slotted into one of five "release candidates". The architecture group and some of the development team leads then spent a little time identifying technical dependencies between those epics. Again, the goal wasn't to find every dependency or hidden technical issue, but to proactively pick the low-hanging fruit.

Meanwhile, we have two different weekly "scrum of scrum" meetings attended by a representative from each of the teams. One scrum is for product managers, while the other scrum is for the dev tech leads. These forums exist primarily to enable others to learn about - and react to - changes from other teams as quickly as possible.

How well will does the proactive + reactive approach work in practice? I don't know yet - we just started our very first sprint. Stay tuned...

Monday, August 25, 2008

Meeting Sleepytime Indicator #2

Meeting Sleepytime Indicator #2: When you've been in a meeting for over an hour, and someone asks a question which would actually provoke a meaningful discussion, and the presenter shuts it down by saying "I just want to get through the rest of these slides...", it's time to take a nap.

Monday, August 18, 2008

Meeting Sleepytime Indicator #1

Over the course of the next few... years...(?), I'll be posting various Meeting Sleepytime Indicators (MSIs) as they are identified. When you're attending a meeting and an MSI occurs, it lets you know that it is now perfectly acceptable to stop paying attention and fall asleep, because all valuable conversation has ended. In fact, this may be the best way to engage in the conversation, because if you begin snoring it might prompt the other attendees to change the subject. So let's kick off the list!

Meeting Sleepytime Indicator #1: When two or more attendees begin debating the differences between "processes" and "procedures".

Friday, August 15, 2008

Professional vs. Profitable

I recently was tasked with producing some documentation for a client. Most of my writing is conversational in tone, with the occasional tongue-in-cheek comment. The client liked the content, but asked that I change the language to make it more "professional". Which got me thinking about the word "professional". What exactly does it mean to be professional?

Entry number one from dictionary.com gives: "following an occupation as a means of livelihood or for gain". Entry number three, however, better captures the shade of meaning for "professional" as it's normally used in the workplace: "appropriate to a profession". The idea is that within a given profession there are certain things that one does to conform to the standards of that profession. But why? What are those standards, and what are they for?

Returning to definition number one, most of us practice a profession within the context of a business, and any business exists for one reason: to make money. Period. No matter what reasons a business gives by way of vague mission statements, a business exists to make a profit. Unfortunately, making a profit is hard work. More unfortunately, within the context of a large corporation - and particularly within the context of information-based corporations which employ a large number of knowledge workers - it becomes increasingly difficult to tie any one person's activities to the profitability of the company.

Enter "professionalism". Since we can't tell whether or not any one person's actions are profitable, we invent a set of customs collectively called "professionalism". Like any social ritual, professionalism involves wearing the correct costumes, saying the correct things, and following the correct protocols. These things are measurable - even if only qualitatively. Measuring professionalism thus becomes a proxy for measuring profitability.

The problem is that as an organization grows, so grows the disconnect between professionalism and profitability. So much so, in fact, that what most people call "professional" behavior tends to actually be mocked by the rank and file workers of an organization, who - on the whole - provide the lion's share of direct value to the company. It makes a certain amount of sense; the rituals of professionalism require effort, and so being "professional" necessarily leaves less time for being profitable.

What's especially terrible about this is that it's a positive feedback loop - the more authority one has within a hierarchy, the more disconnected one becomes from the activities associated with profitability, and so the more one must rely on judgments based on professionalism. Thus, senior management becomes even more disconnected as they become increasingly surrounded by like-minded experts in professionalism but not profitability.

It's not obvious what - if anything - can be done about this, except perhaps to grow an organizational culture that explicitly disdains the traditional notion of "professionalism" (or at least certain aspects of it). The next time you hear someone use the term "professional" or "professionalism", see if you can figure out why they chose to use that particular word.

Thursday, August 14, 2008

More Manager Metrics

Here's another manager metric: does your manager often justify decisions or processes or technology choices as "industry standard"? Your company is in an industry, but the industry is not your company. Who cares what everyone else is doing? If all the other kids are doing something, it makes it worth investigating, but it certainly isn't a rationale for doing it - particularly within the software industry. The software industry sucks so badly that "industry standard" can often be used fairly reliably as an indicator of what not to do.

Wednesday, August 13, 2008

Attention to Detail

On and off for the last few months, I've been working on a plugin for IntelliJ (my favorite IDE and text editor) to support development in the Arc programming language, which is Paul Graham's (relatively) new dialect of Lisp. Writing code in Arc is sort of fun, because although it's young and immature, it's very small and clean. However, it's also fun to develop plugins for IntelliJ, because their plugin API - including their API for developing custom language plugins - is just fabulous. Consequently, I've spent very little time writing Arc code, and quite a lot of time working on my Arc plugin. To a hammer maker, everthing looks like a hammer, I guess.

Anyway, everything was fine until the switch from IntelliJ version 7.0 to version 8.0, in which they've introduced a fair amount of changes to the language plugin API, in order to better support languages other than Java. This is a great thing for IntelliJ, as it is now poised to become the premier IDE for Ruby, and Python, and a host of other languages in addition to Java. It's not such a great thing for me, however, in that I've had to make a variety of changes to the Arc plugin in order to get it creaking into action.

After a few late nights, I finally got most of the Arc plugin working in version 8, with one maddening exception: syntax highlighting. Now, syntax highlighting is one of the easiest things to make happen when writing an IntelliJ language plugin. You practically get it for free, because once you've written the lexer (which is your very first step, and which they also make very easy via JFlex), all you need to do is create a class which extends SyntaxHighlighterBase, specify the default colors and fonts you want for various tokens, and snap - you got yourself some syntax highlighting.

In version 8, (presumably) to better follow some sort of internal dependency injection blah blah blah, the JetBrains folks have changed the way you get a lot of these components created - instead of defining getters in your "MyLanguage extends Language" class, you add "extension" entries to your "plugin.xml" plugin configuration file. This is fine, except that the JetBrains folks haven't gotten around to updating the plugin documentation yet. At all.

But you're still not doomed, because they have been updating their own language plugins for version 8, and for their Javascript plugin, they've included all the source code! So all I should need to do is simply copy the right entry from the Javascript plugin.xml file, and I'm good to go, right?

<syntaxHighlighter language="Arc" implementation="com.bitbakery.plugin.arc.ArcSyntaxHighlighterProvider"/>


Looks right. Fire up the plugin, and I get... no syntax highlighting. So, I beg for help on the "Open API and Plugin Development" user forum over at intellij.net, and Maxim Shafirov (IntelliJ team lead!) is nice enough to respond that the correct XML attribute is "key" and not "language". So I try that, and still no luck. So I try to create the extension programmatically within the constructor of my ArcLangauge class. Still no luck. Finally, after a few hours of frustration, spread across two evenings, I decide to really look at the plugin.xml for the Javascript plugin:

<syntaxHighlighter key="JavaScript" factoryClass="com.intellij.lang.javascript.highlighting.JSSyntaxHighlighterProvider"/>


Oops. Turns out Maxim was very nicely telling me to quit blaming the plugin API changes, and pay attention to detail, because I wasn't.


When I was a freshman in college taking my first physics class, I would often find myself coming up with homework answers that didn't match those in the back of the textbook. I would bring these issues up to my professor during her office hours, and she would very patiently work through the problem with me and show me what I was doing wrong. And not once did I ever actually discover a wrong answer in the back of that textbook, although I thought I did, many, many times.

Occam's Razor is a principle which asserts that - all other things being equal - the simplest explanation is usually the correct explanation. And when you're programming computers, the simplest explanation is that it's not the other guy, it's you. So pay attention, because there are a lot more little things - and a lot more stupidity - than you might think.

It's a valuable lesson. I just wish I didn't have to keep re-learning it.

Meeting Metrics

Random thoughts while attending a stupider-than-usual meeting:
  • Here's a barometer for managers - do they use harsh language for their own mistakes, and gentle language for others' mistakes, or vice-versa?
  • ...and here's a meeting metric - every time someone says or does something that makes you want to crush your own soul, put a hash mark down on your meeting notes. You can then measure the pain of a meeting quantitatively in units of soul crushings per hour (SC/hr). Over time, you can then look for corollations between SC/hr and meeting attendees, topics or facilitators.

Tuesday, August 12, 2008

Hello, world!

Jason Bock keeps bothering me to start making my stupid ideas more public, presumably as part of an elaborate scheme to get me to say enough stupid things in public that I eventually find it impossible to find consulting work. Yes, if there's one thing Jason is about, it's finding a way to destroy me and my family. I'm not sure why.

But since I find Jason strangely persuasive, here it is - my blog, relentlessly driving me to a world of humiliation and starvation, seemingly avoidable, and yet completely unavoidable. It's like watching a Greek tragedy.

To kick this off, I'll answer the same set of questions that Jason himself answered...

How old were you when you started programming?

I took a computer programming summer school class in 1981, at the age of 10. I really, really hated it. I dropped out, and never looked back. Then in college I decided I wanted to be a physicist. I sucked at that, but I did enjoy wiring up cards to control experiments. So then I became enamored with electrical engineering. I sucked at that as well, but I did enjoy writing the device drivers for the cards we were making. So then I tried computer science, which I really enjoyed. I sucked at that too, but not as much. And people were (and still are) willing to pay money for sucky programmers, so I rolled with it.

What was your first programming language?

Technically, Apple Basic, but really I'd have to say Fortran 77, in 1990.

What was the first real program you wrote?

As Jason said, "depends on what you mean by 'real'". I guess I'd say some educational video games I did at JVC Digital Arts Studio in 1997. Everything before that was almost too simple.

If you knew then what you know now would you have started programming?

Yes, but instead of doing the C --> C++ --> Java/.NET route, I would've become a Unix-y/Lisp-y snob, and only taken certain kinds of gigs. I'd be smarter and happier at work had I done this.

If there is one thing you learned along the way that you would tell new developers, what would it be?

I'd echo Jason's sentiment: "learn core concepts". I've learned and thrown away about three technology stacks in my career so far, but if I knew a lot of algorithms and data structures and the Bash shell and Sed and Lisp and maybe C really, really well, I'd have something to hang my hat on. Everything else takes too much work and doesn't last long enough to add lasting value to your brain.

What’s the most fun you’ve ever had … programming?

The project I was on with Jason Bock was actually a ton of fun for a few months. I also had an insanely good experience at JVC Digital Arts Studio, and at a place called Gearworks. It's cool to get gigs where everyone is way smarter than you.