JOB REFERRALS
    ON THIS PAGE
    ARCHIVES
    CATEGORIES
    BLOGROLL
    LINKS
    SEARCH
    MY BOOKS
    DISCLAIMER
 
 Friday, January 03, 2014
Tech Predictions, 2014

Here we go again: the annual review of last year's predictions, and a set of new ones for the new year.

2013 Retrospective

Without further ado, first we examine last year's Gregorian prognostications:

  • THEN:"Big data" and "data analytics" will dominate the enterprise landscape.

    NOW: Yeah, it was a bit of a slam dunk breakaway kind of call, but it clearly counts. Vendors and consulting companies were climbing all over themselves to talk about "big data", and startups basing their existence on gathering, analyzing, displaying and (theoretically) offering insight from "big data" were all the rage in the startup community, such as local startup Predixion (CTO'ed by a buddy of mine). If you live anywhere in the Pacific Northwest, chances are there's a similar kind of startup within spitting distance of you right now. 1-0.

  • THEN:NoSQL buzz will start to diversify.

    NOW: It didn't happen quite as much as I'd expected, but the various vendors are, in fact, starting to use terms other than "NoSQL" to define themselves. In particular, we're seeing database vendors (MongoDB, Neo4J, Cassandra being my principal examples) talking about being a "document database" or a "graph database" instead of being a "NoSQL" database, though they're fairly quick to claim the NoSQL tag when it comes to differentiating against the traditional relational database. Since I said "start" to diversify, I'm going to take the win. 2-0.

  • THEN:Desktops increasingly become niche products.

    NOW: Well, this one is hard to call. Yes, desktop sales have plummeted, but it's hard to see what those remaining sales are being used for. I will point out that the Mac Pro, with it's radically-different internal construction, definitely puts a new spin on the desktop, but I'm not sure that this counts. Since I took the benefit of the doubt on the last one, I'll forgot it on this one. 2-1.

  • THEN:Home servers will start to grow in interest.

    NOW: I wish I had sales numbers to go with some of this stuff, as hard evidence, but the fact that many people are using their console devices (XBox, XBoxOne, PS3, PS4, etc) as media servers means I missed the boat on this one. I think we may still see home servers continue to rise, but the clear trend has been to make the console gaming device into a server, and not purchase servers on their own to serve as media servers. 2-2.

  • THEN:Private cloud is going to start getting hot.

    NOW: Meh. I see certain cloud vendors talking about private cloud, but for the most part the emphasis is still on public cloud. 2-3. Not looking good for the home team.

  • THEN:Oracle will release Java8, and while several Java pundits will decry "it's not the Java I love!", most will actually come to like it.

    NOW: Well, let's start with the fact that Java8 actually didn't ship this year. And even that, what I would have guessed would be a hugely-debated and hotly-contested choice, really went by without much fanfare or complaint, except from some of the usual hard-liner complaint sources. Which means one of two things: either (a) it's finally come to pass that most of the people developing on top of the JVM really don't care about the Java language's growth anymore, or (b) the community felt as Oracle's top engineering brass did, that getting this release "right" was far better than getting it out on the promised deadline. And while I agree with the (b) group on that, it still means that the prediction was way off. 2-4.

  • THEN:Microsoft will start courting the .NET developers again.

    NOW: Quite frankly, this one got left in dust almost the moment that Ballmer's retirement was announced. Whatever emphasis the company as a whole might have put into courting .NET developers back into the fold was immediately shelved, at least until a successor comes in to take Ballmer's place and decide what kind of strategy the company as a whole will pursue. Granted, the individual divisions within Microsoft, most notably DevDiv, continue to try and woo the developer community, but that was always going to be the case. However, the lack of any central "push" from the company effectively meant that the perceived "push" against .NET in favor of WinRT was almost immediately left behind, and the subsequent declaration of the Surface's failure (and Surface was by far the most public and prominent of the WinRT-based devices) from most corners meant that most .NET developers who cared about this breathed a sigh of relief and no longer felt those Microsoft cyclical Darwinian crosshairs (the same ones that claimed first C programmers, then C++ programmers, then COM programmers) on their back. Still, no points. 2-5.

  • THEN:Samsung will start pushing themselves further and further into the consumer market.

    NOW: And boy, howdy, did they. Samsung not only released several new versions of their various devices into the market, but they've also really pushed their consumer electronics in other form factors, too, such as their TVs and such. If there is a rival to Apple in the consumer electronics department, it is clearly Samsung, and the various court cases and patent violation filings are obvious verification of that. 3-5.

  • THEN:Apple's next release cycle will, again, be "more of the same".

    NOW: Can you say "iPhone 5c", and "iPad Air", boys and girls? Even iOS7 is basically the same OS, with a skinnier font and--oh, wow, innovation!--nested folders. 4-5.

  • THEN:Visual Studio 2014 features will start being discussed at the end of the year.

    NOW: Microsoft tossed me a major curve ball with their announcement of quarterly releases, and the subsequent release of Visual Studio 2013, and as a result, we haven't really seen the traditional product hype cycle out of the Microsoft DevDiv that we're used to. Again, how much of that is due to internal confusion over how to project their next-gen products out into the world without a clear Ballmer successor, and how much of that was planned from the beginning isn't clear, but either way, we ain't heard a peep outta nobody about C# 6 at all in 2013, so... 4-6.

  • THEN:Scala interest wanes.

    NOW: If anything, the opposite took place--Typesafe, Scala's owner/pimp/corporate backer, made some pretty splashy headlines within the JVM community, and lots of people talked a lot about it in places where Scala wasn't being discussed before. We can argue about whether that indicates just a strong marketing effort (where before Typesafe's formation there really was none) or actual growth in acceptance, but either way, I can't claim that it "waned", so the score becomes 4-7.

  • THEN:Interest in native languages will rise.

    NOW: Again, this one is hard to judge. There's been some uptick in interest in those native languages (Rust, Go, etc), and certainly there's been some interesting buzz around some kind of Phoenix-like rise of C++, but none of it has really made waves within the mainstream that I can see. (Then again, I don't really spend a lot of time in those areas where native languages would have made a larger mark, so this could be observer's contextual bias at work here.) That said, more native-based languages are emerging, and certainly Apple's interest and support in LLVM (which, contrary to it's name, is not really a "virtual machine", per se) can be seen as such, but not enough to make me feel comfortable saying I got this one right. 4-8.

  • THEN:Hardware is the new platform.

    NOW: Surface was a bust. Chromebooks hardly registered on anybody's radar. Dell threw out an arguable Surface-killer tablet, but for most consumer-minded folks it never even crossed their minds, it seems. Hardware may be the new platform, and certainly we're seeing a lot of non-x86-based machines continuing their race into consumers' hands, but most consumers don't think twice about the hardware as much as they do the visible projection of that hardware choice, in the operating system. (Think about it this way: when you go buy a device, do you care about the CPU, or the OS--iOS, Android, Windows8--running it?) 4-9.

  • THEN:APIs for lots of things are going to come out.

    NOW: Oh, my, yes. More on this later, but for now... 5-9.

Well, with a final tally of 5 "rights" to 9 "wrongs", clearly my 2013 record was about as win-filled as the Baltimore Ravens' 2013 record. *sigh* Oh, well, can't win 'em all every year, right?

2014 Predictions

Now, though, let's do the fun part: What does Ted think 2014 has in store for us geeky types?

  • iOS, Android and Windows8 start to move into your car. Audi has already announced this. Ford announced this last year with their SDK release. Frankly, with all the emphasis on "wearable tech" and "alternative tech", this seems a natural progression, considering how much time Americans, at least, spend time in their car. What, exactly, people will want software developers to do with this capability remains entirely unclear to me (and, it seems, to everybody else, given the lack of apps for the Ford SDK so far), but auto manufacturers will put it into their 2015 models just because their competitors are starting to, and the auto industry is one place were you cannot be seen as not keeping up with the neighbors.
  • Wearable tech hypes up (with little to no actual adoption or innovation). The Samsung Smart Watch is out, one of nearly a dozen models introduced in 2013. Then there was Google Glass. And given that the tech industry is a frequent "hype it before we even barely know it's going to work" kind of place, this one seems like another fast breakway layup kind of claim. Note that I fully expect that what we see offered will, in time, be as hip and as cool as the original Newton, meaning that these first iterations will be stumblin', fumblin', bumblin' attempts to try and see what anybody can do with these things to make them even remotely useful, and that unless you like living on the very edge of techno-geekery, there'll be absolutely zero reason for anyone to get one for at least calendar year 2014.
  • Apple's gadgets will be more of the same. Same one as last year: iPhone, iPad, iPod, MacBook, they're all going to be incremental refinements on what we see already. There will be no AppleTV, there will be no iWatch, there will be no radical new laptop-ish thing. Apple is clearly the market leader, and they are clearly in the grips of the Innovator's Dilemma, and they have no clear challenger (yet) that threatens to dethrone them, leaving them with no reason to shake up the status quo.
  • Android market consolidates further around Samsung and Motorola. The Android consumer market has slowly been collapsing around those two manufacturers, and I don't see any reason for that trend to change. Yes, other carriers will continue to offer Android on their devices, and yes, other device manufacturers will continue to put Android on their devices, and yes, Android will continue to appear on things other than tablets and phones, but as far as the consumer electronics world goes, the Android market will be classified as Samsung, Motorola, and everybody else.
  • We'll see one iOS release, two minor Android releases, and maybe two Windows8 minor releases. The players are basically set, the game plans are already in play, and nobody appears to have any kind of major game-changing feature set in the wings. 2014 will be a year of minor releases, tweaks to the existing systems and UIs, minor software improvements, and so on. I can't see the mobile market getting any kind of major shock or surprise this year.
  • Windows 8/8.1/9/whatever gains a little respect, but not market share gains. Windows8 as a tablet OS has been quietly gathering some converts, particularly among those who didn't find themselves on the WindowsStore-only SurfaceRTs, and as such, I think the "Windows line" will begin to gather more "critics' choice" kinds of respect, but that's not going to translate into much in the way of consumer sales. Unfortunately for the Microsoftians, Windows as of yet doesn't demonstrate any kind of compelling reason to choose it over the other two market leaders (iOS and Android), and until that happens, Windows8, as a device OS, remains a distant third and always will.
  • UI/UX emphasis is going to start moving to "alternate" input streams. Microsoft's Kinect has demonstrated that gesture is a viable input technology. Google Glass demonstrated that eyeballs can drive a UI. Voice commands are making their way into console gaming/media devices (XBox, etc). This year, enterprise and business developers, looking for ways to make a splash and justify greater research budgets, are going to start experimenting with how those "alternative" kinds of input can be utilized in non-gaming scenarios. Particularly when combined with the rise of automobiles offering programmable SDKs/platforms (see above), this represents a huge, rich area for exploration.
  • Java-the-language starts to see a resurgence of "mojo". Java8 will ship this year--not even God Himself could stop that at this point. And once it does, Java-the-language will see a revitalization as developers who've been flirting with Groovy, Scala, Clojure, and other lambda-supporting languages but can't use them on the job start to bring those ideas into Java APIs. Google's already been doing this with Guava, but now many of those ideas--already percolating in some libraries--will explode into common usage.
  • Meanwhile, this will be a quiet year for C#. The big news coming out of Microsoft, "Roslyn", the "compiler-as-a-service" rewrite of the C# and Visual Basic compilers, won't be of much use to most developers on a practical level, and as a result, this will likely be a pretty quiet year for C# and VB.
  • Functional languages will remain "hipster" tools that most people can't use. Haskell remains far out of reach for most developers, and that's the most approachable of the various functional languages discussed. (Don't even get me started on Julia, Pure, Clean, or any of the others.) As much as I wish to the contrary, this is also likely to remain true for several of the "hybrid" languages, like Scala, F#, and Clojure, though I do think they will see some modest growth as some of the upper-echelon development community begins to grok them. Those who understand them will continue to do some amazing things with them, but this is not the year I would suggest starting a business with anything "functional" as part of its business model, because not only will it be difficult to find developers who can use those tools, but trying to sell developer-facing things with those tools at the core will find a pretty dry and dusty market.
  • Dynamic languages will see continued growth and success. Ruby doesn't look to be slowing down, Node/JavaScript only looks to get more hyped, and Objective-C remains the dominant language for doing iOS development, which itself doesn't look to be slowing down. More importantly, I think we're going to start to see a rise in hybrid "static/dynamic" languages, wherein developers can choose (based on the way they write their code) compiler enforcement as they wish. Between the introduction of "invokedynamic" in Java7 (and its deeper use in Java8), and "dynamic" in C# getting some serious exercise in the Oak framework, I'm becoming more and more convinced that having a language that supports both static and dynamic typing capabilities represents the best compromise between those two poles of software development languages. That said, neither Java nor C# "gets it all the way right" on this topic, and I suspect that somewhere out there, there's a language hacker who's got a few ideas that he or she will trot out and make us all go "Doh!"
  • HTML 5 "fragmentation" will start to echo in the industry. Unfortunately, HTML 5 is not the same thing to all browsers, and those who are looking to HTML 5 as a way to save them from platform differences are going to start to feel some pain. That, in turn, is going to lead to some backlash as they are forced to deal with something they thought they were going to be saved from.
  • "Mobile browsers" become just "browsers". With the explosive growth of devices (tablets and phones) and the explosive growth of the capabilities of those devices (processor(s), memory, and so on), the need for a "crippled" or "low-end-optimized" browser has effectively gone the way of the Dodo bird. As a result...
  • "Mobile web" starts a slow, steady slide into irrelevancy. ... sites optimized for "mobile" browsing experiences--which represents a non-trivial development effort in most cases--will start to drop away, mostly due to neglect. Instead...
  • "Responsive web" becomes the new black. ... we'll see web sites using CSS frameworks (among other tools) to build user interfaces that adjust themselves to the physical viewsizes and input capabilities of the target browser. Bootstrap is an obvious frontrunner here for building said kinds of user interfaces, but don't be surprised if a number of other CSS and JavaScript frameworks to achieve the same ends start to spring up.
  • Microsoft fails to name a Ballmer successor. Yeah, this one's a stretch. It's absolutely inconceivable that they wouldn't. And yet, in all honesty, I can't see the Microsoft board finding somebody that meets Bill's approval from outside of the company, and I can't imagine anyone inside of the company who isn't somehow "tainted" by the various internecine wars that have been fought since Bill's departure. It is, quite frankly, a mess, and I don't know that it'll be cleaned up before this time next year. It would be a horrible result were that to be the case, by the way, but... *shrug* I dunno. Pretty clearly, whomever it is, is going to have a monumental task in front of them.
  • "Programmable Web" becomes an even bigger thing, leading companies to develop APIs that make no sense to anybody. Right now, as we spin up 2014, it's become the fashionable thing to build your website not as an HTML-based presentation layer website, but as a series of APIs. For some companies, that makes sense; for others, though, that is going to be a seductive Siren song that leads them to a huge development effort for little payoff. Note, I think almost all companies have interesting data and/or resources that can be exposed as APIs that would lead to some powerful mashups--I'm not arguing otherwise. But what I think we're about to stumble into is the cargo-culting blind obedience to the letter of the idea that so many companies undertake when a new concept hits the industry hard, as "Web APIs" are doing now.
  • Five new single-page JavaScript MVC application frameworks will ship and gather interest. For those of you who know me from the Java world, remember the 2000s and the huge glut of open-source Web frameworks that led us all into analysis paralysis for a half-decade or more? I see absolutely no reason why the exact same thing isn't already under way in the JavaScript Web framework world, with the added delicious twist that in the JavaScript world, we can do it on BOTH the client AND the server. Let the forking begin.
  • Apple's MacPro machine inspires dozens of knock-off clones. When the MacBook came out, silver-metal cases with chiclet keyboards suddenly appeared all over the PC clone market. When the MacBook Air came out, suddenly thin was in. What on Earth makes us think that the trashcan-sized MacPro desktop/server isn't gong to have exactly the same effect?
  • Desktop machine sales creep slightly higher. Work this through with me before you shoot it down out of hand: Tablet sales are continuing to skyrocket, and nothing seems to change that. But people still need to produce stuff (reports, articles, etc), and that really requires a keyboard. But if tablets are easier to consume data on the road, you're more likely to carry your tablet instead of your laptop (and most people--myself wildly excluded--don't like carrying more than one or at most two devices with them). Assuming that your mobile workload is light enough that you can "get by" using your tablet, and you don't want to carry a laptop *and* a tablet, you're more likely to leave your laptop at home or at work. Once your laptop is a glorified workstation, why pay that added premium for a laptop at all? In other words, I think people are going to start doing this particular math, and while tablets will continue to eat away at the "I need a mobile computing solution" sales, desktops are going to start to eat away at the "I need a computing solution for my desk" sales. Don't believe me? Look around the office at all the workstations powered by laptops already, and then start looking at whether those laptops are actually being used as laptops, and whether that mobility need could, in fact, be replaced by a far lighter tablet. It's a stretch, and it may not hit in 2014, but I really think that the world is going to slowly stratify into an 80/20 split of tablets and desktops.
  • Dozens of new "cloud" platforms will be introduced, and most of them will remain entirely irrelevant behind the "Big Three". Lots of the big players are going to start tossing out their version of a cloud platform if they haven't already (HP, Oracle, IBM, I'm looking at you), and smaller players are going to start offering "cloud" platforms of their own (a la Rackspace), but fundamentally, the cloud will remain a "Big Three" place: Amazon's AWS, Microsoft's Azure, and Google's Cloud Platform.
  • We will never see any kind of official announcement, much less actual working prototypes, around Amazon's "Drone Delivery" program ever again. Sure, Jeff made a splash when he announced it. Sure, it resonates with the geek crowd. Sure, it seems like a spiffy idea on paper. Do you have any idea of how much infrastructure and overhead (and potential for failure that has nothing to do with geeks deploying "anti-drone defenses") would be involved? No way. What's more, Amazon is not really in the shipping business (as the all-but-failed Amazon "deliver groceries to your front door" program highlights), but in the "We'll sell it to you and ship it through somebody else" business. It's a cool idea, but it'll never, ever, EVER, see the light of day.

As always, thanks for reading, and keep this channel open--I've got some news percolating about my next new adventure that I'm planning to "splash" in mid-January. It won't be too surprising, but it's exciting (at least to me), and hopefully represents an adventure that I can still be... uh... adventuring... for years to come.


.NET | Android | Azure | C# | C++ | Conferences | Development Processes | F# | Flash | Industry | iPhone | Java/J2EE | Languages | LLVM | Mac OS | Objective-C | Parrot | Personal | Python | Reading | Review | Ruby | Scala | Security | Social | Solaris | Visual Basic | VMWare | WCF | Windows | XML Services | XNA

Friday, January 03, 2014 12:35:25 AM (Pacific Standard Time, UTC-08:00)
Comments [1]  | 
 Monday, December 09, 2013
On Endings

A while back, I mentioned that I had co-founded a startup (LiveTheLook); I'm saddened to report that just after Halloween, my co-founder and I split up, and I'm no longer affiliated with the company except as an adviser and equity shareholder. There were a lot of reasons for the split, most notably that we had some different ideas on how to execute and how to spend the limited seed money we'd managed to acquire, but overall, we just weren't communicating well.

While I'm sad to no longer be involved with LtL, I wish Francesca and the company nothing but success for the future, and in the meantime I'm exploring options and figuring out what my next great adventure will be. It's not the greatest time of the year (the "dead zone" between Thanksgiving and Christmas) to be doing it, but fortunately I've gotten a few leads that may turn out to be hits. We'll have to see. And, while we're sorting that out, I've got plans for things to work on in the meantime, including a partnership effort with my eldest son on a game he invented.

So, what I'm saying here is that if anyone's desperate for consulting, now's a great time to reach out, because I can be bought. :-)


.NET | Android | Azure | C# | C++ | Conferences | Development Processes | F# | Industry | iPhone | LLVM | Mac OS | Personal | Ruby | Scala | Social | Windows | XML Services | XNA

Monday, December 09, 2013 8:59:24 PM (Pacific Standard Time, UTC-08:00)
Comments [1]  | 
 Thursday, August 29, 2013
Seattle (and other) GiveCamps

Too often, geeks are called upon to leverage their technical expertise (which, to most non-technical peoples' perspective, is an all-encompassing uni-field, meaning if you are a DBA, you can fix a printer, and if you are an IT admin, you know how to create a cool HTML game) on behalf of their friends and family, often without much in the way of gratitude. But sometimes, you just gotta get your inner charitable self on, and what's a geek to do then? Doctors have "Doctors Without Boundaries", and lawyers can always do work "pro bono" for groups like the Innocence Project and so on, but geeks....? Sure, you could go and join the Peace Corps, but that's hardly going to really leverage your skills, and Lord knows, there's a ton of places (charities) that could use a little IT love while you're off in a damp and dismal jungle somewhere.

(Not you, Seattle. You're just damp today. Dismal won't be for another few months, when it's raining for weeks on end.)

(As if in response, the rain comes down even harder.)

About five or so years ago, a Microsoft employee realized that geeks didn't really have an outlet for their desires to volunteer and help out in their communities through the skills they have patiently mastered. So Chris created GiveCamp, an organization dedicated to hosting "GiveCamps" all over the US, bringing volunteer developers, designers, and other IT professionals together with charities that need some IT love, whether that's in the form of a new mobile app, some touch-up on the website, a port from a Microsoft Access app to something even remotely more modern, or whatever.

Seattle GiveCamp is coming up, October 11-13, at the Microsoft Commons. No technical bias is implied by that--GiveCamp isn't an evangelism event, it's a "let's help people" event. Bring your Java, PHP, Python, and yes, maybe even your Perl, and create some good karma for groups that are doing good things. And for those of you not local to Seattle, there's lots of other GiveCamps being planned all over the country--consider volunteering at one nearby.


.NET | Android | Azure | C# | C++ | Development Processes | F# | Flash | Industry | iPhone | Java/J2EE | Languages | LLVM | Mac OS | Objective-C | Parrot | Personal | Python | Ruby | Scala | Security | Social | Solaris | Visual Basic | VMWare | WCF | Windows | XML Services | XNA

Thursday, August 29, 2013 12:19:45 PM (Pacific Daylight Time, UTC-07:00)
Comments [2]  | 
 Monday, August 26, 2013
On speakers, expenses, and stipends

In the past, I've been asked about my thoughts on conferences and the potential "death" of conferences, and the question came up again more recently in a social setting. It's been a while since I commented on it, and if anything, my thoughts have only gotten sharper and clearer.

On speaking professionally

When you go to the dentist's office, who do you want holding the drill--the "enthused, excited amateur", or the "practiced professional"?

The use of the term "professional" here, by the way, is not in its technical use of the term, meaning "one who gets paid to perform a particular task", but more in a follow-on to that, meaning, "one who takes their commitment very seriously, and holds themselves to the same morals and ethics as one who would be acting in a professional capacity, particularly with an eye towards actually being paid to perform said task at some point". There is an implicit separation between someone who plays football because they love it, for example, going out on Sunday afternoons and body-slamming other like-minded individuals just because of the adrenaline rush and the male bonding, and those who go out on Sunday afternoons and command a rather decently-sized salary ($300k at a minimum, I think?) to do so. Being a professional means that not only is there a paycheck associated with the activity, but a number of responsibilities--this means not engaging in stupid activity that prevents you from being able to perform your paid activity. In the aforementioned professional athlete's case, this means not going out and doing backflips on a dance floor (*ahem*, Gronkowski) or playing some other sport at a dangerous level of activity. (In the professional speaker's case, it means arranging travel plans to arrive at the conference at least a day before your session--never the day of--and so on.)

For a lot of people, speaking at an event is an opportunity for them to share their passion and excitement about a given topic--and I never want to take that opportunity away from them. By all means, go out and speak--and maybe in so doing, you will find that you enjoy it, and will be willing to put the kind of time and energy required into doing it well.

Because, really, at the end of the day, the speakers you see in the industry that are very, very good at what they do, they weren't just "born" that way. They got that way the same way professional athletes got that way, by doing a lot of preparation and work behind the scenes. They got that way because they got a lot of "first team reps", speaking at a variety of events. And they continue to get better because they continue to speak, which means continuously putting effort and energy into new talks, into revising old talks, and so on.

But all of that time can't be for free, or else people won't do it.

Go back to the amateur athlete scenario: the more time said athlete has to work at a different job to pay the bills, the less time they have to prep and master their athletic skills. This is no different for speakers--if someone is already spending 8 hours a day working, and another 6 to 8 hours a day sleeping, then that's 8 to 10 hours in the day for everything else, including time spent with the family, eating, personal hygiene, and so on, including whatever relaxation time they can carve out. (And yes, we all need some degree of relaxation time.) When, exactly, is this individual, excited, passionate, enthused (or not), supposed to get those "first team reps" in? By sacrificing something else: time with the family, sleep, a hobby, whatever.

Don't you think that they deserve some kind of compensation for that time?

I know, I know, the usual response is, "But they're giving back to the community!" Yes, I know, you never really figured anything out on your own, you just ran off to StackOverflow or Google and found all the code you needed in order to learn the new technology--it was never any more effort on your own part than that. You OWE the community this engagement. And, by the way, you should also owe them all the code you ever write, for the same reason, because it's not like your employer ever gave you anything for that code, and it's not like you did all that research and study for the code you work on for them.

See, the tangled threads of "why" we do something are often way too hard to unravel. So let's instead focus on the "what" you did. You submitted an abstract, you created an outline, you concocted some slides, you built some demos, you practiced your talk, you delivered it to the audience, and you submitted yourself to "life's slings and arrows" in the form of evaluations. And for all that, the conference organizers owe you nothing? In fact, you're required to pay for the privilege of doing all that?

On "professional" conferences

One dangerous trend I see in conferences, and it's not the same one I saw in 2009, is that the main focus of a conference is shifting; no longer is it a gathering of like-minded professionals who want to improve their technical skills by learning from others. Instead, it's turning into a gathering of people who want to party, play board games, gorge themselves on bacon, drink themselves to a stupor, play in a waterpark or go catch a Vegas show with naked women in it. Somehow, "professional developer conference" has taken on all the overtones of a Bacchanalian orgy, all in the name of "community".

Don't get me wrong--I think it can be useful to blow off some steam during a show, particularly because for most people, absorbing all this new information is mentally exhausting, and you need time to process it, both socially (in the form of hallway conversations) and physically (meaning, go give your body something to do while your mind is churning away). But when the focus of the conference shifts from "speakers" to "bacon bar", that's a dangerous, dangerous sign.

And you know what the first sign is that the conference doesn't think it's principal offering is the technical content? When they won't even cover the speakers' costs to be at that event.

Seriously, think about it for a moment: if the principal focus of this event is the exchange of intellectual and industrial information, through the medium of a lecture given by an individual, then where should your money go? The bacon bar? Or towards making sure that you have the best damn lecturers your budget can afford?

When a conference doesn't offer to pick up airfare and hotel, then in my mind that conference is automatically telling the world, "We're willing to bring in the best speakers that are willing to do this all for free!" And how many of you would be willing to eat at a restaurant that said, "We're willing to bring in the best chefs that are willing to cook for free!"? Or go to a hospital that brings in "the best doctors that are willing to operate for free!"?

And how many of you are willing to part of your own money to go to it?

For community events like CodeCamps, it's an understood proposition that this is more about the networking and community-building than it is about the quality of the information you're going to get, and frankly, given that the CodeCamp is a free event, there's also an implicit "everybody here is a volunteer" that goes with it that explains--and, to my mind, encourages--people who've never spoken before to get up and speak.

But when you're a CodeMash, a devLink, or some of these other shows that are charging you, the attendee, a non-trivial amount of money to attend, and they're not covering speakers' expenses at a minimum, then they're telling you that your money is going towards bacon bars and waterparks, not the quality of the information you're receiving.

Yes, there are some great speakers who will continue to do those events, and Gods' honest truth, if I had somebody to cover my mortgage and/or paid me to be there, I'd love to do that, too. But many of those people who are paid by a company to be speaking at events are called "evangelists" and "salespeople", and developers have already voted with their feet often enough to make it easy to say that we don't want a conference filled with "evangelists" and "salespeople". You want an unbiased technical view of something? You want people to talk about a technology that don't have an implicit desire to sell it to you, so that they can tell you both what it's good for and where it sucks? Then you want speakers who aren't being paid by a company to be there; instead, you want speakers who can give you the "harsh truth" about a technology without fear of reprisal from their management. (And yes, there are a lot of evangelists who are very straight-shooting speakers, and I love 'em, every one. But there's a lot more of them out there who aren't.)

In many cases, for the conference to deliver both the bacon bar and the speakers' T&E, it would require your attendance fee to go up some. By rough back-of-the-napkin calculations, probably about $50 for each of you, depending on the venue, the length of the conference, the number of speakers (and the number of talks they each do), and the total number of attendees. Is it worth it?

When you go to the dentist's office, do you want the "excited, enthused amateur", or the "practiced professional"?


.NET | Android | C# | C++ | Conferences | Industry | iPhone | Java/J2EE | Personal | Reading | Social

Monday, August 26, 2013 8:09:01 PM (Pacific Daylight Time, UTC-07:00)
Comments [6]  | 
On startups

Curious to know what Ted's been up to? Head on over to here and sign up.

Yes, I'm a CTO of a bootstrap startup. (Emphasis on the "bootstrap" part of that--always looking for angel investors!) And no, we're not really in "stealth mode", I'll be happy to tell you what we're doing if you drop me an email directly; we're just trying to "manage the message", in startup lingo.

We're only going to be under wraps for a few more weeks before the real site is live. And then.... *crossing fingers*

Don't be too surprised if the tone of some of my blog posts shifts away from low-level tech stuff and starts to include some higher-level stuff, by the way. I'm not walking away from the tech, by any stretch, but becoming a CTO definitely has opened my eyes, so to speak, that the entrepreneur CTO has some very different problems to think about than the enterprise architect does.


.NET | Azure | Development Processes | Industry | Java/J2EE | Personal | Social

Monday, August 26, 2013 2:37:25 PM (Pacific Daylight Time, UTC-07:00)
Comments [0]  | 
 Friday, August 23, 2013
Farewell, Mr. Ballmer

By this point, everybody who's even within shouting distance of a device connected to the Internet has heard the news: Steve Ballmer, CEO of Microsoft, is on his way out, retiring somewhere in the next twelve months and stepping aside to allow someone else to run the firm. And, rumor has it, this was not his choice, but a decision enforced upon the firm by the Microsoft Board.

You know, as much as I've disagreed with some of the decisions that've come out of the company in the last five years or so, I can't help but feel a twinge of sadness for how this ended. Ballmer, by all accounts, is a nice guy. I say that not as someone who's ever had to deal with him in person, but based on hearsay reports and two incidents where I've been in his general proximity, one of which was absolutely hilarious. Truth: when the cookie guard in that story told him that, he didn't, as some might imagine, immediately pull rank and start yelling "Do you know who I am?!?" In fact, he looked entirely like he was going to put the cookies back when another staff member rushed up, whispered in the first's ear, and when the first one apologized profusely, he just grinned--not meanly, but seemingly in the humor of the situation--and took a bite. He didn't have to play it so nicely, but how you treat the "little people" that touch on your life is a great indicator of the kind of person you are, deep down.

And count me a Ballmer-apologist, perhaps, but I have to wonder how much of his decision-making was made on faulty analysis and data from his underlings. Some of that is his own problem--a CEO should always be looking for ways to independently verify the information his people are reporting to him, and people who tell him only what he wants to hear should be immediately fired--but I genuinely think he was a guy just trying to do the best he could.

And maybe, in truth, that was never really enough.

Regardless, should the man suddenly appear at my doorstep, I would invite him in for dinner, offer him a beer, and talk about our kids' football teams. (They play in the same pre-high school football league.) He may not have been the great leader that Microsoft needed in the post-Gates years, but I wouldn't be surprised if Microsoft has to go a few iterations before they find that leader, if they ever can. A lot has to happen exactly right for them to find that person, and unless Bill has suddenly decided he's ready to take up the mantle again, it's something of a long shot.

Good luck, Microsoft.


.NET | Azure | C# | F# | Industry | Personal | Social | Visual Basic | WCF

Friday, August 23, 2013 11:25:54 PM (Pacific Daylight Time, UTC-07:00)
Comments [0]  | 
 Monday, August 19, 2013
Programming Interviews

Apparently I have become something of a resource on programming interviews: I've had three people tell me they read the last two blog posts, one because his company is hiring and he wants his people to be doing interviews right, and two more expressing shock that I still get interviewed--which I don't really think is all that fair, more on that in a moment--and relief that it's not just them getting grilled on areas that they don't believe to be relevant to the job--and more on that in a moment, too.

A couple of things have emerged in the last few weeks since the saga described earlier, so I thought I'd wrap the thing up with a final post. Besides, I like things that come in threes.

First, go see this video. Jonathan pinged me about it shortly after the second blog post came out, and damn if he and Mitch don't nail a bunch of things directly on the head. Specifically, I want to call out two lists they put into their slides (which I can't find online, or I'd include a link, sorry).

One, what are the things you're trying to answer in an interview? They call it out as three questions an interviewer or interview team is seeking to answer:

  1. Can they do the job?
  2. Will they be motivated?
  3. Would they get along with the team?
Personally, #2 to me is a red herring--frankly, I expect that if you, the candidate, take a job with my company, then either you have determined that you will be motivated to work here, or else you can force yourself to be. I don't really expect you to be the company cheerleader (unless, of course, I'm hiring you for that role), but I do expect professionalism: that you will be at work when you are scheduled or expected to be, that you will do quality work while you are there, and that you will look to make the best decisions possible given the information you have at the time. Motivation is not something I should be interviewing for; it's something you should be bringing.

But the other two? Spot-on.

And this brings me to my interim point: I'm not opposed to a programming test. I think I gave the impression to a number of readers that I think that I'm too good or too famous or whatever to be tested on my skills; that's the furthest thing from the truth. I think you most certainly should be verifying that I have the technical chops to do the job you want me to do; what I do want to suggest, however, is that for a number of candidates (myself included), there are ways to determine my technical chops without forcing me to stand at a whiteboard and code with a pen. For some candidates, you can examine their GitHub profile and see how many repos they have that're public (and have a look through some of the code they wrote). In fact, what I think would be a great interview question would be to look at a repo they haven't touched in a year, find some element of the code inside there, and ask them to explain what they were thinking when they wrote it. If it's well-documented, or if it's simple code, they'll be able to do that fairly quickly (once they context-swap to the codebase--got to give them time to remember, after all). If it's a complex or tricky bit, and they can't explain it...

... well, you just learned something about the code they write, now didn't you?

In my case, I have no public GitHub profile to choose from, but I'm an edge case, in that you can also watch my videos, and/or read my books and articles. Granted, there's a chance that I have amazing editors who save me from incredible stupidity and make me look good... but what are the chances that somebody is doing that for over a decade, across several technology platforms, and all without any credit? Probably pretty close to nil, IMHO. I'm not unique in this case--there's others whose work more or less speaks for itself, and I think you're disrespecting the candidate if you don't do your homework on the interview first.

Which, by the way, brings up another point: As an interviewer, you have a responsibility to do your homework on the candidate before they walk in the door, particularly if you're expecting them to have done their homework on your firm. Don't waste my time (and yours, particularly since yours is probably a LOT more expensive than mine, considering that a lot of companies are doing "interview loops" these days with a team of people, and all of their time adds up). If you're not going to take my candidacy seriously, why should I take your job or job offer or interview seriously?

The second list Jon and Mitch call out is their "interviewing antipatterns" list:

  • The Riddler
  • The Disorienter
  • The Stone Tablet
  • The Knuth Fanatic
  • The Cram Session
  • Groundhog Day
  • The Gladiator
  • Hear No Evil
I want you to watch the video, so I'm not going to summarize each here; go watch it. If you're in a position of doing hiring, ask yourself how many of those you yourself are perpetrating.

Second, go read this article. I don't like that he has "Dig into algorithms, data structures, code organization, simplicity" as one of his takeaways, because I think most interviewers are going to see "algorithms" and "data structures" and stop there, but the rest seems pretty spot-on.

Third, ask yourself the critical question: What, exactly, are we doing wrong? You think you're an agile organization? Then ask yourself how much feedback you get on your interviewing process, and how you would know if you screwed it up. Yes, you will know if hire a bad candidate, but how will you know if you're letting good candidates go? Maybe you're the hot company that everybody wants to work at, and you can afford to throw some wheat out with the chaff a few times, but you're not going to be in that position for long if you do, and more importantly, you're not going to be in that position for long, period. If you don't start trying to improve your hiring process now, by the time you need to, it'll be too late.

Fourth, practice! When unit-testing came out, many programmers said, "I don't need to test my code, my code is great!", and then everybody had a good laugh at their expense. Yet I see a lot of companies say essentially the same thing about their hiring and interview practices. How do you test an interview process? Easy--interview yourselves. Work with known-good conditions (people you know, people who work with you already, and so on), and run them through the process, but with the critical stipulation that you must treat them exactly as you would a candidate. If you look at your tech lead and say, "Yeah, this is where I'd ask you a technical question, but I already know...", then unless you're prepared to do that for your candidates, you're cheating yourself on the feedback. It's exactly like saying, "Yeah, this is where I'd write a test checking to see how we handle a null in that second parameter, but I already know...". If you're not prepared to do the latter, don't do the former. (And if you are prepared to do the latter, then I probably don't want to work with you anyway.)

Fifth, remember: Interviewing is not easy! It's not easy on the candidates, and it shouldn't be on you. It would be great if you could just test somebody on one dimension of themselves and call it good, but as much as people want to pretend that a programmer is just a code-spewing cog in a machine, they're not. If you want well-rounded candidates, then you must interview all aspects of that well-roundedness to determine if they are or not.

Whatever you interview for, that's what you will get.


.NET | Android | Azure | C# | C++ | Conferences | Development Processes | F# | Flash | Industry | iPhone | Java/J2EE | Languages | LLVM | Mac OS | Objective-C | Parrot | Personal | Python | Reading | Review | Ruby | Scala | Security | Social | Solaris | Visual Basic | VMWare | WCF | Windows | XML Services

Monday, August 19, 2013 9:30:55 PM (Pacific Daylight Time, UTC-07:00)
Comments [1]  | 
On "Exclusive content"

Although it seems to have dipped somewhat in recent years, periodically I get requests from conferences or webinars or other presentation-oriented organizations/events that demand that the material I present be "exclusive", usually meaning that I've never delivered said content at any other organized event (conference or what-have-you). And, almost without exception, I refuse to speak at those events, or else refuse to abide by the "exclusive" tag (and let them decide whether they still want me to speak for them).

People (by which I mean "organizers"--most speakers seem to get it intuitively if they've spoken at more than five or so conferences in their life) have expressed some surprise and shock at my attitude. So, I decided to answer some of the more frequently-asked questions that I get in response to this, partly so that I don't have to keep repeating myself (yeah, right, as if said organizers are going to read my blog) and partly because putting something into a blog is a curious form of sanity-check, in that if I'm way off, commenters will let me know posthaste.

Thus...:

  • "Nobody will come to our conference/listen to our webinar if the content is the same as elsewhere." This is, by far, the first and most-used reaction I get, and let me be honest: if people came to your conference or fired up your webinar solely because of the information contained, they would never come to your conference or listen to your webinar. The Internet is huge. Mind-staggeringly huge. Anything you could possibly ever want about any topic you could ever possibly imagine, it's captured it somewhere. (There's a corollary to that, too; I call it "Whittington's Law", which states, "Anything you can possibly imagine, the Internet not only has it, but a porn site version of it, as well".) You will never have exclusive content, because unless I invented the damn thing, and I've never shown it to anybody or ever used it before, somebody will likely have used it, written a blog post or a video tutorial or what-have-you, and posted it to the Internet. Therefore, by definition, it can't be exclusive.
  • But even on top of that first point, no presentation given by the same guy using the same slides is ever exactly the same. Anybody who's ever seen me give a talk twice knows that a lot of how I give my presentations is extremely ad-hoc; I like to write code on the fly, incorporate audience feedback and participation, and sometimes I even get caught up in a tangent that we explore along the way. None of my presentations are ever scripted, such that if you filmed two of them and played them side-by-side, you'll see marked and stark differences between them. And frankly, if you're a conference organizer, you should be quite happy about this, because one of the first rules of presenting is to "Know thy audience", but if you can't know your audience ahead of time, what course is left to you but to poll the audience when you first get started, and adjust your presentation based on that?
  • "Sure, the experience won't be as great as if they were in the room at the time, but if they can get the content elsewhere, why should they come to our conference?" Well.... Honestly, that question really needs to be rephrased: "Given all the vast amounts of information out there on the Internet, why should someone come to your conference, period?" If you and your fellow organizers can't answer that question, then my content isn't going to help you in the slightest. TechEd and other big conferences that stream all of their content to the Web seem to be coming to the realization that there is something about the in-person experience that still creates value for attendees, so maybe you should be thinking about that, instead. Yes, you will likely lose a few ticket sales from people watching the content online, but if those numbers are staggeringly large, it means that your conference offered nothing but content in the first place, and you were going to see those numbers drop off significantly anyway once the majority of your audience figured out that the content is available elsewhere. And for free, no less.
  • "But why is this so important to you?" Because, my friends, everything gets better with practice, and that includes presentations. When I taught for DevelopMentor lo those many years ago, one of the fundamental rules was that "You don't really know a deck until you've delivered it five times". (I call it "Sumida's Law", after the guy who trained me there.) What's more, the more often you've presented on a subject, the more easily you see the "right" order to the topics, and better ways of explaining and analogizing those topics occur to you over time. ("Halloway's Corollary to Sumida's Law": "Once you've delivered a deck five times, you immediately want to rewrite it all".) To be quite honest with you all, the first time I give a talk is much like the beta release of any software product: it takes user interaction and feedback before you start to see the non-obvious bugs.

I still respect the conference or webinar host that insists on exclusive content, and I wish you well finding your next speaker.


.NET | C# | C++ | Conferences | Industry | Java/J2EE | Languages | Personal | Review | Social | WCF | Windows

Monday, August 19, 2013 7:17:56 PM (Pacific Daylight Time, UTC-07:00)
Comments [0]  | 
 Thursday, July 25, 2013
More on the Programming Tests Saga

A couple of people had asked how the story with the company that triggered the "I Hate Programming Tests" post ended, so I figured I'd follow up with the rest of that story, and some thoughts.

After handing in the disjoint-set solution I'd come up with, the VP pondered things for a bit, then decided to bring me in for an in-person interview loop with a half-dozen of the others that work there. I said I'd be happy to, and came in, did a brief meet-and-greet with the group of folks I'd be interviewing with (plus, I think, a few others), and then we got to the first interview mono-a-mono, and after a brief "Are you familiar with MVC?", we get into...

... another algorithm challenge. A walk-up-to-the-whiteboard-and-code-this challenge.

OK, whatever. I already said I'm not great with algorithmic challenges like this, but maybe this guy didn't get the memo or he's just trying to see how I reason things through. So, sure, let's attack this, even though I haven't done this kind of problem in like twenty years. (One of the challenges was "How do you sort a file of integer numbers when you can't store the entire collection of numbers in memory?", which wasn't an unfair challenge, just not something that I generally have to mess with. Honestly, in the working world, I'll start by going through the file number by number--or do chunks of the file in parallel using actors, if the file is large enough--and shove them into a database that's indexed on that number. But, of course, as with all of these kinds of challenges, the interviewer continues to throw constraints at the problem until we either get to the solution he wants or Ted runs out of imagination; in this case, I think it was the latter.) End result: not a positive win.

Next interviewer walks in, he wasn't there for the meet-and-greet, which means he has even less context about me than the guy before me, and he immediately asks... another algorithmic challenge. "If you have a tree of nodes, and you want to get a list of the nodes in rank order" (meaning, a breadth-first search, where each node now gets a "sibling" pointer pointing to the sibling on its right in the tree, or null if it's the rightmost node at that depth level) "how would you do it?" Again, a fail, and now I'm getting annoyed. I admitted, from the outset, that this is not the kind of stuff I'm good at. We've already made that point. I accept the "F" on that part of my report card. What's even more annoying, the interviewer keeps sighing and drumming his fingers in an obvious state of "Why is this bozo wasting my time like this, I could be doing something vastly more important" and so on, which, gotta say, was kind of distracting. End result: total fail.

By this point, I'm really annoyed. The VP comes to meet me, asks how it's going, and I tell him, flatly, "Sucks." He nods, says, yeah, we're going to kill the interview loop early, but I want to talk to you over lunch (with another employee along for company) and then have you meet with one more person before we end the exercise.

Lunch goes quite well, actually, and the last interview of the day is with their Product Manager, who then presents me with a challenge: "Suppose I want to build an online system for ordering pizzas. Customers can order pizzas, in other words. Build for me either the UI or the data model for this system." OK, this is different. I choose the data model, and build a ridiculously simple one-to-many relationship of customers to orders, and a similar one-to-many for orders to pizzas. She then proceeds to complicate the model step by step, sometimes in response to my questions, sometimes out of the blue, until we have a fairly complex roughly-sketched data model on the whiteboard. Result: win.

The VP at this point is on the horns of a dilemma: two of the engineers in the interview loop are convinced I'm an idiot. They're clearly voting no on this. But he's read my articles, he's seen some of my presentations, he knows I'm not the idiot the others assume me to be, and he's now trying to figure out what his next steps are. He takes a week to think about it, then emails me yesterday to say that it's not going to work.

Here's my thoughts, and folks, if you interview people or are part of an interview process, I'm trying to generalize this beyond this one experience to take it into a larger context:

  • Know what you want to prove with your interview. I get the feeling that this interview loop was essentially a repeat of every interview loop they've ever done before, with no consideration to the candidate himself. An interview is a chance for the company to get to know the candidate better, in order to make a well-informed decision. In this particular case, trying to suss out my skills around algorithms was a wasted effort--I'd already conceded that point. Therefore, find new questions! Find new areas in which to challenge the candidate to see what their skills are. (If you can't think of something else to ask, then you're not really thinking about the interview all that hard, and you're just going through the motions.)
  • Look for the proof you seek in other areas. With the growth of things like Github and open source projects in general, it's becoming easier and easier to prove to yourself as a company that a candidate does or does not have the coding skills you're looking for. Did this guy submit some pull requests to a project? Did this guy post some blogs about interesting technical tidbits? (Or, Lord help us, write articles for major publications?) Did this guy author an open-source project, or work on a project that other people know about? Look at it this way: If Anders Heljsberg, Bjarne Stroustrup or James Gosling walk through the door, are you going to put them through the same interview questions you put the random recruiter-found candidate goes through? Or are you willing to consider their established body of work and call it covered? As an interviewer, it behooves you to look for that established body of work, so that you can spend the interview loop looking at other things.
  • Be clear in what you want. One of the things the VP said to me was that he was looking for somebody who had a similar skillset to what he had; that is, had a architectural view of things and an interest in managing the people involved. By then submitting my candidacy to a series of tests that didn't really test for those things, he essentially torpedoed whatever chances it might have had.
  • Be willing to assert your authority. If you're the VP of the company, and the people who work for you disagree with your decisions, sometimes the Right Thing To Do is to simply overrule them. Yes, I know, it's not all politically correct to do that, and if you do it too often you'll ruin whatever sense of empowerment that you want your employees to have within the company, but there are times when you just need to assert that authority and say, "You know what? I appreciate y'all's input, but this is one of those cases where I think I have a different enough perspective that I am going to just overrule and do it anyway." Sometimes you'll be right, yay, and sometimes you'll be wrong, boo, but there is a reason you're the VP or the Director or the Team Lead, and not the others. Leadership means making hard decisions sometimes.
  • Be willing to change up the process. So your candidate comes in, and they're a junior programmer who's just graduated college, with zero experience. Do you then start asking them questions about their experience? That would be a waste of their time and yours. So you'll have to come up with new questions and a new approach. Not all interviews have to be carbon copies of each other, because certainly your candidates aren't carbon copies of each other. (At least, you'd better hope not, or else you're going to end up with a pretty single-dimensional staff.) If they've proven their strength in some category, or admitted a lack in another, then drop your standard set of questions, and go to something different. There is no honor in asking the exact same questions of every candidate.
  • Be willing to hire somebody that offers complementary skills. If your company already has a couple of engineers who know algorithms really well, then hire somebody for a different skillset. Likewise, if your company already has a couple of people who are really good with customers, you don't need another one. Look for people that have skills that fall outside the realm of what you currently have, and trust that when that individual is presented with a problem that attacks their weakness, they'll turn to somebody else in the firm to help them with it. When presented with an algorithmic challenge, you're damn well sure that I'm going to turn to somebody next to me and say, "Hey, dude? Help me walk through this for a bit, would you?" And, in turn, if that engineer has to give a presentation to a customer, and they turn to me and say, "Hey, dude? Help me work on this presentation, would you?", I'm absolutely ready to chip in. That's how teams are built. That's why we have teams in the first place.
In the end, this is probably the best of all possible scenarios, not working for them, particularly since I have some other things brewing that will likely consume all of my attention in the coming months, but there's that part of me that hates the fact that I failed at this. That same part of me is now going back through a few of the "interview challenges" books that I picked up, ironically, for my eldest son when he goes out and does his programming interviews, just to work through a few of the problems because I HATE feeling inadequate to a challenge.

And that, in turn, raises my next challenge: I want to create a website, just a static thing, that has a series of questions that, I think, are far better coding challenges than the ones I was given. I don't know when or if I'm going to get to this, but I gotta believe that any of the problems out of the book "Programming Challenges" (by Skiena and Revilla, Springer-Verlag, 2003) or the website from which those challenges were drawn, would be a much better test of the candidate's ability, particularly if you look at the ancillary parts of the challenge: do they write tests, how do they write their tests, do they pair well with somebody, and so on. THOSE are the things you really care about, not how well they remember their college lessons, which are easily accessible over Google or StackOverflow.

Bottom line: Your time is precious, people. Interview well, or just don't bother.


.NET | Android | C# | C++ | Conferences | Development Processes | F# | Industry | iPhone | Java/J2EE | Languages | Mac OS | Personal | Ruby | Scala

Thursday, July 25, 2013 2:19:48 PM (Pacific Daylight Time, UTC-07:00)
Comments [2]  | 
 Tuesday, July 09, 2013
Programming Tests

It's official: I hate them.

Don't get me wrong, I understand their use and the reasons why potential employers give them out. There's enough programmers in the world who aren't really skilled enough for the job (whatever that job may be) that it becomes necessary to offer some kind of litmus test that a potential job-seeker must pass. I get that.

And it's not like all the programming tests in the world are created equal: some are pretty useful ways to demonstrate basic programming facilities, a la the FizzBuzz problem. Or some of the projects I've seen done, a la the "Robot on Mars" problem that ThoughtWorks handed out to candidates (a robot lands on Mars, which happens to be a cartesian grid; assuming that we hand the robot these instructions, such as LFFFRFFFRRFFF, where "L" is a "turn 90 degrees left", "R" is a "turn 90 degrees right", and "F" is "go forward one space, please write control code for the robot such that it ends up at the appropriate-and-correct destination, and include unit tests), are good indicators of how a candidate could/would handle a small project entirely on his/her own.

But the ones where the challenge is to implement some algorithmic doodad or other? *shudder*.

For example, one I just took recently asks candidates to calculate the "disjoint sets" of a collection of sets; in other words, given sets of { 1, 2, 3 }, { 1, 2, 4 } and { 1, 2, 5 }, the result should be sets of {1,2},{3},{4}, and {5}. Do this and calculate the big-O notation for your solution in terms of time and of space/memory.

I hate to say this, but in twenty years of programming, I've never had to do this. Granted, I see the usefulness of it, and granted, it's something that, given large enough sets and large enough numbers of sets, will make a significant difference that it bears examination, but honestly, in times past when I've been confronted with this problem, I'm usually the first to ask somebody next to me how best to think about this, and start sounding out some ideas with them before writing any bit of code. Unit tests to test input and its expected responses are next. Then I start looking for the easy cases to verify before I start attacking the algorithm in its entirety, usually with liberal help from Google and StackOverflow.

But in a programming test, you're doing this alone (which already takes away a significant part of my approach, because being an "external processor", I think by talking out loud), and if it's timed (such as this one was), you're tempted to take a shortcut and forgo some of the setup (which I did) in order to maximize the time spent hacking, and when you end up down a wrong path (such as I did), you have nothing to fall back on.

Granted, I screwed up, in that I should've stuck to my process and simply said, "Here's how far I got in the hour". But when you've been writing code for twenty years, across three major platforms, for dozens of Fortune 500 companies and architected platforms that others will use to build software and services for thousands of users and customers, you feel like you should be able to hack something like this out fairly quickly.

And when you can't, you feel like a failure.

I hate programming tests.

Update: By the way, as always, I would love some suggestions on how to accomplish the disjoint-set problem. I kept thinking I was close, but was missing one key element. I particularly would LOVE a nudge in doing it in a object-functional language, like F# or Scala (I've only attempted it in C# so far). Just a nudge, though--I want to work through it myself, so I learn.

Postscript An analogy hit me shortly after posting this: it's almost as if, in order to test a master carpenter's skill at carpentry, you ask him to build a hammer. After all, if he's all that good, he should be able to do something as simple as affix a metal head to a wooden shaft and have the result be a superior device to anything he could buy off the shelf, right?

Further update: After writing this, I took a break, had some dinner, played a game of Magic: The Gathering with my wife and kids (I won, but I can't be certain they didn't let me win, since they knew I was grumpy about not getting this test done in time), and then came back to it. I built up a series of little steps, backed by unit tests to make sure I was stepping through my attempts at reasoning out the algorithm correctly, backed up once or twice with a new approach, and finally solved it in about three hours, emailing it to the company at 6am (0600 for those of you reading this across the Atlantic or from a keyboard marked "Property of US Armed Forces"), just for grins. I wasn't expecting to get a response, since I was grossly beyond the time allotted, but apparently it was good enough to merit a follow-up interview, so yay for me. :-) Upshot is, though, I have an implementation that works, though now I find myself wondering if there's a way to do it in a functional/no-side-effect/persistent-data-structure kind of way....

I still hate them, though, at least the algorithm-based ones, and in a fleeting moment of transparent honesty, I will admit it's probably because I'm not very good at them, but if you repeat that to anyone I'll deny it as outrageous slander and demand satisfaction, Nerf guns at ten paces.


.NET | Android | Azure | C# | C++ | Conferences | Development Processes | F# | Industry | iPhone | Java/J2EE | Languages | Objective-C | Parrot | Personal | Python | Ruby | Scala | Social | Visual Basic

Tuesday, July 09, 2013 12:02:11 AM (Pacific Daylight Time, UTC-07:00)
Comments [3]  | 
 Thursday, March 21, 2013
On Sexism, Harassment, and Termination

Oh, boy. Diving into this whole Adria Richards/people-getting-fired thing is probably a mistake, but it’s reached levels at which I’m just too annoyed by everyone and everything in this to not say something. You have one of three choices: read the summary below and conclude I’m a misogynist without reading the rest; read the summary below and conclude I’m spot-on without reading the rest; or read the rest and draw your own conclusions after hearing the arguments.

TL;DR Adria Richards was right to be fired; the developer/s from PlayHaven shouldn’t have been fired; the developer/s from PlayHaven could very well be a pair of immature assholes; the rape and death threats against Adria Richards undermine the positions of those who support the developer/s formerly from PlayHaven; the content of the jokes don’t constitute sexism nor should conferences overreact this way; half the Internet will label me a misogynist for these views; and none of this ends well.

The Facts, as I understand them

Three people are sitting in a keynote at a software conference. A presenter makes a comment on stage that leads two people sitting in the audience to start making jokes with all the emotional maturity of Beavis and Butthead. (Said developers are claiming that any and all sexual innuendo was inferred by the third, but frankly, let’s assume worst case here and assume they were, in fact, making cheap tawdry sex jokes out of “dongle” and “forking”.) A third person, after listening to it for a while, turns around, smiles, snaps a photo of the two of them, and Tweets them out as assholes. Conference staff approach third person, ask her to identify the two perpetrators, escort the developers out of the conference based on nothing but her word and (so far as I can tell) zero supporting evidence. Firestorm erupts over the Internet, and now all three (?) are jobless.

(UPDATE: Roberto Guerra mentioned, in private email, that PyCon has published their version of the events, which does not mention the developers being asked to leave; Roberto also tells me that the above link, which states that, apparently got it wrong, and that the original source they used was mistaken. Apologies to PyCon if this is the case.)

My Interpretations

Note that with typical software developer hubris, I feel eminently qualified to comment on all of this. (Which is my way of saying, take all of this with a grain of salt—I have some experience with this, being on the “accused” end of sexual harassment, and what I’m saying stems from my enforced “sit through the class” time from a decade or more ago, but I’m no lawyer, and like everybody else, I’m at the mercy of the reports since I wasn’t there.)

Developers who make “dongle” jokes and “forking” jokes are not only being stupid, those jokes have already been made. So they’re stupid twice over. C’mon, guys. New material. Seriously.

Making jokes in public that others might find offensive is taking a risk. Do it on stage, you run the risk of earning the wrath of the crowd. (Of course, nobody on this blog would, say, drop “the f-bomb” something like 23 times on stage in a keynote, right?) Do it in a crowd, you run the risk of pissing somebody off around you and looking/acting like douche. Might be in your best interests to keep your voice down or just chuckle to yourself and have that conversation later.

Photos taken in public are considered public, if rude. If I walk out into the street and start filming you, I have perfect right to do so, according to US law: what happens in public is considered public domain. Paparazzi depend on this for their “right” to follow and photograph moviestars, atheletes, and other “public” figures. Adria was entirely within her rights to photograph those two and Tweet it. But if I snap a pic of a cute girl and Tweet it with “Wow, want to guess whether her code is hot too?”, it’s a douche move because I’m using her likeness without her permission. If I do that for profit, now I’m actually open to lawsuit. So photos in public are in still something of a grey area, legally. Basic rule of thumb: if you want to be safe, ask before you put a photo of somebody else, taken in public or not, someplace other than on your own private device.

Third parties who overhear conversations could arguably be violating privacy. There’s a fine line here, but eavesdropping is rude. Now, I don’t know how loud they were making the jokes—shouting it out across the room is a very different scenario than whispering it to your seatmate and co-worker—but frankly, it’s usually pretty easy to tell when a joke is meant for general distribution in a room like that, and when it’s not. If it’s not meant for you, how about you just not hear it and concentrate on something else? Chalk up the commentary as “idiots being idiots”, and if there’s no implied threat to anybody going on, leave it be.

If you’re offended, you have an obligation to tell the parties in question and give them a choice to make good. Imagine this scenario: a guy sits down next to a girl on a bus. His leg brushes up against hers. She immediately stands up and shouts out “THIS MAN IS MAKING UNWANTED SEXUAL ADVANCES AT ME!” at the top of her lungs. Who’s the societally maladjusted person here? If, instead, she says, “Oh, please don’t make physical contact with me”, and he says, “But that’s my right as a human male”, and refuses to move his leg from pressing up against hers, then who’s the societally maladjusted one? Slice this one as finely as you like, but if you’re offended at something I do, it’s your responsibility to tell me so that I can make it right, by apologizing and/or ceasing the behavior in question, or telling you that I have Tourette’s, or by telling you you’re an uptight party-pooper, or however else this story can play out. If the party in question continues the behavior, then you’ve got grounds—moral and legal—to go to the authorities.

Just because you call it harassment doesn’t make it such. Legally, from what I remember, harassment is defined as “repeated acts of unwanted sexual attention”; in this case, I don’t see a history of repetition, nor do I see there being actual “attention” to Adria in this case—this was a conversation being held between two individuals that didn’t include her.

Just because it involves sex doesn’t make it sexist. Two guys were making jokes about male genitalia. It may have been inappropriate, but honestly, unless somebody widened the definition of sexism (“making disparaging comments about someone based on their gender or sexual preferences”) when I wasn’t looking, this ain’t it. And for Adria to claim sexism in public is bad when she Tweeted just a few days prior about stuffing a sock down your shorts during a TSA patdown seems a little…. *shrug* You pick the world.

The conference needs to follow basic due process. You know—innocent until proven guilty, measured and proportional response, warnings, and so on. I don’t care what it says on the conference’s website by way of disclaimer—you have to figure out if what was said to happen actually happened before you respond to it. Nowhere in the facts above do I hear the conference taking any steps to protect the accused—a woman said a couple of guys said sexual things, so we must act quickly! This has “bad” written all over it for the next five conferences.

(UPDATE: Again, PyCon apparently didn’t escort the developer/s out of the conference, but instead according to their site, “Both parties were met with, in private. The comments that were made were in poor taste, and individuals involved agreed, apologized and no further actions were taken by the staff of PyCon 2013. No individuals were removed from the conference, no sanctions were levied.” It sounds like, contrary to what I first heard, PyCon handled it in a classy manner, so I apologize for perpetrating the image that they didn’t. Having said that, though, I find it curious that this storm blew up this way—did no one think to push those apologies to Twitter so everyone else knew that things had blown over, or did they in fact do that and we’re all too busy gawking and screaming “fight! fight! fight” on the playground to notice?)

The material shouldn’t matter. I know we’re all being all sexually politically correct these days about women in IT, but this is a Pandora’s Box of a precedent that will eventually get way out of hand, if it isn’t already (and I think it is). Imagine how this story goes for the conference if a man Tweets out a picture of a woman and says, “This woman was talking to another woman and insulted my religion, and the conversation made me uncomfortable.” Is the conference now on the hook to escort those two women out of the building? How about programming language choice? How about race? How about sports teams? Where do we draw this line?

Adria was right to be fired. It’s harsh, but as any celebrity endorsement negotiator will tell you, when you represent a brand, you represent the brand even when the cameras aren’t rolling. (Just ask Tiger Woods about this.) Her actions brought a ton of unwanted negative attention (and a DDOS attack, apparently) to the company; that’s in direct contrast to the reasons they were paying her, and seeing as how her actions were something she did (as opposed to had done to her), her termination is entirely justified. You might see it as a bit harsh, but the company is well within boundaries here.

The PlayHaven developers weren’t right to be fired. Again, nowhere do we see them getting the opportunity to confront their accuser, or make restitution (apology). Now, you can argue that they, too, were representing their firm, but unless their job is to act as an evangelist and brand recognition activities are part of their job description, you can’t terminate them for gross negligence in this. Of course, most employment is “at-will”, meaning a company can fire you for any reason it likes, but this is sort of akin to getting fired for getting drunk and making lewd comments to the wait staff at Denny’s while wearing a company T-shirt.

Sexism in IT is bad. Duh. I don’t think I’ve met anyone who said otherwise. But this wasn’t sexism. Inappropriate, perhaps, but not sexism. By the way, racism in IT is bad, and so is age-ism, role-ism (discounting somebody’s opinions just because they’re in Marketing or Sales), and technacism (discounting a technology based on no factual knowledge).

It’s politically correct to jump to attention when “women in IT” come up. This subject is gathering a lot of momentum, and most of it I think is of the bad variety. Hate speech should not be tolerated—the rape and death threats against Adria cannot, should not, and are not acceptable in any way shape or form. Nor should similar kinds of direct comments against gays, lesbians, transsexuals, blacks, Asians, Jews, or any of the other “other” groups out there. But there is a far cry between this and the discrimination and hate speech that people go through: I have a friend who is lesbian and a school teacher, and she is receiving death threats for teaching at that school. She has dogs at the house, shotgun loaded, and she is waiting for the Mormons and news reporters to vacate her lawn so she can try to resume some kind of normal life. Putting up with a few lewd jokes in a crowd at a conference, I would guess, sounds pretty heavenly to her right now.

I think we have time for a patronizing plea, by the way: Ladies, I know you’ve had something of a rough time in the IT industry, but it’s pretty obvious that it’s getting better, and frankly, you run a big risk of ostracizing yourself and making it harder if every time a woman doesn’t get selected for something (a conference speaking slot, a tech lead role, or a particular job) the whole “women in IT” banner gets unfurled and raised. Don’t get me wrong—I don’t think there’s many of you that are doing that. There are some, though, who do claim special privilege just for being female, and there’s enough of a correlation between these two things that I think before too long it’s going to lose its impact and the real good that could be done will be lost. Don’t demand that you get special privilege—earn it. Believe me, there’s plenty of opportunities for you to do so, so if you get blocked on something, look for a way around it. Demand equality, not artificially-imposed advantage.

(As trends go, quite honestly, given the declining rates of men graduating college and actually making a life for themselves, before too long the shoe will be on the other foot anyway, just give it time.)

There is no happy ending here. Nobody can fix this; three lives have been forever affected, negatively, by all of this. The ones I feel truly sorry for? SendGrid and PlayHaven—they had nothing to do with it, and now their names are going to be associated with this whole crappy mess.

Call me a misogynist for not whole-heartedly backing the woman in this case, if you will, but frankly, it was a disaster from the moment she chose to snap the photo and Tweet to the world instead of saying, “Excuse me, can you not make those jokes here? I don’t think they’re particularly appropriate.” I could theorize why she chose the one route over the other, but that’s an essay for another day.

Let the flaming begin.

UPDATE: This post puts more context around Adria, and I think is the best-written commentary I've seen on this so far, particularly since it's a woman's point of view on the whole thing (assuming, of course, that "Amanda" is in this case applied to a human of the female persuasion).


Conferences | Industry | Personal | Python | Reading | Social

Thursday, March 21, 2013 4:09:20 PM (Pacific Daylight Time, UTC-07:00)
Comments [5]  | 
 Tuesday, March 19, 2013
Programming language "laws"

As is pretty typical for that site, Lambda the Ultimate has a great discussion on some insights that the creators of Mozart and Oz have come to, regarding the design of programming languages; I repeat the post here for convenience:

Now that we are close to releasing Mozart 2 (a complete redesign of the Mozart system), I have been thinking about how best to summarize the lessons we learned about programming paradigms in CTM. Here are five "laws" that summarize these lessons:
  1. A well-designed program uses the right concepts, and the paradigm follows from the concepts that are used. [Paradigms are epiphenomena]
  2. A paradigm with more concepts than another is not better or worse, just different. [Paradigm paradox]
  3. Each problem has a best paradigm in which to program it; a paradigm with less concepts makes the program more complicated and a paradigm with more concepts makes reasoning more complicated. [Best paradigm principle]
  4. If a program is complicated for reasons unrelated to the problem being solved, then a new concept should be added to the paradigm. [Creative extension principle]
  5. A program's interface should depend only on its externally visible functionality, not on the paradigm used to implement it. [Model independence principle]
Here a "paradigm" is defined as a formal system that defines how computations are done and that leads to a set of techniques for programming and reasoning about programs. Some commonly used paradigms are called functional programming, object-oriented programming, and logic programming. The term "best paradigm" can have different meanings depending on the ultimate goal of the programming project; it usually refers to a paradigm that maximizes some combination of good properties such as clarity, provability, maintainability, efficiency, and extensibility. I am curious to see what the LtU community thinks of these laws and their formulation.
This just so neatly calls out to me, based on my own very brief and very informal investigation into multi-paradigm programming (based on James Coplien's work from C++ from a decade-plus ago). I think they really have something interesting here.


.NET | Android | C# | C++ | Conferences | Development Processes | F# | Industry | Java/J2EE | Languages | LLVM | Objective-C | Parrot | Personal | Python | Ruby | Scala | Visual Basic | WCF | Windows

Tuesday, March 19, 2013 6:32:43 PM (Pacific Daylight Time, UTC-07:00)
Comments [0]  | 
 Tuesday, March 05, 2013
That Thing They Call "Unemployment"

TL;DR: I'm "unemployed", I'm looking to land a position as a director of development or similar kind of development management role; I'm ridiculously busy in the meantime.

My employer, after having suffered the loss of close to a quarter of its consultant workforce on a single project when that project chose to "re-examine its current approach", has decided that (not surprisingly) given the blow to its current cash flow, it's a little expensive keeping an architectural consultant of my caliber on staff, particularly since it seems to me they don't appear to have the projects lined up for all these people to go. Today was my last day, the paperwork and final check are processing through the system, there were no tears nor angry accusations from either side, and tomorrow I get to wake up "unemployed".

It's a funny word, that word "unemployed", because it indicates both a state of emotion and existence that I don't really share. On the emotional front, I'm not upset. A number of people expressed condolences ("I'm so sorry, Ted"), but frankly, I'm not angry, upset, hurt, or any of those other emotions that so often come with that. Part of my reaction stems from the fact that I've been expecting this for a while--the company and I had lots of plans in the beginning of my tenure there, but those plans more or less never got past the planning stage, and the focus was clearly always on billability, which at the level I'm at usually implies travel, something I'm not willing to commit to at the 80%/100% level that consulting clients often demand. We just grew apart, the company and I, and I think we've both known it for a few months now; this is just putting the signatures on the divorce and splitting up the CD collection. On the "existence" front, unemployment often means "waking up with nothing to do" and "no more money coming in", which, honestly, doesn't really apply, either. While I'm not going to be drawing a salary on a twice-monthly basis like I was for the last twenty months, it's not like I have no income coming in or nothing to do: I've got my columns with MSDN, CoDe, and Oracle TechNet, I've got two conferences this month (33rd Degree in Warsaw, and VSLive! in Vegas) I've got a contract in place for doing some content work and research for JetBrains on MPS, their language workbench, and I've just commissioned a course with PluralSight, "JVM Fundamentals", which will essentially be an amalgamation of the conference talks I did at NFJS over the past five or six years (ClassLoaders, threading and concurrency, collections, and so on), with a few more PluralSight courses and JetBrains articles/vidcasts/etc sketched out after that. If I'm "unemployed", then it's the busiest damn unemployment I've ever heard of.

And in all honesty, this enforced change on my career is not unwelcome--I've been thinking now for the past few months that it's time for me to challenge myself again, and the chosen challenge I've laid out for myself is to run a team, not an architecture. I want to find a position where I can take a team, throw us at a project, and produce something awesome... or at least acceptable... to the customer. After so many years of making fun of managers at conferences and such, I find myself wanting to become one. I'm not naive, I know this isn't all rainbows and unicorns, and that there will be times I just want to go back to the editor and write code because at least code is deterministic (most of the time), but it's an entirely new set of challenges, and frankly, I've been bored the last few years, I just have to admit that out loud. And I may not like it and in a year or two say to myself, "What was I THINKING?!?", but at least I'll have given it a shot, gotten the experience, and learned a few new things. And it's not like I'm going to give up technology completely, because I'm still going to be writing, blogging, recording, speaking, and researching. I don't think I could give that up if I tried.

So if you know of a company in the Greater Seattle area that's looking for someone who's got a ton of technical skills and an intuitive sense of people to run a development team, drop me a note. Oh, and don't be too surprised if the website gets a face lift in the next month or two--the design is a little old, and I want to play around with Bootstrap and some static-HTML-plus-Javascript kinds of design/development. Should be fun, in all my copious spare time...


Conferences | Development Processes | Industry | Personal | Reading | Social

Tuesday, March 05, 2013 12:52:24 AM (Pacific Standard Time, UTC-08:00)
Comments [0]  | 
 Thursday, February 28, 2013
When Apple decides what email you get to see

According to this report, Apple is now not only spam-filtering out emails containing particular phraseology (in this case, "barely legal teens"), but deleting them entirely, whether they're being sent to your account, or from your account. And what's even more interesting, apparently iCloud users agreed to give Apple that kind of power.

The precedent here is dangerous, and one that needs to be carefully examined--if corporations are going to exercise the ability to investigate/examine (even from an automated tool) the email that you're sending or receiving, then technically privacy is being violated. This has always been an issue with email--corporations have always maintained that email sent on their servers to their employees is their property, and the legal world has held that up to be the case (which is the same rationale that then gives DOJ and other prosecutors the right to examine corporate email in order to see if there's been any wrongdoing taking place, so this is a good thing). But when you're not an employee of the corporation, does the fact that the email travels through their servers mean that they have the right to view your email, even through an algorithm? Does an ISP have the right to read its subscribers' email, too? The fact that iCloud users agree to allow Apple this power is an interesting twist, but frankly the courts have seen fit to throw out waivers that were deemed unenforceable or illegal, so that's something of a red herring, I think.

The much deeper issue here is one of privacy: how much privacy is really left to us these days? And, speaking for myself, why don't more people care?

This also has me wondering if, maybe, email and Internet services haven't reached a level of ubiquity that suggests that they should be considered part of the national or state infrastructure--as in, should local/city/state/federal government maintain an email infrastructure (servers) with the same degree of privacy guarantees that they held up for the US Postal Service? Or, maybe even, should the US Postal Service be that entity?


Industry | iPhone | Personal

Thursday, February 28, 2013 8:20:44 PM (Pacific Standard Time, UTC-08:00)
Comments [0]  | 
 Thursday, February 14, 2013
Um... Security risk much?

While cruising through the Internet a few minute ago, I wandered across Meteor, which looks like a really cool tool/system/platform/whatever for building modern web applications. JavaScript on the front, JavaScript on the back, Mongo backing, it's definitely something worth looking into, IMHO.

Thus emboldened, I decide to look at how to start playing with it, and lo and behold I discover that the instructions for installation are:

curl https://install.meteor.com | sh
Um.... Wat?

Now, I'm sure the Meteor folks are all nice people, and they're making sure (via the use of the https URL) that whatever is piped into my shell is, in fact, coming from their servers, but I don't know these people from Adam or Eve, and that's taking an awfully big risk on my part, just letting them pipe whatever-the-hell-they-want into a shell Terminal. Hell, you don't even need root access to fill my hard drive with whatever random bits of goo you wanted.

I looked at the shell script, and it's all OK, mind you--the Meteor people definitely look trustworthy, I want to reassure anyone of that. But I'm really, really hoping that this is NOT their preferred mechanism for delivery... nor is it anyone's preferred mechanism for delivery... because that's got a gaping security hole in it about twelve miles wide. It's just begging for some random evil hacker to post a website saying, "Hey, all, I've got his really cool framework y'all should try..." and bury the malware inside the code somewhere.

Which leads to today's Random Thought Experiment of the Day: How long would it take the open source community to discover malware buried inside of an open-source package, particularly one that's in widespread use, a la Apache or Tomcat or JBoss? (Assume all the core committers were in on it--how many people, aside from the core committers, actually look at the source of the packages we download and install, sometimes under root permissions?)

Not saying we should abandon open source; just saying we should be responsible citizens about who we let in our front door.

UPDATE: Having done the install, I realize that it's a two-step download... the shell script just figures out which OS you're on, which tool (curl or wget) to use, and asks you for root access to download and install the actual distribution. Which, honestly, I didn't look at. So, here's hoping the Meteor folks are as good as I'm assuming them to be....

Still highlights that this is a huge security risk.


.NET | Android | Azure | C# | C++ | Development Processes | F# | Flash | Industry | iPhone | Java/J2EE | Languages | LLVM | Mac OS | Objective-C | Parrot | Personal | Python | Reading | Ruby | Scala | Security | Social | Solaris | Visual Basic | VMWare | WCF | Windows | XML Services | XNA

Thursday, February 14, 2013 8:25:38 PM (Pacific Standard Time, UTC-08:00)
Comments [4]  | 
 Saturday, February 02, 2013
Last Thoughts on "Craftsmanship"

TL;DR Live craftsmanship, don't preach it. The creation of a label serves no purpose other than to disambiguate and distinguish. If we want to hold people accountable to some sort of "professionalism", then we have to define what that means. I found Uncle Bob's treatment of my blog heavy-handed and arrogant. I don't particularly want to debate this anymore; this is my last take on the subject.


I will freely admit, I didn't want to do this. I really didn't. I had hoped that after my second posting on the subject, the discussion would kind of fade away, because I think we'd (or I'd, at least) wrought about the last few drops of discussion and insight and position on it. The same memes were coming back around, the same reactions, and I really didn't want to perpetuate the whole thing ad infinitum because I don't really think that's the best way to reach any kind of result or positive steps forward. I'd said my piece, I was happy about it.

Alas, such was not to be. Uncle Bob posted his thoughts, and quite frankly, I think he did a pretty bad job of hearing what I had to say, couching it in terms of populism (I stopped counting the number of times he used that word at six or so) even as he framed in it something of his own elitist argument.

Bob first points us all at the Manifesto for Software Craftsmanship. Because everyone who calls themselves a craftsman has to obey this manifesto. It's in the rules somewhere. Sort of like the Agile Manifesto--if you're not a signatory, you're doing it wrong.

(Oh, I know, to suggest that there is even the smallest thing wrong with the Agile Manifesto borders on heresy. Which, if that's the reaction you have, should be setting off a few warning bells in your head--something about replacing dogma with dogma.)

And you know what? I actually agree with most of the principles of the Craftsmanship Manifesto. It's couched in really positive, uplifting language: who doesn't want "well-crafted" software, or "steadily-increasing value", or "productive partnerships"? It's a wonderfully-worded document that unfortunately is way short on details, but hey, it should be intuitively obvious to anyone who is a craftsman, right?

See, this is part of my problem. Manifestos tend to be long on rhetoric, but very, very short on details. The Agile Manifesto is another example. It stresses "collaboration" and "working software" and "interactions" and "responding to change", but then people started trying to figure out how to apply this, and we got into the knife-fights that people arguing XP vs. Scrum vs. Kanban vs. your-homebrewed-craptaculous-brand-of-"little-a"-agile turned into brushfire wars. It's wonderful to say what the end result should be, but putting that into practice is a whole different ball of wax. So I'm a little skeptical any time somebody points to a Manifesto and says, "I believe in that, and that should suffice for you".

Frankly, if we want this to have any weight whatsoever, I think we should model something off the Hippcratic Oath, instead--it at least has prescriptive advice within it, telling doctors what they can and cannot (or, perhaps worded more accurately, should or should not) do. (I took something of a stab at this six years ago. It could probably use some work and some communal input; it was a first iteration.)

Besides (beware the accusation coming of my attempt at a false-association argument here, this is just for snarkiness purposes!), other manifestos haven't always worked out so well.

So by "proving [that I misinterpreted the event] by going to the Manifesto", you're kind of creating a circular argument: "What happened can't have been because of Software Craftsmanship, because look, there, in the Manifesto, it says we don't do that, so clearly, we can't have done that. It says it, right there! Seriously!"

The Supposed "Segregation"

Bob then says I'm clearly mistaken about "craftsmen" creating a segregation, because there's nothing about segregation in the manifesto:

any intimation of those who "get it" vs. those who don't; or any mention of the "right" tools or the "right" way. Indeed, what I see instead is a desire to steadily add value by writing well-crafted software while working in a community of professionals who behave as partners with their customers. That doesn't sound like "narcissistic, high-handed, high-minded" elitism to me.
Hold on to that thought for a bit.

Bob then goes on an interesting leap of logical assumption here. He takes my definition of a "software laborer":

"somebody who comes in at 9, does what they're told, leaves at 5, and never gives a rat's ass about programming except for what they need to know to get their job done [...] who [crank] out one crappy app after another in (what else?) Visual Basic, [that] were [...] sloppy, bloated, ugly [...] cut-and-paste cobbled-together duct-tape wonders."
and interprets it as
Now let's look past the hyperbole, and the populist jargon, and see if we can identify just who Ted is talking about. Firstly, they work 9-5. Secondly, they get their job done. Thirdly, they crank out lots of (apparently useful) apps. And finally, they make a mess in the code. The implication is that they are not late, have no defects, and their projects never fail.
That's weird. I go back and read my definition over and over again, and nowhere do I see me suggesting that they are never late, no-defect, and never-fail projects. Is it possible that Bob is trying to set up his next argument by reductio ad absurdum, basically by saying, "These laborers that Ted sets up, they're all perfect! They walk on water! They must be the illegitimate offspring of Christ himself! Have you met them? No? Oh, then they must not exist, and therefore his entire definition of the 'laborer' is whack, as these young-un kids like to say."

(See what I did there? I make Bob sound old and cantankerous. Not that he would do the same to himself, trying to use his years of experience as a subtle bludgeon to anyone who's younger and therefore less experienced--less professional, by implication--in his argument, right?

Programming is barely 60 years old. I, personally, have been programming for 43+ of those years.
Oh.)

Having sort of wrested my definition of the laborer away from me, Bob goes on:

I've never met these people. In my experience a mess in the code equates to lots of overtime, deep schedule overruns, intolerable defect rates, and frequent project failure -- not to mention eventual redesign.
Funny thing. I've seen "crafted" projects that fell to the same victims. Matter of fact, I had a ton of people (so it's not just my experience, folks, clearly there's a few more examples out there) email and comment to me that they saw "craftsmen" come in and take what could've been a one-week project and turn it into a six-month-or-more project by introducing a bunch of stuff into the project that didn't really need to be there, but were added in order to "add value" to the code and make it "well-crafted". (I could toss off some of the software terms that were cited as the reasons behind the "adding of value"--decoupled design, dependency injection, reusability, encapsulation, and others--but since those aren't in the Manifesto either, it's easy to say in the abstract that the people who did those projects weren't really adding value, even though these same terms seem to show up on every singe project during architecture and design, agile or otherwise.)

Bob goes on to sort of run with this theme:

Ted has created a false dichotomy that appeals to a populist ideology. There are the elite, condescending, self-proclaimed craftsmen, and then there are the humble, honorable, laborers. Ted then declares his allegiance to the latter... .
Well, last time I checked, all I have to do to be listed amongst the craftsmen is sign a web page, so "self-proclaimed" seems pretty accurate as a title. And "elite"? I dunno, can anyone become a craftsman? If so, then the term as a label has no meaning; if not, then yes, there's some kind of segregation, and it sure sounds like you're preaching from on high, particularly when you tell me that I've created a "false dichotomy" that appeals to a "populist ideology":
Generally, populists tend to claim that they side with "the people" against "the elites". While for much of the twentieth century, populism was considered to be a political phenomenon mostly affecting Latin America, since the 1980s populist movements and parties have enjoyed degrees of success in First World democracies such as the USA, Canada, Italy, the Netherlands and Scandinavian countries.
So apparently I'm trying to appeal to "the people", even though Bob will later tell us that we're all the same people. (Funny how there's a lot of programmers who feel like they're being looked down on by the elites--and this isn't my interpretation, read my blog's comments and the responses that have mushroomed on Twitter.) Essentially, Bob will argue later that there is no white-collar/blue-collar divide, even though according to him I'm clearly forming an ideology to appeal to people in the blue-collar camp.

So either I'm talking into a vacuum, or there's more of a divide than Bob thinks. You make the call on that one.

Shall we continue?

He strengthens his identity with, and affinity for, these laborers by telling a story about a tea master and a samurai (or was it some milk and a cow) which further extends and confuses the false dichotomy.
Nice non-sequitur there, Bob! By tossing in that "some milk and a cow", you neatly rob my Zen story of any power whatsoever! You just say it "extends and confuses the false dichotomy", without any real sort of analysis or discussion (that comes later, if you read through to the end), and because you're a craftsman, and I'm just appealing to populist ideology, my story no longer has any meaning! Because reductio ad make-fun-of-em is also a well-recognized and well-respected logical analysis in debating circles.

Oh, the Horror! ... of Ted's Psyche

Not content to analyze the argument, because clearly (he says this so many times, it must be true) my argument is so weak as to not stand on its own (even though I'm not sure, looking back at this point, that Bob has really attacked the argument itself at all, other than to say, "Look at the Manifesto!"), he decides to engage in a little personal attack:

I'm not a psychoanalyst; and I don't really want to dive deep into Ted's psyche to unravel the contradictions and false dichotomies in his blog. However, I will make one observation. In his blog Ted describes his own youthful arrogance as a C++ programmer... It seems to me that Ted is equating his own youthful bad behavior with "craftsmanship". He ascribes his own past arrogance and self-superiority with an entire movement. I find that very odd and very unfortunate. I'm not at all sure what prompted him to make such a large and disconnected leap in reasoning. While it is true that the Software Craftsmanship movement is trying to raise awareness about software quality; it is certainly not doing so by promoting the adolescent behavior that Ted now disavows.
Hmm. One could argue that I'm just throwing out that I'm not perfect nor do I need to profess to be, but maybe that's not a "craftsman's" approach. Or that I was trying to show others my mistakes so they could learn from them. You know, as a way of trying to build a "community of professionals", so that others don't have to go through the mistakes I made. But that would be psychoanalyzing, and we don't want to do that. Others didn't seem to have the problem understanding the "very large and disconnected leap in reasoning", and I would hate to tell someone with over twice my years of experience programming how to understand a logical argument, so how about let's frame the discussion this way: I tend to assume that someone behaving in a way that I used to behave (or still behave) is doing so for the same reasons that I do. (It's a philosophy of life that I've found useful at times.) So I assume that craftsmen take the path they take because they want to take pride in what they do--it's important to them that their code sparkle with elegance and beauty, because that's how code adds value.

Know what? I think one thing that got lost somewhere in all this debate is that value is only value if it's of value to the customer. And in a lot of the "craftsmanship" debates, I don't hear the customer's voice being brought up all that much.

You remember all those crappy VB apps that Bob maligned earlier? Was the customer happy? Did anybody stop to ask them? Or was the assumption that, since the code was crappy, the customer implicitly must be unhappy as well? Don't get me wrong, there's a lot of crappy code out there that doesn't make the customer happy. As a matter of fact, I'll argue that any code that doesn't make the customer happy is crap, regardless of what language it's written in or what patterns it uses or how decoupled or injected or new databases it stores data into. Value isn't value unless it's value to the person who's paying for the code.

Bob Discusses the Dichotomy

Eh, I'm getting tired of writing all this, and I'm sure you're getting tired of reading it, so let's finish up and call it a day. Bob goes on to start dissecting my false dichotomy, starting with:

Elitism is not encouraged in the Software Craftsmanship community. Indeed we reject the elitist attitude altogether. Our goal is not to make others feel bad about their code. Our goal is to teach programmers how to write better code, and behave better as professionals. We feel that the software industry urgently needs to raise the bar of professionalism.
Funny thing is, Bob, one could argue that you're taking a pretty elitist stance yourself with your dissection of my blog post. Nowhere do I get the benefit of the doubt, nor is there an effort to try and bring yourself around to understand where I'm coming from; instead, I'm just plain wrong, and that's all there is to it. Perhaps you will take the stance that "Ted started it, so therefore I have to come back hard", but that doesn't strike me as humility, that strikes me as preaching from a pulpit in tone. (I'd use a Zen story here to try and illustrate my point, but I'm afraid you'd characterize it as another "milk and a cow" story.)

But "raising the bar of professionalism", again, misses a crucial point, one that I've tried to raise earlier: Who defines what that "professionalism" looks like? Does the three-line Perl hack qualify as "professionalism" if it gets the job done for the customer so they can move on? Or does it need to be rewritten in Ruby, using convention over configuration, and a whole host of dynamic language/metaprogramming/internal DSL tricks? What defines professionalism in our world? In medicine, it's defined pretty simply: is the patient healthier or not after the care? In the legal profession, it's "did we represent the client to the best of our ability while remaining in compliance with the rules of ethics laid down by the bar and the laws of the entity in which we practice?" What defines "professionalism" in software? When you can tell me what that looks like, in concrete, without using words that allow for high degree of interpretation, then we can start to make progress towards whether or not my "laborers" are, in actuality, professionals.

We continue.

There are few "laborers" who fit the mold that Ted describes. While there are many 9-5 programmers, and many others who write cut-paste code, and still others who write big, ugly, bloated code, these aren't always the same people. I know lots of 12-12 programmers who work hellish hours, and write bloated, ugly, cut-paste code. I also know many 9-5 programmers who write clean and elegant code. I know 9-5ers who don't give a rat's ass, and I know 9-5ers who care deeply. I know 12-12ers who's only care is to climb the corporate ladder, and others who work long hours for the sheer joy of making something beautiful.
Of course there aren't, Bob, you took my description and sort of twisted it. (See above.) And yes, I'll agree with you, there's lots of 9-5 developers, and lots of 12-12 developers, lots of developers who write great code, and lots of developers who write crap code and what's even funnier about this discussion is that sometimes they're all the same person! (They do that just to defy this kind of stereotyping, I'm sure.) But maybe it's just the companies I've worked for compared to the companies you've worked for, but I can rattle off a vastly larger list of names who fit in the "9-5" category than those who fit into the "12-12" category. All of them wanted to do a good job, I believe, but I believe that because I believe that every human being innately wants to do things they are proud of and can point to with a sense of accomplishment. Some will put more energy into it than others. Some will have more talent for it than others. Just like dancing. Or farming. Or painting. Or just about any endeavor.

The Real Problem

Bob goes on to talk about the youth of our industry, but I think the problem is a different one. Yes, we're a young industry, but frankly, so is Marketing and Sales (they've only really existed in their modern forms for about sixty or seventy years, maybe a hundred if you stretch the definitions a little), and ditto for medicine (remember, it was only about 150 years ago that surgeons were also barbers). Yes, we have a LOT to learn yet, and we're making a lot of mistakes, I think, because our youth is causing us to reach out to other, highly imperfect metaphor/role-model industries for terminology and inspiration. (Cue the discussion of "software architecture" vs "building architecture" here.) Personally, I think we've learned a lot, we're continuing to learn more, and we're reaching a point where looking at other industries for metaphors is reaching a practical end in terms of utility to us.

The bigger problem? Economics. The supply and demand curve.

Neal Ford pointed out on an NFJS panel a few years back that the demand for software vastly exceeds the supply of programmers to build it. I don't know where he got that--whether he read that somewhere or that formed out of his own head--but he's absolutely spot-on right, and it seriously throws the whole industry out of whack.

If the software labor market were like painting, or car repair, or accounting, then the finite demand for people in those positions would mean that those who couldn't meet customer satisfaction would eventually starve and die. Or, more likely, take up some other career. It's a natural way to take the bottom 20% of the bell curve (the portion out to the far right) of potential practitioners, and keep them from ruining some customers' life. If you're a terrible painter, no customers will use you (at least, not twice), and while I suppose you could pick up and move to a new market every year or so until you're run out of town on a rail for crappy work, quite honestly, most people will just give up and go do something else. There are thousands--millions--of actors and actresses in Southern California that never make it to stage or screen, and they wait tables until they find a new thing to pursue that adds value to their customers' lives in such a way that they can make a living.

But software... right now, if you walk out into the middle of the street in San Francisco wearing a T-shirt that says, "I write Rails code", you will have job offers flying after you like the paper airplanes in Disney's just-released-to-the-Internet video short. IT departments are throwing huge amounts of cash into mechanisms, human or otherwise, working or otherwise, to help them find developers. Software engineering has been at the top of the list of "best jobs" for several years, commanding high salaries in a relatively stress-free environment, all in a period of time that many of equated to be the worst economic cycle since the Great Depression. Don't believe me? Take a shot yourself, go to a Startup Weekend and sign up as a developer: there are hundreds of people with new app ideas (granted, most of them total fantasy) who are just looking for a "technical co-founder" to help them see their dream to reality. IT departments will take anybody right now, and I do mean anybody. I'm reasonably convinced that half the reason software development outsourcing overseas happens is because it's a choice between putting up with doing the development overseas, even with all of the related problems and obstacles that come up, or not doing the development at all for lack of being able to staff the team to do it. (Which would you choose, if you were the CTO--some chance of success, or no chance at all?)

Wrapping up

Bob wraps up with this:

The result is that most programmers simply don't know where the quality bar is. They don't know what disciplines they should adopt. They don't know the difference between good and bad code. And, most importantly, they have not learned that writing good clean code in a disciplined manner is the fastest and best way get the job done well.

We, in the Software Craftsmanship movement are trying to teach those lessons. Our goal is to raise the awareness that software quality matters. That doing a good job means having pride in workmanship, being careful, deliberate, and disciplined. That the best way to miss a deadline, and lay the seeds of defeat, is to make a mess.

We, in the Software Craftsmanship movement are promoting software professionalism.
Frankly, Bob, you sort of reject your own "we're not elitists" argument by making it very clear here: "most programmers simply don't know where the quality bar is. They don't know .... They don't know.... They have not learned. ... We, in the Software Craftsmanship movement are trying to teach those lessons." You could not sound more elitist if you quoted the colonial powers "bringing enlightenment" to the "uncivilized" world back in the 1600s and 1700s. They are an ignorant, undisciplined lot, and you have taken this self-appointed messiah role to bring them into the light.

Seriously? You can't see how that comes across as elitist? And arrogant?

Look, I really don't mean to perpetuate this whole argument, and I'm reasonably sure that Uncle Bob is already firing up his blog editor to point out all the ways in which my "populist ideology" is falsly dichotomous or whatever. I'm tired of this argument, to be honest, so let me try to sum up my thoughts on this whole mess in what I hope will be a few, easy-to-digest bullet points:

  1. Live craftsmanship, don't preach it. If you hold the craftsman meme as a way of trying to improve yourself, then you and I have no argument. If you put "software craftsman" on your business cards, or website, or write Manifestos that you try to use as a bludgeon in an argument, then it seems to me that you're trying to distinguish yourself from the rest, and that to me smacks of elitism. You may not think of yourself as covering yourself in elitism, but to a lot of the rest of the world, that's exactly how you're coming off. Sorry if that's not how you intended it.
  2. Value is only value if the customer sees it as value. And the customer gets to define what is valuable to them, not you. You can (and should) certainly try to work with them to understand what they see as value, and you can (and should) certainly try to help them see how there may be value in ways they don't see today. But at the end of the day, they are the customer, they are paying the checks, and even after advising them against it, if they want to prioritize quick-and-dirty over longer-and-elegant, then (IMHO) that's what you do. Because they may have reasons for choosing that approach that they simply don't care to share with you, and it's their choice.
  3. The creation of a label serves no purpose other than to disambiguate and distinguish. If there really is no blue-collar programming workforce, Bob, then I challenge you to drop the term "craftsman" from your bio, profile, and self-description anywhere it appears, and replace it with "programer". Or else refer to all software developers as "craftsmen" (in which case the term becomes meaningless, and thus useless). Because, let's face it, how many doctors do you know who put "Hippocratic-sworn" somewhere on their business cards?
  4. If we want to hold people accountable to some sort of "professionalism", then we have to define what that means. The definition of the term "professional" is not really what we want, in practice, for it's usually defined as "somebody who got paid to do the job". The Craftsmanship Manifesto seems to want some kind of code of ethics or programmer equivalent to the Hippocratic Oath, so that the third precept isn't "a community of people who are paid to do what they do", but something deeper and more meaningful and concrete. (I don't have that definition handy, by the way, so don't look to me for it. But I will also roundly reject anyone who tries to use the Potter Stewart-esque "I can't define it but I know it when I see it" approach, because now we're back to individual interpretation.)
  5. I found Uncle Bob's treatment of my blog heavy-handed and arrogant. In case that wasn't obvious. And I reacted in similar manner, something for which I will apologize now. By reacting in that way, I'm sure I perpetuate the blog war, and truthfully, I have a lot of respect for Bob's technical skills; I was an avid fan of his C++ articles for years, and there's a lot of good technical ideas and concepts that any programmer would be well-advised to learn. His technical skill is without question; his compassion and empathy, however, might be. (As are mine, for stooping to that same level.)
Peace out.


.NET | C# | C++ | Conferences | Development Processes | F# | Industry | Java/J2EE | Languages | Parrot | Personal | Reading | Review | Social | Visual Basic | Windows

Saturday, February 02, 2013 4:33:12 AM (Pacific Standard Time, UTC-08:00)
Comments [3]  | 
 Friday, January 25, 2013
More on "Craftsmanship"

TL;DR: To all those who dissented, you're right, but you're wrong. Craftsmanship is a noble meme, when it's something that somebody holds as a personal goal, but it's often coming across as a way to beat up and denigrate on others who don't choose to invest significant time and energy into programming. The Zen Masters didn't walk around the countryside, proclaiming "I am a Zen Master!"

Wow. Apparently I touched a nerve.

It's been 48 hours since I posted On the Dark Side of 'Craftsmanship', and it's gotten a ton of interest, as well as a few syndicated re-posts (DZone and a few others). Comments to the blog included a response from Dave Thomas, other blog posts have been brought to my attention, and Twitter was on FIRE with people pinging me with their thoughts, which turn out to be across the spectrum, approving and dissenting. Not at all what I really expected to happen, to be honest--I kinda thought it would get lost in the noise of others commenting around the whole thing.

But for whatever reason, it's gotten a lot of attention, so I feel a certain responsibility to respond and explain to some of the dissenters who've responded. Not to defend, per se, but to at least demonstrate some recognition and attempt to clarify my position where I think it's gotten mis-heard. (To those who approved of the message, thank you for your support, and I'm happy to have vocalized something you felt unable, unwilling, unheard, or too busy to vocalize yourself. I hope my explanations here continue to represent your opinions, but if not, please feel free to let me know.)

A lot of the opinions centered around a few core ideas, it seems, so let me try and respond to those first.

You're confusing "craftsmanship" with a few people behaving badly. That may well be, but those who behaved badly included at least one who holds himself up as a leader of the craftsman movement and has held his actions up as indications of how "craftsmen" should behave. When you do this, you invite this kind of criticism and association. So if the movement is being given a black eye because of the actions of a single individual, well, now you know how a bunch of moderate Republicans feel about Paul Ryan.

Corey is a nice guy, he apologized, don't crucify him. Of course he is. Corey is a nice guy--and, speaking well to his character, he apologized almost immediately when it all broke. I learned a long time ago that "true sorry" means you (a) apologize for your actions, (b) seek to remedy the damage your actions have caused ("make it right", in other words), and (c) avoid making the same mistake in the future. From a distance, it seems like he feels contrition, and has publicly apologized for his actions. I would hope he's reached out to Heather directly to try and make things right with her, but that's between the two of them. Whether he avoids this kind of activity in the future remains to be seen. I think he will, but that's because I think he's learned a harsh lesson about being in the spotlight--it tends to be a harsh place to be. The rest of this really isn't about Corey and Heather anymore, so as far as I'm concerned, that thread complete.

You misunderstand the nature of "craftsmanship". Actually, no, I don't. At its heart, the original intent of "craftsmanship" was a constant striving to be better about what you do, and taking pride in the things that you do. It's related to the Japanese code of the samurai (kaizen) that says, in essence, that we are constantly striving to get better. The samurai sought to become better swordsmen, constantly challenging each other to prove the mettle against one another, improving their skills and, conditioning, but also their honor, by how they treated each other, their lord, their servants, and those they sought to protect. Kanban is a wonderful code, and one I have tried to live my entire life, even before I'd discovered it. Please don't assume that I misunderstand the teachings of your movement just because I don't go to the meetings.

Why you pick on "craftsmanship", anyway? If I want to take pride in what I do, what difference does it make? This is me paraphrasing on much of the dissent, and my response boils down to two basic thoughts:

  1. If you think your movement is "just about yourself", why invent a label to differentiate yourself from the rest?
  2. If you invent a label, it becomes almost automatic to draw a line between "us" and "them", and that in of itself almost automatically leads to "us vs them" behavior and mentality.
Look, I view this whole thing as kind of like religion: whatever you want to do behind closed doors, that's your business. But when you start waving it in other peoples' faces, then I have a problem with it. You want to spend time on the weekends improving your skills, go for it. You want to spend time at night learning a bunch of programming languages so you can improve your code and your ability to design systems, go for it. You want to study psychology and philosophy so you can understand other people better when it comes time to interact with them, go for it. And hey, you want to put some code up somewhere so people can point to it and help you get it better, go for it. But when you start waving all that time and dedication in my face, you're either doing it because you want recognition, or you want to suggest that I'm somehow not as good as you. Live the virtuous life, don't brag about it.

There were some specific blogs and comments that I think deserve discusson, too:

Dave Thomas was kind enough to comment on my blog:

I remember the farmer comment :) I think I said 30%, but I stand by what I said. And it isn't really an elitist stance. Instead, I feel that programming is hard work. At the end of a day of coding, I'm tired. And so I believe that if you are asking someone to do programming, then it is in both your and their interest that they are doing something they enjoy. Because if they don't enjoy it, then they are truly just a laborer, working hard at something that has no meaning to them. And as you spend 8 hours a day, 5 days a week doing it, that seems like an awful waste of an intelligent person's life.
Sure, programming is hard. So is house painting. They're different kinds of exhaustion, but it's exhaustion all the same. But, frankly, if somebody has chosen to take up a job that they do just because it's a job, that's their choice, and not ours to criticize, in my opinion. (And I remember it as 50%, because I very clearly remember saying the "way to insult half the room" crack after it, but maybe I misheard you. I do know others also heard it at 50%, because an attendee or two came up to talk about it after the panel. At least, that's how I remember it at the time. But the number itself is kinda meaningless, now that I think about it.)
The farming quote was a deliberate attempt at being shocking to make a point. But I still think it is valid. I'd guess that 30% of the developers I meet are not happy in their work. And I think those folks would be happier and more fulfilled doing something else that gave them more satisfaction.
Again, you and I are both in agreement, that people should be doing what they love, but that's a personal judgment that each person is permitted to make for themselves. There are aspects of our lives that we don't love, but we do because they make other people happy (Juliet and Charlotte driving the boys around to their various activities comes to mind, for example), and it is not our position to judge how others choose for themselves, IMHO.
No one should have to be a laborer.
And here, you and I will disagree quite fundamentally: as I believe it was Martin Luther King, Jr, who said, "If you are going to be a janitor, be the best janitor you know how to be." It seems by that statement that you are saying that people who labor with their bodies rather than your minds (and trust me, you may not be a laborer anymore, big publishing magnate that you are, but I know I sure still am) are somehow less well-off than those who have other people working for them. Some people don't want the responsibility of being the boss, or the owner. See the story of the mexican fisherman at the end of this blog.

Nate commented:

You have a logical fallacy by lumping together the people that derided Heather's code and people that are involved in software craftmanship. It's actually a huge leap of logic to make that connection, and it really retracts from the article.
As I point out later, the people who derided Heather's code were some of the same folks who hold up software craftsmanship. That wasn't me making that up.

Now you realise that you are planting your flag firmly in the 'craftmanship' camp while propelling your position upwards by drawing a line in the sand to define another group of people as 'labourers'. Or in other words attempt to elevate yourself by patronising others with the position you think you are paying them a compliment. Maybe you do not realise this?
No, I realize it, and it's a fair critique, which is why I don't label myself as a "craftsman". I have more to say on this below.
However, have you considered that the craft is not how awesome and perfect you and your code are, but what is applicable for the task at hand. I think most people who you would put into either camp share the same mix of attributes whether good or bad. The important thing is if the solution created does what it is designed to do, is delivered on time for when it is needed and if the environment that the solution has been created for warrants it, that the code is easily understandable by yourself and others (that matter) so it can be developed further over time and maintained.
And the very people who call themselves "craftsmen" criticized a piece of code that, as near as I can tell, met all of those criteria. Hence my reaction that started this whole thing.
I don't wish to judge you, and maybe you are a great, smart guy who does good in the world, but like you I have not researched anything about you, I have simply read your assessment above and come to a conclusion, that's being human I guess.
Oh, people judge each other all the time, and it's high time we stopped beating them up for it. It's human to judge. And while it would be politically correct to say, "You shouldn't judge me before you know me", fact is, of course you're going to do exactly that, because you don't have time to get to know me. And the fact that you don't know me except but through the blog is totally acceptable--you shouldn't have to research me in order to have an opinion. So we're all square on that point. (As to whether I'm a great smart guy who does good in the world, well, that's for others to judge in my opinion, not mine.)
The above just sounds like more of the same 'elitism' that has been ripe in this world from playground to the workplace since the beginning.
It does, doesn't it? And hopefully I clarify the position more clearly later.

In It's OK to love your job, Chad McCallum says that

The basic premise (or at least the one the author start out with) is that because there’s a self-declared group of “software craftspeople”, there is going to be an egotistical divide between those who “get it” and those who don’t.
Like it or not, Chad, that egotistical divide is there. You can "call bullshit" all day long, but look at the reactions that have popped up over this--people feel that divide, and frankly, it's one that's been there for a long, long time. This isn't just me making this up.

Chad also says,

It’s true the feedback that Heather got was unnecessarily negative. And that it came from people who are probably considered “software craftspeople”. That said, correlation doesn’t equal causation. I’m guessing the negative feedback was more because those original offenders had a bad day and needed to vent. And maybe the comments after that one just jumped on the bandwagon because someone with lots of followers and/or respect said it.

These are both things that can and have happened to anyone, regardless of the industry they work in. It’s extremely unfair to associate “someone who’s passionate about software development” to “person who’s waiting to jump on you for your mistakes”.

Unfortunately, Chad, the excuse that "others do it, too" is not an acceptable excuse. If everybody jumped off a cliff, would you do it, too? I understand the rationale--it's extremely hard being the one to go against the herd (I've got the psychological studies I can cite at you that prove it), but that doesn't make it OK or excuse it. Saying "it happens in other industries" is just an extension of that. In other industries, women are still explicitly discriminated against--does that make it OK for us to do that, too?

Chad closes his blog with "Stop calling us egotistical jerks just because we love what we do." To which I respond, "I am happy to do so, as soon as those 'craftsmen' who are acting like one, stop acting like one." If you're not acting like one, then there should be no argument here. If you're trying to tell me that your label is somehow immune to criticism, then I think we just have to agree to disagree.

Paul Pagel (on a site devoted to software craftsmanship, no less) responded as well with his Humble Pursuit of Mastery. He opens with:

I have been reading on blogs and tweets the sentiment that "software craftsmanship is elitism". This perception is formed around comments of code, process, or techniques. I understand a craftsman's earned sense of pride in their work can sometimes be inappropriately communicated.
I don't think I commented on code, process or technique, so I can't be sure if this is directly refuting what I'm saying, but I note that Paul has already touched on the meme he wants to communicate in his last phrase: the craftsman's "earned sense of pride". I have no problem with the work being something that you take pride in; I note, however, that "pride goeth before a fall", and note that, again, Ozymandias was justifiably proud of his accomplishments, too.

Paul then goes through a summation of his career, making sure to smallcaps certain terms with which I have no argument: "sacrifice", "listen", "practicing", "critique" and "teaching". And, in all honesty, these are things that I embrace, as well. But I start getting a little dubious about the sanctity of your terminology, Paul, when it's being used pretty blatantly as an advertising slogan and theme all over the site--if you want the term to remain a Zen-like pursuit, then you need to keep the commercialism out of it, in my opinion, or you invite the kind of criticism that's coming here (explicit or implicit).

Paul's conclusion wraps up with:

Do sacrificing, listening, practice, critiquing, and teaching sound like elitist qualities to you? Software craftsmanship starts out as a humble endeavor moving towards mastery. I won't let 140 or 1000 characters redefine the hours and years spent working hard to become a craftsman. It gave me humility and the confidence to be a professional software developer. Sometimes I let confidence get the better of me, but I know when that happens I am not honoring the spirit of craftsmanship which I was trained.
Humility enough to trademark your phrase "Software is our craft"? Humility enough to call yourself a "driving force" behind software craftsmanship? Don't get me wrong, Paul, there is a certain amount of commercialism that any consultant must adopt in order to survive--but either please don't mix your life-guiding principles with your commercialism, or else don't be surprised when others take aim at your "humility" when you do. It's the same when ministers stand in a multi-million dollar building on a Sunday morning and talk about the parable of the widow giving away her last two coppers--that smacks of hypocrisy.

Finally, Matt van Horn wrote in Crafsmanship, a rebuttal that:

there is an allusion to software craftsmen as being an exclusive group who agre on the “right” tools and techniques. This could not be further from the truth. Anyone who is serious about their craft knows that for every job there are some tools that are better and some that are worse.
... but then he goes right into making that exact mistake:
Now, I may not have a good definition of elegant code, but I definitely know it when I see it – regardless of who wrote it. If you can’t see that
(1..10).each{|i| puts i}

is more elegant than
x = 0
while true do
  x = x + 1
  if x > 10
    break
  end
  puts x
end
then you must near the beginning of your journey towards mastery. Practicing your craft develops your ability to recognize these differences, just as a skilled tailor can more easily spot the difference between a bespoke suit and something from Men’s Wearhouse.
Matt, you kind of make my point for me. What makes it elegant? You take it as self-evident. I don't. As a matter of fact, I've been asking this question for some years now, "What makes code 'elegant', as opposed to 'ugly'? Ironically, Elliott Rusty Harold just blogged about how this style of coding is dangerous in Java, and got crucified for it, but he has the point that functional style (your first example) doesn't JIT as well as the more imperative style right now on the JVM (or on the CLR, from what I can tell). Are you assuming that this will be running on a native Ruby implementation, on JRuby, IronRuby, ...? You have judged the code in the second example based on an intrinsic value system that you may have never questioned. To judge, you have to be able to explain your judgments in terms of the value system. And the fact that you judge without any context, kind of speaks directly to the point I was trying to make: "craftsmen", it seems, have this tendency to judge in absence of context, because they are clearly "further down their journey towards mastery", to use your own metaphor.

Or, to put it much more succinctly, "Beauty is in the eye of the beholder".

Matt then tells me I missed the point of the samurai and tea master story:

inally, he closes with a famous zen story, but he entirely misses the point of it. The story concerns a tea master, and a samurai, who get into a duel. The tea master prevails by bringing the same concentration to the duel that he brings to his tea ceremony. The point that Ted seems to miss here is that the tea master is a craftsman of the highest order. A master of cha-do (the way of tea) is able to transform the simple act of making and pouring a cup of tea into something transcendant by bringing to this simple act a clear mind, a good attitude, and years of patient, humble practice. Arguably he prevails because he has perfected his craft to a higher degree than the samurai has perfected his own. That is why he has earned the right to wear the garb of a samurai, and why he is able to face down his opponent.
Which, again, I find funny, because most Zen masters will tell you that the story--any Zen story, in fact--has no "definitive" meaning, but has meaning based on how you interpret it. (There are a few Zen parables that reinforce this point, but it gets a little meta to justify my understanding of a Zen story by quoting another Zen story.) How Matt chooses to interpret that parable is, of course, up to him. I choose to interpret the story thusly: the insulted samurai felt that his "earned sense of pride" at his sword mastery was insulted by the tea master--clearly no swordsman, as it says in the story--wore robes of a rank and honor that he had not earned. And clearly, the tea master was no swordsman. But what the tea master learned from his peer was not how to use his concentration and discipline to improve his own swordsmanship, but how to demonstrate that he had, in fact, earned a note of mastery through an entirely different discipline than the insulted samurai's. The tea master still has no mastery of the sword, but in his own domain, he is an expert. This was all the insulted samurai needed to see, that the badge of honor had been earned, and not just imposed by a capricious (and disrespectful) lord. Put a paintbrush and canvas into the hands of a house painter, and you get pretty much a mess--but put a spray painter in the hands of Leonardo, and you still get a mess. In fact, to really do the parable justice, we should see how much "craft" Matt can bring when asked to paint a house, because that's about how much relevance swordsmanship and house painting have in relationship to one another. (All analogies fail eventually, by the way, and we're probably reaching the boundaries of this one.)

Billy Hollis is a master with VB, far more than I ever will be; I know C++ far better than he ever will. I respect his abilities, and he, mine. There is no argument here. But more importantly, there are friends I've worked with in the past who are masters with neither VB nor C++, nor any other programming language, but chose instead to sink their time and energy into skiing, pottery, or being a fan of a television show. They chose to put their energies--energies the "craftsmen" seem to say should be put towards their programming--towards things that bring them joy, which happen to not be programming.

Which brings me to another refrain that came up over and over again: You criticize the craftsman, but then you draw a distinction between "craftsman" and "laborer". You're confusing (or confused). First of all, I think it important to disambiguate along two axes: those who are choosing to invest their time into learning to write better software, and those who are choosing to look at writing code as "just" a job as one axis, and along a second axis, the degree to which they have mastered programming. By your own definitions, "craftsmen", can one be early in your mastery of programming and still be a "craftsman"? Can one be a master bowler who's just picked up programming and be considered a "craftsman"? Is the nature of "craftsmanship" a measure of your skill, or is it your dedication to programming, or is it your dedication to something in your life, period? (Remember, the tea master parable says that a master C++ developer will see the master bowler and respect his mastery of bowling, even though he can't code worth a crap. Would you call him a "craftsman"?)

Frankly, I will say, for the record, that I think there are people programming who don't want to put a ton of time and energy into learning how to be better programmers. (I suspect that most of them won't ever read this blog, either.) They see the job as "just a job", and are willing to be taught how to do things, but aren't willing to go off and learn how to do them on their own. They want to do the best job they can, because they, like any human being, want to bring value to the world, but don't have that passion for programming. They want to come in at 9, do their job, and go home at 5. These are those whom I call "laborers". They are the "fisherman" in the following story:

The businessman was at the pier of a small coastal Mexican village when a small boat with just one fisherman docked. Inside the small boat were several large yellowfin tuna. The businessman complimented the Mexican on the quality of his fish and asked how long it took to catch them. The Mexican replied only a little while.

The businessman then asked why he didn't stay out longer and catch more fish? The Mexican said he had enough to support his family's immediate needs. The businessman then asked, but what do you do with the rest of your time? The Mexican fisherman said, "I sleep late, fish a little, play with my children, take a siesta with my wife, Maria, stroll into the village each evening where I sip wine and play guitar with my amigos; I have a full and busy life, señor."

The businessman scoffed, "I am a Harvard MBA and I could help you. You should spend more time fishing and with the proceeds buy a bigger boat. With the proceeds from the bigger boat you could buy several boats; eventually you would have a fleet of fishing boats. Instead of selling your catch to a middleman, you would sell directly to the processor and eventually open your own cannery. You would control the product, processing and distribution. You would need to leave this small coastal fishing village and move to Mexico City, then LA and eventually New York City where you would run your expanding enterprise."

The Mexican fisherman asked, "But señor, how long will this all take?" To which the businessman replied, "15-20 years." "But what then, señor?" The businessman laughed and said, "That's the best part! When the time is right you would announce an IPO and sell your company stock to the public and become very rich. You would make millions." "Millions, señor? Then what?" The businessman said, "Then you would retire. Move to a small coastal fishing village where you would sleep late, fish a little, play with your kids, take a siesta with your wife, stroll to the village in the evenings where you could sip wine and play your guitar with your amigos."

What makes all of this (this particular subject, craftsmanship) particularly hard for me is that I like the message that craftsmanship brings, in terms of how you conduct yourself. I love the book Apprenticeship Patterns, for example, and think that anyone, novice or master, should read this book. I have taken on speaking apprentices in the past, and will continue to do so well into the future. The message that underlies the meme of craftsmanship--the constant striving to improve--is a good one, and I don't want to throw the baby out with the bathwater. If you have adopted "craftsmanship" as a core value of yours, then please, by all means, continue to practice it! Myself, I choose to do so, as well. I have mentored programmers, I have taken speaking apprentices, and I strive to learn more about my craft by branching my studies out well beyond software--I am reading books on management, psychology, building architecture, and business, because I think there is more to software than just the choice of programming language or style.

But be aware that if you start telling people how you're living your life, there is an implicit criticism or expectation that they should be doing that, as well. And when you start criticizing other peoples' code as being "unelegant" or "unbeautiful" or "unclean", you'd better be able to explain your value system and why you judged it as so. Humility is a hard, hard path to tread, and one that I have only recently started to see the outlines of; I am guilty of just about every sin imaginable when it comes to this subject. I have created "elegant" systems that failed their original intent. I have criticized "ugly" code that, in fact, served the purpose well. I have bragged of my own accomplishments to those who accomplished a lot more than I did, or ever will. And I consider it amazing to me that my friends who've been with me since long before I started to eat my justly-deserved humble pie are still with me. (And that those friends are some amazing people in their own right.; if a man is judged by the company he keeps, then by looking around at my friends, I am judged to be a king.) I will continue to strive to be better than I am now, though, even within this discussion right now: those of you who took criticism with my post, you have good points, all of you, and I certainly don't want to stop you from continuing on your journeys of self-discovery, either.

And if we ever cross paths in person, I will buy you a beer so that we can sit down, and we can continue this discussion in person.


.NET | C# | C++ | Conferences | Development Processes | F# | Industry | Java/J2EE | Languages | Objective-C | Parrot | Personal | Reading | Review | Ruby | Scala | Social | Windows

Friday, January 25, 2013 10:24:27 PM (Pacific Standard Time, UTC-08:00)
Comments [7]  | 
 Wednesday, January 23, 2013
On the Dark Side of "Craftsmanship"

I don't know Heather Arthur from Eve. Never met her, never read an article by her, seen a video she's in or shot, or seen her code. Matter of fact, I don't even know that she is a "she"--I'm just guessing from the name.

But apparently she got quite an ugly reaction from a few folks when she open-sourced some code:

So I went to see what people were saying about this project. I searched Twitter and several tweets came up. One of them, I guess the original one, was basically like “hey, this is cool”, but then the rest went like this:
"I cannot even make this stuff up." --@steveklabnik
"Ever wanted to make sed or grep worse?" --@zeeg
"@steveklabnik or just point to the actual code file. eyes bleeding!" --@coreyhaines
At this point, all I know is that by creating this project I’ve done something very wrong. It seemed liked I’d done something fundamentally wrong, so stupid that it flabbergasts someone. So wrong that it doesn’t even need to be explained. And my code is so bad it makes people’s eyes bleed. So of course I start sobbing.
Now, to be fair, Corey later apologized. But I'm still going to criticize the response. Not because Heather's a "she" and we should be more supportive of women in IT. Not because somebody took something they found interesting and put it up on github for anyone to take a look at and use if they found it useful. Not even because it's good code when they said it was bad code or vice versa. (To be honest, I haven't even looked at the code--that's how immaterial it is to my point.)

I'm criticizing because this is what "software craftsmanship" gets us: an imposed segregation of those who "get it" from those who "don't" based on somebody's arbitrary criteria of what we should or shouldn't be doing. And if somebody doesn't use the "right" tools or code it in the "right" way, then bam! You clearly aren't a "craftsman" (or "craftswoman"?) and you clearly don't care about your craft and you clearly aren't worth the time or energy necessary to support and nourish and grow and....

Frankly, I've not been a fan of this movement since its inception. Dave Thomas (Ruby Dave) was on a software panel with me at a No Fluff Just Stuff show about five years ago when we got on to this subject, and Dave said, point blank, "About half of the programmers in the world should just go take up farming." He paused, and in the moment that followed, I said, "Wow, Dave, way to insult half the room." He immediately pointed out that the people in the room were part of the first half, since they were at a conference, but it just sort of underscored to me how high-handed and high-minded that kind of talk and position can be.

Not all of us writing code have to be artists. Frankly, in the world of painting, there are those who will spend hours and days and months, tiny brushes in hand, jars of pigment just one lumens different from one another, laboring over the finest details, creating just one piece... and then there are those who paint houses with paint-sprayers, out of cans of mass-produced "Cream Beige" found at your local Lowes. And you know what? We need both of them.

I will now coin a term that I consider to be the opposite of "software craftsman": the "software laborer". In my younger days, believing myself to be one of those "craftsmen", a developer who knew C++ in and out, who understood memory management and pointers, who could create elegant and useful solutions in templates and classes and inheritance, I turned up my nose at those "laborers" who cranked out one crappy app after another in (what else?) Visual Basic. My app was tight, lean, and well-tuned; their apps were sloppy, bloated, and ugly. My app was a paragon of reused code; their apps were cut-and-paste cobbled-together duct-tape wonders. My app was a shining beacon on a hill for all the world to admire; their apps were mindless drones, slogging through the mud.... Yeah, OK, so you get the idea.

But the funny thing was, those "laborers" were going home at 5 every day. Me, I was staying sometimes until 9pm, wallowing in the wonderment of my code. And, I have to wonder, how much of that was actually not the wonderment of my code, but the wonderment of "me" over the wonderment of "code".

Speaking of, by the way, there appear to be the makings of another such false segregation, in the areas of "functional programming". In defense of Elliott Rusty Harold's blog the other day (which I criticized, and still stand behind, for the reasons I cited there), there are a lot of programmers that are falling into the trap of thinking that "all the cool kids are using functional programming, so if I want to be a cool kid, I have to use functional programming too, even though I'm not sure what I'm doing....". Not all the cool kids are using FP. Some aren't even using OOP. Some are just happily humming along using good ol' fashioned C. And producing some really quality stuff doing so.

See, I have to wonder just how much of the software "craftsmanship" being touted isn't really a narcissistic "Look at me, world! Look at how much better I am because I care about what I do! Look upon my works, ye mighty, and despair!" kind of mentality. Too much of software "craftsmanship" seems to be about the "me" part of "my code". And when I think about why that is, I come to an interesting assertion: That if we take the name away from the code, and just look at the code, we can't really tell what's "elegant" code, what's "hack" code, and what was "elegant hack because there were all these other surrounding constraints outside the code". Without the context, we can't tell.

A few years after my high point as a C++ "craftsman", I was asked to do a short, one-week programming gig/assignment, and the more I looked at it, the more it screamed "VB" at me. And I discovered that what would've taken me probably a month to do in C++ was easily accomplished in a few days in VB. I remember looking at the code, and feeling this sickening, sinking sense of despair at how stupid I must've looked, crowing. VB isn't a bad language--and neither is C++. Or Java. Or C#. Or Groovy, or Scala, or Python, or, heck, just about any language you choose to name. (Except Perl. I refuse to cave on that point. Mostly for comedic effect.)

But more importantly, somebody who comes in at 9, does what they're told, leaves at 5, and never gives a rat's ass about programming except for what they need to know to get their job done, I have respect for them. Yes, some people will want to hold themselves up as "painters", and others will just show up at your house at 8 in the morning with drop cloths. Both have their place in the world. Neither should be denigrated for their choices about how they live their lives or manage their careers. (Yes, there's a question of professional ethics--I want the house painters to make sure they do a good job, too, but quality can come just as easily from the nozzle of a spray painter as it does from the tip of a paintbrush.)

I end this with one of my favorite parables from Japanese lore:

Several centuries ago, a tea master worked in the service of Lord Yamanouchi. No-one else performed the way of the tea to such perfection. The timing and the grace of his every move, from the unfurling of mat, to the setting out of the cups, and the sifting of the green leaves, was beauty itself. His master was so pleased with his servant, that he bestowed upon him the rank and robes of a Samurai warrior.

When Lord Yamanouchi travelled, he always took his tea master with him, so that others could appreciate the perfection of his art. On one occasion, he went on business to the great city of Edo, which we now know as Tokyo.

When evening fell, the tea master and his friends set out to explore the pleasure district, known as the floating world. As they turned the corner of a wooden pavement, they found themselves face to face with two Samurai warriors.

The tea master bowed, and politely step into the gutter to let the fearsome ones pass. But although one warrior went by, the other remained rooted to the spot. He stroked a long black whisker that decorated his face, gnarled by the sun, and scarred by the sword. His eyes pierced through the tea maker’s heart like an arrow.

He did not quite know what to make of the fellow who dressed like a fellow Samurai, yet who would willingly step aside into a gutter. What kind of warrior was this? He looked him up and down. Where were broad shoulders and the thick neck of a man of force and muscle? Instinct told him that this was no soldier. He was an impostor who by ignorance or impudence had donned the uniform of a Samurai. He snarled: “Tell me, oh strange one, where are you from and what is your rank?”

The tea master bowed once more. “It is my honour to serve Lord Yamanouchi and I am his master of the way of the tea.”

“A tea-sprout who dares to wear the robes of Samurai?” exclaimed the rough warrior.

The tea master’s lip trembled. He pressed his hands together and said: “My lord has honoured me with the rank of a Samurai and he requires me to wear these robes. “

The warrior stamped the ground like a raging a bull and exclaimed: “He who wears the robes of a Samurai must fight like a Samurai. I challenge you to a duel. If you die with dignity, you will bring honour to your ancestors. And if you die like a dog, at least you will be no longer insult the rank of the Samurai !”

By now, the hairs on the tea master’s neck were standing on end like the feet of a helpless centipede that has been turned upside down. He imagined he could feel that edge of the Samurai blade against his skin. He thought that his last second on earth had come.

But the corner of the street was no place for a duel with honour. Death is a serious matter, and everything has to be arranged just so. The Samurai’s friend spoke to the tea master’s friends, and gave them the time and the place for the mortal contest.

When the fierce warriors had departed, the tea master’s friends fanned his face and treated his faint nerves with smelling salts. They steadied him as they took him into a nearby place of rest and refreshment. There they assured him that there was no need to fear for his life. Each one of them would give freely of money from his own purse, and they would collect a handsome enough sum to buy the warrior off and make him forget his desire to fight a duel. And if by chance the warrior was not satisfied with the bribe, then surely Lord Yamanouchi would give generously to save his much prized master of the way of the tea.

But these generous words brought no cheer to the tea master. He thought of his family, and his ancestors, and of Lord Yamanouchi himself, and he knew that he must not bring them any reason to be ashamed of him.

“No,” he said with a firmness that surprised his friends. “I have one day and one night to learn how to die with honour, and I will do so.”

And so speaking, he got up and returned alone to the court of Lord Yamanouchi. There he found his equal in rank, the master of fencing, he was skilled as no other in the art of fighting with a sword.

“Master,” he said, when he had explained his tale, “Teach me to die like a Samurai.”

But the master of fencing was a wise man, and he had a great respect for the master of the Tea ceremony. And so he said: “I will teach you all you require, but first, I ask that you perform the way of the Tea for me one last time.”

The tea master could not refuse this request. As he performed the ceremony, all trace of fear seemed to leave his face. He was serenely concentrated on the simple but beautiful cups and pots, and the delicate aroma of the leaves. There was no room in his mind for anxiety. His thoughts were focused on the ritual.

When the ceremony was complete, the fencing master slapped his thigh and exclaimed with pleasure : “There you have it. No need to learn anything of the way of death. Your state of mind when you perform the tea ceremony is all that is required. When you see your challenger tomorrow, imagine that you are about to serve tea for him. Salute him courteously, express regret that you could not meet him sooner, take of your coat and fold it as you did just now. Wrap your head in a silken scarf and and do it with the same serenity as you dress for the tea ritual. Draw your sword, and hold it high above your head. Then close your eyes and ready yourself for combat. “

And that is exactly what the tea master did when, the following morning, at the crack of dawn he met his opponent. The Samurai warrior had been expecting a quivering wreck and he was amazed by the tea master’s presence of mind as he prepared himself for combat. The Samurai’s eyes were opened and he saw a different man altogether. He thought he must have fallen victim to some kind of trick or deception ,and now it was he who feared for his life. The warrior bowed, asked to be excused for his rude behaviour, and left the place of combat with as much speed and dignity as he could muster.

(excerpted from http://storynory.com/2011/03/27/the-samurai-and-the-tea-master/)

My name is Ted Neward. And I bow with respect to the "software laborers" of the world, who churn out quality code without concern for "craftsmanship", because their lives are more than just their code.


.NET | Android | C# | C++ | Conferences | Development Processes | F# | Industry | Java/J2EE | Languages | LLVM | Objective-C | Parrot | Personal | Reading | Ruby | Scala | Social | Visual Basic | Windows

Wednesday, January 23, 2013 9:06:24 PM (Pacific Standard Time, UTC-08:00)
Comments [14]  | 
 Monday, January 21, 2013
On Functional Programming in Java

Elliott Rusty Harold is blogging that functional programming in Java is dangerous. He's wrong, and he's way late to the party on this one--it's coming to Java whether he likes it or not.

Go read his post first, while I try to sum up the reasons he cites for saying it's dangerous:

  1. Java is not a lazy-evaluated language. Programmers in Java will screw up and create heap and stack errors as a result.
  2. See? Here's a naive implementation of Clojure code taken directly over to Java and look how it blows up.
  3. Programmers can do bad things with this idea, so therefore we should avoid it.
  4. Oh, and by the way, it's "dangerously inefficient" in Java/JVM, even though I offer no perf benchmarks or comparisons to back this statement, and I'm somehow ignoring that Clojure and Scala run on the JVM as well, apparently without problem.
That kind of about sums it up, I think.

Look, as Elliott points out, Java is not Haskell. Neither is it Lisp. It's its own language, rooted in imperative and object-oriented history, but no less able to incorporate functional features into its development than Lisp could incorporate object-oriented features. However, if you do stupid things, like trying to re-create an infinite (implicitly lazily-evaluated) list in Clojure by creating an actualized list that stretches to infinity... you're going to blow the JVM up. Duh. Not even the supercomputer on the USS Enterprise five hundred years from now will be able to construct that list.

Porting code from one language to another is not a trivial exercise. If you attempt to port line-for-line and expression-for-expression, you can expect that your ported code will not be idiomatically correct. (I know this already, having done the exercise myself.) The root of the problem in his ported code is twofold. One, he (rather foolishly and in elegant strawman fashion) badly simulates what an infinite list would look like in Java--a commenter does the better job by showing how an Iterator can be made to perform the same thing that Haskell actually does under the hood by producing the next value on demand, rather than trying to create a list of Integers stretching to infinity. For someone who professes to have some Haskell experience and love, it surprises me that Elliott makes this kind of mistake, which leads me to conclude that he's trying to create the strawman. Two, he assumes that anyone who programs in Java functionally will have to create all of their functional tools by hand, and frankly, using Guava or FJ here would make this code sample a LOT easier to swallow. The fact that he ignores both of these in his stawman again sort of reinforces the idea that he's deliberately crippling his strawman to make his point.

His underlying point, though, seems to be simple: "I work with bad programmers, who don't seem to understand how to write code functionally in Java without screwing it up." Dude. Sucks to be you. "Bad programmers will move heaven and earth to do the wrong thing." --Glenn Vanderburg.

What really sucks for him is that these features are coming in Java 8, including lambda expressions and library support including a Stream interface that will allow for exactly this kind of code to be written without pain. Those programmers Elliott is working with are going to be even more on fire to use their functional approaches (and all the associated goodness of doing so, including composability and what-not) in their Java code. What might make Elliott more happy is that at least they won't have written it; it'll all be written by guys much smarter than any of them.


Android | Java/J2EE | Languages | Personal | Review | Scala

Monday, January 21, 2013 2:10:14 PM (Pacific Standard Time, UTC-08:00)
Comments [1]  | 
 Monday, January 07, 2013
Thoughts on a CodeMash Gone By

A year ago today (roughly), I gave the opening keynote at CodeMash 2.0.1.2. For those of you who were there, I don't think I need to tell you what happened. For those of you who weren't there, you probably still heard about, thanks to the Twitterstream of comments and counter-comments that followed. I've more or less tried to keep quiet about it since that time, trying to just let the furor die down (and it did, pretty quickly, I thought) out of respect to the conference organizers.

But with the show starting up again this week, and there having been a few people over the last twelve months who've asked me about "what the f*ck were you thinking" (whether that was in deliberate pun/jest or not, I can't always tell), and most importantly, now that I know that Jim and I are square with each other (thanks to a Twitter conversation a few days ago), I figure it's time to come clean and tell my side of the story.

TL;DR: If I had the chance to do the keynote over again, I'd do it differently.

(By the way, the rest of this post does have a few profanities in it, so if you're offended by that sort of thing, this is a good place to stop reading. Or, as the movies would say, this post is rated PG-13 for adult language.)

As a speaker, I have always sought to create a "persona" on stage that allowed me the maximum freedom of expression and opportunity to get my point across. A long time ago, when I started teaching at DevelopMentor, I learned from some of the best--one of those best being, of course, Don Box, but another of those was Ted Pattison. It was he who taught me that "If you can make 'em laugh, you can do whatever you want to them" (meaning the audience). He demonstrated this quite graphically by guest-lecturing in one of my classes once, early in my tenure as a DM instructor, and promptly castrated one of the students who was constantly irritating the class (and me) with off-topic questions. It was an eye-opening experience. Later, Don mentioned in passing that what we did was "equal parts education and entertainment". Education because, yes, it's what we do, but entertainment, too, because if the room falls asleep, then they're not getting educated.

And folks, I've sat in those chairs, I know how boring talks can be sometimes. And that sometimes, despite your best efforts, no matter how interesting the material, it can just be sooooo easy to pop open the laptop and do some email. Or write some code. Or even let the ambient warmth of the room in a post-lunch talk just... make... eyes... so heavy.... I get it, really.

So I decided, quite consciously, to develop a speaking persona that was a little on the edge, a little outrageous, a little "over the top", because then that persona gave me the freedom to do some of the crazy things that would keep the crowd awake and on its toes. I stand people up from the audience and use them in my demos. I write code on the fly based on their questions, and I try to use examples that allow for a certain amount of "Wow, that was weird, so I'll remember it better" in the demo itself. Case in point: when writing code to demonstrate delegates and events in C#, I would use the idea of a "Rock Band" and its fan club which, of course, must include groupies.

Is it politically correct to talk about groupies in a professional programming classroom setting? Probably not. Did anybody complain? Never heard one, directly or indirectly. Part of that, I believe, was because they got the point of the demo, and that was the point. Not that I was advocating groupie-ism, or that rock bands were more interesting than programming, but that the domain was easy enough to grip in their heads, and that made the result (loose coupling between event generators and consumers, in the case of delegates and events) more easily understood.

Analogies, for me, are never gratuitous. I choose my analogies quite carefully, and try to be very clear about where and when they do break down, because all analogies break down eventually. Even my most famous analogy breaks down, as many people have pointed out: nobody has ever died from O/R-Ms. Yep. But your wife's eyes were never burning balls of superheated plasma billions of light years away, either.

Point is, I deliberately seek ways to keep you entertained. And you know what? Entertainment often comes, in the case, from making the room laugh, and humor most often derives from the unexpected. And what's more unexpected than a profanity dropped at the most unexpected moment?

You don't have to agree with that sentiment to realize that it's FUCKING true.

When I got up to speak at CodeMash, I wanted very badly for this to be the best damn keynote I'd ever done in my life up to that point. I wanted the room to rock. Buzzing. Yes, I wanted to succeed very, very badly. It was an early-morning keynote, first one of the show. People were still milling around, there was a lot of background noise. People were still eating breakfast and waking up. And when Keith Elder, just before he introduced me to the crowd, whispered (I'm paraphrasing here) "Put some energy into this crowd, would ya?", I said to myself, "Oh, yeah. I'm on it."

A little TOO on it, as it turns out. I went way overboard. Brian Prince counted 18 f-bombs that day. Others counted, as well; lowest total I heard was 13, highest was 23. Needless to say, it was a carpet bombing to rival anything we ever did to North Vietnam. Made Dresden look like a weenie roast. (There's probably a Hiroshima joke in there too somewhere, but you get the point.)

The interesting thing about profanity used like that, however, is that it loses its efficacy. They have to be spaced out, chosen carefully, or they lose their impact. Which was, of course, exactly what happened. It's not going to have the 'unexpected' effect if it's coming every other minute or so. No matter how hard you try.

The result? Kind of predictable. Not my best results. For which I am most heartily sorry. I so wanted that keynote to go off so well, and it didn't, and I'm sorry.

For three hours after the keynote was over, as the Twitterstream was dissecting me for all that, I lay on the couch in my hotel room, bordering on tears. Seriously.

Had I the chance to do the keynote over again, you'd better damn well believe that I'd do it differently. Would I cut out all the profanity entirely? Nope. That's a part of my speaking persona, and anyone who brings me to a conference that doesn't know that probably didn't do their homework about me as a speaker beforehand. (It's not like there aren't ample opportunities to see me speaking in person, or videos of the same.) But somebody suggested not too long ago that maybe it wouldn't be a bad idea to warn people ahead of time, and yep, that's a great idea. Because (and for this, I am really even more sorry) sometimes kids are in the room, such as was the case for CodeMash, and they shouldn't have to hear it unless their parents are OK with it, and I didn't give their parents (or any attendees that felt the same way) an opportunity to "opt out" if they so chose.

I could, I suppose, hide behind the excuse that "We were all adults, we should be able to handle that kind of language", but in the case of the kids, that wasn't the case. Even then, in the case of the adults, you still should be given an opportunity to opt out.

More critically, if the message got lost because of the messenger's choice of words, then I failed as a speaker. And that, my friends, is where the real frustration for me lies--not with the words I used in of themselves, but in that the message--that we as an industry have to break out of our 'box-arrow-box-arrow-cylinder' habits and modes of thinking--got lost for so many people, That is how I failed most of all, and it is on those grounds that I say, once again, I am sorry.

To you, Jim, and to the rest of the CodeMash staff, I am particularly sorry. CodeMash is your baby, and I gave it a black eye.

To the attendees of CodeMash 2.0.1.2, I am sorry if my language offended you and distracted you from the message I was trying to deliver. I hope that you were able to get past it and enjoy the rest of the show. I think a lot of you did--many came up to me afterwards, but it was such a small fraction of the total I don't want to assume anything.

Enjoy CodeMash 2.0.1.3. With any luck, I'll see you there next year: hopefully a little wiser, but still just as FUCKING outrageous as I have always been, only this time, with an up-front disclaimer.

Flame away.


.NET | C# | Conferences | F# | Industry | Personal | Review | Social | Windows

Monday, January 07, 2013 3:23:42 PM (Pacific Standard Time, UTC-08:00)
Comments [4]  | 
 Wednesday, December 26, 2012
Thoughts on my new Surface

As a post-Christmas gift to myself, I took a bit of the money that my folks gave us and bought myself a 64GB Surface. Couple of thoughts came to mind as I've sat down to play with this thing:

  1. Microsoft doesn't sell a 64GB model with a Type keyboard? I know the touch-thing is, like, the new hotness with everyone, but frankly, having played with a friend's Surface and his (preferred) Touch keyboard cover, I think both he and Microsoft are smoking some serious crack if they think anyone can seriously touch-type on the touch keyboard. (To be fair, it's not just Microsoft, either--I can't effectively touch-type on my iPad or Galaxy Tab, either. I need the tactile feedback from the spring underneath the key and the edges of the keys themselves to know if I hit the key squarely or not.) More importantly, why on earth does Microsoft think that people buying the 64GB model won't want the Type cover? Or is this an insiduous ploy to force me to accept a bundle (the 64GB model apparently only comes with a Touch cover, not no cover at all) that I don't want? It certainly worked--I bought the 64GB with Touch cover for $699, then the Type cover by itself for another $129. (Let the conspiracists go crazy with that one.)
  2. The packaging is awfully reminiscient of the iPad/iPhone/iPod packaging style. Nice to see that Microsoft can leverage good ideas. ;-)
  3. So I fire this thing up, and the first thing I'm told is that there are 15 updates waiting. I'm all for keeping bits fresh and current and fixed, but this seems a bit excessive--why do so many apps need an update so quickly after its initial release? What's worse, the Store app doesn't tell you what these updates are for, as near as I can tell, so you can't tell which ones are crucial and which ones are just cosmetic. Kind of a fail there.
  4. Wait, how do I right-click on this thing? Or has Microsoft finally come to the realization that one mouse button is all you need right about the time that Apple seems ready to accept that two buttons are, in fact, a superior way of life?
  5. The form factor on this thing is a little bit larger than I expected for some reason. Not that I didn't really know how big it is (and it's not really all that big, at least not when compared to the Samsung tablet they gave us at //Build/ two years ago), but for some reason it just feels bigger than it is.
  6. The keyboard makes me think of it as a laptop, not a tablet. I find myself wanting to go download Visual Studio and put a stripped-down version of it on here. (I even asked my buddy who had a Surface if he'd managed to do that yet, and he--gently--reminded me that since this is Windows RT, and an ARM processor, it won't run on here.)
  7. Because I still wasn't convinced that this isn't a laptop, I tried to download Dropbox onto here. The Surface let me download the whole thing, then told me "This app cannot run on Surface". D'oh! Busted. I am an idiot.
  8. But no Dropbox on here? Really, Microsoft? This seems like a fairly major oversight. I know, Sinofsky was not a "team player", but he's gone now: Find the Dropbox team, give them a ton of money and a few "We're sorry, we won't shut you out again, we promise" mea culpas, and get one of the most popular productivity apps on the planet on this thing. Seriously.
  9. And while we're fixing things, can we please get the Store to be a little more responsive? I know the UX here is going for a "minimalist" vibe, but some part of me wants to see some whirlygigs or something going on while I'm downloading apps. (I, of course, will probably regret this in two years, and vehemently deny saying this when the whirlygigs make me long for a clean and simple interface after Microsoft jazzes it all up to the point of migraine-inducing snazziness.)
  10. And why did the Store hang in the middle of doing my 15 updates and 4 app downloads? It may have been the Internet connection (I'm sitting in a restaurant as I do this, and restaurant WiFi is on par with hotel WiFI in its reliability and bandwidth), but if it is, give me some kind of indication and don't lock me out of doing anything. (The screen became entirely unresponsive.) That's silly.
  11. Oh, and Evernote? After you install and start downloading my notes, same thing--don't get all silent on me and not tell me what's going on.
  12. Wait, Word and Excel and PowerPoint and OneNote are just Office 2013 previews? Not the real thing? Interesting--will I get a free update when those go live, or is this just another "play for free for 90 days then we soak you for money" kinds of arrangements? (And if so, will I be able to use an MVP MSDN key to update/upgrade/install them?)
  13. And now, post-reboot, Store won't launch--it just goes into the spinning circle of deathly dots. (Did I just coin that phrase? Can I copyright it?)
All in all, in the hour or so I've had it, it's not been a terrible experience, but I can't say it's been "sublime" or "world-changing". I'm glad I have it, because once I get a system worked out whereby I can easily share files back and forth between my Surface and the rest of my machines (yes, Mr. Ballmer, I know about SkyDrive, I just haven't been using mine and have to figure out how and where and when I would shift things back and forth between it and Dropbox), I look forward to giving this thing a spin for some of my upcoming blog entries and articles.

Which reminds me: whichever of BitBucket or GitHub manages to bring git or Mercurial over to the Surface (and iPad, and Android) will be a hell of a first-mover on integrating source control into peoples' daily lives. Can you imagine if GitHub and Dropbox joined forces? That would be interesting.


Conferences | Industry | Personal | Reading | Review | Social | Windows

Wednesday, December 26, 2012 6:02:26 PM (Pacific Standard Time, UTC-08:00)
Comments [0]  | 
 Thursday, December 20, 2012
Envoy (in Scala, JavaScript, and more)

A little over a decade ago, Eugene Wallingford wrote a paper for the PloP '99 conference, describing the Envoy pattern language, "a pattern language for managing state in a functional program". It's a good read, but the implementation language for the paper is Scheme--given that it's a Lisp dialect, often isn't particularly obvious or easy to understand at first, I thought it might be interesting (both for me and any readers that wanted to follow along) to translate the implementation examples into a variety of different languages. In this case, I thought it would be relatively easy to do it in Scala and F#, given their hybrid object-functional nature, but it's also an interesting exercise to demonstrate it in [Javascript] (I'll use NodeJS v0.8.15, running on my Mac, and Rhino, with the JVM), Yeti (an ML dialect that runs on the JVM), Jaskell (a Haskell dialect that also runs on the JVM), and, hey, what the heck, let's do it in C# while we're at it, just so the .NET guys don't feel too badly outnumbered.

(I'm posting this now with the intent of filling in the Yeti, Jaskell, F# and C# implementations later.)

Note that with lambdas coming in Java 8, it'll be possible to adapt this pattern language to work with Java, too--I'll leave that as exercise to do for myself (and update this blog entry) once I get a Java8 build on the machine on which I'm writing this.

One reason for doing this in Yeti and Jaskell is to demonstrate the original purpose of the Envoy pattern language--that we can achieve object-like semantics even in languages that don't directly support object semantics (like Scheme). But for the other languages, it's fair to ask why anyone would bother doing this in languages that do directly support objects (a la Scala, F#, etc), since it would seem a lot easier to just use the object features directly. And, truth be told, it's true--when looking to model objects in a language that has first-class support for objects, just use that support and those features, and call it a day. The point of this exercise is, for me, to exercise the functional features of those languages, and see exactly how functional languages can provide some of the same benefits that an O-O language enjoys, without having to use the O-O features directly. (There's been a lot of people writing functional-isms in O-O languages, yours truly included, so it seems a good exercise to flip that on its head.) This will also help me figure out where/when/how to use these features, when the need arises.

If you've not yet read the Envoy pattern language, take a moment and do that now; I don't want to annoy Mr. Wallingford in any way by repeating his prose here (not to mention that I'm going to have enough to do as it is just translating the code into several different languages). But I will toss in a brief summary of each of the elements in the pattern language, just so we're all on the same page about what's happening in each of these code samples.

Implementation notes

These are a few notes for each of the implementation langauges.

JavaScript

Because I want to be able to run the JavaScript code on either the Node platform directly or on the Rhino engine (via the Java JDK "jrunscript" command that installs on Java implementations starting with Java 6), and because those two environments provide different mechanisms for printing to the console ("console.log" in Node, and "println" in Rhino), I create a top-level function "out" that aliases to one or the other of those, depending on what's defined in the environment:

var out = (function() {
  if (typeof(console) !== "undefined" &&
      typeof(console.log) !== "undefined")
    return console.log
  else if (typeof(println) !== "undefined")
    return println
  else
    throw new Error("No idea what to use for output")
})();

(This actually gives away one of the punchlines in the first element of the pattern language, Function as Object, below, because here we're pretty clearly using "out" as a function-as-object.)

I used Rhino that ships with Java6, and node v0.8.15 for these.

Scala

I used Scala v2.9.2 running on Java6 for this.

Yeti

Yeti is an ML-based language that compiles to Java bytecode. Unlike Scala, it's a functional-only language (well, sort of), with Hindley-Milner type inference. As the Yeti home page describes, it supports polymorphic structure and variant types, property fields, lazy lists, pattern-matching on values, and a decent interop facility against Java code (meaning it can call Java classes, as well as compile to classes to be called from Java if desired.)

Yeti was at v0.9.7 at the time I wrote this, and again, running on the Java6 VM.

F#

I'm using the Visual Studio 2012 release to write the F# bits, which corresponds to F# 3.0. As far as I can tell, there's nothing really all that "3.0-specific" that I'm using, so it should work with F# 2.0, which shipped with Visual Studio 2010, and there's nothing Windows-specific here either, which means it should run fine on F# 3.0-on-Mono.

Note that, like what I'm doing with the JavaScript version, I'm binding each of the pattern elements into a function for execution, thus creating a scope block that is dissociated from the larger global scope:

let example = fun () ->
    Console.WriteLine "Howdy world"

If (like I tried once) we were to use the more naive approach:

let example =
    Console.WriteLine "Howdy world"

... then each of the functions is executed and the results bound to the name described ("example", in this case) at the time the compiler sees it; in other words, each is eagerly-evaluated, instead of waiting to be invoked in the main entry point of the program later. By binding an anonymous function literal, it essentially lazy-fies each of them, and won't execute them until they are deliberately invoked in Main, as in:

[<EntryPoint>]
let main argv = 
    example()
    // ... the others go here
    0 // return an integer exit code

With the platform (and prelude) details out of the way, let's begin.

C#

Wow, the C# version is going to be ugly. Let me explain what I mean.

Let's start with the syntax for an anonymous function literal (a lambda, in C# parlance):

() => { return 5; };

This is a function that takes no arguments and yields an int. (Actually, to be exact, this is a delegate, since the lambda wouldn't need an explicit return or the curly braces, since it's a single-expression block and the lambda assumes that the result of the single expression should be implicitly returned.)

Ideally, we'd be able to capture this in an implicitly-typed local variable, like so:

var giveMeFive = () => { return 5; };

But unfortunately, C# doesn't allow this, saying that it "Cannot assign lambda to implicitly-typed local variable". (Doesn't get much more straightforward than that when it comes to an error message.) So, we have to explicitly-type the local variable, which is a Func<> of some type:

Func<int> giveMeFive = () => { return 5; };

Hold on to this thought, because things are going to get even uglier when we want to invoke an anonymous block like this later (when we get into the Closure parts of the pattern language).

Function as Object

In pure functional languages, it's actually difficult to keep state and data tied together--in fact, part of the whole point of a functional language is to write functions that operate on data, ideally on lots of different kinds of data. "Therefore, create a function that acts like an object. Such a function carries the data it needs along with the expression that operates on the data. More importantly, an object encapsulates its data, ensuring that only the allowed operations are applied to them." In other words, by writing a function and keeping the data buried inside of it, we achieve the same kind of encapsulation that object-orientation has traditionally reserved for itself as its principal advantage. This is done via a closure, which is the next element in the language.

Scheme:

The original Scheme implementation looked like this:

(define balance 0)
(define withdraw
  (lambda (balance amount)
    (if (<= amount balance)
      (- balance amount)
      (error "Insufficient funds" balance))
  ))
(define deposit
  (lambda (balance amount)
    (+ balance amount)
  ))
(define accrue-interest
  (lambda (balance interest-rate)
    (+ balance (* balance interest-rate))
  ))

There's a few things wrong with this approach, as Wallingford points out, but to be faithful, recreating this in our target languages is pretty straightforward: three functions, each of which operate on parameters passed in. "You could create new accounts simply by binding values to names. Operating on accounts involves passing the account to the appropriate procedure and binding the new value as appropriate."

JavaScript:

In JavaScript, we can bind function values to names just as we can in Scheme, so it's not actually all that different, once you get past the lack of parentheses and added curly braces. Thus, it looks like:

(function() {
  out("function-as-object =========")
  
  var balance = 0
  var withdraw = function(amount) {
    if (amount <= balance)
      balance = balance - amount
    else
      throw new Error("Insufficient funds")
  }
  var deposit = function(amount) {
    balance += amount
  }
  var accrueInterest = function(interestRate) {
    balance += (balance * interestRate)
  }
})()

Note that I wrap all of it into its own function so as to give the whole thing some scope--makes it easier to define in a single .js file and execute.

Scala:

Similarly, Scala allows us to bind functions to names, too:

  def functionAsObject() = {
    def withdraw(balance : Int, amount : Int) = {
      if (amount <= balance) balance - amount else throw new RuntimeException("Insufficient funds")
    }
    def deposit(balance: Int, amount : Int) = {
      balance + amount
    }
    def accrueInterest(balance : Int, rate : Float) = {
      balance + (balance * rate)
    }
  }

Again, all of it is wrapped into a function for easier (on me, while I was experimenting with all of this) scoping.

F#:

F#, like most functional/object hybrid languages, also offers the ability to bind functions to values, so this is also pretty straightforward. I choose to just operate against the "global" balance value, rather than do the more functional "pass the balance in" that the previous two use:

let functionAsObject = fun () ->
    let balance = ref 0
    let withdraw = 
        fun amt ->
            if amt <= !balance then
                balance := (!balance) - amt
                !balance
            else
                raise (Exception("Insufficient funds"))
    let deposit = 
        fun amt -> 
            balance := (!balance) + amt
            !balance
    let accrueInterest = 
        fun (intRate : float) -> 
            balance := (!balance) + (int (float !balance * intRate))
            !balance
    
    Console.WriteLine "=========> Function as Object"
    printfn "%d" (deposit 200)
    printfn "%d" (withdraw 50)

Yeti (ML):

Although Yeti supports a slightly more succinct syntax for defining a function, I choose to use the syntax that more closely matches what we're doing in the other examples--bind a function literal (do ... done;) to a name (withdraw, deposit and accrueInterest). Again, since this is running on top of the JVM, we have full access to the underlying Java library, which means we can make use of RuntimeException again as a cheap way of signaling a bad withdrawal.

withdraw = 
  do bal amt:
    if amt <= bal then
      bal - amt
    else
      throw new RuntimeException("Insufficient funds")
    fi
  done;

deposit =
  do bal amt: bal + amt done;
  
accrueInterest =
  do bal intRate:
    bal + (bal * intRate)
  done;

balance = 100;
println (withdraw balance 10)

Jaskell (Haskell):

C#:

This is a little more verbose than some of the other versions we've seen thus far, because C# lacks the type-inference that F# or Yeti or Scala has, yet requires explicit typing (in some places) because it is a statically-typed language. Again, because the language explicitly forbids the assignment of a lambda/delegate to an implicitly-typed local variable, the local names "withdraw", "deposit", and "accrueInterest" have to be explicitly typed.

static void FunctionAsObject()
{
    var balance = 0;
    Func<int, int> withdraw = (amount) =>
    {
        if (amount <= balance)
        {
            balance = balance - amount;
            return balance;
        }
        else
            throw new Exception("Insufficient funds");
    };
    Func<int, int> deposit = (amount) => 
    {
        balance += amount; return balance;
    };
    Func<float, int> accrueInterest = (intRate) => 
    { 
        balance += (int)(intRate * balance); return balance; 
    };

    Console.WriteLine("=============> FunctionAsObject");
    Console.WriteLine("{0}", deposit(100));
    Console.WriteLine("{0}", withdraw(10));
}

Notice that again, I choose to operate on the "global" variable "balance", rather than pass it in. (It's fairly easy to imagine how it would look if "balance" were passed in.)

Closure

"You are writing a function with a free variable. How do you bundle a function with a data value defined outside the procedure's body?" If the data value is defined inside the procedure, remember, it gets reset to the same value each time, and obviously this isn't going to track state at all well. "So you might try defining the balance outside the function." But that doesn't work, because now the value isn't encapsulated anymore. "Therefore, create the function in an environment where its free variables are bound to local variables."

This is something that O-O folks won't see right away, but it's a powerful mechanism for reuse. Traditional O-O says to tuck the encapsulated value (balance) away as a private field, but in environments like JavaScript, which lack any sort of formal access control, or in environments like the JVM or CLR, both of which offer a means by which to bypass access control directives (via the Reflection libraries in both), what's marked as "private" often isn't as private as we might want. By creating a local variable that's outside the scope of the returned function object but inside of the scope of the function returning the function (see where "balance" is declared in the JavaScript version, for example), the language or platform has to "close over" that variable (hence the name "closure"), thus making it accessible to the returned function for use, but effectively hidden away from any prying eyes that might want to screw with it outside of permitted access channels.

Scheme:

The only key thing to note here is that "withdraw" references a lambda, a function literal in Scheme. We'll try to keep this flavor in the other language implementations, just to be faithful:

(define withdraw
  (let ((balance 100)) ;; balance is defined here,
    (lambda (amount)
      (if (>= balance amount) ;; so this reference is bound
        (begin
          (set! balance (- balance amount))
          balance)
        (error "Insufficient funds" balance)))
    ))

JavaScript:

JavaScript is, surprisingly to some "old-school" JavaScript programmers, a full-fledged member of the family of languages that support closures, so all that's necessary here is to define a function that returns a function that "closes over" the local variable "balance". But, in order to make sure that balance isn't reset to its original value of 100 each time we call the function, we have to actually invoke the outer function to return the inner function, which is then bound to the name "withdraw"; that way, the variable "balance" is initialized once, yet still referenced:

(function() {
  out("closure ====================")

  var withdraw = function() {
    var balance = 100
    return function(amount) {
      if (balance >= amount) {
        balance -= amount
        return balance
      }
      else
        throw new Error("Insufficient funds")
    }
  }()
  out("withdraw 20 " + withdraw(20))
  out("withdraw 30 " + withdraw(30))
})()

Scala:

We can do the same thing in Scala, and the syntax looks somewhat similar to the JavaScript version--create a function literal, invoke it, and bind the result to the name "withdraw", where the return is another anonymous function literal:

  def closure() = {
    val withdraw = (() => {
      var balance = 100
      (amount: Int) => {
        if (amount <= balance) {
          balance -= amount
          balance
        }
        else
          throw new RuntimeException("Insufficient funds")
      }
    })()
    println(withdraw(20))
    println(withdraw(20))
  }

F#:

The F# version gets interesting because when we try to do the same thing that the JavaScript (or other languages) do--that is, "close over" a local variable defined in the outer scope--the compiler immediately rejects that, saying point-blank that the language does not support that, and to use "reference variables" (the ref keyword) instead:

let closure = fun () ->
    let withdraw =
        let balance = ref 100
        fun amt ->
            if amt <= !balance then
                balance := (!balance) - amt
                !balance
            else
                raise (Exception("Insufficient funds"))

    Console.WriteLine "=========> Closure"
    printfn "%d" (withdraw 20)
    printfn "%d" (withdraw 30)

What essentially we're doing, then, is capturing a pointer/reference to balance, and carrying that into the returned function literal, rather than letting the language capture that for us. The ref is dereferenced using the "!" operator, and assigned through using the ":=" operator, as can be seen above. Other than that, this is pretty much identical to the other languages' versions.

By the way, it should be noted that F#'s "printfn" function is actually type-safe, so attempts to write "printfn "%d" x" where "x" is a non-integer value will yield a compile-time error. That's an incredibly spiffy feature, and I wish it were something we could apply to our own F# APIs, but from what I understand from Don Syme (the F# language creator), it's something that's baked into the compiler somehow. :-/

Yeti (ML):

Yeti works just as any of the others have, since we can define a function literal that returns a function literal, so just like the JavaScript and Scala versions, we can bind a variable (as opposed to a value, which is immutable) just outside the inner function literal, and Yeti will "close over" that variable and use it for modifiable state:

withdraw = 
  (do:
    var balance = 100;
    do amt:
      if amt <= balance then
        balance := balance - amt;
        balance
      else
        throw new RuntimeException("Insufficient funds")
      fi
    done;
  done;) ();

println (withdraw 10);  // prints 90
println (withdraw 10);  // prints 80
println (withdraw 10);  // prints 70

Jaskell (Haskell):

C#:

Brace yourself--things are about to get really ugly here. The other versions suggest that we obtain encapsulation by capturing the "balance" value inside an outer function scope which is then referenced from an inner function scope, that inner function scope being the returned function literal. But... C# doesn't let us invoke function literals directly, except if they're cast to Func<> instances:

static void Closure()
{
    Func<int, int> withdraw = ((Func<Func<int, int>>)(() => {
        var balance = 100;
        Func<int, int> result = delegate(int amount)
        {
            if (balance >= amount)
            {
                balance -= amount;
                return balance;
            }
            else
                throw new Exception("Insufficient funds");
        };
        return result;
    }))();
    Console.WriteLine("=============> Closure");
    Console.WriteLine("{0}", withdraw(20));
}

Did all that make sense? It might be clearer if I go back to the version I wrote that I had to use in order to figure all this out on my own:

static void Closure()
{
    Func<Func<int, int>> withdrawMaker = (delegate {
        var balance = 100;
        Func<int, int> result = delegate(int amount)
        {
            if (balance >= amount)
            {
                balance -= amount;
                return balance;
            }
            else
                throw new Exception("Insufficient funds");
        };
        return result;
    });
    Func<int, int> withdraw = withdrawMaker();

    Console.WriteLine("=============> Closure");
    Console.WriteLine("{0}", withdraw(20));
}

Why bother will all of this--why not just write it as a generalized method like O-O folks have done since the beginning of time? Because we want that "balance" tucked away somewhere where Reflection can't find it. So the double-level of function indirection is necessary; to cap things off, we don't want to have to write a one-use "maker" function every time.

Constructor Function

"You are creating a Function as Object using a Closure. How do you create instances of the object? [M]ake a function that returns your Function as Object. Give the function an Intention Revealing Name (Beck) such as make-object."

Scheme:

Now things get more interesting, because the Scheme code is defining "make-withdraw" to be a lambda that in turn nests a lambda inside of it. This makes the syntax a little weird--since the returned value from "make-withdraw" is a lambda, the bound lambda must be executed in order to do the actual withdrawal.

(define make-withdraw
  (lambda (balance)
    (lambda (amount)
      (if (>= balance amount)  ;; balance is still bound,
        (begin                 ;; but to a new object on each call
          (set! balance (- balance amount))
          balance)
        (error "Insufficient funds" balance)))
    ))
(define account-for-eugene (make-withdraw 100))
(account-for-eugene 20)    => 80
(define account-for-tom (make-withdraw 1000))
(account-for-tom 20)       => 980

JavaScript:

It's pretty common in JavaScript to create a function that returns a function, and that's the heart of what Constructor Function is doing: returning a function:

(function() {
  out("constructorFunction ========")

  var makeWithdraw = function(balance) {
    return function(amount) {
      if (balance >= amount) {
        balance -= amount
        return balance
      }
      else
        throw new Error("Insufficient funds")
    }
  }
  var acctForEugene = makeWithdraw(100)
  out(acctForEugene(20))
  var acctForTed = makeWithdraw(1000)
  out(acctForTed(20))
})()

Scala:

Ditto for Scala, though the idiom/pattern of function-literal-returning- function-literal isn't always quite this obvious in Scala:

  def constructorFunction() = {
    def makeWithdraw(bal : Int) = {
      var balance = bal
      (amt : Int) => {
        if (balance >= amt) {
          balance = (balance - amt) 
          balance
        }
        else 
          throw new RuntimeException("Insufficient funds")
      }
    }
    val acctForEugene = makeWithdraw(100)
    println(acctForEugene(20))
    val acctForTed = makeWithdraw(1000)
    println(acctForTed(20))
  }

F#:

Really, not any different from the other languages: a function binding that returns a function, with the passed-in "balance" captured as a reference (see the earlier pattern element discussion for why it's a ref) inside the outer function scope, and used from the inner function scope.

let constructorFunction = fun () ->
    let makeAccount =
        fun bal ->
            let balance = ref bal
            fun amt ->
                if amt <= !balance then
                    balance := (!balance) - amt
                    !balance
                else
                    raise (Exception("Insufficient funds"))                
            
    Console.WriteLine "=========> Constructor Function"
    let acctForEugene = makeAccount 100
    printfn "%d" (acctForEugene 20)

Yeti (ML):

Same exercise--a function binding that returns a function, with the passed-in "balance" stored as a variable (var) inside the outer function scope, such that it is closed over by the inner function scope.

makeWithdraw =
  (do bal:
    var balance = bal;
    do amt:
      if amt <= balance then
        balance := balance - amt;
        balance
      else
        throw new RuntimeException("Insufficient funds")
      fi
    done;
  done;);

acctForEugene = makeWithdraw 100;
println (acctForEugene 10);   // 90
println (acctForEugene 10);   // 80

Jaskell (Haskell):

C#:

The constructor function must be explicitly typed, again, but we gain a tiny bit of brevity by changing the "delegate" literals into (slightly) shorter C# lambdas:

static void ConstructorFunction()
{
    Func<int,Func<int, int>> makeAccount = 
        ((Func<int,Func<int,int>>)( (bal) => {
            var balance = bal;
            return (int amount) =>
            {
                if (balance >= amount)
                {
                    balance -= amount;
                    return balance;
                }
                else
                    throw new Exception("Insufficient funds");
            };
        }));

    Console.WriteLine("=============> Closure");
    var acctForEugene = makeAccount(100);
    Console.WriteLine("{0}", acctForEugene(20));
}

Were it not for the implicitly-typed local variable declaration syntax around "acctForEugene", it would be acutely obvious that "makeAccount" isn't creating any kind of object at all, but a function to be executed. Even so, the explicit typing requirement for the lambdas is kind of annoying, and will only get worse as we move through the pattern language.

Method Selector

"You are creating a Function as Object using a Closure. A Constructor Function creates new instances of the object. How do you provide shared access to the closure's state?" After all, an account can do more than just withdraw, but all of the operations on the account have to share the same state--the account balance--without violating encapsulation.

Scheme:

Again we see the nested lambdas, but now there's a third level of nesting; the first invocation (make-account) returns a second invocation that will take a single string, switch on the string, and return a third lambda that will do the actual work of manipulating the balance.

(define make-account
  (lambda (balance)
    (lambda (transaction)
      (case transaction
        ('withdraw
          (lambda (amount)
            (if (>= balance amount)
              (begin
                (set! balance (- balance amount)
                balance)
              (error "Insufficient funds" balance)))))
        ('deposit
          (lambda (amount)
            (set! balance (+ balance amount))
            balance))
        ('balance
          (lambda ()
            balance))
        (else
          (error "Unknown request -- ACCOUNT"
            transaction))))
  ))
(define account-for-eugene (make-account 100))
((account-for-eugene 'withdraw) 10)  => 90
((account-for-eugene 'withdraw) 10)  => 80
((account-for-eugene 'deposit) 100)  => 180

JavaScript:

Doing this in JavaScript is, again, straightforward, though it does seem a little too subtle for idiomatic JavaScript:

(function() {
  out("methodSelector ========")

  var makeAccount = function(bal) {
    var balance = bal
    return function(transaction) {
      if (transaction === "withdraw") {
        return function(amount) {
          if (balance >= amount)
            return (balance = (balance - amount))
          else
            throw new Error("Insufficient funds")
        }
      }
      else if (transaction === "deposit") {
        return function(amount) {
          return (balance = (balance + amount))
        }
      }
      else if (transaction === "balance") {
        return function() {
          return balance
        }
      }
      else {
        throw new Error("Insufficient funds")
      }
    }
  }
  var acctForEugene = makeAccount(100)
  out(acctForEugene("withdraw")(20))
  out(acctForEugene("balance")())
})();

Scala:

This style of interface--passing in a string and a variable list of arguments--really isn't quite Scala's style, since (being a strongly- typed language) it prefers to be able to compile-time-check as much as it can, but that doesn't mean we can't build it when the need and opportunity mesh:

  def methodSelector() = {
    def makeAccount(bal : Int) = {
      var balance = bal
      (transaction : String) => {
        transaction match {
          case "withdraw" =>
            (amt : Int) => {
              if (balance >= amt) {
                balance = (balance - amt) 
                balance
              }
              else 
                throw new RuntimeException("Insufficient funds")
            }
          case "deposit" => {
            (amt : Int) => {
              balance += amt
              balance
            }
          }
          case _ => 
            throw new RuntimeException("Unknown request")
        }
      }
    }
    val acctForEugene = makeAccount(100)
    println(acctForEugene("deposit")(50))
    val acctForTed = makeAccount(100)
    println(acctForTed("withdraw")(50))
  }

F#:

This is, again, like the Yeti version and the Scala version, going to require some sacrifice in terms of flexibility in order to stay true to the original Scheme version--in F#, like in Scala and other statically-typed languages, we have to make sure that all branches of a pattern-match yield the same type of result, so the "balance" branch has to yield a function that takes a parameter (and it must be of the same type of parameter as the other two branches), even though "balance" never makes use of it. This also means that when calling the return value from "makeAccount", even for balance, we have to pass along some parameter that will be ignored.

let methodSelector = fun () ->
    let makeAccount =
        fun (bal : int) ->
            let balance = ref bal
            fun transaction ->
                match transaction with
                | "balance" ->
                    fun _ -> !balance
                | "deposit" ->
                    fun (amt : int) ->
                        balance := (!balance) + amt
                        !balance
                | "withdraw" ->
                    fun (amt : int) ->
                        if amt <= !balance then
                            balance := (!balance) - amt
                            !balance
                        else
                            raise (Exception("Insufficient funds"))
                | _ ->
                    raise (Exception("Unrecognized operation" + transaction))
                            
    Console.WriteLine "=========> Method Selector"
    let acctForEugene = makeAccount 100
    printfn "%d" ((acctForEugene "withdraw") 20)
    printfn "%d" ((acctForEugene "balance") 0)

We can address this required-uniformity-of-access a little bit more consistently with the next pattern element, but whether it's an improvement is debatable.

Yeti (ML):

Nothing new here: the makeAccount function now nests three function literals, just like the JavaScript and Scala ones do. Like the other languages, we use a pattern-match/switch-case construct to decide between the different action strings ("deposit", "withdraw", "balance") and then return the appropriate function literal for further execution. Note that Yeti, like JavaScript, actually has a way of returning an "object" here (a structure, which is a data type the contains one or more named fields, a la objects in JavaScript or case classes in Scala), but since the goal is to remain as faithful as possible to the original Scheme implementation, I stick with the more "functional-only" approach.

makeAccount =
  (do bal:
    var balance = bal;
    do action:
      case action of
        "withdraw": 
          do amt:
            if amt <= balance then
              balance := balance - amt;
              balance
            else
              throw new RuntimeException("Insufficient funds")
            fi
          done;
        "deposit": 
          do amt: 
            balance := balance + amt;
            balance;
          done;
        "balance": 
          do: 
            balance; 
          done;
        _ : throw new RuntimeException("Unknown operation")
      esac
    done;
  done;);

acctForEugene = makeAccount 100;
println ((acctForEugene "withdraw") 20);
println ((acctForEugene "deposit") 20);

Jaskell (Haskell):

C#:

If you stopped reading right here, I wouldn't blame you; this is some ugly C#, without question, particularly considering that there are other ways to accomplishing this same effect without requiring quite so much nesting.

static void MethodSelector()
{
    Func<int, Func<string, Func<int, int>>> makeAccount =
        ((Func<int, Func<string, Func<int, int>>>)((bal) =>
        {
            var balance = bal;
            return (string transaction) =>
                {
                    switch (transaction)
                    {
                        case "deposit":
                            return (int amount) =>
                                {
                                    if (balance >= amount)
                                    {
                                        balance -= amount;
                                        return balance;
                                    }
                                    else
                                        throw new Exception("Insufficient funds");
                                };
                        case "withdraw":
                            return (int amount) =>
                                {
                                    balance += amount;
                                    return balance;
                                };
                        case "balance":
                            return (int unused) =>
                                {
                                    return balance;
                                };
                        default:
                            throw new Exception("Illegal operation");
                    }
                };
        }));
    Console.WriteLine("=============> MethodSelector");
    var acctForEugene = makeAccount(100);
    Console.WriteLine("{0}", acctForEugene("deposit")(20));
    Console.WriteLine("{0}", acctForEugene("withdraw")(20));
    Console.WriteLine("{0}", acctForEugene("balance")(0));
}

Bear in mind, too, that there are some other ways to accomplish what the C# code here tries to do, one using dynamic types (from 4.0):

static void MethodSelector2()
{
    Func<int, dynamic> makeAccount = (int bal) =>
    {
        var balance = bal;
        dynamic result = new System.Dynamic.ExpandoObject();
        result.withdraw = (Func<int, int>)((amount) => {
            if (balance >= amount)
            {
                balance -= amount;
                return balance;
            }
            else
                throw new Exception("Insufficient funds");
        });
        result.deposit = (Func<int, int>)((amount) =>
        {
            balance += amount;
            return balance;
        });
        result.balance = (Func<int>)(() => balance);
        return result;
    };

    Console.WriteLine("=============> MethodSelector2");
    var acctForEugene = makeAccount(100);
    Console.WriteLine("{0}", acctForEugene.deposit(20));
    Console.WriteLine("{0}", acctForEugene.balance());
    var acctForTed = makeAccount(100);
    Console.WriteLine("{0}", acctForTed.withdraw(10));
    Console.WriteLine("{0}", acctForTed.balance());
}

... or even using ye old plain ol' Dictionary type, taking a string as a key and yielding Func<> as values for execution:

static void MethodSelector3()
{
    Func<int, Dictionary<string,Func<int,int>>> makeAccount = 
    (int bal) =>
    {
        var balance = bal;
        var result = new Dictionary<string, Func<int,int>>();
        result["withdraw"] = (Func<int, int>)((amount) =>
        {
            if (balance >= amount)
            {
                balance -= amount;
                return balance;
            }
            else
                throw new Exception("Insufficient funds");
        });
        result["deposit"] = (Func<int, int>)((amount) =>
        {
            balance += amount;
            return balance;
        });
        result["balance"] = (Func<int, int>)((unused) => balance);
        return result;
    };

    Console.WriteLine("=============> MethodSelector3");
    var acctForEugene = makeAccount(100);
    Console.WriteLine("{0}", acctForEugene["deposit"](20));
    Console.WriteLine("{0}", acctForEugene["balance"](0));
    var acctForTed = makeAccount(100);
    Console.WriteLine("{0}", acctForTed["withdraw"](10));
    Console.WriteLine("{0}", acctForTed["balance"](0));
}

The second of these two is closer to strict intent of Method Selector from the Scheme example, but the first allows for flexible arity (numbers of parameters) in the functions handed back when dereferenced (so that "balance" doesn't have to take a bogus unused parameter). Frankly, had I to choose, I'd probably go with the dynamic version, just because of that flexibility.

Message-Passing Interface

"You have created a Method Selector for a Function as Object. You prefer to use your object in code that has an object-oriented feel. How do you invoke the methods of an object? [P]rovide a simple message-passing interface for using the closure."

Scheme:

Everything in a Lisp is a list, and the Scheme implementation uses that to full effect by taking the argument list passed in to "send" and splits it up into the object (the account), message (withdraw/deposit/etc), and the arguments (if any) that are left.

(define send
  (lambda argument-list
    (let ((object  (car argument-list))
          (message (car (cdr argument-list)))
          (args    (cdr (cdr argument-list))))
      (apply (get-method object message) args))
  ))
(define get-method
  (lambda (object selector)
    (object selector)
  ))
(define account-for-eugene (make-account 100))
(send account-for-eugene 'withdraw 50)  => 50
(send account-for-eugene 'deposit 100)  => 150
(send account-for-eugene 'balance)      => 150

JavaScript:

In JavaScript, peeling off the head and tail of the arguments reference is trickier here, because unlike Scheme, JavaScript sees "arguments" as an array, not a list. While I could've created "car" and "cdr" functions in JavaScript to perform the relevant operations on an array, it felt more idiomatic to provide a function "slice" to do the "slicing" (which is actually a copy) of elements off the end of the array instead. More importantly, "slice" is a primitive method on Array objects in ECMAScript 5, though neither Node nor Rhino in Java 6 recognize it (I suspect because neither is a compliant ECMAScript 5 environment yet), and if this code ever gets run in a ECMAScript 5 world, then it would/should use that version, instead, since it'll likely be faster than mine.

The other interesting tidbit in here is that when I wrote it the first time, when doing a deposit, the "balance" became "8020", instead of the mathematically-correct "100". JavaScript's "promiscuous typing" thought that the "+" operator wanted to do a string concatenation, instead of a mathematical add of two numbers, so I had to convince it that the value coming out of arguments[1] was, in fact, a number, and the easiest way (it seemed to me at the time) was to just do a quick redundant math operation on it (multiply by 10, then divide by 10 again). There's likely a more idiomatic way to do that, I suspect.

I also note that getMethod() in JavaScript is a bit unnecessary; we could inline its functionality directly inside of send().

(function() {
  out("messagePassingInterface ========")

  var slice = function(src, start, end) {
    var returnVal = []
    var j = 0
    if (end === undefined)
      end = src.length
    for (var i = start; i < end; i++) {
      if (src.length > i)
        returnVal[j++] = src[i]
    }
    return returnVal;
  }
  
  var makeAccount = function(bal) {
    var balance = bal
    return function(transaction) {
      if (transaction === "withdraw") {
        return function(amount) {
          if (balance >= amount)
            return (balance = (balance - amount))
          else
            throw new Error("Insufficient funds")
        }
      }
      else if (transaction === "deposit") {
        return function(amount) {
          return (balance = (balance + (amount * 10.0 / 10.0)))
        }
      }
      else if (transaction === "balance") {
        return function() {
          return balance
        }
      }
      else {
        throw new Error("Insufficient funds")
      }
    }
  }
  var getMethod = function(object, selector) {
    return object(selector)
  }
  var send = function(object, message) {
    return (getMethod(object, message))(slice(arguments, 2))
  }
  var acctForEugene = makeAccount(100)
  out(send(acctForEugene, "withdraw", 20)) // 80
  out(send(acctForEugene, "balance"))      // 80
  out(send(acctForEugene, "deposit", 20))  // 100
  out(send(acctForEugene, "balance"))      // 100
})();

Scala:

The Scala version of this follows the JavaScript version in that it works off of a variable-argument list, but since Scala doesn't give us the built-in "arguments" reference, we have to specify it at the method declaration:

  def messagePassingInterface() = {
    def makeAccount(bal : Int) = {
      var balance = bal
      def send(key:String, args:Any*) = {
        key match {
          case "withdraw" => {
            val amt = args.head.asInstanceOf[Int]
            if (balance >= amt) {
              balance = (balance - amt) 
              balance
            }
            else 
              throw new RuntimeException("Insufficient funds")
          }
          case "deposit" => {
            val amt = args.head.asInstanceOf[Int]
            balance += amt
            balance
          }
        }
      }
      send _
    }
    val acctForEugene = makeAccount(100)
    println(acctForEugene("withdraw", 10))
  }

F#:

By now taking an "obj list" for the parameters, we unify all of the calls to the account to take a consistent parameter list that still allows for a flexible number of parameters, but.... It still requires that callers that don't want to pass any arguments have to pass an empty list. And, on top of that, it doesn't really feel "F#-ish".

let messagePassingInterface = fun () ->
    let makeAccount =
        fun (bal : int) ->
            let balance = ref bal
            fun transaction ->
                match transaction with
                | "balance" ->
                    fun _ -> !balance
                | "deposit" ->
                    fun (arglist : obj list) ->
                        let amt = arglist.Head :?> int
                        balance := (!balance) + amt
                        !balance
                | "withdraw" ->
                    fun (arglist : obj list) ->
                        let amt = arglist.Head :?> int
                        if amt <= !balance then
                            balance := (!balance) - amt
                            !balance
                        else
                            raise (Exception("Insufficient funds"))
                | _ ->
                    raise (Exception("Unrecognized operation" + transaction))
    let getMethod = fun (acct : string -> obj list -> int) selector -> acct selector
    let send = 
        fun (acct : string -> obj list -> int) (message : string) (arglist : obj list) ->
            (getMethod acct message)(arglist)

    Console.WriteLine "=========> Message Passing Interface"
    let acctForEugene = makeAccount 100
    printfn "%d" (send acctForEugene "withdraw" [20])
    printfn "%d" (send acctForEugene "balance" [])

Note that F# does have language facilities for allowing a variable-argument list to be passed, but it only works on method members:

// From the MSDN documentation
open System

type X() =
    member this.F([<ParamArray>] args: Object[]) =
        for arg in args do
            printfn "%A" arg

[<EntryPoint>]
let main _ =
    // call a .NET method that takes a parameter array, passing values of various types
    Console.WriteLine("a {0} {1} {2} {3} {4}", 1, 10.0, "Hello world", 1u, true)

    let xobj = new X()
    // call an F# method that takes a parameter array, passing values of various types
    xobj.F("a", 1, 10.0, "Hello world", 1u, true)
    0

We could go back and rewrite all of the F# samples to be class member methods (that is, return actual objects), but that sort of gets away from the spirit of what the blog exercise is trying to do, so I'll leave that as an exercise to the reader. (Which, by the way, is author-speak for "I'm feeling lazy and I don't want to bother".)

Yeti (ML):

Unfortunately, while Yeti (like most functional languages) has a built-in list type, it doesn't recognize arguments to a function as a list, so we either have to explicitly put the arguments in, or we have to explicitly state that the arguments to the returned function literal are a list. I choose the latter tactic, even though it's not the world's most impressive syntax:

makeAccount =
  (do bal:
    var balance = bal;
    do action:
      case action of
        "withdraw":
          do argList:
            amt = head argList;
            if amt <= balance then
              balance := balance - amt;
              balance;
            else
              throw new RuntimeException("Insufficient funds")
            fi
          done;
        "deposit": 
          do argList:
            amt = head argList; 
            balance := balance + amt;
            balance;
          done;
        "balance": 
          do: 
            balance; 
          done;
        _ : throw new RuntimeException("Unknown operation")
      esac
    done;
  done;);

acctForEugene = makeAccount 100;
println  ((acctForEugene "withdraw")[20]);  // 80

If there's a way to get a Yeti function to accept a variable number of arguments, I've not seen it in the language overview. I don't know if any ML-derivative has this, to be honest. Of course, the other thing to do, since this is a statically-typed environment, is to just return function literals that expect the proper number of arguments, which will get us the compile-time safety that these languages are supposed to provide; the below does exactly that--the last line will fail to compile if you uncomment it:

makeAccount =
  (do bal:
    var balance = bal;
    do action:
      case action of
        "withdraw":
          do amt:
            if amt <= balance then
              balance := balance - amt;
              balance;
            else
              throw new RuntimeException("Insufficient funds")
            fi
          done;
        "deposit": 
          do amt:
            balance := balance + amt;
            balance;
          done;
        "balance": 
          do: 
            balance; 
          done;
        _ : throw new RuntimeException("Unknown operation")
      esac
    done;
  done;);

acctForEugene = makeAccount 100;
println ((acctForEugene "withdraw") 20);      // 80
println ((acctForEugene "balance") 0);        // 80
//println ((acctForEugene "withdraw") "fred");  // won't compile

(Truthfully, we should do this for the Scala version, too.) This choice is going to cause us a little bit of heartache, though, because in order to use "balance", we have to pass in a number--if we leave off the "_" in the function literal returned from the "balance" arm of the selector, we don't need to pass "0" when we invoke it, but what's returned isn't a number, but a function. I can't figure out how to make Yeti take that function and just invoke it--the syntax guide doesn't seem to say out loud exactly how I can invoke that function without having to pass in a number argument. If I'd left it as taking a list, then I could pass an empty list and all would look consistent, if a little weird.

(Note that this is deliberately opposite what I chose to do for the F# version.)

Jaskell (Haskell):

C#:

Generic Function

"You have created a Method Selector for a Function as Object. You want to take full advantage of the tools available in your functional language. How do you invoke the methods of an object? ... [P]rovide a simple interface to the Method Selector that more closely follows the functional style."

Scheme:

In the Scheme implementation, it's interesting that having written the send function in the last element of the pattern language, we don't really use it here, but instead just inline its functionality in each of the named functions (which, in turn, take the argument list, peel off the head of the argument list as the account object, and pass the remainder of the arguments on to the selected function):

(define withdraw
  (lambda argument-list
    (let ((object (car argument-list))
          (withdraw-arguments (cdr argument-list)))
      (apply (object 'withdraw) withdraw-arguments)
    )))
(define deposit
  (lambda argument-list
    (let ((object (car arguments))
          (deposit-arguments (cdr arguments)))
      (apply (object 'deposit) deposit-arguments)
    )))
(define balance
  (lambda (object)
    (object 'balance)
  ))
  
(define account-for-eugene (make-account 100))
(withdraw account-for-eugene 10)
(map  (lambda (account) (deposit account 10)) account-for-eugene)

Interestingly enough, I sort of expected the Scheme version to use "deposit" directly, rather than write a trampoline that calls "deposit", since we could've avoided the Generic Function part of the language just by using "send" directly, as well:

(map  (lambda (account) (send account 'deposit 10)) account-for-eugene)

And, to be honest, calling "map" on a single object doesn't really seem to be a profoundly functional experience, so in my examples I'm going to create a collection of accounts (called a "bank", naturally enough), and map across that collection.

JavaScript:

The JavaScript version of this is, again, pretty similar to the Scheme version. Again, ECMAScript 5 environments are supposed to have a "map" function natively built in, but previous environments don't, so I have to write one to verify that we can, in fact, use the named functions as the mapped operation. I also write a "map2", another version of map that takes the function to apply to the collection but also takes any additional arguments after that and passes them to the function being applied across the collection; it allows me to use "deposit" directly, instead of having to write a trampoline for it, and besides, it's trivial to write in JavaScript:

(function() {
  out("genericFunction ==============")

  var slice = function(src, start, end) {
    var returnVal = []
    var j = 0
    if (end === undefined)
      end = src.length
    for (var i = start; i < end; i++) {
      if (src.length > i)
        returnVal[j++] = src[i]
    }
    return returnVal
  }
  
  var map = function(fn, src) {
    var retVal = []
    for (i in src)
      retVal[i] = fn(src[i])
    return retVal
  }
  var map2 = function(src, fn) {
    var retVal = []
    for (i in src)
      retVal[i] = fn(src[i], slice(arguments, 2))
    return retVal
  }
  
  var makeAccount = function(bal) {
    var balance = bal
    return function(transaction) {
      if (transaction === "withdraw") {
        return function(amount) {
          if (balance >= amount)
            return (balance = (balance - amount))
          else
            throw new Error("Insufficient funds")
        }
      }
      else if (transaction === "deposit") {
        return function(amount) {
          return (balance = (balance + (amount * 10.0 / 10.0)))
        }
      }
      else if (transaction === "balance") {
        return function() {
          return balance
        }
      }
      else {
        throw new Error("Insufficient funds")
      }
    }
  }
  var withdraw = function() {
    var object = arguments[0]
    var argumentList = slice(arguments, 1)
    return object("withdraw")(argumentList)
  }
  var deposit = function() {
    var object = arguments[0]
    var argumentList = slice(arguments, 1)
    return object("deposit")(argumentList)
  }
  var balance = function(object) {
    return object("balance")()
  }

  var acctForEugene = makeAccount(100)
  out(withdraw(acctForEugene, 20))
  out(deposit(acctForEugene, 20))
  
  var bank = [
    makeAccount(100),  // acctForEugene
    makeAccount(1000)  // acctForTed
  ]
  map(function(it) { deposit(it, 20) }, bank)
  out(balance(bank[0]))
  out(balance(bank[1]))
  
  map2(bank, deposit, 20)
  out(balance(bank[0]))
  out(balance(bank[1]))
})();

Scala:

Scala, of course, has functional methods built onto its List type (which we can use instead of an array, since Scala has much better support for lists than arrays):

  def genericFunction() = {
    def makeAccount(bal : Int) = {
      var balance = bal
      def send(key:String, args:Any*) = {
        key match {
          case "withdraw" => {
            val amt = args.head.asInstanceOf[Int]
            if (balance >= amt) {
              balance = (balance - amt) 
              balance
            }
            else 
              throw new RuntimeException("Insufficient funds")
          }
          case "deposit" => {
            val amt = args.head.asInstanceOf[Int]
            balance += amt
            balance
          }
          case "balance" => {
            balance
          }
          case _ =>
            throw new RuntimeException("Unknown request")
        }
      }
      send _
    }
    def withdraw(account : (String, Any*) => Int, amount : Int) = {
      account("withdraw", amount)
    }
    def deposit(account : (String, Any*) => Int, amount : Int) = {
      account("deposit", amount)
    }
    def balance(account : (String, Any*) => Int) = {
      account("balance")
    }
    val accounts = List(makeAccount(100), makeAccount(200), makeAccount(300))
    accounts.foreach(withdraw(_, 20))
    accounts.foreach((in) => { println(balance(in)) })
  }

F#:

The F# version helps clean up some of the syntax a little, sort of:

let genericFunction = fun () ->
    let makeAccount =
        fun (bal : int) ->
            let balance = ref bal
            fun transaction ->
                match transaction with
                | "balance" ->
                    fun _ -> !balance
                | "deposit" ->
                    fun (arglist : obj list) ->
                        let amt = arglist.Head :?> int
                        balance := (!balance) + amt
                        !balance
                | "withdraw" ->
                    fun (arglist : obj list) ->
                        let amt = arglist.Head :?> int
                        if amt <= !balance then
                            balance := (!balance) - amt
                            !balance
                        else
                            raise (Exception("Insufficient funds"))
                | _ ->
                    raise (Exception("Unrecognized operation" + transaction))
    let deposit = 
        fun amt acct->
            acct "deposit" [amt :> obj]
    let withdraw =
        fun amt acct ->
            acct "withdraw" [amt :> obj]
    let balance =
        fun acct ->
            acct "balance" []

    Console.WriteLine "=========> Generic Function"
    let bank = [ makeAccount 100; makeAccount 200; makeAccount 300 ]
    let balances = List.map (fun it -> deposit 20 it) bank
    List.iter (fun it -> printfn "%d" it) balances
    let balances = List.map (deposit 20) bank
    List.iter (printfn "%d") balances

Notice that by putting the account counter-intuitively as the last parameter to the generic "deposit" and "withdraw" functions, we can avoid having to write the "trampoline" function that we would've had to write when using "map"; the account gets curried from the List directly (as shown in the second example). We could do the same thing in the Scala version, too, and then wouldn't have to use the explicit "_" syntax that Scala provides. Of course, if the desire is instead to pass the amount in a curried fashion, instead of the account, then the original ordering of the parameters is better.

Yeti (ML):

Writing this in Yeti/ML is definitely trickier than it was in JavaScript, despite the built-in "map" and other functions, because getting the arguments to "trampoline" right is a little harder. Fortunately, the generic method hides the "balance 0" weirdness from the last pattern element, making it a tad easier to use:

makeAccount =
  (do bal:
    var balance = bal;
    do action:
      case action of
        "withdraw":
          do amt:
            if amt <= balance then
              balance := balance - amt;
              balance
            else
              throw new RuntimeException("Insufficient funds")
            fi
          done;
        "deposit": 
          do amt:
            balance := balance + amt;
            balance
          done;
        "balance": 
          do: 
            balance
          done;
        _ : throw new RuntimeException("Unknown operation")
      esac
    done;
  done;);

withdraw =
  (do acct amt:
    (acct "withdraw") amt;
  done;);
deposit =
  (do acct amt:
    (acct "deposit") amt;
  done;);
balance =
  (do acct:
    acct "balance" 0;
  done;);

acctForEugene = makeAccount 100;
println (withdraw acctForEugene 20);      // 80
println (deposit acctForEugene 20);       // 100
println (balance acctForEugene);          // 100

accounts = [(makeAccount 100), (makeAccount 200), (makeAccount 300)];
balances = map (do acct: (deposit acct 20) done) accounts;
for accounts do acct: println(deposit acct 20) done;

Yeti complained if I didn't bind the result of the "map" call to a value, hence the "balances" value there, even though the balances are actually also stored in the relevant closures for each account. Note that the "for" line that follows it actually does the same thing, and prints the results out, to boot. In fact, it's high time people started to realize that the "for" loop in most imperative languages is just a non-functional way of doing a "map" without yielding a value. Languages like Scala and Yeti/ML essentially blur that line significantly enough to the point where we should just eschew "for" altogether and use "map", if you ask me.

Jaskell (Haskell):

C#:

Delegation

"You are creating a Function as Object. How do you create a new object that extends the behavior of an existing object? ... [U]se delegation. Make a Function as Object that has an instance variable an instance of the object you want to extend. Implement behaviors specific to the new object as methods in a Method Selector. Pass all other messages onto the instance variable."

Again, in a traditional O-O language, we'd just inherit, and in an object- functional hybrid, we could do the same. There's no real point not to, to be honest. But the interesting thing about this implementation is that it demonstrates the runtime relationship between a JavaScript object and its prototype: calling a function passing in the "derived" object causes the "derived" to try its "base" (its prototype) in the event that the method in question isn't defined on the "derived".

Note also that this particular trick is really only feasible because the "object" presents a uniform interface: all interaction with the "object" (whether it is a standard account or an interest-bearing one) is done through the Method Selector mechanism, which allows for this extension without having to modify any sort of base interface. This isn't so much a knock on O-O as a whole as it is on statically-typed traditional O-O.

Scheme:

This is pretty straightforward, if you understood the Message-Passing Interface implementation of earlier.

(define make-interest-bearing-account
  (lambda (balance interest-rate)
    (let ((my-account (make-account balance)))
      (lambda (transaction)
        (case transaction
          ('accrue-interest
            (lambda ()
              ((my-account 'deposit)
                (* ((my-account 'balance))
                   interest-rate)) ))
        (else
          (my-account transaction))
        )))
  ))
(define account-for-eugene (make-interest-bearing-account 100 0.05))
((account-for-eugene 'balance))         => 100
((account-for-eugene 'deposit) 100)     => 200
((account-for-eugene 'balance))         => 200
((account-for-eugene 'accrue-interest)) => 210
((account-for-eugene 'balance))         => 210

JavaScript:

Despite the fact that the JavaScript implementation just keeps getting longer and longer, it's actually not that much harder to add in this delegation functionality--again, as has been the case for a lot of the JavaScript code, it's almost a direct one-to-one port from the Scheme:

(function() {
  out("delegation =======")
  
  var slice = function(src, start, end) {
    var returnVal = []
    var j = 0
    if (end === undefined)
      end = src.length
    for (var i = start; i < end; i++) {
      if (src.length > i)
        returnVal[j++] = src[i]
    }
    return returnVal
  }
  
  var makeAccount = function(bal) {
    var balance = bal
    return function(transaction) {
      if (transaction === "withdraw") {
        return function(amount) {
          if (balance >= amount)
            return (balance = (balance - amount))
          else
            throw new Error("Insufficient funds")
        }
      }
      else if (transaction === "deposit") {
        return function(amount) {
          return (balance = (balance + (amount * 10.0 / 10.0)))
        }
      }
      else if (transaction === "balance") {
        return function() {
          return balance
        }
      }
      else {
        throw new Error("Insufficient funds")
      }
    }
  }
  var makeInterestBearingAccount = function(bal, intRate) {
    var myAccount = makeAccount(bal)
    return function(transaction) {
      if (transaction === "accrueInterest") {
        return function() {
          var balance = myAccount("balance")()
          var interest = (balance * intRate)
          return myAccount("deposit")(interest)
        }
      }
      else
        return myAccount(transaction)
    }
  }
  
  var acctForEugene = makeInterestBearingAccount(100, 0.05)
  out(acctForEugene("balance")())
  out(acctForEugene("deposit")(20))
  out(acctForEugene("accrueInterest")())
  out(acctForEugene("balance")())
})();

Scala:

The Scala version of this is tricky, because it relies on a very subtle bit of Scala syntax; specifically, when we try to pass the "args" sequence (which, in actual implementation, is a WrappedArray) from the "makeInterestBearingAccount" function to the "makeAccount" function (by which I mean, the functions returned from those two functions), if we don't use the peculiar ": _*" syntax, Scala interprets "args" to be a single parameter (a single parameter whose type is a collection), instead of the intended "pass the arguments through" behavior. (If you're a Java or C# developer, it's like having a varargs method calling another varargs method, and passing the array of arguments from the first as an array instead of each element on its own to form the array of arguments in the second. Yeah, I know--it's a little brain-twisty.)

  def delegation() = {
    def makeAccount(bal : Int) = {
      var balance = bal
      def send(key:String, args:Any*) = {
        key match {
          case "withdraw" => {
            val amt = args.head.asInstanceOf[Int]
            if (balance >= amt) {
              balance = (balance - amt) 
              balance
            }
            else 
              throw new RuntimeException("Insufficient funds")
          }
          case "deposit" => {
            val amt = args.head.asInstanceOf[Int]
            balance += amt
            balance
          }
          case "balance" => {
            balance
          }
          case _ =>
            throw new RuntimeException("Unknown request")
        }
      }
      send _
    }
    def makeInterestBearingAccount(bal : Int, intRate : Double) = {
      val account = makeAccount(bal)
      def send(key: String, args:Any*) = {
        key match {
          case "accrueInterest" => {
            val amt = (int2float(account("balance")) * intRate).toInt
            account("deposit", amt)
          }
          case _ =>
            account(key, args : _*)
        }
      }
      send _
    }
    val acctForEugene = makeInterestBearingAccount(100, 0.05)
    println(acctForEugene("deposit", 20))
    println(acctForEugene("accrueInterest"))
    println(acctForEugene("balance"))
  }

F#:

Aside from the aforementioned weirdness about the obj list as a generic parameter mechanism, this is really straightforward:

let delegation = fun () ->
    let makeAccount =
        fun (bal : int) ->
            let balance = ref bal
            fun transaction ->
                match transaction with
                | "balance" ->
                    fun _ -> !balance
                | "deposit" ->
                    fun (arglist : obj list) ->
                        let amt = arglist.Head :?> int
                        balance := (!balance) + amt
                        !balance
                | "withdraw" ->
                    fun (arglist : obj list) ->
                        let amt = arglist.Head :?> int
                        if amt <= !balance then
                            balance := (!balance) - amt
                            !balance
                        else
                            raise (Exception("Insufficient funds"))
                | _ ->
                    raise (Exception("Unrecognized operation" + transaction))
    let makeInterestBearingAccount =
        fun (bal : int) (intRate : float) ->
            let account = makeAccount bal
            fun transaction ->
                match transaction with
                | "accrueInterest" ->
                    fun _ ->
                        let balance = (account "balance" [])
                        let interest : float = float balance * intRate 
                        account "deposit" [int interest]
                | _ -> account transaction
    Console.WriteLine "=========> Delegation"
    let acctForEugene = makeInterestBearingAccount 100 0.05
    printfn "%d" (acctForEugene "deposit" [20])
    printfn "%d" (acctForEugene "accrueInterest" [])

Note the explicit casts in the "accrueInterest" code: this is because F#, like a lot of functional languages, won't do automatic type-promotion for you. So the "int"s have to be explicitly converted to "float"s, and back again.

Yeti (ML):

Since we didn't go down the path of trying to do the variable-argument list in Yeti, we don't have the same problems the Scala version presented, and the generic methods (the top-level "withdraw", "deposit" and "balance" functions) actually help hide the syntactic weirdness that we ran into in the last pattern element:

makeAccount =
  (do bal:
    var balance = bal;
    do action:
      case action of
        "withdraw":
          do amt:
            if amt <= balance then
              balance := balance - amt;
              balance
            else
              throw new RuntimeException("Insufficient funds")
            fi
          done;
        "deposit": 
          do amt:
            balance := balance + amt;
            balance
          done;
        "balance": 
          do: 
            balance
          done;
        _ : throw new RuntimeException("Unknown operation")
      esac
    done;
  done;);

withdraw =
  (do acct amt:
    (acct "withdraw") amt;
  done;);
deposit =
  (do acct amt:
    (acct "deposit") amt;
  done;);
balance =
  (do acct:
    acct "balance" 0;
  done;);

acctForEugene = makeAccount 100;
println (withdraw acctForEugene 20);      // 80
println (deposit acctForEugene 20);       // 100
println (balance acctForEugene);          // 100

accounts = [(makeAccount 100), (makeAccount 200), (makeAccount 300)];
balances = map (do acct: (deposit acct 20) done) accounts;
for accounts do acct: println(deposit acct 20) done;

Note the last two lines--the "for" construct in most imperative languages is actually akin to the "map" construct in most functional languages, except that in the imperative "for" there's no return value from the expression, and in a functional "map" there (usually) is. This is why we have to bind the result from the "map" to a name, and we don't have any results from the "for". (The "map" also insists on having a returned value--a list of Unit isn't acceptable, which is what would be returned if we used the "println" expression in the "map".)

Jaskell (Haskell):

C#:

Private Method

"You have created a Method Selector. How do you factor common behavior out of the methods in the Method Selector? ... [D]efine the common code in a Local Procedure (Wallingford). Invoke this procedure in place of the duplicated code within the Method Selector."

Scheme:

(define make-account
  (lambda (balance)
    (let ((transaction-log '())
      (log-transaction
        (lambda type amount)
          (set! transaction-log
                (cons (list type amount)
                      transaction-log)))) )
      (lamba (transaction)
        (case transaction
          ('withdraw
            (lambda (amount)
              (if (>= balance amount)
                (begin
                  (set! balance (- balance amount))
                  (log-transaction 'withdraw amount)
                  balance)
                (error "Insufficient funds" balance))))
          ('deposit
            (lambda (amount)
              (set! balance (+ balance amount))
              (log-transaction 'deposit amount)
              balance))
        ...))
    ))

JavaScript:

Again, in JavaScript, we rely on the fact that anything declared inside the "makeAccount" function but outside the function returned by "makeAccount" is encapsulated, and create both the "transactionLog" (an array, since JavaScript likes those better than lists) and the function to append to it ("logTransaction") within that "neutral zone". Just to prove that the transaction log is being written, I add another method to the method selector table, "viewLog", to return the contents of the transaction log.

(function() {
  out("privateMethod ===========")
  
  var makeAccount = function(bal) {
    var transactionLog = []
    var logTransaction = function(type, amount) {
      transactionLog.push("Action: " + type + " for " + amount)
    }
    
    var balance = bal
    return function(transaction) {
      if (transaction === "withdraw") {
        return function(amount) {
          if (balance >= amount) {
            logTransaction("withdraw", amount)
            return (balance = (balance - amount))
          }
          else
            throw new Error("Insufficient funds")
        }
      }
      else if (transaction === "deposit") {
        return function(amount) {
          logTransaction("deposit", amount)
          return (balance = (balance + (amount * 10.0 / 10.0)))
        }
      }
      else if (transaction === "balance") {
        return function() {
          logTransaction("balance", balance)
          return balance
        }
      }
      else if (transaction === "viewLog") {
        return function() {
          return (transactionLog)
        }
      }
      else {
        throw new Error("Insufficient funds")
      }
    }
  }
  var acctForEugene = makeAccount(100)
  out(acctForEugene("withdraw")(20))
  out(acctForEugene("balance")())
  out(acctForEugene("deposit")(20))
  out(acctForEugene("balance")())
  out(acctForEugene("viewLog")())
})();

Scala:

The Scala version is also pretty straightforward--we've already seen that Scala supports nested functions, so it is simply a matter of defining the logTransaction() function and an empty List[String] in the same "neutral zone" in which the "balance" variable lives. Instead of adding a new selector to the list, I chose this time to just print the transaction log as part of the "balance" operation.

  def privateMethod() = {
    def makeAccount(bal : Int) = {
      var balance = bal
      var transactionLog = List[String]()
      def logTransaction(action:String, amount:Int) = {
        val msg = ("Action: " + action + " for " + amount)
        transactionLog = transactionLog :+ msg
      }
      def send(key:String, args:Any*) = {
        key match {
          case "withdraw" => {
            val amt = args.head.asInstanceOf[Int]
            if (balance >= amt) {
              logTransaction("withdraw", amt)
              balance = (balance - amt) 
              balance
            }
            else 
              throw new RuntimeException("Insufficient funds")
          }
          case "deposit" => {
            val amt = args.head.asInstanceOf[Int]
            logTransaction("deposit", amt)
            balance += amt
            balance
          }
          case "balance" => {
            println(transactionLog)
            balance
          }
          case _ =>
            throw new RuntimeException("Unknown request")
        }
      }
      send _
    }
    val acctForEugene = makeAccount(100)
    println(acctForEugene("deposit", 20))
    println(acctForEugene("balance"))
  }

F#:

Binding a local function is, by this point, somewhat trivial and uninspiring, but it's just as easily done in F# as it is in any of the other languages:

let privateMethod = fun () ->
    let makeAccount =
        fun (bal : int) ->
            let transactionLog = ref []
            let logTransaction act (amt : int) =
                let message = "Action: " + act + " for " + amt.ToString()
                transactionLog := List.append !transactionLog [message]
            let balance = ref bal
            fun transaction ->
                match transaction with
                | "balance" ->
                    fun _ -> 
                        List.iter (printfn "%s") !transactionLog
                        !balance
                | "deposit" ->
                    fun (arglist : obj list) ->
                        let amt = arglist.Head :?> int
                        balance := (!balance) + amt
                        logTransaction "deposit" amt
                        !balance
                | "withdraw" ->
                    fun (arglist : obj list) ->
                        let amt = arglist.Head :?> int
                        if amt <= !balance then
                            balance := (!balance) - amt
                            logTransaction "withdraw" amt
                            !balance
                        else
                            raise (Exception("Insufficient funds"))
                | _ ->
                    raise (Exception("Unrecognized operation" + transaction))
    Console.WriteLine "=========> Private Method"
    let acctForEugene = makeAccount 100
    printfn "%d" (acctForEugene "deposit" [20])
    printfn "%d" (acctForEugene "withdraw" [50])
    printfn "%d" (acctForEugene "balance" [])

Yeti (ML):

The private method in Yeti is, again, just a nested function hiding out in the closure that is returned by "makeAccount"; the fact that Yeti supports expressions embedded inside of strings makes it easy to create the transaction log string:

makeAccount =
  (do bal:
    var balance = bal;
    var transactionLog is list<string> = [];
    logTransaction action amount = 
      transactionLog := "Action: \(action) for \(amount)" :: transactionLog;
    do action:
      case action of
        "withdraw":
          do amt:
            if amt <= balance then
              logTransaction "withdraw" amt;
              balance := balance - amt;
              balance
            else
              throw new RuntimeException("Insufficient funds")
            fi
          done;
        "deposit": 
          do amt:
            logTransaction "deposit" amt;
            balance := balance + amt;
            balance
          done;
        "balance": 
          do: 
            println transactionLog;
            balance
          done;
        _ : throw new RuntimeException("Unknown operation")
      esac
    done;
  done;);

Jaskell (Haskell):

C#:

Summary

JavaScript is, of course, the de-facto golden child right now.

And Scala is, undoubtedly, one of my favorites. It's syntax is a little quirky in places, but no more so than any other language I've used.

I like the Yeti code style and syntax, and could definitely see doing some small projects in it, particularly some service-y kinds of things with it, a la Web or REST services; the Yeti source code has some examples of how to create (for example) servlets and WARs, and it's a nice syntax. I don't know that I'd want to create a full-fledged MVC framework on top of Yeti, but as something that's basically taking input, doing processing and sending back JSON or XML results, it's not a bad approach. Considering you can also create classes in Yeti, which puts it into the same grounds as F#, it's worth looking into if you've got some ML in your background and want to go back to it while staying on top of the JVM.

The F# version is a nice mix of ML and objects, though the casting operators are definitely a syntax that only a mother could love, and the distinction between what is allowed on functions vs. methods (such as the parameter arrays) feels a little arbitrary at times. (I'm sure there's good reasons for it, but it still feels a little arbitrary, at least to me.) The "cannot close over local variables, use refs instead" rule is also a little annoying, although it does make it explicitly clear that now you're closing over a reference, not the actual value, so now the "what happens if I modify the closed-over value" question becomes self-explanatory. (This sometimes trips people up in other languages that don't make the by-value or by-reference closing-over semantics explicit.)

Honestly, I don't really expect that anyone reading this piece is going to immediately turn around, abandon all their domain objects, and take up this approach as a replacement--in some cases, taking this "all functional" style creates more angst than it really provides benefits--but we can use parts of it to generate some really interesting new patterns.


.NET | C# | F# | Industry | Java/J2EE | Languages | Personal | Scala | Windows

Thursday, December 20, 2012 6:24:55 PM (Pacific Standard Time, UTC-08:00)
Comments [1]  | 
 Friday, November 30, 2012
On Uniqueness, and Difference

In my teenage formative years, which (I will have to admit) occurred during the 80s, educators and other people deeply involved in the formation of young peoples' psyches laid great emphasis on building and enhancing our self-esteem. Self-esteem, in fact, seems to have been the cause and cure of every major problem suffered by any young person in the 80s; if you caved to peer pressure, it was because you lacked self-esteem. If you dressed in the latest styles, it was because you lacked the self-esteem to differentiate yourself from the crowd. If you dressed contrary to the latest styles, it was because you lacked the self-esteem to trust in your abilities (rather than your fashion) to stand out. Everything, it seemed, centered around your self-esteem, or lack thereof. "Be yourself", they said. "Don't be what anyone else says you are", and so on.

In what I think was supposed to be a trump card for those who suffered from chronically low self-esteem, those who were trying to form us into highly-self-esteemed young adults stressed the fact that by virtue of the fact that each of us owns a unique strand of DNA, then each of us is unique, and therefore each of us is special. This was, I think, supposed to impose on each of us a sense of self- worth and self-value that could be relied upon in the event that our own internal processing and evaluation led us to believe that we weren't worth anything.

(There was a lot of this handed down at my high school, for example, particularly my freshman year when one of my swim team teammates committed suicide.)

With the benefit of thirty years' hindsight, I can pronounce this little experiment/effort something of a failure.

The reason I say this is because it has, it seems, spawned a generation of now-adults who are convinced that because they are unique, that they are somehow different--that because of their uniqueness, the generalizations that we draw about people as a whole don't apply to them. I knew one woman (rather well) who told me, flat out, that she couldn't get anything out of going to therapy, because she was different from everybody else. "And if I'm different, then all of those things that the therapist thinks about everybody else won't apply to me." And before readers start thinking that she was a unique case, I've heard it in a variety of different forms from others, too, on a variety of different topics other than mental health. Toss in the study, quoted in a variety of different psych books, that something like 80% of the population thinks they are "above average", and you begin to get what I mean--somewhere, deep down, we've been led down this path that says "Because you are unique, you are different."

And folks, I hate to burst your bubble, but you're not.

Don't get me wrong, I understand that fundamentally, if you are unique, then by definition you are different from everybody else. But implicit in this discussion of the word "different" is an assumption that suggests that "different" means "markedly different", and it's in that distinction that the argument rests.

Consider this string of numbers for a second:

12345678901234567890123456789012345678901234567890
and this string of numbers:
12345678901234567890123456788012345678901234567890
These two strings are unique, but I would argue that they're not different--in fact, their contents differ by one digit (did you spot it?), but unless you're looking for the difference, they're basically the same sequential set of numbers. Contrast, then, the first string of numbers with this one:
19283746519283746519283746554637281905647382910000
Now, the fact that they are unique is so clear, it's obvious that they are different. Markedly different, I would argue.

If we look at your DNA, and we compare it to another human's DNA, the truth is (and I'm no biologist, so I'm trying to quote the numbers I was told back in high school biology), you and I share about 99% of the same DNA. Considering the first two strings above are exactly 98% different (one number in 50 digits), if you didn't see the two strings as different, then I don't think you can claim that you're markedly different from any other human if you're half again less different than those two numbers.

(By the way, this is actually a very good thing, because medical science would be orders of magnitude more difficult, if not entirely impossible, to practice if we were all more different than that. Consider what life would be like if the MD had to study you, your body, for a few years before she could determine whether or not Tylenol would work on your biochemistry to relieve your headache.)

But maybe you're one of those who believes that the difference comes from your experiences--you're a "nurture over nature" kind of person. Leaving all the twins' research aside (the nature-ists final trump card, a ton of research that shows twins engaging in similar actions and behaviors despite being raised in separate households, thus providing the best isolation of nature and nurture while still minimizing the variables), let's take a small quiz. How many of you have:

  1. kissed someone not in your family
  2. slept with someone not in your family
  3. been to a baseball game
  4. been to a bar
  5. had a one-night stand
  6. had a one-night stand that turned into "something more"
... we could go on, probably indefinitely. You can probably see where I'm going with this--if we look at the sum total of our experiences, we're going to find that a large percentage of our experiences are actually quite similar, particularly if we examine them at a high level. Certainly we can ask the questions at a specific enough level to force uniqueness ("How many of you have kissed Charlotte Neward on September 23rd 1990 in Davis, California?"), but doing so ignores a basic fact that despite the details, your first kiss with the man or woman you married has more in common with mine than not.

If you still don't believe me, go read your horoscope for yesterday, and see how much of that "prediction" came true. Then read the horoscope for yesterday for somebody born six months away from you, and see how much of that "prediction" came true. Or, if you really want to test this theory, find somebody who believes in horoscopes, and read them the wrong one, and see if they buy it as their own. (They will, trust me.) Our experiences share far more in common--possibly to the tune of somewhere in the high 90th percentiles.

The point to all of this? As much as you may not want to admit it, just because you are unique does not make you different. Your brain reacts the same ways as mine does, and your emotions lead you to make bad decisions in the same ways that mine does. Your uniqueness does not in any way exempt you from the generalizations that we can infer based on how all the rest of us act, behave, and interact.

This is both terrifying and reassuring: terrifying because it means that the last bastion of justification for self-worth, that you are unique, is no longer a place you can hide, and reassuring because it means that even if you are emotionally an absolute wreck, we know how to help you straighten your life out.

By the way, if you're a software dev and wondering how this applies in any way to software, all of this is true of software projects, as well. How could it not? It's a human exercise, and as a result it's going to be made up of a collection of experiences that are entirely human. Which again, is terrifying and reassuring: terrifying in that your project really isn't the unique exercise you thought it was (and therefore maybe there's no excuse for it being in such a deep hole), and reassuring in that if/when it goes off the rails into the land of dysfunction, it can be rescued.


Conferences | Development Processes | Industry | Personal | Reading | Social

Friday, November 30, 2012 10:03:48 PM (Pacific Standard Time, UTC-08:00)
Comments [2]  | 
 Wednesday, November 28, 2012
On Knowledge

Back during the Bush-Jr Administration, Donald Rumsfeld drew quite a bit of fire for his discussion of knowledge, in which he said (loosely paraphrasing) "There are three kinds of knowledge: what you know you know, what you know you don't know, and what you don't know you don't know". Lots of Americans, particularly those who were not kindly disposed towards "Rummy" in the first place, took this to be canonical Washington doublespeak, and berated him for it.

I actually think that was one of the few things Rumsfeld said that was worth listening to, and I have a slight amendment to the statement; but first, let's level-set and make sure we're all on the same page about what those first three categories mean, in real life, with a few assumptions along the way to simplify the discussion (as best we can, anyway):

  1. What you know you know. This is the category of information that the individual in question has studied to some level of depth: for a student of International Relations (as I was), this would be the various classes that they took and received (presumably) a passing grade in. For you, the reader of my blog, that would probably be some programming language and/or platform. This is knowledge that you have, in some depth, at a degree that most people would consider "factually accurate".
  2. What you know you don't know. This is the category of information that the individual in question has heard about, but has never studied to any level or degree: for the student of International Relations, this might be the subject of biochemistry or electrical engineering. For you, the reader of my blog, it might be certain languages that you've heard of, perhaps through this blog (Erlang, F#, Scala, Clojure, Haskell, etc) or data-storage systems (Cassandra, CouchDB, Riak, Redis, etc) that you've never investigated or even sat through a lecture about. This is knowledge that you realize you don't have.
  3. What you don't know you don't know. This is the category of information that the individual in question has never even heard about, and so therefore, by definition, has not only the lack of knowledge of the subject, but lacks the realization that they lack the knowledge of the subject. For the student of International Relations, this might be phrenology or Schrodinger's Cat. For you, the reader of my blog, it might be languages like Dylan, Crack, Brainf*ck, Ook, or Shakespeare (which I'm guessing is going to trigger a few Google searches) or platforms like BeOS (if you're in your early 20's now), AmigaOS (if you're in your early 30's now) or database tools/platforms/environments like Pick or Paradox. This is knowledge that you didn't realize you don't have (but, paradoxically, now that you know you don't have it, it moves into the "know you don't know" category).
Typically, this discussion comes up in my "Pragmatic Architecture" talk, because an architect needs to have a very clear realization of what technologies and/or platforms are in which of those three categories, and (IMHO) push as many of them from category #3 (don't know that you don't know) into category #2 (know you don't know) or, ideally, category #1 (know you know). Note that category #1 doesn't mean that you are the world's foremost expert on the thing, but you have some working knowledge of the thing in question--I don't consider myself to be an expert on Cassandra, for example, but I know enough that I can talk reasonably intelligently to it, and I know where I can get more in the way of details if that becomes important, so therefore I peg it in category #1.

But what if I'm wrong?

See, here's where I think there's a new level of knowledge, and it's one I think every software developer needs to admit exists, at least for various things in their own mind:

  • What you think you know. This is knowledge that you believe, in your heart of hearts, you have about a given subject.
Be honest with yourself: we've all met somebody in this industry who claims to have knowledge/expertise on a subject, and damn if they can't talk a good game. They genuinely believe, in fact, that they know the subject in question, and speak with the confidence and assurance that comes with that belief. (I'm assuming that the speaker in question isn't trying to deliberately deceive anyone, which may, in some cases, be a naive and/or false assumption, but I'm leaving that aside for now.) But, after a while, it becomes apparent, either to themselves or to the others around them, that the knowledge they have is either incorrect, out of date, out of context, or some combination of all three.

As much as "what you don't know you don't know" information is dangerous, "what you think you know" information is far, far more so, particularly because until you demonstrate to yourself that your information is actually correct, you're a danger and a liability to anyone who listens to you. Without regularly challenging yourself to some form of external review/challenge, you'll never exactly know whether what you know is real, or just made up from your head.

This is why, at every turn, your assumption should be that any information you have is some or all incorrect until proven otherwise. Find out why you know something--what combination of facts/data lead you to believe that this is the case?--and you will quickly begin to discover whether that knowledge is real, or just some kind of elaborate self-deception.


Conferences | Development Processes | Industry | Personal | Review | Social

Wednesday, November 28, 2012 6:13:45 PM (Pacific Standard Time, UTC-08:00)
Comments [0]  | 
 Friday, November 23, 2012
On Tech, and Football

Today was Thanksgiving in the US, a holiday that is steeped in "tradition" (if you can call a country of less than three hundred years in history to have any traditions, anyway). Americans gather in their homes with friends and family, prepare an absurdly large meal centered around a turkey, mashed potatoes, gravy, and "all the trimmings", and eat. Sometimes the guys go outside and play some football before the meal, while the gals drink wine and/or margaritas and prep the food, and the kids escape to video games or nerf gun wars outside, and so on.

One of these traditions commonly associated with this holiday is the National Football League (NFL, to those of you not familiar with American football): there is always a game on, and for whatever reason (tradition!), usually the game (or one of the games, if there's more than one--today there were three) is between the Dallas Cowboys and the Washington Redskins. I don't have the statistics handy, but I think those two teams have played on Thanksgiving like every year for the last four decades (or something like that).

This year, the Washington Redskins defeated the Dallas Cowboys 38-31. Apparently, it was quite the blowout in the second quarter, when Washington's rookie quarterback, Robert Griffin III, threw three touchdown passes in one quarter, then one more later in the game to become the first quarterback in Washington franchise history to throw back-to-back four-TD games. ESPN has all the details, if you're interested. What you won't find, however, in that news report, is far more important about what you will find. For all the praise heaped on RGIII (as Mr. Griffin is known in sports circles), you will not hear one very interesting factoid:

RGIII is black.

So, it turns out, is Michael Vick (Philadelphia). So is Byron Leftwich (Pittsburgh's backup QB), as is Charlie Batch (the backup for Pittsburgh now that Leftwich is down for the season with an injury). In fact, despite the fact that no team in the NFL had a starting black quarterback just twenty or thirty years ago, the issue of race is pretty much "done" in the NFL: nobody cares what the race of the players is anymore, unless the player themselves makes an issue of it. After Doug Williams, the first black quarterback to win a Super Bowl, people just kinda... stopped caring.

What does this have to do with tech?

People have been making a big deal out of the lack of women (and minority, though women get better press) speakers in the software industry. This post, for example, implicitly suggests that somehow, women aren't getting the opportunities that they deserve:

Where are these opportunities? You don't see the opportunities that no one offers you. You don't see the suggestions, requests for collaboration, invitations to the user group, that didn't happen.

Where are these obstacles? Also invisible. They're a lack of inclusion, and of a single role model. They're not having your opinion asked for technical decisions. They're an absence of sponsorship -- of people who say in management meetings "Jason would make a great architect." Jason doesn't even know someone's speaking up for him, so how could Rokshana know she's missing this?

You can't see what isn't there. You can't fight for what you can't see.

I take issue with a couple of these points. Not everyone deserves the opportunity: sometimes an opportunity is not handed to you not because you're a woman, but because you're not willing to go after it. Look, as much as we may want to pretend that everybody is equal, that everybody can make the same results given the same inputs, if you put a football in my hand and ask me to make the throw 85 yards down the field into target area that's about the diameter of your average trash can, I'm not going to generate the same results that RGIII can. He's bigger than me, stronger than me, faster than me, and so on. What's more, even if I put in the same kinds of hours into practicing and training and bodybuilding and so forth, he's still going to get the nod, because he's been aggressive about pursuing the opportunities that gave people the confidence to put the ball in his hands in the fourth quarter. Me? Not so much. It wasn't that I didn't have the opportunities, it's that I chose not to take them when those opportunities arose.

Some people choose to not see opportunities. Some people choose other opportunities--when the choice comes down to staying a few extra hours to get stuff done at work, versus going home to spend time with your family, regardless of which one you choose, that choice will have consequences. The IT worker who chooses to stay will often be rewarded by being given opportunities to pursue additional opportunities at work and/or promotions and/or recognition; the one who chooses to go home will often be rewarded by a deeper connection to their family. The one who stays gets labeled "workaholic"; the one who goes home gets labeled "selfish" or "not committed to the project". Toh-may-toh, toh-mah-toh.

I don't care what gender you are--this choice applies equally to you.

Contrary to what the other blogger seems to imply, there is no secret "Men's IT Success Club", identifying promising members and giving them the necessary secret training to succeed. Nobody ever held a hand out to me and said, "Dude, you're smart. You should get ahead in life--let me help you get there." I had to take risks. I had to put myself out there. I got lucky, in a lot of ways, but don't for a second think that it was all me or it was all luck, it was a combination of the two. When I was sitting in meetings, as just a Programmer I, I had to weigh very carefully the risks of speaking up in the meeting or keeping quiet. Speaking up gets you noticed--and if you're wrong, you get shot down very quickly. Staying quiet lets you fly under the radar and avoids humiliation, but also doesn't get your boss' attention or demonstrate that you have a strong grasp of the situation.

I don't care what gender you are--this choice applies equally to you.

Sure, maybe someone will notice you and offer you that hand up. Someone will recognize your talents and say, "Damn, I think you'd be good at this, are you interested?", and if you say yes, smooth the road for you and mentor you and give you opportunities that would've taken you years otherwise to create for yourself. But notice, at the front of that sentence, I said, "Someone will recognize your talents", and in the middle I said, "if you say yes". Your talents have to be on display, and you have to say yes. Neglecting either of these will remove those opportunities. Not taking the risk to show off your talents takes away the opportunity. Not taking the risk by saying yes takes away the opportunity.

Frankly, I'm appalled that she says we have to:

  1. Create explicit opportunities to make up for the implicit ones minorities aren't getting. Invite women to speak, create minority-specific scholarships, make extra effort to reach out to underrepresented people.
  2. Make conscious effort to think about including everyone on the team in decisions. Don't always go with your gut for whom to invite to the table.
  3. Don't interrupt a woman in a meeting. (I catch myself doing this, now that I know it's a problem.) Listen, and ask questions.
  4. If you are a woman, be the first woman in the room. We are the start of making others feel like they belong.
My thoughts in response, in order:
  1. I call bull. The call for speakers should always be color- and gender-blind. If a woman speaker wants to be take seriously, she has to be taken to speak because she is a good speaker, not because she has boobies. To offer women speakers a lower bar means essentially that she's still not equal, that she's there only because she's a woman and "we need to have a few of those to liven the place up". Yep, that's 1950's sexism talking, and it horrifies me that someone could suggest that with a straight face. Particularly someone who hasn't had to scrabble her way into conferences like other speakers have had to.
  2. I call bull. There are some decisions that are appropriate for the entire team to make, there are some decisions that only the team leads and/or architects should make, and there are some decisions that are best made by someone within the team who has the technical background to make them--for example, asking me about CSS or which client-side Javascript library to use is rather foolish, since I don't really have the background to make a good call. RGIII doesn't ask the offensive linemen where he should throw the ball, and they don't ask him how they should react to the hand slap that the defensive end throws out as he tries to go around them. No one should be deliberately excluded from a conversation they can contribute to, no, but then again, no one should be included in meetings for which they have no expertise. Want to be in on that meeting? Develop the expertise first, then look for the chance to demonstrate it--they're always there, if you look for them.
  3. Don't interrupt a woman in a meeting? How about, don't interrupt ANYONE in a meeting? If interruptions are a sign of disrespect, then those signs should be removed regardless of gender. If interruptions are just a way that teams generate flow (and I believe they are, based on my own experiences), then artificially establishing that rule means that the woman is an artificial barrier to the "form/storm/norm" process.
  4. If you are a woman, then sure, keep an eye out for the other women in the room that may want to be where you are now. But if you're a man, keep an eye out for the other men in the room that seek the same opportunities, and help them. If you're black, keep an eye out for the other blacks, Asian for the other Asians, and... Well, wait, no, come to think of it, women could mentor men, and men could mentor women, and blacks for Asians and Asians for blacks, and... How about you just keep your eyes open for anyone that shows the talent and drive, and reward that with your offer of mentorship and aid?

Within the NFL, a rule was established demanding that teams interview at least one minority for any open coaching position; it was a rule designed to make sure that blacks and other minorities could make it into the very top rungs of coaching. Today, I'm guessing somewhere between a quarter to a third of the NFL teams are led by a minority head coach. But no such rule, to my knowledge, has ever been passed about which players are taken for which positions. Despite the adage a few decades ago that "blacks aren't cerebral enough to play quarterback", I'm guessing that about a quarter to a third of the quarterbacks in the league are black, and several have won a Super Bowl. This, despite absolutely no artificial aids designed to help them.

Women in IT don't need special rules or special favors. They don't need some kind of corporate return to chivalry--they're not some kind of "weaker sex" that need special help. If a woman today wants to become a speaker, the opportunities are there. Maybe it's not a keynote session at a 20,000-person industry-spanning show, but hey, not a lot of men get those opportunities, either. Some opportunities are earned, not just offered. So rather than trying to force organizations to offer opportunities to women, maybe women should look to themselves and ask, "What do I need to do to earn that opportunity?" Instead of insisting that women be given a handout, insist that everyone be given the chance equally well, based on merit, not genital plumbing.

Because then, it's a choice, and one you can make for yourself.


Conferences | Industry | Personal | Social

Friday, November 23, 2012 12:51:12 AM (Pacific Standard Time, UTC-08:00)
Comments [5]  | 
 Saturday, November 03, 2012
Cloud legal

There's an interesting legal interpretation coming out of the Electronic Freedom Foundation (EFF) around the Megaupload case, and the EFF has said this:

"The government maintains that Mr. Goodwin lost his property rights in his data by storing it on a cloud computing service. Specifically, the government argues that both the contract between Megaupload and Mr. Goodwin (a standard cloud computing contract) and the contract between Megaupload and the server host, Carpathia (also a standard agreement), "likely limit any property interest he may have" in his data. (Page 4). If the government is right, no provider can both protect itself against sudden losses (like those due to a hurricane) and also promise its customers that their property rights will be maintained when they use the service. Nor can they promise that their property might not suddenly disappear, with no reasonable way to get it back if the government comes in with a warrant. Apparently your property rights "become severely limited" if you allow someone else to host your data under standard cloud computing arrangements. This argument isn't limited in any way to Megaupload -- it would apply if the third party host was Amazon's S3 or Google Apps or or Apple iCloud."
Now, one of the participants on the Seattle Tech Startup list, Jonathan Shapiro, wrote this as an interpretation of the government's brief and the EFF filing:

What the government actually says is that the state of Mr. Goodwin's property rights depends on his agreement with the cloud provider and their agreement with the infrastructure provider. The question ultimately comes down to: if I upload data onto a machine that you own, who owns the copy of the data that ends up on your machine? The answer to that question depends on the agreements involved, which is what the government is saying. Without reviewing the agreements, it isn't clear if the upload should be thought of as a loan, a gift, a transfer, or something else.

Lacking any physical embodiment, it is not clear whether the bits comprising these uploaded digital artifacts constitute property in the traditional sense at all. Even if they do, the government is arguing that who owns the bits may have nothing to do with who controls the use of the bits; that the two are separate matters. That's quite standard: your decision to buy a book from the bookstore conveys ownership to you, but does not give you the right to make further copies of the book. Once a copy of the data leaves the possession of Mr. Goodwin, the constraints on its use are determined by copyright law and license terms. The agreement between Goodwin and the cloud provider clearly narrows the copyright-driven constraints, because the cloud provider has to be able to make copies to provide their services, and has surely placed terms that permit this in their user agreement. The consequences for ownership are unclear. In particular: if the cloud provider (as opposed to Mr. Goodwin) makes an authorized copy of Goodwin's data in the course of their operations, using only the resources of the cloud provider, the ownership of that copy doesn't seem obvious at all. A license may exist requiring that copy to be destroyed under certain circumstances (e.g. if Mr. Goodwin terminates his contract), but that doesn't speak to ownership of the copy.

Because no sale has occurred, and there was clearly no intent to cede ownership, the Government's challenge concerning ownership has the feel of violating common sense. If you share that feeling, welcome to the world of intellectual property law. But while everyone is looking at the negative side of this argument, it's worth considering that there may be positive consequences of the Government's argument. In Germany, for example, software is property. It is illegal (or at least unenforceable) to write a software license in Germany that stops me from selling my copy of a piece of software to my friend, so long as I remove it from my machine. A copy of a work of software can be resold in the same way that a book can be resold because it is property. At present, the provisions of UCITA in the U.S. have the effect that you do not own a work of software that you buy. If the district court in Virginia determines that a recipient has property rights in a copy of software that they receive, that could have far-reaching consequences, possibly including a consequent right of resale in the United States.

Now, whether or not Jon's interpretation is correct, there are some huge legal implications of this interpretation of the cloud, because data "ownership" is going to be the defining legal issue of the next century.


.NET | Android | Azure | C# | C++ | Conferences | Development Processes | F# | Flash | Industry | iPhone | Java/J2EE | Languages | LLVM | Mac OS | Objective-C | Parrot | Personal | Python | Reading | Review | Ruby | Scala | Security | Social | Solaris | Visual Basic | VMWare | WCF | Windows | XML Services | XNA

Saturday, November 03, 2012 12:14:40 AM (Pacific Daylight Time, UTC-07:00)
Comments [0]  | 
 Sunday, October 21, 2012
On JDD2012

There aren't many times that I cancel out of a conference (fortunately), so when I do I often feel a touch of guilt, even if I have to cancel for the best of reasons. (I'd like to think that if I have to cancel my appearance at a conference, it's only for the best of reasons, but obviously there may be others who disagree--I won't get into that.)

The particular case that merits this blog post is my lack of appearance at the JDD 2012 show (JDD standing for "Java Developer Days") in Krakow, Poland. Don't get me wrong, I love that show--Krakow is a fun city, quickly establishing itself as a university town (hellooo night clubs and parties!) as well as something of a Polish Silicon Valley, or so I've been told. (Actually, I think Krakow has a history of being a university town, but the tech angle to it is fairly recent.) My previous trips there have always been wonderful experiences, and when the organizers and I discussed my attendance at this years' show back in the start of the calendar year, I was looking forward to it.

Unfortunately, my current employer took an issue with my European travels, stating something to the effect that "three trips to Europe in five weeks' time is not a great value for us", and when coupled with the fact that there was a US speaker going to the show (that I helped get to the show, ironically) that I didn't particularly want to be around and that I'd be just walking off the plane from London before I'd have to get back on the plane to get to Krakow.... *shrug* It was just a little too much all at once. Regretfully, I emailed Slawomir (the organizer) and told him I was going to have to cancel.

Any one of these, I'd have bulled my way through. Two of them, I probably still would have shown up. But all three.... I just decided that the divine heavens had spoken, and I should just take the message and stay home. And let the message be very clear here, there was no fault or blame about this decision to be laid anywhere but at my feet--if you're at JDD now and you're pissed that I'm not there, then you should blame me, and not the organizers. (But honestly, with Rebecca Wirfs-Brock and Adam Bien there, you're getting some top-notch content, so you probably won't even miss me.)

And yes, assuming I haven't burned a bridge with the organizers (and I think we're all good on that score), I sincerely hope to be back there in 2013; Polish attendees and conference organizers are off the hook when it comes to making a speaker feel welcome.


Android | Conferences | Industry | Java/J2EE | Languages | Personal | Scala

Sunday, October 21, 2012 12:12:07 AM (Pacific Daylight Time, UTC-07:00)
Comments [0]  | 
 Friday, October 12, 2012
On Equality

Recently (over the last half-decade, so far as I know) there's been a concern about the numbers of women in the IT industry, and in particular the noticeable absence of women leaders and/or industry icons in the space. All of the popular languages (C, C++, Java, C#, Scala, Groovy, Ruby, you name it) have been invented by or are represented publicly by men. The industry speakers at conferences are nearly all men. The rank-and-file that populate the industry are men. And this strikes many as a bad thing.

Honestly, I used to be a lot more concerned than I am today. While I'm sure that many will see my statements and position that follows as misogynistic and/or discriminatory, let me be the first to suggest quite plainly that I have nothing against any woman who wants to be a programmer, who wants to be an industry speaker, or who wants to create a startup and/or language and/or library and/or framework and/or tool and/or any other role of leadership and authority within the industry. I have always felt that this industry is more merit-based than any other I have ever had direct or indirect contact with. There is no need for physical strength, there is no need for dexterity or mobility, there is no need for any sort of physical stress tolerances (such as the G forces fighter pilots incur during aerial combat which, by the way, women are actually scientifically better at handling than men), there really even is no reason that somebody who is physically challenged couldn't excel here. So long as you can type (or, quite frankly, have some other mechanism by which you can put characters into an IDE), you can program.

And no, I have no illusions that somehow men are biologically wired better to be leaders. In fact, I think that as time progresses, we will find that the stereotypical characteristics that we ascribe to each of the genders (male competitiveness and female nuturing) each serve incredibly useful purposes in the IT world. Cathi Gero, for example, was once referred to by a client in my presence as "the Mom of the IT department"--by which they meant, Cathi would simply not rest until everything was exactly as it should be, a characteristic that they found incredibly comforting and supportive. Exactly the kind of characteristic you would want from a highly-paid consultant: that they will stick with you through all the mess until the problem is solved.

And no, I also have no illusions that somehow I understand what it's been like to be a woman in IT. I've never experienced the kind of "automatic discrimination" that women sometimes describe, being mistaken for recruiters at a technical conference, rather than as a programmer. I won't even begin to try and pretend that I know what that's like.

Unless, of course, I can understand it by analogy, such as when a woman sees me walking down the street, and crosses the street ahead of me so that she won't have to share the sidewalk, for even a second, with a long-haired, goateed six-foot-plus stranger. She has no reason to assume I represent any threat to her other than my physical appearance, but still, her brain makes the association, and she chooses to avoid the potential possibility of threat. Still, that's probably not the same.

What I do think, quite bluntly, is that one of the reasons we don't have more women in IT is because women simply choose not to be here.

Yes, I know, there are dozens of stories of misogynistic behavior at conferences, and dozens more stories of discriminatory behavior. Dozens of stories of "good ol' boys behavior" making women feel isolated, and dozens of stories of women feeling like they had to over-compensate for their gender in order to be heard and respected. But for each conference story where a woman felt offended by a speakers' use of a sexual epithet or joke, there are dozens of conferences where no such story ever emerges.

I'm reminded of a story, perhaps an urban myth, of a speaker at a leadership conference that stood in front of a crowd, took a black marker, made a small circle in the middle of a flip board, and asked a person in the first row what they saw. "A black spot", they replied. A second person said the same thing, and a third. Finally, after about a half-dozen responses of "a block spot", the speaker said, "All of you said you saw the same thing: a black spot. I'm curious as to why none of you saw the white background behind it".

It's easy for us to focus on the outlier and give that attention. It's even easier when we see several of them, and if they come in a cluster, we call it a "dangerous trend" and "something that must be addressed". But how easy it is, then, to miss the rest of the field, in the name of focusing on the outlier.

My ex-apprentice wants us to proactively hire women instead of men in order to address this lack:

Bring women to the forefront of the field. If you're selecting a leader and the best woman you can find is not as qualified as the best man you can find, (1) check your numbers to make sure unintentional bias isn't working against her, and (2) hire her anyway. She is smart and she will rise to the occasion. She is not as experienced because women haven't been given these opportunities in the past. So give it to her. Next round, she will be the most qualified. Am I advocating affirmative action in hiring? No, I'm advocating blind hiring as much as is feasible. This has worked for conferences that do blind session selection and seek out submissions from women. However, I am advocating deliberate bias in favor of a woman in promotions, committee selection, writing and speaking solicitation, all technical leadership positions. The small biases have multiplied until there are almost no women in the highest technical levels of the field.
But you can't claim that you're advocating "blind hiring" while you're saying "hire her anyway" if she "is not as qualified as the best man you can find". This is, by definition, affirmative action, and while it does put women into those positions, it doesn't address the underlying problem--that she isn't as qualified. There is no reason that she shouldn't be as qualified as the man, so why are we giving her a pass? Why is it this company's responsibility to fix the industry at a cost to themselves? (I'm assuming, of course, that there is a lost productivity or lost innovation or some other cost to not hiring the best candidate they can find; if such a loss doesn't exist, then there's no basis for assuming that she isn't equally qualified as the man.)

Did women routinely get "railroaded" out of technical directions (math and science) and into more "soft areas" (English and fine arts) in schools back when I was a kid? Yep. Studies prove that. My wife herself tells me that she was "strongly encouraged" to take more English classes than math or science back in Junior high and high school, even when her grades in math and science were better than those in English. That bias happened. But does it happen with girls today? Studies I'm reading about third-hand suggest not appreciably. And even if you were discriminated against back then, what stops you now? If you're reading this, you have a computer, so what stops you now from pursuing that career path? Programming today is not about math and science--it's about picking up a book, downloading a free SDK and/or IDE, and diving in. My background was in International Relations--I was never formally trained, either. Has it held me back? You betcha--there are a few places that refused to hire me because I didn't have the formal CS background to be able to select the right algorithm or do big-O analysis. Didn't seem to stop me--I just went and interviewed someplace else.

Equality means equality. If a woman wants to be given the same respect as a man, then she has to earn it the same way he does, by being equally qualified and equally professional. It is this "we should strengthen the weak" mentality that leads to soccer games with no score kept, because "we're all winners". That in turn leads to children that then can't handle it when they actually do lose at something, which they must, eventually, because life is not fair. It never will be. Pretending otherwise just does a disservice to the women who have put in the blood, sweat, and tears to achieve the positions of prominence and respect that they earned.

Am I saying this because I worry that preferential treatment to women speakers at conferences and in writing will somehow mean there are fewer opportunities for me, a man? Some will accuse me of such, but those who do probably don't realize that I turn down more conferences than I accept these days, and more writing opportunities as well. In fact, regardless of your gender, there are dozens, if not hundreds, of online portals and magazines that are desperate for authors to write quality work--if you're at all stumped trying to write for somebody, then you're not trying very hard. And every week user groups across the country are canceled for a lack of a speaker--if you're trying to speak and you're not, then you're either setting your bar too high ("If I don't get into TechEd, having never spoken before in my life, it must be because I'm a woman, not that I'm not a qualified speaker!") or you're really not trying ("Why aren't the conferences calling me about speaking there?").

If you're a woman, and you're thinking about a career in IT, more power to you. This industry offers more opportunity and room for growth than any other I've yet come across. There are dozens of meetings and meetups and conferences that are springing into place to encourage you and help you earn that distinction. Yes, as you go you will want and/or need help. So did I. You need people that will help you sharpen your skills and improve your abilities, yes. But a specific and concrete bias in your favor? No. You don't need somebody's charity.

Because if you do, then it means that you're admitting that you can't do it on your own, and you aren't really equal. And that, I think, would be the biggest tragedy of the whole issue.

Flame away.


Conferences | Development Processes | Industry | Personal | Reading | Security | Social

Friday, October 12, 2012 2:17:22 AM (Pacific Daylight Time, UTC-07:00)
Comments [2]  | 
 Friday, March 16, 2012
Just Say No to SSNs

Two things conspire to bring you this blog post.

Of Contracts and Contracts

First, a few months ago, I was asked to participate in an architectural review for a project being done for one of the states here in the US. It was a project dealing with some sensitive information (Child Welfare Services), and I was required to sign a document basically promising not to do anything bad with the data. Not a problem to sign, since I was going to be more focused on the architecture and code anyway, and would stay away from the production servers and data as much as I possibly could. But then the state agency asked for my social security number, and when I pushed back asking why, they told me it was “mandatory” in order to work on the project. I suspect it was for a background check—but when I asked how long they were going to hold on to the number and what their privacy policy was regarding my data, they refused to answer, and I never heard from them again. Which, quite frankly, was something of a relief.

Second, just tonight there was a thread on the Seattle Tech Startup mailing list about SSNs again. This time, a contractor who participates on the list was being asked by the contracting agency for his SSN, not for any tax document form, but… just because. This sounded fishy. It turned out that the contract was going to be with AT&T, and that they commonly use a contractor’s SSN as a way of identifying the contractor in their vendor database. It was also noted that many companies do this, and that it was likely that many more would do so in the future. One poster pointed out that when the state’s attorney general’s office was contacted about this practice, it isn’t illegal.

Folks, this practice has to stop. For both your sake, and the company’s.

Of Data and Integrity

Using SSNs in your database is just a bad idea from top to bottom. For starters, it makes your otherwise-unassuming enterprise application a ripe target for hackers, who seek to gather legitimate SSNs as part of the digital fingerprinting of potential victims for identity theft. What’s worse, any time I’ve ever seen any company store the SSNs, they’re almost always stored in plaintext form (“These aren’t credit cards!”), and they’re often used as a primary key to uniquely identify individuals.

There’s so many things wrong with this idea from a data management perspective, it’s shameful.

  • SSNs were never intended for identification purposes. Yeah, this is a weak argument now, given all the de facto uses to which they are put already, but when FDR passed the Social Security program back in the 30s, he promised the country that they would never be used for identification purposes. This is, in fact, why the card reads “This number not to be used for identification purposes” across the bottom. Granted, every financial institution with whom I’ve ever done business has ignored that promise for as long as I’ve been alive, but that doesn’t strike me as a reason to continue doing so.
  • SSNs are not unique. There’s rumors of two different people being issued the same SSN, and while I can’t confirm or deny this based on personal experience, it doesn’t take a rocket scientist to figure out that if there are 300 million people living in the US, and the SSN is a nine-digit number, that means that there are 999,999,999 potential numbers in the best case (which isn’t possible, because the first three digits are a stratification mechanism—for example, California-issued numbers are generally in the 5xx range, while East Coast-issued numbers are in the 0xx range). What I can say for certain is that SSNs are, in fact, recycled—so your new baby may (and very likely will) end up with some recently-deceased individual’s SSN. As we start to see databases extending to a second and possibly even third generation of individuals, these kinds of conflicts are going to become even more common. As US population continues to rise, and immigration brings even more people into the country to work, how soon before we start seeing the US government sweat the problems associated with trying to go to a 10- or 11-digit SSN? It’s going to make the IPv4 and IPv6 problems look trivial by comparison. (Look for that to be the moment when the US government formally adopts a hexadecimal system for SSNs.)
  • SSNs are sensitive data. You knew this already. But what you may not realize is that data not only has a tendency to escape the organization that gathered it (databases are often sold, acquired, or stolen), but that said data frequently lives far, far longer than it needs to. Look around in your own company—how many databases are still online, in use, even though the data isn’t really relevant anymore, just because “there’s no cost to keeping it”? More importantly, companies are increasingly being held accountable for sensitive information breaches, and it’s just a matter of time before a creative lawyer seeking to tap into the public’s sensitivities to things they don’t understand leads him/her takes a company to court, suing them for damages for such a breach. And there’s very likely more than a few sympathetic judges in the country to the idea. Do you really want to be hauled up on the witness stand to defend your use of the SSN in your database?

Given that SSNs aren’t unique, and therefore fail as their primary purpose in a data management scheme, and that they represent a huge liability because of their sensitive nature, why on earth would you want them in your database?

A Call

But more importantly, companies aren’t going to stop using them for these kinds of purposes until we make them stop. Any time a company asks you for your SSN, challenge them. Ask them why they need it, if the transaction can be completed without it, and if they insist on having it, a formal declaration of their sensitive information policy and what kind of notification and compensation you can expect when they suffer a sensitive data breach. It may take a while to find somebody within the company who can answer your questions at the places that legitimately need the information, but you’ll get there eventually. And for the rest of the companies that gather it “just in case”, well, if it starts turning into a huge PITA to get them, they’ll find other ways to figure out who you are.

This is a call to arms, folks: Just say NO to handing over your SSN.


.NET | Android | Azure | C# | C++ | Conferences | Development Processes | F# | Flash | Industry | iPhone | Java/J2EE | Languages | LLVM | Mac OS | Objective-C | Parrot | Personal | Python | Reading | Review | Ruby | Scala | Security | Social | Solaris | Visual Basic | VMWare | WCF | Windows | XML Services | XNA

Friday, March 16, 2012 11:10:49 PM (Pacific Daylight Time, UTC-07:00)
Comments [1]  | 
 Wednesday, January 25, 2012
Is Programming Less Exciting Today?

As discriminatory as this is going to sound, this one is for the old-timers. If you started programming after the turn of the milennium, I don’t know if you’re going to be able to follow the trend of this post—not out of any serious deficiency on your part, hardly that. But I think this is something only the old-timers are going to identify with. (And thus, do I alienate probably 80% of my readership, but so be it.)

Is it me, or is programming just less interesting today than it was two decades ago?

By all means, shake your smartphones and other mobile devices at me and say, “Dude, how can you say that?”, but in many ways programming for Android and iOS reminds me of programming for Windows and Mac OS two decades ago. HTML 5 and JavaScript remind me of ten years ago, the first time HTML and JavaScript came around. The discussions around programming languages remind me of the discussions around C++. The discussions around NoSQL remind me of the arguments both for and against relational databases. It all feels like we’ve been here before, with only the names having changed.

Don’t get me wrong—if any of you comment on the differences between HTML 5 now and HTML 3.2 then, or the degree of the various browser companies agreeing to the standard today against the “browser wars” of a decade ago, I’ll agree with you. This isn’t so much of a rational and logical discussion as it is an emotive and intuitive one. It just feels similar.

To be honest, I get this sense that across the entire industry right now, there’s a sort of malaise, a general sort of “Bah, nothing really all that new is going on anymore”. NoSQL is re-introducing storage ideas that had been around before but were discarded (perhaps injudiciously and too quickly) in favor of the relational model. Functional languages have obviously been in place since the 50’s (in Lisp). And so on.

More importantly, look at the Java community: what truly innovative ideas have emerged here in the last five years? Every new open-source project or commercial endeavor either seems to be a refinement of an idea before it (how many different times are we going to create a new Web framework, guys?) or an attempt to leverage an idea coming from somewhere else (be it from .NET or from Ruby or from JavaScript or….). With the upcoming .NET 4.5 release and Windows 8, Microsoft is holding out very little “new and exciting” bits for the community to invest emotionally in: we hear about “async” in C# 5 (something that F# has had already, thank you), and of course there is WinRT (another platform or virtual machine… sort of), and… well, honestly, didn’t we just do this a decade ago? Where is the WCFs, the WPFs, the Silverlights, the things that would get us fired up? Hell, even a new approach to data access might stir some excitement. Node.js feels like an attempt to reinvent the app server, but if you look back far enough you see that the app server itself was reinvented once (in the Java world) in Spring and other lightweight frameworks, and before that by people who actually thought to write their own web servers in straight Java. (And, for the record, the whole event-driven I/O thing is something that’s been done in both Java and .NET a long time before now.)

And as much as this is going to probably just throw fat on the fire, all the excitement around JavaScript as a language reminds me of the excitement about Ruby as a language. Does nobody remember that Sun did this once already, with Phobos? Or that Netscape did this with LiveScript? JavaScript on the server end is not new, folks. It’s just new to the people who’d never seen it before.

In years past, there has always seemed to be something deeper, something more exciting and more innovative that drives the industry in strange ways. Artificial Intelligence was one such thing: the search to try and bring computers to a state of human-like sentience drove a lot of interesting ideas and concepts forward, but over the last decade or two, AI seems to have lost almost all of its luster and momentum. User interfaces—specifically, GUIs—were another force for a while, until GUIs got to the point where they were so common and so deeply rooted in their chosen pasts (the single-button of the Mac, the menubar-per-window of Windows, etc) that they left themselves so little room for maneuver. At least this is one area where Microsoft is (maybe) putting the fatted sacred cow to the butcher’s knife, with their Metro UI moves in Windows 8… but only up to a point.

Maybe I’m just old and tired and should hang up my keyboard and go take up farming, then go retire to my front porch’s rocking chair and practice my Hey you kids! Getoffamylawn! or something. But before you dismiss me entirely, do me a favor and tell me: what gets you excited these days? If you’ve been programming for twenty years, what about the industry today gets your blood moving and your mind sharpened?


.NET | Android | Azure | C# | C++ | Development Processes | F# | Flash | Industry | iPhone | Java/J2EE | Languages | LLVM | Mac OS | Objective-C | Parrot | Personal | Python | Ruby | Scala | Security | Social | Solaris | Visual Basic | VMWare | WCF | Windows | XML Services | XNA

Wednesday, January 25, 2012 3:24:43 PM (Pacific Standard Time, UTC-08:00)
Comments [34]  | 
 Sunday, January 01, 2012
Tech Predictions, 2012 Edition

Well, friends, another year has come and gone, and it's time for me to put my crystal ball into place and see what the upcoming year has for us. But, of course, in the long-standing tradition of these predictions, I also need to put my spectacles on (I did turn 40 last year, after all) and have a look at how well I did in this same activity twelve months ago.

Let's see what unbelievable gobs of hooey I slung last year came even remotely to pass. For 2011, I said....

  • THEN: Android’s penetration into the mobile space is going to rise, then plateau around the middle of the year. Android phones, collectively, have outpaced iPhone sales. That’s a pretty significant statistic—and it means that there’s fewer customers buying smartphones in the coming year. More importantly, the first generation of Android slates (including the Galaxy Tab, which I own), are less-than-sublime, and not really an “iPad Killer” device by any stretch of the imagination. And I think that will slow down people buying Android slates and phones, particularly since Google has all but promised that Android releases will start slowing down.
    • NOW: Well, I think I get a point for saying that Android's penetration will rise... but then I lose it for suggesting that it would slow down. Wow, was I wrong on that. Once Amazon put the Kindle Fire out, suddenly for the first time Android tablets began to appear in peoples' hands in record numbers. The drawback here is that most people using the Fire don't realize it's an Android tablet, which certainly hurts Google's brand-awareness (not that Amazon really seems to mind), but the upshot is simple: people are still buying devices, even though they may already own one. Which amazes me.
  • THEN: Windows Phone 7 penetration into the mobile space will appear huge, then slow down towards the middle of the year. Microsoft is getting some pretty decent numbers now, from what I can piece together, and I think that’s largely the “I love Microsoft” crowd buying in. But it’s a pretty crowded place right now with Android and iPhone, and I’m not sure if the much-easier Office and/or Exchange integration is enough to woo consumers (who care about Office) or business types (who care about Exchange) away from their Androids and iPhones.
    • NOW: Despite the catastrophic implosion of RIM (thus creating a huge market of people looking to trade their Blackberrys in for other mobile phones, ones which won't all go down when a RIM server implodes), WP7 has definitely not emerged as the "third player" in the mobile space; or, perhaps more precisely, they feel like a distant third, rather than a creditable alternative to the other two. In fact, more and more it just feels like this is a two-horse race and Microsoft is in it still because they're willing to throw loss after loss to stay in it. (For what reason, I'm not sure--it's not clear to me that they can ever reach a point of profitability here, even once Nokia makes the transition to WP7, which is supposedly going to take years. On the order of a half-decade or so.) Even living here in Redmon, where I would expect the WP7 concentration to be much, much higher than anywhere else in the world, it's still more common to see iPhones and 'droids in peoples' hands than it is to see WP7 phones.
  • THEN: Android, iOS and/or Windows Phone 7 becomes a developer requirement. Developers, if you haven’t taken the time to learn how to program one of these three platforms, you are electing to remove yourself from a growing market that desperately wants people with these skills. I see the “mobile native app development” space as every bit as hot as the “Internet/Web development” space was back in 2000. If you don’t have a device, buy one. If you have a device, get the tools—in all three cases they’re free downloads—and start writing stupid little apps that nobody cares about, so you can have some skills on the platform when somebody cares about it.
    • NOW: Wow, yes. Right now, if you are a developer and you haven't spent at least a little time learning mobile development, you are excluding yourself from a development "boom" that rivals the one around Web sites in the mid-90's. Seriously: remember when everybody had to have a website? That's the mentality right now with a ton of different companies--"we have to have a mobile app!" "But we sell condom lubricant!" "Doesn't matter! We need a mobile app! Build us something! Go go go go go!"
  • THEN: The Windows 7 slates will suck. This isn’t a prediction, this is established fact. I played with an “ExoPC” 10” form factor slate running Windows 7 (Dell I think was the manufacturer), and it was a horrible experience. Windows 7, like most OSes, really expects a keyboard to be present, and a slate doesn’t have one—so the OS was hacked to put a “keyboard” button at the top of the screen that would slide out to let you touch-type on the slate. I tried to fire up Notepad and type out a haiku, and it was an unbelievably awkward process. Android and iOS clearly own the slate market for the forseeable future, and if Dell has any brains in its corporate head, it will phone up Google tomorrow and start talking about putting Android on that hardware.
    • NOW: Yeah, that was something of a "gimme" point (but I'll take it). Windows7 on a slate was a Bad Idea, and I'm pretty sure the sales reflect that. Conduct your own anecdotal poll: see if you can find a store somewhere in your town or city that will actually sell you a Windows7 slate. Can't find one? I can--it's the Microsoft store in town, and I'm not entirely sure they still stock them. Certainly our local Best Buy doesn't.
  • THEN: DSLs mostly disappear from the buzz. I still see no strawman (no “pet store” equivalent), and none of the traditional builders-of-strawmen (Microsoft, Oracle, etc) appear interested in DSLs much anymore, so I think 2010 will mark the last year that we spent any time talking about the concept.
    • NOW: I'm going to claim a point here, too. DSLs have pretty much left us hanging. Without a strawman for developers to "get", the DSL movement has more or less largely died out. I still sometimes hear people refer to something that isn't a programming language but does something technical as a "DSL" ("That shipping label? That's a DSL!"), and that just tells me that the concept never really took root.
  • THEN: Facebook becomes more of a developer requirement than before. I don’t like Mark Zuckerburg. I don’t like Facebook’s privacy policies. I don’t particularly like the way Facebook approaches the Facebook Connect experience. But Facebook owns enough people to be the fourth-largest nation on the planet, and probably commands an economy of roughly that size to boot. If your app is aimed at the Facebook demographic (that is, everybody who’s not on Twitter), you have to know how to reach these people, and that means developing at least some part of your system to integrate with it.
    • NOW: Facebook, if anything, has become more important through 2011, particularly for startups looking to get some exposure and recognition. Facebook continues to screw with their user experience, though, and they keep screwing with their security policies, and as "big" a presence as they have, it's not invulnerable, and if they're not careful, they're going to find themselves on the other side of the relevance curve.
  • THEN: Twitter becomes more of a developer requirement, too. Anybody who’s not on Facebook is on Twitter. Or dead. So to reach the other half of the online community, you have to know how to connect out with Twitter.
    • NOW: Twitter's impact has become deeper, but more muted in some ways--people don't think of Twitter as a "new" channel, but one that they've come to expect and get used to. At the same time, how Twitter is supposed to factor into different applications isn't always clear, which hinders Twitter's acceptance and "must-have"-ness. Of course, Twitter could care less, it seems, though it still confuses me how they actually make money.
  • THEN: XMPP becomes more of a developer requirement. XMPP hasn’t crossed a lot of people’s radar screen before, but Facebook decided to adopt it as their chat system communication protocol, and Google’s already been using it, and suddenly there’s a whole lotta traffic going over XMPP. More importantly, it offers a two-way communication experience that is in some scenarios vastly better than what HTTP offers, yet running in a very “Internet-friendly” way just as HTTP does. I suspect that XMPP is going to start cropping up in a number of places as a useful alternative and/or complement to using HTTP.
    • NOW: Well, unfortunately, XMPP still hides underneath other names and still doesn't come to mind when people are thinking about communication, leaving this one way unfulfilled. *sigh* Maybe someday we will learn that not everything has to go over HTTP, but it didn't happen in 2011.
  • THEN: “Gamification” starts making serious inroads into non-gaming systems. Maybe it’s just because I’ve been talking more about gaming, game design, and game implementation last year, but all of a sudden “gamification”—the process of putting game-like concepts into non-game applications—is cresting in a big way. FourSquare, Yelp, Gowalla, suddenly all these systems are offering achievement badges and scoring systems for people who want to play in their worlds. How long is it before a developer is pulled into a meeting and told that “we need to put achievement badges into the call-center support application”? Or the online e-commerce portal? It’ll start either this year or next.
    • NOW: Gamification is emerging, but slowly and under the radar. It's certainly not as strong as I thought it would be, but gamification concepts are sneaking their way into a variety of different scenarios (beyond games themselves). Probably can't claim a point here, no.
  • THEN: Functional languages will hit a make-or-break point. I know, I said it last year. But the buzz keeps growing, and when that happens, it usually means that it’s either going to reach a critical mass and explode, or it’s going to implode—and the longer the buzz grows, the faster it explodes or implodes, accordingly. My personal guess is that the “F/O hybrids”—F#, Scala, etc—will continue to grow until they explode, particularly since the suggested v.Next changes to both Java and C# have to be done as language changes, whereas futures for F# frequently are either built as libraries masquerading as syntax (such as asynchronous workflows, introduced in 2.0) or as back-end library hooks that anybody can plug in (such as type providers, introduced at PDC a few months ago), neither of which require any language revs—and no concerns about backwards compatibility with existing code. This makes the F/O hybrids vastly more flexible and stable. In fact, I suspect that within five years or so, we’ll start seeing a gradual shift away from pure O-O systems, into systems that use a lot more functional concepts—and that will propel the F/O languages into the center of the developer mindshare.
    • NOW: More than any of my other predictions (or subjects of interest), functional languages stump me the most. On the one hand, there doesn't seem to be a drop-off of interest in the subject, based on a variety of anecdotal evidence (books, articles, etc), but on the other hand, they don't seem to be crossing over into the "mainstream" programming worlds, either. At best, we can say that they are entering the mindset of senior programmers and/or project leads and/or architects, but certainly they don't seem to be turning in to the "go-to" language for projects being done in 2011.
  • THEN: The Microsoft Kinect will lose its shine. I hate to say it, but I just don’t see where the excitement is coming from. Remember when the Wii nunchucks were the most amazing thing anybody had ever seen? Frankly, after a slew of initial releases for the Wii that made use of them in interesting ways, the buzz has dropped off, and more importantly, the nunchucks turned out to be just another way to move an arrow around on the screen—in other words, we haven’t found particularly novel and interesting/game-changing ways to use the things. That’s what I think will happen with the Kinect. Sure, it’s really freakin’ cool that you can use your body as the controller—but how precise is it, how quickly can it react to my body movements, and most of all, what new user interface metaphors are people going to have to come up with in order to avoid the “me-too” dancing-game clones that are charging down the pipeline right now?
    • NOW: Kinect still makes for a great Christmas or birthday present, but nobody seems to be all that amazed by the idea anymore. Certainly we aren't seeing a huge surge in using Kinect as a general user interface device, at least not yet. Maybe it needed more time for people to develop those new metaphors, but at the same time, I would've expected at least a few more games to make use of it, and I haven't seen any this past year.
  • THEN: There will be no clear victor in the Silverlight-vs-HTML5 war. And make no mistake about it, a war is brewing. Microsoft, I think, finds itself in the inenviable position of having two very clearly useful technologies, each one’s “sphere of utility” (meaning, the range of answers to the “where would I use it?” question) very clearly overlapping. It’s sort of like being a football team with both Brett Favre and Tom Brady on your roster—both of them are superstars, but you know, deep down, that you have to cut one, because you can’t devote the same degree of time and energy to both. Microsoft is going to take most of 2011 and probably part of 2012 trying to support both, making a mess of it, offering up conflicting rationale and reasoning, in the end achieving nothing but confusing developers and harming their relationship with the Microsoft developer community in the process. Personally, I think Microsoft has no choice but to get behind HTML 5, but I like a lot of the features of Silverlight and think that it has a lot of mojo that HTML 5 lacks, and would actually be in favor of Microsoft keeping both—so long as they make it very clear to the developer community when and where each should be used. In other words, the executives in charge of each should be locked into a room and not allowed out until they’ve hammered out a business strategy that is then printed and handed out to every developer within a 3-continent radius of Redmond. (Chances of this happening: .01%)
    • NOW: Well, this was accurate all the way up until the last couple of months, when Microsoft made it fairly clear that Silverlight was being effectively "put behind" HTML 5, despite shipping another version of Silverlight. In the meantime, though, they've tried to support both (and some Silverlighters tell me that the Silverlight team is still looking forward to continuing supporting it, though I'm not sure at this point what is rumor and what is fact anymore), and yes, they confused the hell out of everybody. I'm surprised they pulled the trigger on it in 2011, though--I expected it to go a version or two more before they finally pulled the rug out.
  • THEN: Apple starts feeling the pressure to deliver a developer experience that isn’t mired in mid-90’s metaphor. Don’t look now, Apple, but a lot of software developers are coming to your platform from Java and .NET, and they’re bringing their expectations for what and how a developer IDE should look like, perform, and do, with them. Xcode is not a modern IDE, all the Apple fan-boy love for it notwithstanding, and this means that a few things will happen:
    • Eclipse gets an iOS plugin. Yes, I know, it wouldn’t work (for the most part) on a Windows-based Eclipse installation, but if Eclipse can have a native C/C++ developer experience, then there’s no reason why a Mac Eclipse install couldn’t have an Objective-C plugin, and that opens up the idea of using Eclipse to write iOS and/or native Mac apps (which will be critical when the Mac App Store debuts somewhere in 2011 or 2012).
    • Rumors will abound about Microsoft bringing Visual Studio to the Mac. Silverlight already runs on the Mac; why not bring the native development experience there? I’m not saying they’ll actually do it, and certainly not in 2011, but the rumors, they will be flyin….
    • Other third-party alternatives to Xcode will emerge and/or grow. MonoTouch is just one example. There’s opportunity here, just as the fledgling Java IDE market looked back in ‘96, and people will come to fill it.
    • NOW: Xcode 4 is "better", but it's still not what I would call comparable to the Microsoft Visual Studio or JetBrains IDEA experience. LLVM is definitely a better platform for the company's development efforts, long-term, and it's encouraging that they're investing so heavily into it, but I still wish the overall development experience was stronger. Meanwhile, though, no Eclipse plugin has emerged (that I'm aware of), which surprised me, and neither did we see Microsoft trying to step into that world, which doesn't surprise me, but disappoints me just a little. I realize that Microsoft's developer tools are generally designed to support the Windows operating system first, but Microsoft has to cut loose from that perspective if they're going to survive as a company. More on that later.
  • THEN: NoSQL buzz grows. The NoSQL movement, which sort of got started last year, will reach significant states of buzz this year. NoSQL databases have a lot to offer, particularly in areas that relational databases are weak, such as hierarchical kinds of storage requirements, for example. That buzz will reach a fever pitch this year, and the relational database moguls (Microsoft, Oracle, IBM) will start to fight back.
    • NOW: Well, the buzz certainly grew, and it surprised me that the big storage guys (Microsoft, IBM, Oracle) didn't do more to address it; I was expecting features to emerge in their database products to address some of the features present in MongoDB or CouchDB or some of the others, such as "schemaless" or map/reduce-style queries. Even just incorporating JavaScript into the engine somewhere would've generated a reaction.

Overall, it appears I'm running at about my usual 50/50 levels of prognostication. So be it. Let's see what the ol' crystal ball has in mind for 2012:

  • Lisps will be the languages to watch. With Clojure leading the way, Lisps (that is, languages that are more or less loosely based on Common Lisp or one of its variants) are slowly clawing their way back into the limelight. Lisps are both functional languages as well as dynamic languages, which gives them a significant reason for interest. Clojure runs on top of the JVM, which makes it highly interoperable with other JVM languages/systems, and Clojure/CLR is the version of Clojure for the CLR platform, though there seems to be less interest in it in the .NET world (which is a mistake, if you ask me).
  • Functional languages will.... I have no idea. As I said above, I'm kind of stymied on the whole functional-language thing and their future. I keep thinking they will either "take off" or "drop off", and they keep tacking to the middle, doing neither, just sort of hanging in there as a concept for programmers to take and run with. Mind you, I like functional languages, and I want to see them become mainstream, or at least more so, but I keep wondering if the mainstream programming public is ready to accept the ideas and concepts hiding therein. So this year, let's try something different: I predict that they will remain exactly where they are, neither "done" nor "accepted", but continue next year to sort of hang out in the middle.
  • F#'s type providers will show up in C# v.Next. This one is actually a "gimme", if you look across the history of F# and C#: for almost every version of F# v."N", features from that version show up in C# v."N+1". More importantly, F# 3.0's type provider feature is an amazing idea, and one that I think will open up language research in some very interesting ways. (Not sure what F#'s type providers are or what they'll do for you? Check out Don Syme's talk on it at BUILD last year.)
  • Windows8 will generate a lot of chatter. As 2012 progresses, Microsoft will try to force a lot of buzz around it by keeping things under wraps until various points in the year that feel strategic (TechEd, BUILD, etc). In doing so, though, they will annoy a number of people by not talking about them more openly or transparently. What's more....
  • Windows8 ("Metro")-style apps won't impress at first. The more I think about it, the more I'm becoming convinced that Metro-style apps on a desktop machine are going to collectively underwhelm. The UI simply isn't designed for keyboard-and-mouse kinds of interaction, and that's going to be the hardware setup that most people first experience Windows8 on--contrary to what (I think) Microsoft thinks, people do not just have tablets laying around waiting for Windows 8 to be installed on it, nor are they going to buy a Windows8 tablet just to try it out, at least not until it's gathered some mojo behind it. Microsoft is going to have to finesse the messaging here very, very finely, and that's not something they've shown themselves to be particularly good at over the last half-decade.
  • Scala will get bigger, thanks to Heroku. With the adoption of Scala and Play for their Java apps, Heroku is going to make Scala look attractive as a development platform, and the adoption of Play by Typesafe (the same people who brought you Akka) means that these four--Heroku, Scala, Play and Akka--will combine into a very compelling and interesting platform. I'm looking forward to seeing what comes of that.
  • Cloud will continue to whip up a lot of air. For all the hype and money spent on it, it doesn't really seem like cloud is gathering commensurate amounts of traction, across all the various cloud providers with the possible exception of Amazon's cloud system. But, as the different cloud platforms start to diversify their platform technology (Microsoft seems to be leading the way here, ironically, with the introduction of Java, Hadoop and some limited NoSQL bits into their Azure offerings), and as we start to get more experience with the pricing and costs of cloud, 2012 might be the year that we start to see mainstream cloud adoption, beyond "just" the usage patterns we've seen so far (as a backing server for mobile apps and as an easy way to spin up startups).
  • Android tablets will start to gain momentum. Amazon's Kindle Fire has hit the market strong, definitely better than any other Android-based tablet before it. The Nooq (the Kindle's principal competitor, at least in the e-reader world) is also an Android tablet, which means that right now, consumers can get into the Android tablet world for far, far less than what an iPad costs. Apple rumors suggest that they may have a 7" form factor tablet that will price competitively (in the $200/$300 range), but that's just rumor right now, and Apple has never shown an interest in that form factor, which means the 7" world will remain exclusively Android's (at least for now), and that's a nice form factor for a lot of things. This translates well into more sales of Android tablets in general, I think.
  • Apple will release an iPad 3, and it will be "more of the same". Trying to predict Apple is generally a lost cause, particularly when it comes to their vaunted iOS lines, but somewhere around the middle of the year would be ripe for a new iPad, at the very least. (With the iPhone 4S out a few months ago, it's hard to imagine they'd cannibalize those sales by releasing a new iPhone, until the end of the year at the earliest.) Frankly, though, I don't expect the iPad 3 to be all that big of a boost, just a faster processor, more storage, and probably about the same size. Probably the only thing I'd want added to the iPad would be a USB port, but that conflicts with the Apple desire to present the iPad as a "device", rather than as a "computer". (USB ports smack of "computers", not self-contained "devices".)
  • Apple will get hauled in front of the US government for... something. Apple's recent foray in the legal world, effectively informing Samsung that they can't make square phones and offering advice as to what will avoid future litigation, smacks of such hubris and arrogance, it makes Microsoft look like a Pollyanna Pushover by comparison. It is pretty much a given, it seems to me, that a confrontation in the legal halls is not far removed, either with the US or with the EU, over anti-cometitive behavior. (And if this kind of behavior continues, and there is no legal action, it'll be pretty apparent that Apple has a pretty good set of US Congressmen and Senators in their pocket, something they probably learned from watching Microsoft and IBM slug it out rather than just buy them off.)
  • IBM will be entirely irrelevant again. Look, IBM's main contribution to the Java world is/was Eclipse, and to a much lesser degree, Harmony. With Eclipse more or less "done" (aside from all the work on plugins being done by third parties), and with IBM abandoning Harmony in favor of OpenJDK, IBM more or less removes themselves from the game, as far as developers are concerned. Which shouldn't really be surprising--they've been more or less irrelevant pretty much ever since the mid-2000s or so.
  • Oracle will "screw it up" at least once. Right now, the Java community is poised, like a starving vulture, waiting for Oracle to do something else that demonstrates and befits their Evil Emperor status. The community has already been quick (far too quick, if you ask me) to highlight Oracle's supposed missteps, such as the JVM-crashing bug (which has already been fixed in the _u1 release of Java7, which garnered no attention from the various Java news sites) and the debacle around Hudson/Jenkins/whatever-the-heck-we-need-to-call-it-this-week. I'll grant you, the Hudson/Jenkins debacle was deserving of ire, but Oracle is hardly the Evil Emperor the community makes them out to be--at least, so far. (I'll admit it, though, I'm a touch biased, both because Brian Goetz is a friend of mine and because Oracle TechNet has asked me to write a column for them next year. Still, in the spirit of "innocent until proven guilty"....)
  • VMWare/SpringSource will start pushing their cloud solution in a major way. Companies like Microsoft and Google are pushing cloud solutions because Software-as-a-Service is a reoccurring revenue model, generating revenue even in years when the product hasn't incremented. VMWare, being a product company, is in the same boat--the only time they make money is when they sell a new copy of their product, unless they can start pushing their virtualization story onto hardware on behalf of clients--a.k.a. "the cloud". With SpringSource as the software stack, VMWare has a more-or-less complete cloud play, so it's surprising that they didn't push it harder in 2011; I suspect they'll start cramming it down everybody's throats in 2012. Expect to see Rod Johnson talking a lot about the cloud as a result.
  • JavaScript hype will continue to grow, and by years' end will be at near-backlash levels. JavaScript (more properly known as ECMAScript, not that anyone seems to care but me) is gaining all kinds of steam as a mainstream development language (as opposed to just-a-browser language), particularly with the release of NodeJS. That hype will continue to escalate, and by the end of the year we may start to see a backlash against it. (Speaking personally, NodeJS is an interesting solution, but suggesting that it will replace your Tomcat or IIS server is a bit far-fetched; event-driven I/O is something both of those servers have been doing for years, and the rest of it is "just" a language discussion. We could pretty easily use JavaScript as the development language inside both servers, as Sun demonstrated years ago with their "Phobos" project--not that anybody really cared back then.)
  • NoSQL buzz will continue to grow, and by years' end will start to generate a backlash. More and more companies are jumping into NoSQL-based solutions, and this trend will continue to accelerate, until some extremely public failure will start to generate a backlash against it. (This seems to be a pattern that shows up with a lot of technologies, so it seems entirely realistic that it'll happen here, too.) Mind you, I don't mean to suggest that the backlash will be factual or correct--usually these sorts of things come from misuing the tool, not from any intrinsic failure in it--but it'll generate some bad press.
  • Ted will thoroughly rock the house during his CodeMash keynote. Yeah, OK, that's more of a fervent wish than a prediction, but hey, keep a positive attitude and all that, right?
  • Ted will continue to enjoy his time working for Neudesic. So far, it's been great working for these guys, and I'm looking forward to a great 2012 with them. (Hopefully this will be a prediction I get to tack on for many years to come, too.)

I hope that all of you have enjoyed reading these, and I wish you and yours a very merry, happy, profitable and fulfilling 2012. Thanks for reading.


.NET | Android | C# | C++ | Conferences | Development Processes | F# | Flash | Industry | iPhone | Java/J2EE | Languages | LLVM | Mac OS | Objective-C | Parrot | Personal | Ruby | Scala | VMWare | Windows

Sunday, January 01, 2012 10:17:28 PM (Pacific Standard Time, UTC-08:00)
Comments [2]  | 
 Tuesday, December 27, 2011
Changes, changes, changes

Many of you have undoubtedly noticed that my blogging has dropped off precipitously over the last half-year. The reason for that is multifold, ranging from the usual “I just don’t seem to have the time for it” rationale, up through the realization that I have a couple of regular (paid) columns (one with CoDe Magazine, one with MSDN) that consume a lot of my ideas that would otherwise go into the blog.

But most of all, the main reason I’m finding it harder these days to blog is that as of July of this year, I have joined forces with Neudesic, LLC, as a full-time employee, working as an Architectural Consultant for them.

Neudesic is a Microsoft partner (as a matter of fact, as I understand it we were Microsoft’s Partner of the Year not too long ago), with several different technology practices, including a Mobile practice, a User Experience practice, a Connected Systems practice, and a Custom Application Development practice, among others. The company is (as of this writing) about 400 consultants strong, with a number of Microsoft MVPs and Regional Directors on staff, including a personal friend of mine, Simon Guest, who heads up the Mobile Practice, and another friend, Rick Garibay, who is the Practice Director for Connected Systems. And that doesn’t include the other friends I have within the company, as well as the people within the company who are quickly becoming new friends. I’m even more tickled that I was instrumental in bringing Steven “Doc” List in, to bring his agile experience and perspective to our projects nationwide. (Plus I just like working with Doc.)

It’s been a great partnership so far: they ask me to continue doing the speaking and writing that I love to do, bringing fame and glory (I hope!) to the Neudesic name, and in turn I get to jump in on a variety of different projects as an architect and mentor. The people I’m working with are great, top-notch technology experts and just some of the nicest people I’ve met. Plus, yes, it’s nice to draw a regular bimonthly paycheck and benefits after being an independent for a decade or so.

The fact that they’re principally a .NET shop may lead some to conclude that this is my farewell letter to the Java community, but in fact the opposite is the case. I’m actively engaged with our Mobile practice around Android (and iOS) development, and I’m subtly and covertly (sssh! Don’t tell the partners!) trying to subvert the company into expanding our technology practices into the Java (and Ruby/Rails) space.

With the coming new year, I think one of my upcoming responsibilities will be to blog more, so don’t be too surprised if you start to see more activity on a more regular basis here. But in the meantime, I’m working on my end-of-year predictions and retrospective, so keep an eye out for that in the next few days.

(Oh, and that link that appears across the bottom of my blog posts? Someday I’m going to remember how to change the text for that in the blog engine and modify it to read something more Neudesic-centric. But for now, it’ll work.)


.NET | Android | Azure | C# | C++ | Conferences | Development Processes | F# | Flash | Industry | iPhone | Java/J2EE | Languages | Mac OS | Personal | Ruby | Scala | Security | Social | Visual Basic | WCF | XML Services

Tuesday, December 27, 2011 1:53:14 PM (Pacific Standard Time, UTC-08:00)
Comments [0]  | 
 Wednesday, October 05, 2011
God speed, Mr. Jobs

I received the news that Steve Jobs passed away today while packing my kit to fly down to LA tomorrow morning to attend the funeral of my step-grandmother (my father’s stepmother), Ruth Neward.

The reason I mention this is that Grandma Ruth is and will always be linked to the man she married, my father’s father and the man for whom I was named, Theodore Chester Neward, who died a few years ago after a short battle with cancer. Pancreatic cancer, if I’m not mistaken, the same disease that brought Steve Jobs down. Grandma Ruth lived for Grandpa Ted—she was his support structure, his moral backing, and his faithful companion all throughout the years that I knew them.

My grandfather, like Mr Jobs, was an inventor. He invented several devices that, while bringing nowhere near the kind of income or world-changing impact that Mr Jobs’ devices brought, still changed the world just a little. His principal invention was a handheld, hand-operated vacuum pump that he called the “Mityvac”, to which the Neward Enterprises, Inc marketing department added the tagline, “It’s a useful little sucker!” because of its versatility. It had uses across a broad spectrum of industries, from automobile repair and maintenance (as a one-man brake bleeding kit) to medical emergency use (as an anti-choking device, one which then-Governor Reagan carried with him during state dinners, in case Nancy started to choke, which she apparently was prone to do), to pediatric use (as a replacement for forceps to deliver a child—pop a small cap on the baby’s head, draw a small vacuum, and the doctor now has a “handle” to help pull the baby out of the birth canal). Though the Mityvac (and the anti-choking “Throat-E-Vac”) will never reach the levels of world-shattering dominance that the iOS and MacOS devices will, there is a good chance that many of the readers of this blog (if they are under the age of 25) were in fact touched by this device in the very first few minutes of their lives, and don’t have the “conehead” shape to their head (that forceps inflict on newborns) to prove it.

My grandfather, like Mr. Jobs, never stopped inventing things. To his grave, he was still “tinkering” in the shop, working on a more efficient carbeurator for gasoline engines. And his was the only indoor pool in Banks, Oregon, that not only was a full-length Olympic-size pool, but also was heated by a wood-fire steam-powered system of his own design. In a frighteningly good demonstration of the dangers of custom-built systems, the only documentation to go along with it are the strange markings on the wall and pipes that probably meant something to him, but to the rest of us, is pure gibberish. (Note to self: get a photo of that before they replace it with something boring and standard.)

Unlike Mr, Jobs, my grandfather never really understood what it is I did. When the volume on his TV was too loud on turning it on, he was told that “that’s just how TV’s work—they remember the volume from before you turned it off”, and he turned to my father and said, “You should get Teddy to work on that”.

I was always “Teddy” to him, and to Grandma Ruth, and to this day they are the only people in the world I allowed to call me that. Now they are both gone, and I will miss them terribly.

My grandfather built an amazing legacy in the plastics industry. In many ways, I hope I leave even a tenth as amazing a legacy within my own. You, readers, will have to be the judge of that.

To the family of Steve Jobs, and all of his friends and associates at Apple, I grieve with you.


Personal

Wednesday, October 05, 2011 11:58:41 PM (Pacific Daylight Time, UTC-07:00)
Comments [0]  |