JOB REFERRALS
    ON THIS PAGE
    ARCHIVES
    CATEGORIES
    BLOGROLL
    LINKS
    SEARCH
    MY BOOKS
    DISCLAIMER
 
 Saturday, April 13, 2013
Say that part about HTML standards, again?

In incarnations past, I have had debates, public and otherwise, with friends and colleagues who have asserted that HTML5 (by which we really mean HTML5/JavaScript/CSS3) will essentially become the platform of choice for all applications going forward—that essentially, this time, standards will win out, and companies that try to subvert the open nature of the web by creating their own implementations with their own extensions and proprietary features that aren’t part of the standards, lose.

Then, I read the Wired news post about Google’s departure from WebKit, and I’m a little surprised that the Internet (and by “the Internet”, I mean “the very people who get up in arms about standards and subverting them and blah blah blah”) hasn’t taken more issues with some of the things cited therein:

Google’s decision is in tune with its overall efforts to improve the infrastructure of the internet. When it comes to browser software and other web technologies that directly effect the how quickly and effectively your machine grabs and displays webpages, the company likes to use open source technologies. That way, it can feed their adoption outside the company — and ultimately improve the delivery of its many online services (including all important advertisements). But if it believes the rest of the web is moving too slowly, it has no problem starting up its own project.

Just to be clear, Google is happy to use open-source technologies, so it can feed adoption of those technologies, but if it’s something that Google thinks is being adopted too slowly—like, say, Google’s extensions to the various standards that aren’t being picked up by its competitors—then Google feels the need to kick off its own thing. Interesting.

… [T]he trouble with WebKit is that is used different “multi-process architecture” than its Chrome browser, which basically means it didn’t handle concurrent tasks in the same way. When Chrome was first released in 2008 WebKit didn’t have a multi-process architecture, so Google had to build its own. WebKit2, released in 2010, adds multi-process features, but is quite different from what Google had already built. Apple and Google don’t see eye to eye on the project, and it became too difficult and too time-consuming for the company juggle the two architectures. “Supporting multiple architectures over the years has led to increasing complexity for both [projects],” the post says. “This has slowed down the collective pace of innovation.”

So… Google tried to use some open-source software, but discovered that the project didn’t work the way they built the rest of their application to work. (I’m certain that’s the first time that has happened, ever.) When the custodians of the project did add the feature Google wanted, the feature was implemented in a manner that still wasn’t in lockstep with the way Google wanted things to work in their application. This meant that “innovation” is “slowed down”.

(As an aside, I find it fascinating that whenever a company adopts open-source, it’s to “foster interoperability and open standards”, but when they abandon open-source, it’s to “foster innovation and faster evolution”. And I’m sure it’s entirely accidental that most of the time, adopting “open standards” is usually when the company is way behind on the technology curve for a given thing, and adopting “faster innovation” is usually when that same company thinks they’ve caught up the distance or surged ahead of their competitors in that space.)

Of course, a new implementation has its risks of bugs and incompatibilities, but Google has a plan for that:

“Throughout this transition, we’ll collaborate closely with other browser vendors to move the web forward and preserve the compatibility that made it a successful ecosystem,” the announcement reads.

Ah, there. See? By collaborating closely with their competitors, they will preserve compatibility. Because when Microsoft did that, everybody was totally OK with that…. uh, and… yeah… it worked pretty well, too, and….

Look, it seems pretty reasonable to assume that even if the tags and the DOM and the APIs are all 100% unchanged from Chrome v.Past to v.Next, there’s still going to be places where they optimize differently than WebKit does, which means now that developers will need to learn (and implement) optimizations in their Web-based applications differently. And frankly, the assumption that Chrome’s Blink and WebKit will somehow be bug-for-bug compatible/identical with each other is a pretty steep bar to accept blindly, considering the history.

Once again, we see the cycle coming around: in the beginning, when a technology is fleshing out, companies yearn for standards in order to create adoption. After a certain tipping point of adoption, however, the major players start to seek ways to avoid becoming a commodity, and start introducing “extensions” and “innovations” that for some odd reason their competitors in the standards meetings don’t seem all that inclined to adopt. That’s when they start forking and shying away from staying true to the standard, and eventually, the standard becomes either a least-common-denominator… or a joke.

Anybody want to bet on which outcome emerges for HTML5?

(Before you reach for the “Comment” link to flame me all to Hell, yes, even an HTML 5 standard that is 80% consistent across all the browsers is still pretty damn useful—just as a SQL standard that is 80% consistent across all the databases is useful. But this is a far cry from the utopia of interconnectedness and interoperability that was promised to us by the HTMLophiles, and it simply demonstrates that the Circle of TechnoLife continues, unabated, as it has ever since PC manufacturers—and the rest of us watching them--discovered what happens to them when they become a commodity.)


.NET | Android | Azure | C# | C++ | F# | Industry | iPhone | Java/J2EE | Mac OS | Objective-C | Reading | Ruby | Scala | Windows | XML Services

Saturday, April 13, 2013 1:30:45 AM (Pacific Daylight Time, UTC-07:00)
Comments [0]  | 
 Friday, April 05, 2013
"Craftsmanship", by another name

This blog, talking about the "1/10" developer as a sort of factored replacement for the "x10" developer, caught my eye over Twitter. Frankly, I'm not sure what to say about it, but there's a part of me that says I need to say something.

I don't like the terminology "1/10 developer". As the commenters on the author's blog suggest, it implies a denigration of the individual in question. I don't think that was the author's intent, but intentions don't matter--results do. You're still suggesting that this guy is effectively worthless, even if your intent is to say that his programming skills aren't great.

Some programmers shouldn't be. It's hard to say it, but yes, there are going to be some programmers at either end of the bell curve. (Assuming that skill in programming is a bell curve, and some have suggested that it's not, which is its own fascinating discussion, but for another day.) That means that some of the people writing code with you or for you are not going to be from the end you'd hope them to be from. That doesn't necessarily mean they should all immediately retire and take up farming.

Be careful how you measure. The author assumed that because this programmer wasn't able to churn out code at the same rate that the author himself could, the programmer in question was therefore one of these "1/10" programmers. Hubris is a dangerous thing in a CTO, even a temporary one--assuming that you could write it in "like, 2 hours, tops" is a dangerous, dangerous path. Every programmer I've ever known has looked at a feature or a story, thought, "Oh, that should only take me, like, 2 hours, tops" and then discovered later, to his/her chagrin, that there's a lot more involved in that than first considered. It's very possible the author/CTO is a wunderkind programmer who could do everything he talked about in, like, 1 or 2 hours, tops. It's also very possible that this author/CTO misunderstood the problem (which he never once seems to consider).

The teacher isn't finished teaching until the student learns. From the sound of the blog post, it doesn't sound like the author/CTO was really putting that much of an effort into teaching the programmer, but just "leading him step by step" to the solution. Give a man a fish... teach a man to fish.... Not all wunderkind programmer/author/CTOs are great teachers.

Some students just don't learn very well. The sword of teaching swings both ways, though: sometimes, some teachers just can't reach some students. It sucks, but it's life.

This programmer was a PhD candidate? The programmer in question, by the way, was (according to the blog) studying for a PhD at the time. And couldn't grasp MVC? Something is off here. I believe it, on the surface of it, because I worked with a guy who had graduated university with a PhD, and couldn't understand C++ and MFC to save his life, and got fired (and I inherited his project, which was a mess, to be blunt), but he'd spent all his time in university studying artificial intelligence, and had written it all using straight C code because that's what the libraries and platform he was using for his research demanded. I don't think he was a "1/10" developer, I think he was woefully mis-placed. Would you like an offensive lineman and put him as a slot receiver? Would you take a catcher and put him at pitcher? Would you take a Marketing guy and put him on server support? We need to stop thinking that all programmers are skilled alike--this is probably creating more problems than we really realize. Sure, on the whole, it sounds great that "craftsmen" should be able to pick up any tool and be just as effective with that tool as they are with any other--just like a drywaller can pick up a wrench and be just as effective a plumber, and pick up a circuit breaker and be just as effective an electrician. Right?

In the end reckoning, I don't think the "1/10" vs "10x" designation really does a whole lot--I have a hard time caring where the decimal point goes in this particular home-spun tale of metrics. And I'll even give the author the benefit of the doubt and assume the programmer he had was, in fact, from the lower end of the bell curve, and just wasn't capable of putting together the necessary abstractions in his head to get from point "A" to point "B", figuratively and literally.

But to draw this conclusion from a data point of one person? Seems a little sketchy, to me.

Software development, once again, thy name is hubris.


Development Processes | Industry | Languages | Reading | Review | Social

Friday, April 05, 2013 1:35:47 AM (Pacific Daylight Time, UTC-07:00)
Comments [1]  | 
 Thursday, March 21, 2013
On Sexism, Harassment, and Termination

Oh, boy. Diving into this whole Adria Richards/people-getting-fired thing is probably a mistake, but it’s reached levels at which I’m just too annoyed by everyone and everything in this to not say something. You have one of three choices: read the summary below and conclude I’m a misogynist without reading the rest; read the summary below and conclude I’m spot-on without reading the rest; or read the rest and draw your own conclusions after hearing the arguments.

TL;DR Adria Richards was right to be fired; the developer/s from PlayHaven shouldn’t have been fired; the developer/s from PlayHaven could very well be a pair of immature assholes; the rape and death threats against Adria Richards undermine the positions of those who support the developer/s formerly from PlayHaven; the content of the jokes don’t constitute sexism nor should conferences overreact this way; half the Internet will label me a misogynist for these views; and none of this ends well.

The Facts, as I understand them

Three people are sitting in a keynote at a software conference. A presenter makes a comment on stage that leads two people sitting in the audience to start making jokes with all the emotional maturity of Beavis and Butthead. (Said developers are claiming that any and all sexual innuendo was inferred by the third, but frankly, let’s assume worst case here and assume they were, in fact, making cheap tawdry sex jokes out of “dongle” and “forking”.) A third person, after listening to it for a while, turns around, smiles, snaps a photo of the two of them, and Tweets them out as assholes. Conference staff approach third person, ask her to identify the two perpetrators, escort the developers out of the conference based on nothing but her word and (so far as I can tell) zero supporting evidence. Firestorm erupts over the Internet, and now all three (?) are jobless.

(UPDATE: Roberto Guerra mentioned, in private email, that PyCon has published their version of the events, which does not mention the developers being asked to leave; Roberto also tells me that the above link, which states that, apparently got it wrong, and that the original source they used was mistaken. Apologies to PyCon if this is the case.)

My Interpretations

Note that with typical software developer hubris, I feel eminently qualified to comment on all of this. (Which is my way of saying, take all of this with a grain of salt—I have some experience with this, being on the “accused” end of sexual harassment, and what I’m saying stems from my enforced “sit through the class” time from a decade or more ago, but I’m no lawyer, and like everybody else, I’m at the mercy of the reports since I wasn’t there.)

Developers who make “dongle” jokes and “forking” jokes are not only being stupid, those jokes have already been made. So they’re stupid twice over. C’mon, guys. New material. Seriously.

Making jokes in public that others might find offensive is taking a risk. Do it on stage, you run the risk of earning the wrath of the crowd. (Of course, nobody on this blog would, say, drop “the f-bomb” something like 23 times on stage in a keynote, right?) Do it in a crowd, you run the risk of pissing somebody off around you and looking/acting like douche. Might be in your best interests to keep your voice down or just chuckle to yourself and have that conversation later.

Photos taken in public are considered public, if rude. If I walk out into the street and start filming you, I have perfect right to do so, according to US law: what happens in public is considered public domain. Paparazzi depend on this for their “right” to follow and photograph moviestars, atheletes, and other “public” figures. Adria was entirely within her rights to photograph those two and Tweet it. But if I snap a pic of a cute girl and Tweet it with “Wow, want to guess whether her code is hot too?”, it’s a douche move because I’m using her likeness without her permission. If I do that for profit, now I’m actually open to lawsuit. So photos in public are in still something of a grey area, legally. Basic rule of thumb: if you want to be safe, ask before you put a photo of somebody else, taken in public or not, someplace other than on your own private device.

Third parties who overhear conversations could arguably be violating privacy. There’s a fine line here, but eavesdropping is rude. Now, I don’t know how loud they were making the jokes—shouting it out across the room is a very different scenario than whispering it to your seatmate and co-worker—but frankly, it’s usually pretty easy to tell when a joke is meant for general distribution in a room like that, and when it’s not. If it’s not meant for you, how about you just not hear it and concentrate on something else? Chalk up the commentary as “idiots being idiots”, and if there’s no implied threat to anybody going on, leave it be.

If you’re offended, you have an obligation to tell the parties in question and give them a choice to make good. Imagine this scenario: a guy sits down next to a girl on a bus. His leg brushes up against hers. She immediately stands up and shouts out “THIS MAN IS MAKING UNWANTED SEXUAL ADVANCES AT ME!” at the top of her lungs. Who’s the societally maladjusted person here? If, instead, she says, “Oh, please don’t make physical contact with me”, and he says, “But that’s my right as a human male”, and refuses to move his leg from pressing up against hers, then who’s the societally maladjusted one? Slice this one as finely as you like, but if you’re offended at something I do, it’s your responsibility to tell me so that I can make it right, by apologizing and/or ceasing the behavior in question, or telling you that I have Tourette’s, or by telling you you’re an uptight party-pooper, or however else this story can play out. If the party in question continues the behavior, then you’ve got grounds—moral and legal—to go to the authorities.

Just because you call it harassment doesn’t make it such. Legally, from what I remember, harassment is defined as “repeated acts of unwanted sexual attention”; in this case, I don’t see a history of repetition, nor do I see there being actual “attention” to Adria in this case—this was a conversation being held between two individuals that didn’t include her.

Just because it involves sex doesn’t make it sexist. Two guys were making jokes about male genitalia. It may have been inappropriate, but honestly, unless somebody widened the definition of sexism (“making disparaging comments about someone based on their gender or sexual preferences”) when I wasn’t looking, this ain’t it. And for Adria to claim sexism in public is bad when she Tweeted just a few days prior about stuffing a sock down your shorts during a TSA patdown seems a little…. *shrug* You pick the world.

The conference needs to follow basic due process. You know—innocent until proven guilty, measured and proportional response, warnings, and so on. I don’t care what it says on the conference’s website by way of disclaimer—you have to figure out if what was said to happen actually happened before you respond to it. Nowhere in the facts above do I hear the conference taking any steps to protect the accused—a woman said a couple of guys said sexual things, so we must act quickly! This has “bad” written all over it for the next five conferences.

(UPDATE: Again, PyCon apparently didn’t escort the developer/s out of the conference, but instead according to their site, “Both parties were met with, in private. The comments that were made were in poor taste, and individuals involved agreed, apologized and no further actions were taken by the staff of PyCon 2013. No individuals were removed from the conference, no sanctions were levied.” It sounds like, contrary to what I first heard, PyCon handled it in a classy manner, so I apologize for perpetrating the image that they didn’t. Having said that, though, I find it curious that this storm blew up this way—did no one think to push those apologies to Twitter so everyone else knew that things had blown over, or did they in fact do that and we’re all too busy gawking and screaming “fight! fight! fight” on the playground to notice?)

The material shouldn’t matter. I know we’re all being all sexually politically correct these days about women in IT, but this is a Pandora’s Box of a precedent that will eventually get way out of hand, if it isn’t already (and I think it is). Imagine how this story goes for the conference if a man Tweets out a picture of a woman and says, “This woman was talking to another woman and insulted my religion, and the conversation made me uncomfortable.” Is the conference now on the hook to escort those two women out of the building? How about programming language choice? How about race? How about sports teams? Where do we draw this line?

Adria was right to be fired. It’s harsh, but as any celebrity endorsement negotiator will tell you, when you represent a brand, you represent the brand even when the cameras aren’t rolling. (Just ask Tiger Woods about this.) Her actions brought a ton of unwanted negative attention (and a DDOS attack, apparently) to the company; that’s in direct contrast to the reasons they were paying her, and seeing as how her actions were something she did (as opposed to had done to her), her termination is entirely justified. You might see it as a bit harsh, but the company is well within boundaries here.

The PlayHaven developers weren’t right to be fired. Again, nowhere do we see them getting the opportunity to confront their accuser, or make restitution (apology). Now, you can argue that they, too, were representing their firm, but unless their job is to act as an evangelist and brand recognition activities are part of their job description, you can’t terminate them for gross negligence in this. Of course, most employment is “at-will”, meaning a company can fire you for any reason it likes, but this is sort of akin to getting fired for getting drunk and making lewd comments to the wait staff at Denny’s while wearing a company T-shirt.

Sexism in IT is bad. Duh. I don’t think I’ve met anyone who said otherwise. But this wasn’t sexism. Inappropriate, perhaps, but not sexism. By the way, racism in IT is bad, and so is age-ism, role-ism (discounting somebody’s opinions just because they’re in Marketing or Sales), and technacism (discounting a technology based on no factual knowledge).

It’s politically correct to jump to attention when “women in IT” come up. This subject is gathering a lot of momentum, and most of it I think is of the bad variety. Hate speech should not be tolerated—the rape and death threats against Adria cannot, should not, and are not acceptable in any way shape or form. Nor should similar kinds of direct comments against gays, lesbians, transsexuals, blacks, Asians, Jews, or any of the other “other” groups out there. But there is a far cry between this and the discrimination and hate speech that people go through: I have a friend who is lesbian and a school teacher, and she is receiving death threats for teaching at that school. She has dogs at the house, shotgun loaded, and she is waiting for the Mormons and news reporters to vacate her lawn so she can try to resume some kind of normal life. Putting up with a few lewd jokes in a crowd at a conference, I would guess, sounds pretty heavenly to her right now.

I think we have time for a patronizing plea, by the way: Ladies, I know you’ve had something of a rough time in the IT industry, but it’s pretty obvious that it’s getting better, and frankly, you run a big risk of ostracizing yourself and making it harder if every time a woman doesn’t get selected for something (a conference speaking slot, a tech lead role, or a particular job) the whole “women in IT” banner gets unfurled and raised. Don’t get me wrong—I don’t think there’s many of you that are doing that. There are some, though, who do claim special privilege just for being female, and there’s enough of a correlation between these two things that I think before too long it’s going to lose its impact and the real good that could be done will be lost. Don’t demand that you get special privilege—earn it. Believe me, there’s plenty of opportunities for you to do so, so if you get blocked on something, look for a way around it. Demand equality, not artificially-imposed advantage.

(As trends go, quite honestly, given the declining rates of men graduating college and actually making a life for themselves, before too long the shoe will be on the other foot anyway, just give it time.)

There is no happy ending here. Nobody can fix this; three lives have been forever affected, negatively, by all of this. The ones I feel truly sorry for? SendGrid and PlayHaven—they had nothing to do with it, and now their names are going to be associated with this whole crappy mess.

Call me a misogynist for not whole-heartedly backing the woman in this case, if you will, but frankly, it was a disaster from the moment she chose to snap the photo and Tweet to the world instead of saying, “Excuse me, can you not make those jokes here? I don’t think they’re particularly appropriate.” I could theorize why she chose the one route over the other, but that’s an essay for another day.

Let the flaming begin.

UPDATE: This post puts more context around Adria, and I think is the best-written commentary I've seen on this so far, particularly since it's a woman's point of view on the whole thing (assuming, of course, that "Amanda" is in this case applied to a human of the female persuasion).


Conferences | Industry | Personal | Python | Reading | Social

Thursday, March 21, 2013 4:09:20 PM (Pacific Daylight Time, UTC-07:00)
Comments [5]  | 
 Tuesday, March 19, 2013
Programming language "laws"

As is pretty typical for that site, Lambda the Ultimate has a great discussion on some insights that the creators of Mozart and Oz have come to, regarding the design of programming languages; I repeat the post here for convenience:

Now that we are close to releasing Mozart 2 (a complete redesign of the Mozart system), I have been thinking about how best to summarize the lessons we learned about programming paradigms in CTM. Here are five "laws" that summarize these lessons:
  1. A well-designed program uses the right concepts, and the paradigm follows from the concepts that are used. [Paradigms are epiphenomena]
  2. A paradigm with more concepts than another is not better or worse, just different. [Paradigm paradox]
  3. Each problem has a best paradigm in which to program it; a paradigm with less concepts makes the program more complicated and a paradigm with more concepts makes reasoning more complicated. [Best paradigm principle]
  4. If a program is complicated for reasons unrelated to the problem being solved, then a new concept should be added to the paradigm. [Creative extension principle]
  5. A program's interface should depend only on its externally visible functionality, not on the paradigm used to implement it. [Model independence principle]
Here a "paradigm" is defined as a formal system that defines how computations are done and that leads to a set of techniques for programming and reasoning about programs. Some commonly used paradigms are called functional programming, object-oriented programming, and logic programming. The term "best paradigm" can have different meanings depending on the ultimate goal of the programming project; it usually refers to a paradigm that maximizes some combination of good properties such as clarity, provability, maintainability, efficiency, and extensibility. I am curious to see what the LtU community thinks of these laws and their formulation.
This just so neatly calls out to me, based on my own very brief and very informal investigation into multi-paradigm programming (based on James Coplien's work from C++ from a decade-plus ago). I think they really have something interesting here.


.NET | Android | C# | C++ | Conferences | Development Processes | F# | Industry | Java/J2EE | Languages | LLVM | Objective-C | Parrot | Personal | Python | Ruby | Scala | Visual Basic | WCF | Windows

Tuesday, March 19, 2013 6:32:43 PM (Pacific Daylight Time, UTC-07:00)
Comments [0]  | 
 Monday, March 18, 2013
Ted Neward on Java 8 adoption

Every once in a while, there is a moment in your life when inspiration just BAM! strikes out of nowhere, telling you what your next blog post is.

Then, there’s this one.

This blog post wasn’t inspired by any sort of bolt from the blue, or even a conversation with a buddy that led me to think, “Yeah, this is something that I should share with the world”. No, this one comes directly to you, from you. You see, I was cruising through my blog logs, and in particular looking at the Google Search queries that led to the blog site, and yesterday apparently two different Google Searches, both titled “Ted Neward on Java 8 adoption”, came in twice each.

I take that as a sign that y’all are kinda curious what my thoughts on Java 8 adoption are. Consider the message received: from your fingers to my eyes, as the old saying (slightly rephrased) goes.

Java 8: Overview

For those of you who’ve been too busy to track what’s going on with the Java language recently, the upcoming release of the JDK, the JavaSE 8 release, marks a fairly significant moment in Java’s history, one that ranks right up there with Java 5, in that the language is going to get a significant “bump” in functionality. Historically, Sun tried very hard to avoid such changes: Java 1.1 introduced inner classes, Java 1.4 introduced “assert”, and beyond that the language was the same language we’d been using since 1996 or so. The JVM saw some huge growth, by leaps and bounds, and the Java libraries grew exponentially, it seemed, but the language itself remained pretty static until Java 5. With Java 5 we got generics, enumerations, annotations, enhanced for loops, variable argument declarations, and a few other things besides; with Java 7 (the last release) we got a couple of trivial changes that really didn’t ruffle anybody’s hair, much less blow anybody’s socks off.

Java 8 represents another Java 5-like “sea change” kind of release. Not because there’s a ton of new features, like Java 5 had, but because the introduction of lambdas—anonymous function literals—will change a lot of the ways we can express concepts in Java, and that’s going to ripple throughout the language and the ecosystem. (Well, over time, it will—it’s hard to say exactly how much things will change in the days and months immediately following 8’s release.)

I won’t go into the details of Java 8’s new syntax—that’s not only still being finalized, but it’s also been pretty well-documented and discussed elsewhere (including a forthcoming Java Magazine issue from Oracle TechNet on the subject that’s been written by yours truly), and I only have a few minutes to write this in between flights home from a conference, to boot. For those who are familiar with lambdas, suffice to say that Java lambdas will look astonishingly like Scala or C# lambdas, partly because there’s really only a few ways you can make lambdas look in a C-style language, and partly because the folks writing the new features want the syntax to look familiar to programmers, and borrowing somebody else’s syntax (or at least big chunks of it) is a good way to do that.

Java 8: Adoption

When we talk about “adoption” of a given Java release, there’s a couple of different concepts we should tease out and examine individually: those customers who will deploy their non-Java8-written code on top of the Java8 JVM; those customers who will start using libraries written using Java8 features; and those customers who will start writing their own designs and implementations in the Java8 syntax and style.

Customers deploying Java8 for the JVM. Frankly, I expect this to happen relatively quickly, in line with the Java releases before this one. The JVM gets better and better with each release, and there’s no reason to assume that this release will be any different, and once Oracle and the JVM itself have demonstrated that there’s little to no risk to dropping the new JVM into the production data center and firing up your current version of JBoss or Tomcat or whatever on top of it, customers will begin to take a hard look at the risks involved in doing so (if any) and make that transition. It’s really a high-win-low-cost thing to do, again, once the Java8 JVM has some actual production miles under its belt, so to speak. (This isn’t a new rewrite of the JVM, by the way—customers just don’t want to be the first one to discover stupid bugs. My Dad once summarized this attitude this way: “Pilots never want to the fly the ‘A’ model of any aircraft.”) I give it about a year, maybe as early as six months, after the Java8 release before customers start putting Java8 into production.

Customers using libraries written using Java8 features. And let’s be clear, by “Java8 features” we’re talking about lambdas and virtual extension methods (a.k.a. “defender methods” from earlier draft specs), and by “libraries”, we’re talking about major open-source favorites like Spring, Hibernate, Commons Collections and so on. Essentially, the reason this is important as a category centers around the idea that Java developers, like a lot of developers, aren’t going to adopt the language features of the new Java until they see them in action—passing lambdas in to Spring for executing inside a database transaction, for example, or passing a lambda in to a collection for execution across a collection. The timeline here will be somewhat dependent on the library, and on the commitment of the developers around those libraries, but I’m a little less optimistic here—many of the open-source committers have historically been the loudest to cry foul over some of the changes Sun made to the language, and I’m not convinced yet that they have come around to embrace Oracle’s intentions regarding the language’s evolution. (In many ways, the image that strikes me is that of a large number of grumpy old men sitting around the office, gruffly tossing off one-liners like “Didn’t work like that in MY day” and “Don’t these kids realize that sometimes the old ways are the best ways?”.) I’m guessing that this transition will take longer, like two years at the minimum, and some libraries will never actually make the transition at all, choosing instead to remain “pre-Java8 compatible”, in the same way that some libraries chose to remain “pre-Java5 compatible” (and, IMHO, essentially put themselves out to pasture as a result).

Customers writing their own designs and implementations in Java8. And really, what I mean here is “how long before they start creating classes that utilize lambdas in the domain object design”? Interestingly enough, I think this is tangentially related to how quickly the open-source community adopts Java8 (the previous point), because then customers will begin to see some design patterns and idioms that they can copy/follow/embrace/extend, but even if the open-source community roundly rejects Java8, I still see customers starting to design and build code using lambdas by 2015 or ‘16. Some will jump on it early, or be able to transition their existing anonymous-inner-class-based (that is, “poor man’s lambda”) code over to lambdas within months of Java8’s release, but it will take longer to percolate through the rest of the industry—there are more than a few companies out there still running Java6, for example, and those folks aren’t going to accelerate their use of Java8 just to get lambdas.

Java 8: Perception

Having said all that, though, I think the overall perception of Java8’s adoption will be entirely dependent on how well Oracle addresses some of the recent “security flaws” that have been coming out of Java in the press. Even though the security flaws all seem to be applet- or client-side Java related, the perception that Java is somehow insecure likely has Microsoft chuckling internally—it certainly has Microsoft’s community (of which I and a number of my friends are a part) giggling and roaring and engaging in a few “Neener-neener-neener” moments; after all the crap that Java guys gave the Microsoft community back in the days of Bill Gates’ famous Security Memo, I can’t say that it’s unwarranted.

Aside from that, though, I think there’s no real reason not to expect adoption of Java8 to follow the same broad strokes path that previous Java releases have enjoyed, and thus within three years I fully expect that widescale adoption will be well under way.


.NET | Android | C# | F# | Industry | Java/J2EE | Languages | Ruby | Scala

Monday, March 18, 2013 6:46:36 PM (Pacific Daylight Time, UTC-07:00)
Comments [0]  | 
 Tuesday, March 05, 2013
That Thing They Call "Unemployment"

TL;DR: I'm "unemployed", I'm looking to land a position as a director of development or similar kind of development management role; I'm ridiculously busy in the meantime.

My employer, after having suffered the loss of close to a quarter of its consultant workforce on a single project when that project chose to "re-examine its current approach", has decided that (not surprisingly) given the blow to its current cash flow, it's a little expensive keeping an architectural consultant of my caliber on staff, particularly since it seems to me they don't appear to have the projects lined up for all these people to go. Today was my last day, the paperwork and final check are processing through the system, there were no tears nor angry accusations from either side, and tomorrow I get to wake up "unemployed".

It's a funny word, that word "unemployed", because it indicates both a state of emotion and existence that I don't really share. On the emotional front, I'm not upset. A number of people expressed condolences ("I'm so sorry, Ted"), but frankly, I'm not angry, upset, hurt, or any of those other emotions that so often come with that. Part of my reaction stems from the fact that I've been expecting this for a while--the company and I had lots of plans in the beginning of my tenure there, but those plans more or less never got past the planning stage, and the focus was clearly always on billability, which at the level I'm at usually implies travel, something I'm not willing to commit to at the 80%/100% level that consulting clients often demand. We just grew apart, the company and I, and I think we've both known it for a few months now; this is just putting the signatures on the divorce and splitting up the CD collection. On the "existence" front, unemployment often means "waking up with nothing to do" and "no more money coming in", which, honestly, doesn't really apply, either. While I'm not going to be drawing a salary on a twice-monthly basis like I was for the last twenty months, it's not like I have no income coming in or nothing to do: I've got my columns with MSDN, CoDe, and Oracle TechNet, I've got two conferences this month (33rd Degree in Warsaw, and VSLive! in Vegas) I've got a contract in place for doing some content work and research for JetBrains on MPS, their language workbench, and I've just commissioned a course with PluralSight, "JVM Fundamentals", which will essentially be an amalgamation of the conference talks I did at NFJS over the past five or six years (ClassLoaders, threading and concurrency, collections, and so on), with a few more PluralSight courses and JetBrains articles/vidcasts/etc sketched out after that. If I'm "unemployed", then it's the busiest damn unemployment I've ever heard of.

And in all honesty, this enforced change on my career is not unwelcome--I've been thinking now for the past few months that it's time for me to challenge myself again, and the chosen challenge I've laid out for myself is to run a team, not an architecture. I want to find a position where I can take a team, throw us at a project, and produce something awesome... or at least acceptable... to the customer. After so many years of making fun of managers at conferences and such, I find myself wanting to become one. I'm not naive, I know this isn't all rainbows and unicorns, and that there will be times I just want to go back to the editor and write code because at least code is deterministic (most of the time), but it's an entirely new set of challenges, and frankly, I've been bored the last few years, I just have to admit that out loud. And I may not like it and in a year or two say to myself, "What was I THINKING?!?", but at least I'll have given it a shot, gotten the experience, and learned a few new things. And it's not like I'm going to give up technology completely, because I'm still going to be writing, blogging, recording, speaking, and researching. I don't think I could give that up if I tried.

So if you know of a company in the Greater Seattle area that's looking for someone who's got a ton of technical skills and an intuitive sense of people to run a development team, drop me a note. Oh, and don't be too surprised if the website gets a face lift in the next month or two--the design is a little old, and I want to play around with Bootstrap and some static-HTML-plus-Javascript kinds of design/development. Should be fun, in all my copious spare time...


Conferences | Development Processes | Industry | Personal | Reading | Social

Tuesday, March 05, 2013 12:52:24 AM (Pacific Standard Time, UTC-08:00)
Comments [0]  | 
 Thursday, February 28, 2013
When Apple decides what email you get to see

According to this report, Apple is now not only spam-filtering out emails containing particular phraseology (in this case, "barely legal teens"), but deleting them entirely, whether they're being sent to your account, or from your account. And what's even more interesting, apparently iCloud users agreed to give Apple that kind of power.

The precedent here is dangerous, and one that needs to be carefully examined--if corporations are going to exercise the ability to investigate/examine (even from an automated tool) the email that you're sending or receiving, then technically privacy is being violated. This has always been an issue with email--corporations have always maintained that email sent on their servers to their employees is their property, and the legal world has held that up to be the case (which is the same rationale that then gives DOJ and other prosecutors the right to examine corporate email in order to see if there's been any wrongdoing taking place, so this is a good thing). But when you're not an employee of the corporation, does the fact that the email travels through their servers mean that they have the right to view your email, even through an algorithm? Does an ISP have the right to read its subscribers' email, too? The fact that iCloud users agree to allow Apple this power is an interesting twist, but frankly the courts have seen fit to throw out waivers that were deemed unenforceable or illegal, so that's something of a red herring, I think.

The much deeper issue here is one of privacy: how much privacy is really left to us these days? And, speaking for myself, why don't more people care?

This also has me wondering if, maybe, email and Internet services haven't reached a level of ubiquity that suggests that they should be considered part of the national or state infrastructure--as in, should local/city/state/federal government maintain an email infrastructure (servers) with the same degree of privacy guarantees that they held up for the US Postal Service? Or, maybe even, should the US Postal Service be that entity?


Industry | iPhone | Personal

Thursday, February 28, 2013 8:20:44 PM (Pacific Standard Time, UTC-08:00)
Comments [0]  | 
 Tuesday, February 26, 2013
"We Accept Pull Requests"

There are times when the industry in which I find myself does things that I just don't understand.

Consider, for a moment, this blog by Jeff Handley, in which he essentially says that the phrase "We accept pull requests" is "cringe-inducing":

Why do the words “we accept pull requests” have such a stigma? Why were they cringe-inducing when I spoke them? Because too many OSS projects use these words as an easy way to shut people up. We (the collective of OSS project owners) can too easily jump to this phrase when we don’t want to do something ourselves. If we don’t see the value in a feature, but the requester persists, we can simply utter, “We accept pull requests,” and drop it until the end of days or when a pull request is submitted, whichever comes first. The phrase now basically means, “Buzz off!”
OK, I admit that I'm somewhat removed from the OSS community--I don't have any particular dogs in that race, as the old saying goes--and the idea that "We accept pull requests" is a "Buzz off!" phrase is news to me. But I understand what Jeff is saying: a phrase has taken on a meaning of its own, and as is often the case, it's a meaning that's contrary to its stated one:
At Microsoft, having open source projects that actually accept pull requests is a fairly new concept. I work on NuGet, which is an Outercurve project that accepts contributions from Microsoft and many others. I was the dev lead for Razor and Web Pages at the time it went open source through Microsoft Open Tech. I collaborate with teams that work on EntityFramework, SignalR, MVC, and several other open source projects. I spend virtually all my time thinking about projects that are open source. Just a few years ago, this was unimaginable at Microsoft. Sometimes I feel like it still hasn’t sunk in how awesome it is that we have gotten to where we are, and I think I’ve been trigger happy and I’ve said “We accept pull requests” too often I typically use the phrase in jest, but I admit that I have said it when I was really thinking “Buzz off!”
Honestly, I've heard the same kind of thing from the mouths of Microsoft developers during Software Development Reviews (SDRs), in the form of the phrase "Thank you for your feedback"--it's usually at the end of a fervent discussion when one of the reviewers is commenting on a feature being done (or not being done) and the team is in some kind of disagreement about the feature's relative importance or the implementation used. It's usually uttered in a manner that gives the crowd a very clear intent: "You can stop talking now, because I've stopped listening."
The weekend after the MVP summit, I was still regretting having said what I said. I wished all week I could take the words back. And then I saw someone else fall victim. On a highly controversial NuGet issue, the infamous Phil Haack used a similar phrase as part of a response stating that the core team probably wouldn’t be taking action on the proposed changes, but that there was nothing stopping those affected from issuing a pull request. With my mistake still fresh in my mind, I read Phil’s words just as I’m sure everyone in the room at the MVP summit heard my own. It sounded flippant and it had the opposite effect from what Phil intended or what I would want people thinking of the NuGet core team. From there, the thread started turning nasty. We were stuck arguing opinions and we were no longer discussing the actual issue and how it could be solved.
As Jeff goes on to mention, I got involved in that Twitter conversation, along with a number of others, and as he says, the conversation moved on to JabbR, but without me--I bailed on it for a couple of reasons. Phil proposed a resolution to the problem, though, that seemed to satisfy at least a few folks:
With that many mentions on the tweets, we ran out of characters and eventually moved into JabbR. By the end of the conversation, we all agreed that the words “we accept pull requests” should never be used again. Phil proposed a great phrase to use instead: “Want to take a crack at it? We’ll help.”
But frankly, I don't care for this phraseology. Yes, I understand the intent--the owners of open-source projects shouldn't brush off people's suggestions about things to do with the project in the future and shouldn't reach for a handy phrase that will essentially serve the purpose of saying "Buzz off". And keeping an open ear to your community is a good thing, yes.

What I don't like about the new phrase is twofold. First, if people use the phrase casually enough, eventually it too will be overused and interpreted to mean "Buzz off!", just as "Thank you for your feedback" became. But secondly, where in the world did it somehow become a law that open source projects MUST implement every feature that their users suggest? This is part of the strange economics of open source--in a commercial product, if the developers stray too far away from what customers need or want, declining sales will serve as a corrective force to bring them back around (or, if they don't, bankruptcy of either the product or the company will eventually follow). But in an open-source project, there's no real visible marker to serve as that accountability and feedback--and so the project owners, those who want to try and stay in tune with their users anyway, feel a deeper responsibility to respond to user requests. And on its own, that's a good thing.

The part that bothers me, though, is that this new phraseology essentially implies that any open-source project has a responsibility to implement the features that its users ask for, and frankly, that's not sustainable. Open-source projects are, for the most part, maintained by volunteers, but even those that are backed by commercial firms (like Microsoft or GitHub) have finite resources--they simply cannot commit resources, even just "help", to every feature request that any user makes of them. This is why the "We accept pull requests" was always, to my mind, an acceptable response: loosely translated, to me at least, it meant, "Look, that's an interesting idea, but it either isn't on our immediate roadmap, or it takes the project in a different direction than we'd intended, or we're not even entirely sure that it's feasible or doable or easily managed or what-have-you. Why don't you take a stab at implementing it in your own fork of the code, and if you can get it to some point of implementation that you can show us, send us a copy of the code in the form of a pull request so we can take a look and see if it fits with how we see the project going." This is not an unreasonable response: if you care passionately about this feature, either because you think it should be there or because your company needs that feature to get its work done, then you have the time, energy and motivation to at least take a first pass at it and prove the concept (or, sometimes, prove to yourself that it's not such an easy request as you thought). Cultivating a sense of entitlement in your users is not a good practice--it's a step towards a completely unsustainable model that could, if not curbed, eventually lead to the death of the project as the maintainers essentially give up when faced with feature request after feature request.

I applaud the efforts on the part of project maintainers, particularly those at large commercial corporations involved in open source, to avoid "Buzz off" phrases. But it's not OK for project maintainers to feel like they are under a responsibility to implement any particular feature or idea suggested by a user. Some ideas are going to be good ones, some are going to be just "off the radar" of the project's core committers, and some are going to be just plain bad. You think your idea is one of those? Take a stab at it. Write the code. And if you've got it to a point where it seems to be working, then submit a pull request.

But please, let's not blow this out of proportion. Users need to cut the people who give them software for free some slack.

(EDIT: I accidentally referred to Jeff as "Anthony" in one place and "Andrew" in another. Not really sure how or why, but... Edited.)


.NET | Android | Azure | C# | C++ | Conferences | Development Processes | F# | Industry | iPhone | Java/J2EE | Languages | LLVM | Mac OS | Objective-C | Python | Reading | Ruby | Scala | Security | Solaris | Visual Basic | VMWare | XML Services

Tuesday, February 26, 2013 1:52:45 AM (Pacific Standard Time, UTC-08:00)
Comments [2]  | 
 Thursday, February 21, 2013
Java was not the first

Charlie Kindel blogs that he thinks James Gosling (and the rest of Sun) screwed us all with Java and it's "Write Once, Run Anywhere" mantra. It's catchy, but it's wrong.

Like a lot of Charlie's blogs, he nails parts of this one squarely on the head:

WORA was, is, and always will be, a fallacy. ... It is the “Write once…“ part that’s the most dangerous. We all wish the world was rainbows and unicorns, and “Write once…” implies that there is a world where you can actually write an app once and it will run on all devices. But this is precisely the fantasy that the platform vendors will never allow to become reality. ...
And, given his current focus on building a mobile startup, he of course takes this lesson directly into the "native mobile app vs HTML 5 app" discussion that I've been a part of on way too many speaker panels and conference BOFs and keynotes and such:
HTML5 is awesome in many ways. If applied judiciously, it can be a great technology and tool. As a tool, it can absolutely be used to reduce the amount of platform specific code you have to write. But it is not a starting place. Starting with HTML5 is the most customer unfriendly thing a developer can do. ... Like many ‘solutions’ in our industry the “Hey, write it once in in HTML5 and it will run anywhere” story didn’t actually start with the end-user customer. It started with idealistic thoughts about technology. It was then turned into snake oil for developers. Not only is the “build a mobile app that hosts a web view that contains HTML5″ approach bass-ackwards, it is a recipe for execution disaster. Yes, there are examples of teams that have built great apps using this technique, but if you actually look at what they did, they focused on their experience first and then made the technology work. What happens when the shop starts with “we gotta use HTML5 running in a UIWebView” is initial euphoria over productivity, followed by incredible pain doing the final 20%.
And he's flat-out right about this: HTML 5, as an application development technology, takes you about 60 - 80% of the way home, depending on what you want your application to do.

In fact, about the only part of Charlie's blog post that I disagree with is the part where he blames Gosling and Java:

I blame James Gosling. He foisted Java on us and as a result Sun coined the term Write Once Run Anywhere. ... Developers really want to believe it is possible to “Write once…”. They also really want to believe that more threads will help. But we all know they just make the problems worse. Just as we’ve all grown to accept that starting with “make it multi-threaded” is evil, we need to accept “Write once…” is evil.
It didn't start with Java--it started well before that, with a set of cross-platform C++ toolkits that promised the same kind of promise: write your application in platform-standard C++ to our API, and we'll have the libraries on all the major platforms (back in those days, it was Windows, Mac OS, Solaris OpenView, OSF/Motif, and a few others) and it will just work. Even Microsoft got into this game briefly (I worked at Intuit, and helped a consultant who was struggling to port QuickBooks, I think it was, over to the Mac using Microsoft's short-lived "MFC For Mac OS" release), And, even before that, we had the discussions of "Standard C" and the #ifdef tricks we used to play to struggle to get one source file to compile on all the different platforms that C runs on.

And that, folks, is the heart of the matter: long before Gosling took his fledgling failed set-top box Oak-named project and looked around for a space to which to apply it next, developers... no, let's get that right, "developers and their managers who hate the idea of violating DRY by having the code in umpteen different codebases" have been looking for ways to have a single source base that runs across all the platforms. We've tried it with portable languages (see C, C++, Java, for starters), portable libraries (in the C++ space see Zinc, zApp, XVT, Tools.h++), portable containers (see EJB, the web browser), and now portable platforms (see PhoneGap/Cordova, Titanium, etc), portable cross-compilers (see MonoTouch/MonoDroid, for recent examples), and I'm sure there will be other efforts along these lines for years and decades to come. It's a noble goal, but the major players in the space to which we are targeting--whether that be operating systems, browsers, mobile platforms, console game devices, or whatever comes next two decades from now--will not allow their systems to be commoditized that easily. Because at the heart of it, that's exactly what these "cross-platform" tools and languages and libraries are trying to do: reduce the underlying "thing" to a commodity that lacks interest or impact.

Interestingly enough, as a side-note, one thing I'm starting to notice is that the more pervasive mobile devices become and the more mobile applications we see reaching those devices, the less and less "device-standard" those interfaces are trying to look even as they try to achieve cross-platform similarities. Consider, for a moment, the Fly Delta app on iPhone: it doesn't really use any of the standard iOS UI metaphors (except for some of the basic ones), largely because they've defined their own look-and-feel across all the platforms they support (iOS and Android, at least so far). Ditto for the CNN and USA Today apps, as well as the ESPN app, and of course just about every game ever written for any of those platforms. So even as Charlie argues:

The problem is each major platform has its own UI model, its own model for how a web view is hosted, its own HTML rendering engine, and its own JavaScript engine. These inter-platform differences mean that not only is the platform-specific code unique, but the interactions between that code and the code running within the web view becomes device specific. And to make matters worse intra-platform fragmentation, particularly on the platform with the largest number of users, Android, is so bad that this “Write Once..” approach provides no help.
We are starting to see mobile app developers actually striving to define their own UI model entirely, with only passing nod to the standards of the device on which they're running. Which then makes me wonder if we're going to start to see new portable toolkits that define their own unique UI model on each of these platforms, or will somehow allow developers to define their own UI model on each of these platforms--a UI model toolkit, so to speak. Which would be an interesting development, but one that will eventually run into many of the same problems as the others did.


.NET | Android | Azure | C# | C++ | Development Processes | F# | Flash | Industry | iPhone | Java/J2EE | Languages | LLVM | Mac OS | Objective-C | Parrot | Review | Ruby | Windows

Thursday, February 21, 2013 4:08:04 PM (Pacific Standard Time, UTC-08:00)
Comments [6]  | 
 Thursday, February 14, 2013
Um... Security risk much?

While cruising through the Internet a few minute ago, I wandered across Meteor, which looks like a really cool tool/system/platform/whatever for building modern web applications. JavaScript on the front, JavaScript on the back, Mongo backing, it's definitely something worth looking into, IMHO.

Thus emboldened, I decide to look at how to start playing with it, and lo and behold I discover that the instructions for installation are:

curl https://install.meteor.com | sh
Um.... Wat?

Now, I'm sure the Meteor folks are all nice people, and they're making sure (via the use of the https URL) that whatever is piped into my shell is, in fact, coming from their servers, but I don't know these people from Adam or Eve, and that's taking an awfully big risk on my part, just letting them pipe whatever-the-hell-they-want into a shell Terminal. Hell, you don't even need root access to fill my hard drive with whatever random bits of goo you wanted.

I looked at the shell script, and it's all OK, mind you--the Meteor people definitely look trustworthy, I want to reassure anyone of that. But I'm really, really hoping that this is NOT their preferred mechanism for delivery... nor is it anyone's preferred mechanism for delivery... because that's got a gaping security hole in it about twelve miles wide. It's just begging for some random evil hacker to post a website saying, "Hey, all, I've got his really cool framework y'all should try..." and bury the malware inside the code somewhere.

Which leads to today's Random Thought Experiment of the Day: How long would it take the open source community to discover malware buried inside of an open-source package, particularly one that's in widespread use, a la Apache or Tomcat or JBoss? (Assume all the core committers were in on it--how many people, aside from the core committers, actually look at the source of the packages we download and install, sometimes under root permissions?)

Not saying we should abandon open source; just saying we should be responsible citizens about who we let in our front door.

UPDATE: Having done the install, I realize that it's a two-step download... the shell script just figures out which OS you're on, which tool (curl or wget) to use, and asks you for root access to download and install the actual distribution. Which, honestly, I didn't look at. So, here's hoping the Meteor folks are as good as I'm assuming them to be....

Still highlights that this is a huge security risk.


.NET | Android | Azure | C# | C++ | Development Processes | F# | Flash | Industry | iPhone | Java/J2EE | Languages | LLVM | Mac OS | Objective-C | Parrot | Personal | Python | Reading | Ruby | Scala | Security | Social | Solaris | Visual Basic | VMWare | WCF | Windows | XML Services | XNA

Thursday, February 14, 2013 8:25:38 PM (Pacific Standard Time, UTC-08:00)
Comments [4]  | 
 Saturday, February 02, 2013
Last Thoughts on "Craftsmanship"

TL;DR Live craftsmanship, don't preach it. The creation of a label serves no purpose other than to disambiguate and distinguish. If we want to hold people accountable to some sort of "professionalism", then we have to define what that means. I found Uncle Bob's treatment of my blog heavy-handed and arrogant. I don't particularly want to debate this anymore; this is my last take on the subject.


I will freely admit, I didn't want to do this. I really didn't. I had hoped that after my second posting on the subject, the discussion would kind of fade away, because I think we'd (or I'd, at least) wrought about the last few drops of discussion and insight and position on it. The same memes were coming back around, the same reactions, and I really didn't want to perpetuate the whole thing ad infinitum because I don't really think that's the best way to reach any kind of result or positive steps forward. I'd said my piece, I was happy about it.

Alas, such was not to be. Uncle Bob posted his thoughts, and quite frankly, I think he did a pretty bad job of hearing what I had to say, couching it in terms of populism (I stopped counting the number of times he used that word at six or so) even as he framed in it something of his own elitist argument.

Bob first points us all at the Manifesto for Software Craftsmanship. Because everyone who calls themselves a craftsman has to obey this manifesto. It's in the rules somewhere. Sort of like the Agile Manifesto--if you're not a signatory, you're doing it wrong.

(Oh, I know, to suggest that there is even the smallest thing wrong with the Agile Manifesto borders on heresy. Which, if that's the reaction you have, should be setting off a few warning bells in your head--something about replacing dogma with dogma.)

And you know what? I actually agree with most of the principles of the Craftsmanship Manifesto. It's couched in really positive, uplifting language: who doesn't want "well-crafted" software, or "steadily-increasing value", or "productive partnerships"? It's a wonderfully-worded document that unfortunately is way short on details, but hey, it should be intuitively obvious to anyone who is a craftsman, right?

See, this is part of my problem. Manifestos tend to be long on rhetoric, but very, very short on details. The Agile Manifesto is another example. It stresses "collaboration" and "working software" and "interactions" and "responding to change", but then people started trying to figure out how to apply this, and we got into the knife-fights that people arguing XP vs. Scrum vs. Kanban vs. your-homebrewed-craptaculous-brand-of-"little-a"-agile turned into brushfire wars. It's wonderful to say what the end result should be, but putting that into practice is a whole different ball of wax. So I'm a little skeptical any time somebody points to a Manifesto and says, "I believe in that, and that should suffice for you".

Frankly, if we want this to have any weight whatsoever, I think we should model something off the Hippcratic Oath, instead--it at least has prescriptive advice within it, telling doctors what they can and cannot (or, perhaps worded more accurately, should or should not) do. (I took something of a stab at this six years ago. It could probably use some work and some communal input; it was a first iteration.)

Besides (beware the accusation coming of my attempt at a false-association argument here, this is just for snarkiness purposes!), other manifestos haven't always worked out so well.

So by "proving [that I misinterpreted the event] by going to the Manifesto", you're kind of creating a circular argument: "What happened can't have been because of Software Craftsmanship, because look, there, in the Manifesto, it says we don't do that, so clearly, we can't have done that. It says it, right there! Seriously!"

The Supposed "Segregation"

Bob then says I'm clearly mistaken about "craftsmen" creating a segregation, because there's nothing about segregation in the manifesto:

any intimation of those who "get it" vs. those who don't; or any mention of the "right" tools or the "right" way. Indeed, what I see instead is a desire to steadily add value by writing well-crafted software while working in a community of professionals who behave as partners with their customers. That doesn't sound like "narcissistic, high-handed, high-minded" elitism to me.
Hold on to that thought for a bit.

Bob then goes on an interesting leap of logical assumption here. He takes my definition of a "software laborer":

"somebody who comes in at 9, does what they're told, leaves at 5, and never gives a rat's ass about programming except for what they need to know to get their job done [...] who [crank] out one crappy app after another in (what else?) Visual Basic, [that] were [...] sloppy, bloated, ugly [...] cut-and-paste cobbled-together duct-tape wonders."
and interprets it as
Now let's look past the hyperbole, and the populist jargon, and see if we can identify just who Ted is talking about. Firstly, they work 9-5. Secondly, they get their job done. Thirdly, they crank out lots of (apparently useful) apps. And finally, they make a mess in the code. The implication is that they are not late, have no defects, and their projects never fail.
That's weird. I go back and read my definition over and over again, and nowhere do I see me suggesting that they are never late, no-defect, and never-fail projects. Is it possible that Bob is trying to set up his next argument by reductio ad absurdum, basically by saying, "These laborers that Ted sets up, they're all perfect! They walk on water! They must be the illegitimate offspring of Christ himself! Have you met them? No? Oh, then they must not exist, and therefore his entire definition of the 'laborer' is whack, as these young-un kids like to say."

(See what I did there? I make Bob sound old and cantankerous. Not that he would do the same to himself, trying to use his years of experience as a subtle bludgeon to anyone who's younger and therefore less experienced--less professional, by implication--in his argument, right?

Programming is barely 60 years old. I, personally, have been programming for 43+ of those years.
Oh.)

Having sort of wrested my definition of the laborer away from me, Bob goes on:

I've never met these people. In my experience a mess in the code equates to lots of overtime, deep schedule overruns, intolerable defect rates, and frequent project failure -- not to mention eventual redesign.
Funny thing. I've seen "crafted" projects that fell to the same victims. Matter of fact, I had a ton of people (so it's not just my experience, folks, clearly there's a few more examples out there) email and comment to me that they saw "craftsmen" come in and take what could've been a one-week project and turn it into a six-month-or-more project by introducing a bunch of stuff into the project that didn't really need to be there, but were added in order to "add value" to the code and make it "well-crafted". (I could toss off some of the software terms that were cited as the reasons behind the "adding of value"--decoupled design, dependency injection, reusability, encapsulation, and others--but since those aren't in the Manifesto either, it's easy to say in the abstract that the people who did those projects weren't really adding value, even though these same terms seem to show up on every singe project during architecture and design, agile or otherwise.)

Bob goes on to sort of run with this theme:

Ted has created a false dichotomy that appeals to a populist ideology. There are the elite, condescending, self-proclaimed craftsmen, and then there are the humble, honorable, laborers. Ted then declares his allegiance to the latter... .
Well, last time I checked, all I have to do to be listed amongst the craftsmen is sign a web page, so "self-proclaimed" seems pretty accurate as a title. And "elite"? I dunno, can anyone become a craftsman? If so, then the term as a label has no meaning; if not, then yes, there's some kind of segregation, and it sure sounds like you're preaching from on high, particularly when you tell me that I've created a "false dichotomy" that appeals to a "populist ideology":
Generally, populists tend to claim that they side with "the people" against "the elites". While for much of the twentieth century, populism was considered to be a political phenomenon mostly affecting Latin America, since the 1980s populist movements and parties have enjoyed degrees of success in First World democracies such as the USA, Canada, Italy, the Netherlands and Scandinavian countries.
So apparently I'm trying to appeal to "the people", even though Bob will later tell us that we're all the same people. (Funny how there's a lot of programmers who feel like they're being looked down on by the elites--and this isn't my interpretation, read my blog's comments and the responses that have mushroomed on Twitter.) Essentially, Bob will argue later that there is no white-collar/blue-collar divide, even though according to him I'm clearly forming an ideology to appeal to people in the blue-collar camp.

So either I'm talking into a vacuum, or there's more of a divide than Bob thinks. You make the call on that one.

Shall we continue?

He strengthens his identity with, and affinity for, these laborers by telling a story about a tea master and a samurai (or was it some milk and a cow) which further extends and confuses the false dichotomy.
Nice non-sequitur there, Bob! By tossing in that "some milk and a cow", you neatly rob my Zen story of any power whatsoever! You just say it "extends and confuses the false dichotomy", without any real sort of analysis or discussion (that comes later, if you read through to the end), and because you're a craftsman, and I'm just appealing to populist ideology, my story no longer has any meaning! Because reductio ad make-fun-of-em is also a well-recognized and well-respected logical analysis in debating circles.

Oh, the Horror! ... of Ted's Psyche

Not content to analyze the argument, because clearly (he says this so many times, it must be true) my argument is so weak as to not stand on its own (even though I'm not sure, looking back at this point, that Bob has really attacked the argument itself at all, other than to say, "Look at the Manifesto!"), he decides to engage in a little personal attack:

I'm not a psychoanalyst; and I don't really want to dive deep into Ted's psyche to unravel the contradictions and false dichotomies in his blog. However, I will make one observation. In his blog Ted describes his own youthful arrogance as a C++ programmer... It seems to me that Ted is equating his own youthful bad behavior with "craftsmanship". He ascribes his own past arrogance and self-superiority with an entire movement. I find that very odd and very unfortunate. I'm not at all sure what prompted him to make such a large and disconnected leap in reasoning. While it is true that the Software Craftsmanship movement is trying to raise awareness about software quality; it is certainly not doing so by promoting the adolescent behavior that Ted now disavows.
Hmm. One could argue that I'm just throwing out that I'm not perfect nor do I need to profess to be, but maybe that's not a "craftsman's" approach. Or that I was trying to show others my mistakes so they could learn from them. You know, as a way of trying to build a "community of professionals", so that others don't have to go through the mistakes I made. But that would be psychoanalyzing, and we don't want to do that. Others didn't seem to have the problem understanding the "very large and disconnected leap in reasoning", and I would hate to tell someone with over twice my years of experience programming how to understand a logical argument, so how about let's frame the discussion this way: I tend to assume that someone behaving in a way that I used to behave (or still behave) is doing so for the same reasons that I do. (It's a philosophy of life that I've found useful at times.) So I assume that craftsmen take the path they take because they want to take pride in what they do--it's important to them that their code sparkle with elegance and beauty, because that's how code adds value.

Know what? I think one thing that got lost somewhere in all this debate is that value is only value if it's of value to the customer. And in a lot of the "craftsmanship" debates, I don't hear the customer's voice being brought up all that much.

You remember all those crappy VB apps that Bob maligned earlier? Was the customer happy? Did anybody stop to ask them? Or was the assumption that, since the code was crappy, the customer implicitly must be unhappy as well? Don't get me wrong, there's a lot of crappy code out there that doesn't make the customer happy. As a matter of fact, I'll argue that any code that doesn't make the customer happy is crap, regardless of what language it's written in or what patterns it uses or how decoupled or injected or new databases it stores data into. Value isn't value unless it's value to the person who's paying for the code.

Bob Discusses the Dichotomy

Eh, I'm getting tired of writing all this, and I'm sure you're getting tired of reading it, so let's finish up and call it a day. Bob goes on to start dissecting my false dichotomy, starting with:

Elitism is not encouraged in the Software Craftsmanship community. Indeed we reject the elitist attitude altogether. Our goal is not to make others feel bad about their code. Our goal is to teach programmers how to write better code, and behave better as professionals. We feel that the software industry urgently needs to raise the bar of professionalism.
Funny thing is, Bob, one could argue that you're taking a pretty elitist stance yourself with your dissection of my blog post. Nowhere do I get the benefit of the doubt, nor is there an effort to try and bring yourself around to understand where I'm coming from; instead, I'm just plain wrong, and that's all there is to it. Perhaps you will take the stance that "Ted started it, so therefore I have to come back hard", but that doesn't strike me as humility, that strikes me as preaching from a pulpit in tone. (I'd use a Zen story here to try and illustrate my point, but I'm afraid you'd characterize it as another "milk and a cow" story.)

But "raising the bar of professionalism", again, misses a crucial point, one that I've tried to raise earlier: Who defines what that "professionalism" looks like? Does the three-line Perl hack qualify as "professionalism" if it gets the job done for the customer so they can move on? Or does it need to be rewritten in Ruby, using convention over configuration, and a whole host of dynamic language/metaprogramming/internal DSL tricks? What defines professionalism in our world? In medicine, it's defined pretty simply: is the patient healthier or not after the care? In the legal profession, it's "did we represent the client to the best of our ability while remaining in compliance with the rules of ethics laid down by the bar and the laws of the entity in which we practice?" What defines "professionalism" in software? When you can tell me what that looks like, in concrete, without using words that allow for high degree of interpretation, then we can start to make progress towards whether or not my "laborers" are, in actuality, professionals.

We continue.

There are few "laborers" who fit the mold that Ted describes. While there are many 9-5 programmers, and many others who write cut-paste code, and still others who write big, ugly, bloated code, these aren't always the same people. I know lots of 12-12 programmers who work hellish hours, and write bloated, ugly, cut-paste code. I also know many 9-5 programmers who write clean and elegant code. I know 9-5ers who don't give a rat's ass, and I know 9-5ers who care deeply. I know 12-12ers who's only care is to climb the corporate ladder, and others who work long hours for the sheer joy of making something beautiful.
Of course there aren't, Bob, you took my description and sort of twisted it. (See above.) And yes, I'll agree with you, there's lots of 9-5 developers, and lots of 12-12 developers, lots of developers who write great code, and lots of developers who write crap code and what's even funnier about this discussion is that sometimes they're all the same person! (They do that just to defy this kind of stereotyping, I'm sure.) But maybe it's just the companies I've worked for compared to the companies you've worked for, but I can rattle off a vastly larger list of names who fit in the "9-5" category than those who fit into the "12-12" category. All of them wanted to do a good job, I believe, but I believe that because I believe that every human being innately wants to do things they are proud of and can point to with a sense of accomplishment. Some will put more energy into it than others. Some will have more talent for it than others. Just like dancing. Or farming. Or painting. Or just about any endeavor.

The Real Problem

Bob goes on to talk about the youth of our industry, but I think the problem is a different one. Yes, we're a young industry, but frankly, so is Marketing and Sales (they've only really existed in their modern forms for about sixty or seventy years, maybe a hundred if you stretch the definitions a little), and ditto for medicine (remember, it was only about 150 years ago that surgeons were also barbers). Yes, we have a LOT to learn yet, and we're making a lot of mistakes, I think, because our youth is causing us to reach out to other, highly imperfect metaphor/role-model industries for terminology and inspiration. (Cue the discussion of "software architecture" vs "building architecture" here.) Personally, I think we've learned a lot, we're continuing to learn more, and we're reaching a point where looking at other industries for metaphors is reaching a practical end in terms of utility to us.

The bigger problem? Economics. The supply and demand curve.

Neal Ford pointed out on an NFJS panel a few years back that the demand for software vastly exceeds the supply of programmers to build it. I don't know where he got that--whether he read that somewhere or that formed out of his own head--but he's absolutely spot-on right, and it seriously throws the whole industry out of whack.

If the software labor market were like painting, or car repair, or accounting, then the finite demand for people in those positions would mean that those who couldn't meet customer satisfaction would eventually starve and die. Or, more likely, take up some other career. It's a natural way to take the bottom 20% of the bell curve (the portion out to the far right) of potential practitioners, and keep them from ruining some customers' life. If you're a terrible painter, no customers will use you (at least, not twice), and while I suppose you could pick up and move to a new market every year or so until you're run out of town on a rail for crappy work, quite honestly, most people will just give up and go do something else. There are thousands--millions--of actors and actresses in Southern California that never make it to stage or screen, and they wait tables until they find a new thing to pursue that adds value to their customers' lives in such a way that they can make a living.

But software... right now, if you walk out into the middle of the street in San Francisco wearing a T-shirt that says, "I write Rails code", you will have job offers flying after you like the paper airplanes in Disney's just-released-to-the-Internet video short. IT departments are throwing huge amounts of cash into mechanisms, human or otherwise, working or otherwise, to help them find developers. Software engineering has been at the top of the list of "best jobs" for several years, commanding high salaries in a relatively stress-free environment, all in a period of time that many of equated to be the worst economic cycle since the Great Depression. Don't believe me? Take a shot yourself, go to a Startup Weekend and sign up as a developer: there are hundreds of people with new app ideas (granted, most of them total fantasy) who are just looking for a "technical co-founder" to help them see their dream to reality. IT departments will take anybody right now, and I do mean anybody. I'm reasonably convinced that half the reason software development outsourcing overseas happens is because it's a choice between putting up with doing the development overseas, even with all of the related problems and obstacles that come up, or not doing the development at all for lack of being able to staff the team to do it. (Which would you choose, if you were the CTO--some chance of success, or no chance at all?)

Wrapping up

Bob wraps up with this:

The result is that most programmers simply don't know where the quality bar is. They don't know what disciplines they should adopt. They don't know the difference between good and bad code. And, most importantly, they have not learned that writing good clean code in a disciplined manner is the fastest and best way get the job done well.

We, in the Software Craftsmanship movement are trying to teach those lessons. Our goal is to raise the awareness that software quality matters. That doing a good job means having pride in workmanship, being careful, deliberate, and disciplined. That the best way to miss a deadline, and lay the seeds of defeat, is to make a mess.

We, in the Software Craftsmanship movement are promoting software professionalism.
Frankly, Bob, you sort of reject your own "we're not elitists" argument by making it very clear here: "most programmers simply don't know where the quality bar is. They don't know .... They don't know.... They have not learned. ... We, in the Software Craftsmanship movement are trying to teach those lessons." You could not sound more elitist if you quoted the colonial powers "bringing enlightenment" to the "uncivilized" world back in the 1600s and 1700s. They are an ignorant, undisciplined lot, and you have taken this self-appointed messiah role to bring them into the light.

Seriously? You can't see how that comes across as elitist? And arrogant?

Look, I really don't mean to perpetuate this whole argument, and I'm reasonably sure that Uncle Bob is already firing up his blog editor to point out all the ways in which my "populist ideology" is falsly dichotomous or whatever. I'm tired of this argument, to be honest, so let me try to sum up my thoughts on this whole mess in what I hope will be a few, easy-to-digest bullet points:

  1. Live craftsmanship, don't preach it. If you hold the craftsman meme as a way of trying to improve yourself, then you and I have no argument. If you put "software craftsman" on your business cards, or website, or write Manifestos that you try to use as a bludgeon in an argument, then it seems to me that you're trying to distinguish yourself from the rest, and that to me smacks of elitism. You may not think of yourself as covering yourself in elitism, but to a lot of the rest of the world, that's exactly how you're coming off. Sorry if that's not how you intended it.
  2. Value is only value if the customer sees it as value. And the customer gets to define what is valuable to them, not you. You can (and should) certainly try to work with them to understand what they see as value, and you can (and should) certainly try to help them see how there may be value in ways they don't see today. But at the end of the day, they are the customer, they are paying the checks, and even after advising them against it, if they want to prioritize quick-and-dirty over longer-and-elegant, then (IMHO) that's what you do. Because they may have reasons for choosing that approach that they simply don't care to share with you, and it's their choice.
  3. The creation of a label serves no purpose other than to disambiguate and distinguish. If there really is no blue-collar programming workforce, Bob, then I challenge you to drop the term "craftsman" from your bio, profile, and self-description anywhere it appears, and replace it with "programer". Or else refer to all software developers as "craftsmen" (in which case the term becomes meaningless, and thus useless). Because, let's face it, how many doctors do you know who put "Hippocratic-sworn" somewhere on their business cards?
  4. If we want to hold people accountable to some sort of "professionalism", then we have to define what that means. The definition of the term "professional" is not really what we want, in practice, for it's usually defined as "somebody who got paid to do the job". The Craftsmanship Manifesto seems to want some kind of code of ethics or programmer equivalent to the Hippocratic Oath, so that the third precept isn't "a community of people who are paid to do what they do", but something deeper and more meaningful and concrete. (I don't have that definition handy, by the way, so don't look to me for it. But I will also roundly reject anyone who tries to use the Potter Stewart-esque "I can't define it but I know it when I see it" approach, because now we're back to individual interpretation.)
  5. I found Uncle Bob's treatment of my blog heavy-handed and arrogant. In case that wasn't obvious. And I reacted in similar manner, something for which I will apologize now. By reacting in that way, I'm sure I perpetuate the blog war, and truthfully, I have a lot of respect for Bob's technical skills; I was an avid fan of his C++ articles for years, and there's a lot of good technical ideas and concepts that any programmer would be well-advised to learn. His technical skill is without question; his compassion and empathy, however, might be. (As are mine, for stooping to that same level.)
Peace out.


.NET | C# | C++ | Conferences | Development Processes | F# | Industry | Java/J2EE | Languages | Parrot | Personal | Reading | Review | Social | Visual Basic | Windows

Saturday, February 02, 2013 4:33:12 AM (Pacific Standard Time, UTC-08:00)
Comments [3]  | 
 Friday, January 25, 2013
More on "Craftsmanship"

TL;DR: To all those who dissented, you're right, but you're wrong. Craftsmanship is a noble meme, when it's something that somebody holds as a personal goal, but it's often coming across as a way to beat up and denigrate on others who don't choose to invest significant time and energy into programming. The Zen Masters didn't walk around the countryside, proclaiming "I am a Zen Master!"

Wow. Apparently I touched a nerve.

It's been 48 hours since I posted On the Dark Side of 'Craftsmanship', and it's gotten a ton of interest, as well as a few syndicated re-posts (DZone and a few others). Comments to the blog included a response from Dave Thomas, other blog posts have been brought to my attention, and Twitter was on FIRE with people pinging me with their thoughts, which turn out to be across the spectrum, approving and dissenting. Not at all what I really expected to happen, to be honest--I kinda thought it would get lost in the noise of others commenting around the whole thing.

But for whatever reason, it's gotten a lot of attention, so I feel a certain responsibility to respond and explain to some of the dissenters who've responded. Not to defend, per se, but to at least demonstrate some recognition and attempt to clarify my position where I think it's gotten mis-heard. (To those who approved of the message, thank you for your support, and I'm happy to have vocalized something you felt unable, unwilling, unheard, or too busy to vocalize yourself. I hope my explanations here continue to represent your opinions, but if not, please feel free to let me know.)

A lot of the opinions centered around a few core ideas, it seems, so let me try and respond to those first.

You're confusing "craftsmanship" with a few people behaving badly. That may well be, but those who behaved badly included at least one who holds himself up as a leader of the craftsman movement and has held his actions up as indications of how "craftsmen" should behave. When you do this, you invite this kind of criticism and association. So if the movement is being given a black eye because of the actions of a single individual, well, now you know how a bunch of moderate Republicans feel about Paul Ryan.

Corey is a nice guy, he apologized, don't crucify him. Of course he is. Corey is a nice guy--and, speaking well to his character, he apologized almost immediately when it all broke. I learned a long time ago that "true sorry" means you (a) apologize for your actions, (b) seek to remedy the damage your actions have caused ("make it right", in other words), and (c) avoid making the same mistake in the future. From a distance, it seems like he feels contrition, and has publicly apologized for his actions. I would hope he's reached out to Heather directly to try and make things right with her, but that's between the two of them. Whether he avoids this kind of activity in the future remains to be seen. I think he will, but that's because I think he's learned a harsh lesson about being in the spotlight--it tends to be a harsh place to be. The rest of this really isn't about Corey and Heather anymore, so as far as I'm concerned, that thread complete.

You misunderstand the nature of "craftsmanship". Actually, no, I don't. At its heart, the original intent of "craftsmanship" was a constant striving to be better about what you do, and taking pride in the things that you do. It's related to the Japanese code of the samurai (kaizen) that says, in essence, that we are constantly striving to get better. The samurai sought to become better swordsmen, constantly challenging each other to prove the mettle against one another, improving their skills and, conditioning, but also their honor, by how they treated each other, their lord, their servants, and those they sought to protect. Kanban is a wonderful code, and one I have tried to live my entire life, even before I'd discovered it. Please don't assume that I misunderstand the teachings of your movement just because I don't go to the meetings.

Why you pick on "craftsmanship", anyway? If I want to take pride in what I do, what difference does it make? This is me paraphrasing on much of the dissent, and my response boils down to two basic thoughts:

  1. If you think your movement is "just about yourself", why invent a label to differentiate yourself from the rest?
  2. If you invent a label, it becomes almost automatic to draw a line between "us" and "them", and that in of itself almost automatically leads to "us vs them" behavior and mentality.
Look, I view this whole thing as kind of like religion: whatever you want to do behind closed doors, that's your business. But when you start waving it in other peoples' faces, then I have a problem with it. You want to spend time on the weekends improving your skills, go for it. You want to spend time at night learning a bunch of programming languages so you can improve your code and your ability to design systems, go for it. You want to study psychology and philosophy so you can understand other people better when it comes time to interact with them, go for it. And hey, you want to put some code up somewhere so people can point to it and help you get it better, go for it. But when you start waving all that time and dedication in my face, you're either doing it because you want recognition, or you want to suggest that I'm somehow not as good as you. Live the virtuous life, don't brag about it.

There were some specific blogs and comments that I think deserve discusson, too:

Dave Thomas was kind enough to comment on my blog:

I remember the farmer comment :) I think I said 30%, but I stand by what I said. And it isn't really an elitist stance. Instead, I feel that programming is hard work. At the end of a day of coding, I'm tired. And so I believe that if you are asking someone to do programming, then it is in both your and their interest that they are doing something they enjoy. Because if they don't enjoy it, then they are truly just a laborer, working hard at something that has no meaning to them. And as you spend 8 hours a day, 5 days a week doing it, that seems like an awful waste of an intelligent person's life.
Sure, programming is hard. So is house painting. They're different kinds of exhaustion, but it's exhaustion all the same. But, frankly, if somebody has chosen to take up a job that they do just because it's a job, that's their choice, and not ours to criticize, in my opinion. (And I remember it as 50%, because I very clearly remember saying the "way to insult half the room" crack after it, but maybe I misheard you. I do know others also heard it at 50%, because an attendee or two came up to talk about it after the panel. At least, that's how I remember it at the time. But the number itself is kinda meaningless, now that I think about it.)
The farming quote was a deliberate attempt at being shocking to make a point. But I still think it is valid. I'd guess that 30% of the developers I meet are not happy in their work. And I think those folks would be happier and more fulfilled doing something else that gave them more satisfaction.
Again, you and I are both in agreement, that people should be doing what they love, but that's a personal judgment that each person is permitted to make for themselves. There are aspects of our lives that we don't love, but we do because they make other people happy (Juliet and Charlotte driving the boys around to their various activities comes to mind, for example), and it is not our position to judge how others choose for themselves, IMHO.
No one should have to be a laborer.
And here, you and I will disagree quite fundamentally: as I believe it was Martin Luther King, Jr, who said, "If you are going to be a janitor, be the best janitor you know how to be." It seems by that statement that you are saying that people who labor with their bodies rather than your minds (and trust me, you may not be a laborer anymore, big publishing magnate that you are, but I know I sure still am) are somehow less well-off than those who have other people working for them. Some people don't want the responsibility of being the boss, or the owner. See the story of the mexican fisherman at the end of this blog.

Nate commented:

You have a logical fallacy by lumping together the people that derided Heather's code and people that are involved in software craftmanship. It's actually a huge leap of logic to make that connection, and it really retracts from the article.
As I point out later, the people who derided Heather's code were some of the same folks who hold up software craftsmanship. That wasn't me making that up.

Now you realise that you are planting your flag firmly in the 'craftmanship' camp while propelling your position upwards by drawing a line in the sand to define another group of people as 'labourers'. Or in other words attempt to elevate yourself by patronising others with the position you think you are paying them a compliment. Maybe you do not realise this?
No, I realize it, and it's a fair critique, which is why I don't label myself as a "craftsman". I have more to say on this below.
However, have you considered that the craft is not how awesome and perfect you and your code are, but what is applicable for the task at hand. I think most people who you would put into either camp share the same mix of attributes whether good or bad. The important thing is if the solution created does what it is designed to do, is delivered on time for when it is needed and if the environment that the solution has been created for warrants it, that the code is easily understandable by yourself and others (that matter) so it can be developed further over time and maintained.
And the very people who call themselves "craftsmen" criticized a piece of code that, as near as I can tell, met all of those criteria. Hence my reaction that started this whole thing.
I don't wish to judge you, and maybe you are a great, smart guy who does good in the world, but like you I have not researched anything about you, I have simply read your assessment above and come to a conclusion, that's being human I guess.
Oh, people judge each other all the time, and it's high time we stopped beating them up for it. It's human to judge. And while it would be politically correct to say, "You shouldn't judge me before you know me", fact is, of course you're going to do exactly that, because you don't have time to get to know me. And the fact that you don't know me except but through the blog is totally acceptable--you shouldn't have to research me in order to have an opinion. So we're all square on that point. (As to whether I'm a great smart guy who does good in the world, well, that's for others to judge in my opinion, not mine.)
The above just sounds like more of the same 'elitism' that has been ripe in this world from playground to the workplace since the beginning.
It does, doesn't it? And hopefully I clarify the position more clearly later.

In It's OK to love your job, Chad McCallum says that

The basic premise (or at least the one the author start out with) is that because there’s a self-declared group of “software craftspeople”, there is going to be an egotistical divide between those who “get it” and those who don’t.
Like it or not, Chad, that egotistical divide is there. You can "call bullshit" all day long, but look at the reactions that have popped up over this--people feel that divide, and frankly, it's one that's been there for a long, long time. This isn't just me making this up.

Chad also says,

It’s true the feedback that Heather got was unnecessarily negative. And that it came from people who are probably considered “software craftspeople”. That said, correlation doesn’t equal causation. I’m guessing the negative feedback was more because those original offenders had a bad day and needed to vent. And maybe the comments after that one just jumped on the bandwagon because someone with lots of followers and/or respect said it.

These are both things that can and have happened to anyone, regardless of the industry they work in. It’s extremely unfair to associate “someone who’s passionate about software development” to “person who’s waiting to jump on you for your mistakes”.

Unfortunately, Chad, the excuse that "others do it, too" is not an acceptable excuse. If everybody jumped off a cliff, would you do it, too? I understand the rationale--it's extremely hard being the one to go against the herd (I've got the psychological studies I can cite at you that prove it), but that doesn't make it OK or excuse it. Saying "it happens in other industries" is just an extension of that. In other industries, women are still explicitly discriminated against--does that make it OK for us to do that, too?

Chad closes his blog with "Stop calling us egotistical jerks just because we love what we do." To which I respond, "I am happy to do so, as soon as those 'craftsmen' who are acting like one, stop acting like one." If you're not acting like one, then there should be no argument here. If you're trying to tell me that your label is somehow immune to criticism, then I think we just have to agree to disagree.

Paul Pagel (on a site devoted to software craftsmanship, no less) responded as well with his Humble Pursuit of Mastery. He opens with:

I have been reading on blogs and tweets the sentiment that "software craftsmanship is elitism". This perception is formed around comments of code, process, or techniques. I understand a craftsman's earned sense of pride in their work can sometimes be inappropriately communicated.
I don't think I commented on code, process or technique, so I can't be sure if this is directly refuting what I'm saying, but I note that Paul has already touched on the meme he wants to communicate in his last phrase: the craftsman's "earned sense of pride". I have no problem with the work being something that you take pride in; I note, however, that "pride goeth before a fall", and note that, again, Ozymandias was justifiably proud of his accomplishments, too.

Paul then goes through a summation of his career, making sure to smallcaps certain terms with which I have no argument: "sacrifice", "listen", "practicing", "critique" and "teaching". And, in all honesty, these are things that I embrace, as well. But I start getting a little dubious about the sanctity of your terminology, Paul, when it's being used pretty blatantly as an advertising slogan and theme all over the site--if you want the term to remain a Zen-like pursuit, then you need to keep the commercialism out of it, in my opinion, or you invite the kind of criticism that's coming here (explicit or implicit).

Paul's conclusion wraps up with:

Do sacrificing, listening, practice, critiquing, and teaching sound like elitist qualities to you? Software craftsmanship starts out as a humble endeavor moving towards mastery. I won't let 140 or 1000 characters redefine the hours and years spent working hard to become a craftsman. It gave me humility and the confidence to be a professional software developer. Sometimes I let confidence get the better of me, but I know when that happens I am not honoring the spirit of craftsmanship which I was trained.
Humility enough to trademark your phrase "Software is our craft"? Humility enough to call yourself a "driving force" behind software craftsmanship? Don't get me wrong, Paul, there is a certain amount of commercialism that any consultant must adopt in order to survive--but either please don't mix your life-guiding principles with your commercialism, or else don't be surprised when others take aim at your "humility" when you do. It's the same when ministers stand in a multi-million dollar building on a Sunday morning and talk about the parable of the widow giving away her last two coppers--that smacks of hypocrisy.

Finally, Matt van Horn wrote in Crafsmanship, a rebuttal that:

there is an allusion to software craftsmen as being an exclusive group who agre on the “right” tools and techniques. This could not be further from the truth. Anyone who is serious about their craft knows that for every job there are some tools that are better and some that are worse.
... but then he goes right into making that exact mistake:
Now, I may not have a good definition of elegant code, but I definitely know it when I see it – regardless of who wrote it. If you can’t see that
(1..10).each{|i| puts i}

is more elegant than
x = 0
while true do
  x = x + 1
  if x > 10
    break
  end
  puts x
end
then you must near the beginning of your journey towards mastery. Practicing your craft develops your ability to recognize these differences, just as a skilled tailor can more easily spot the difference between a bespoke suit and something from Men’s Wearhouse.
Matt, you kind of make my point for me. What makes it elegant? You take it as self-evident. I don't. As a matter of fact, I've been asking this question for some years now, "What makes code 'elegant', as opposed to 'ugly'? Ironically, Elliott Rusty Harold just blogged about how this style of coding is dangerous in Java, and got crucified for it, but he has the point that functional style (your first example) doesn't JIT as well as the more imperative style right now on the JVM (or on the CLR, from what I can tell). Are you assuming that this will be running on a native Ruby implementation, on JRuby, IronRuby, ...? You have judged the code in the second example based on an intrinsic value system that you may have never questioned. To judge, you have to be able to explain your judgments in terms of the value system. And the fact that you judge without any context, kind of speaks directly to the point I was trying to make: "craftsmen", it seems, have this tendency to judge in absence of context, because they are clearly "further down their journey towards mastery", to use your own metaphor.

Or, to put it much more succinctly, "Beauty is in the eye of the beholder".

Matt then tells me I missed the point of the samurai and tea master story:

inally, he closes with a famous zen story, but he entirely misses the point of it. The story concerns a tea master, and a samurai, who get into a duel. The tea master prevails by bringing the same concentration to the duel that he brings to his tea ceremony. The point that Ted seems to miss here is that the tea master is a craftsman of the highest order. A master of cha-do (the way of tea) is able to transform the simple act of making and pouring a cup of tea into something transcendant by bringing to this simple act a clear mind, a good attitude, and years of patient, humble practice. Arguably he prevails because he has perfected his craft to a higher degree than the samurai has perfected his own. That is why he has earned the right to wear the garb of a samurai, and why he is able to face down his opponent.
Which, again, I find funny, because most Zen masters will tell you that the story--any Zen story, in fact--has no "definitive" meaning, but has meaning based on how you interpret it. (There are a few Zen parables that reinforce this point, but it gets a little meta to justify my understanding of a Zen story by quoting another Zen story.) How Matt chooses to interpret that parable is, of course, up to him. I choose to interpret the story thusly: the insulted samurai felt that his "earned sense of pride" at his sword mastery was insulted by the tea master--clearly no swordsman, as it says in the story--wore robes of a rank and honor that he had not earned. And clearly, the tea master was no swordsman. But what the tea master learned from his peer was not how to use his concentration and discipline to improve his own swordsmanship, but how to demonstrate that he had, in fact, earned a note of mastery through an entirely different discipline than the insulted samurai's. The tea master still has no mastery of the sword, but in his own domain, he is an expert. This was all the insulted samurai needed to see, that the badge of honor had been earned, and not just imposed by a capricious (and disrespectful) lord. Put a paintbrush and canvas into the hands of a house painter, and you get pretty much a mess--but put a spray painter in the hands of Leonardo, and you still get a mess. In fact, to really do the parable justice, we should see how much "craft" Matt can bring when asked to paint a house, because that's about how much relevance swordsmanship and house painting have in relationship to one another. (All analogies fail eventually, by the way, and we're probably reaching the boundaries of this one.)

Billy Hollis is a master with VB, far more than I ever will be; I know C++ far better than he ever will. I respect his abilities, and he, mine. There is no argument here. But more importantly, there are friends I've worked with in the past who are masters with neither VB nor C++, nor any other programming language, but chose instead to sink their time and energy into skiing, pottery, or being a fan of a television show. They chose to put their energies--energies the "craftsmen" seem to say should be put towards their programming--towards things that bring them joy, which happen to not be programming.

Which brings me to another refrain that came up over and over again: You criticize the craftsman, but then you draw a distinction between "craftsman" and "laborer". You're confusing (or confused). First of all, I think it important to disambiguate along two axes: those who are choosing to invest their time into learning to write better software, and those who are choosing to look at writing code as "just" a job as one axis, and along a second axis, the degree to which they have mastered programming. By your own definitions, "craftsmen", can one be early in your mastery of programming and still be a "craftsman"? Can one be a master bowler who's just picked up programming and be considered a "craftsman"? Is the nature of "craftsmanship" a measure of your skill, or is it your dedication to programming, or is it your dedication to something in your life, period? (Remember, the tea master parable says that a master C++ developer will see the master bowler and respect his mastery of bowling, even though he can't code worth a crap. Would you call him a "craftsman"?)

Frankly, I will say, for the record, that I think there are people programming who don't want to put a ton of time and energy into learning how to be better programmers. (I suspect that most of them won't ever read this blog, either.) They see the job as "just a job", and are willing to be taught how to do things, but aren't willing to go off and learn how to do them on their own. They want to do the best job they can, because they, like any human being, want to bring value to the world, but don't have that passion for programming. They want to come in at 9, do their job, and go home at 5. These are those whom I call "laborers". They are the "fisherman" in the following story:

The businessman was at the pier of a small coastal Mexican village when a small boat with just one fisherman docked. Inside the small boat were several large yellowfin tuna. The businessman complimented the Mexican on the quality of his fish and asked how long it took to catch them. The Mexican replied only a little while.

The businessman then asked why he didn't stay out longer and catch more fish? The Mexican said he had enough to support his family's immediate needs. The businessman then asked, but what do you do with the rest of your time? The Mexican fisherman said, "I sleep late, fish a little, play with my children, take a siesta with my wife, Maria, stroll into the village each evening where I sip wine and play guitar with my amigos; I have a full and busy life, señor."

The businessman scoffed, "I am a Harvard MBA and I could help you. You should spend more time fishing and with the proceeds buy a bigger boat. With the proceeds from the bigger boat you could buy several boats; eventually you would have a fleet of fishing boats. Instead of selling your catch to a middleman, you would sell directly to the processor and eventually open your own cannery. You would control the product, processing and distribution. You would need to leave this small coastal fishing village and move to Mexico City, then LA and eventually New York City where you would run your expanding enterprise."

The Mexican fisherman asked, "But señor, how long will this all take?" To which the businessman replied, "15-20 years." "But what then, señor?" The businessman laughed and said, "That's the best part! When the time is right you would announce an IPO and sell your company stock to the public and become very rich. You would make millions." "Millions, señor? Then what?" The businessman said, "Then you would retire. Move to a small coastal fishing village where you would sleep late, fish a little, play with your kids, take a siesta with your wife, stroll to the village in the evenings where you could sip wine and play your guitar with your amigos."

What makes all of this (this particular subject, craftsmanship) particularly hard for me is that I like the message that craftsmanship brings, in terms of how you conduct yourself. I love the book Apprenticeship Patterns, for example, and think that anyone, novice or master, should read this book. I have taken on speaking apprentices in the past, and will continue to do so well into the future. The message that underlies the meme of craftsmanship--the constant striving to improve--is a good one, and I don't want to throw the baby out with the bathwater. If you have adopted "craftsmanship" as a core value of yours, then please, by all means, continue to practice it! Myself, I choose to do so, as well. I have mentored programmers, I have taken speaking apprentices, and I strive to learn more about my craft by branching my studies out well beyond software--I am reading books on management, psychology, building architecture, and business, because I think there is more to software than just the choice of programming language or style.

But be aware that if you start telling people how you're living your life, there is an implicit criticism or expectation that they should be doing that, as well. And when you start criticizing other peoples' code as being "unelegant" or "unbeautiful" or "unclean", you'd better be able to explain your value system and why you judged it as so. Humility is a hard, hard path to tread, and one that I have only recently started to see the outlines of; I am guilty of just about every sin imaginable when it comes to this subject. I have created "elegant" systems that failed their original intent. I have criticized "ugly" code that, in fact, served the purpose well. I have bragged of my own accomplishments to those who accomplished a lot more than I did, or ever will. And I consider it amazing to me that my friends who've been with me since long before I started to eat my justly-deserved humble pie are still with me. (And that those friends are some amazing people in their own right.; if a man is judged by the company he keeps, then by looking around at my friends, I am judged to be a king.) I will continue to strive to be better than I am now, though, even within this discussion right now: those of you who took criticism with my post, you have good points, all of you, and I certainly don't want to stop you from continuing on your journeys of self-discovery, either.

And if we ever cross paths in person, I will buy you a beer so that we can sit down, and we can continue this discussion in person.


.NET | C# | C++ | Conferences | Development Processes | F# | Industry | Java/J2EE | Languages | Objective-C | Parrot | Personal | Reading | Review | Ruby | Scala | Social | Windows

Friday, January 25, 2013 10:24:27 PM (Pacific Standard Time, UTC-08:00)
Comments [7]  | 
 Wednesday, January 23, 2013
On the Dark Side of "Craftsmanship"

I don't know Heather Arthur from Eve. Never met her, never read an article by her, seen a video she's in or shot, or seen her code. Matter of fact, I don't even know that she is a "she"--I'm just guessing from the name.

But apparently she got quite an ugly reaction from a few folks when she open-sourced some code:

So I went to see what people were saying about this project. I searched Twitter and several tweets came up. One of them, I guess the original one, was basically like “hey, this is cool”, but then the rest went like this:
"I cannot even make this stuff up." --@steveklabnik
"Ever wanted to make sed or grep worse?" --@zeeg
"@steveklabnik or just point to the actual code file. eyes bleeding!" --@coreyhaines
At this point, all I know is that by creating this project I’ve done something very wrong. It seemed liked I’d done something fundamentally wrong, so stupid that it flabbergasts someone. So wrong that it doesn’t even need to be explained. And my code is so bad it makes people’s eyes bleed. So of course I start sobbing.
Now, to be fair, Corey later apologized. But I'm still going to criticize the response. Not because Heather's a "she" and we should be more supportive of women in IT. Not because somebody took something they found interesting and put it up on github for anyone to take a look at and use if they found it useful. Not even because it's good code when they said it was bad code or vice versa. (To be honest, I haven't even looked at the code--that's how immaterial it is to my point.)

I'm criticizing because this is what "software craftsmanship" gets us: an imposed segregation of those who "get it" from those who "don't" based on somebody's arbitrary criteria of what we should or shouldn't be doing. And if somebody doesn't use the "right" tools or code it in the "right" way, then bam! You clearly aren't a "craftsman" (or "craftswoman"?) and you clearly don't care about your craft and you clearly aren't worth the time or energy necessary to support and nourish and grow and....

Frankly, I've not been a fan of this movement since its inception. Dave Thomas (Ruby Dave) was on a software panel with me at a No Fluff Just Stuff show about five years ago when we got on to this subject, and Dave said, point blank, "About half of the programmers in the world should just go take up farming." He paused, and in the moment that followed, I said, "Wow, Dave, way to insult half the room." He immediately pointed out that the people in the room were part of the first half, since they were at a conference, but it just sort of underscored to me how high-handed and high-minded that kind of talk and position can be.

Not all of us writing code have to be artists. Frankly, in the world of painting, there are those who will spend hours and days and months, tiny brushes in hand, jars of pigment just one lumens different from one another, laboring over the finest details, creating just one piece... and then there are those who paint houses with paint-sprayers, out of cans of mass-produced "Cream Beige" found at your local Lowes. And you know what? We need both of them.

I will now coin a term that I consider to be the opposite of "software craftsman": the "software laborer". In my younger days, believing myself to be one of those "craftsmen", a developer who knew C++ in and out, who understood memory management and pointers, who could create elegant and useful solutions in templates and classes and inheritance, I turned up my nose at those "laborers" who cranked out one crappy app after another in (what else?) Visual Basic. My app was tight, lean, and well-tuned; their apps were sloppy, bloated, and ugly. My app was a paragon of reused code; their apps were cut-and-paste cobbled-together duct-tape wonders. My app was a shining beacon on a hill for all the world to admire; their apps were mindless drones, slogging through the mud.... Yeah, OK, so you get the idea.

But the funny thing was, those "laborers" were going home at 5 every day. Me, I was staying sometimes until 9pm, wallowing in the wonderment of my code. And, I have to wonder, how much of that was actually not the wonderment of my code, but the wonderment of "me" over the wonderment of "code".

Speaking of, by the way, there appear to be the makings of another such false segregation, in the areas of "functional programming". In defense of Elliott Rusty Harold's blog the other day (which I criticized, and still stand behind, for the reasons I cited there), there are a lot of programmers that are falling into the trap of thinking that "all the cool kids are using functional programming, so if I want to be a cool kid, I have to use functional programming too, even though I'm not sure what I'm doing....". Not all the cool kids are using FP. Some aren't even using OOP. Some are just happily humming along using good ol' fashioned C. And producing some really quality stuff doing so.

See, I have to wonder just how much of the software "craftsmanship" being touted isn't really a narcissistic "Look at me, world! Look at how much better I am because I care about what I do! Look upon my works, ye mighty, and despair!" kind of mentality. Too much of software "craftsmanship" seems to be about the "me" part of "my code". And when I think about why that is, I come to an interesting assertion: That if we take the name away from the code, and just look at the code, we can't really tell what's "elegant" code, what's "hack" code, and what was "elegant hack because there were all these other surrounding constraints outside the code". Without the context, we can't tell.

A few years after my high point as a C++ "craftsman", I was asked to do a short, one-week programming gig/assignment, and the more I looked at it, the more it screamed "VB" at me. And I discovered that what would've taken me probably a month to do in C++ was easily accomplished in a few days in VB. I remember looking at the code, and feeling this sickening, sinking sense of despair at how stupid I must've looked, crowing. VB isn't a bad language--and neither is C++. Or Java. Or C#. Or Groovy, or Scala, or Python, or, heck, just about any language you choose to name. (Except Perl. I refuse to cave on that point. Mostly for comedic effect.)

But more importantly, somebody who comes in at 9, does what they're told, leaves at 5, and never gives a rat's ass about programming except for what they need to know to get their job done, I have respect for them. Yes, some people will want to hold themselves up as "painters", and others will just show up at your house at 8 in the morning with drop cloths. Both have their place in the world. Neither should be denigrated for their choices about how they live their lives or manage their careers. (Yes, there's a question of professional ethics--I want the house painters to make sure they do a good job, too, but quality can come just as easily from the nozzle of a spray painter as it does from the tip of a paintbrush.)

I end this with one of my favorite parables from Japanese lore:

Several centuries ago, a tea master worked in the service of Lord Yamanouchi. No-one else performed the way of the tea to such perfection. The timing and the grace of his every move, from the unfurling of mat, to the setting out of the cups, and the sifting of the green leaves, was beauty itself. His master was so pleased with his servant, that he bestowed upon him the rank and robes of a Samurai warrior.

When Lord Yamanouchi travelled, he always took his tea master with him, so that others could appreciate the perfection of his art. On one occasion, he went on business to the great city of Edo, which we now know as Tokyo.

When evening fell, the tea master and his friends set out to explore the pleasure district, known as the floating world. As they turned the corner of a wooden pavement, they found themselves face to face with two Samurai warriors.

The tea master bowed, and politely step into the gutter to let the fearsome ones pass. But although one warrior went by, the other remained rooted to the spot. He stroked a long black whisker that decorated his face, gnarled by the sun, and scarred by the sword. His eyes pierced through the tea maker’s heart like an arrow.

He did not quite know what to make of the fellow who dressed like a fellow Samurai, yet who would willingly step aside into a gutter. What kind of warrior was this? He looked him up and down. Where were broad shoulders and the thick neck of a man of force and muscle? Instinct told him that this was no soldier. He was an impostor who by ignorance or impudence had donned the uniform of a Samurai. He snarled: “Tell me, oh strange one, where are you from and what is your rank?”

The tea master bowed once more. “It is my honour to serve Lord Yamanouchi and I am his master of the way of the tea.”

“A tea-sprout who dares to wear the robes of Samurai?” exclaimed the rough warrior.

The tea master’s lip trembled. He pressed his hands together and said: “My lord has honoured me with the rank of a Samurai and he requires me to wear these robes. “

The warrior stamped the ground like a raging a bull and exclaimed: “He who wears the robes of a Samurai must fight like a Samurai. I challenge you to a duel. If you die with dignity, you will bring honour to your ancestors. And if you die like a dog, at least you will be no longer insult the rank of the Samurai !”

By now, the hairs on the tea master’s neck were standing on end like the feet of a helpless centipede that has been turned upside down. He imagined he could feel that edge of the Samurai blade against his skin. He thought that his last second on earth had come.

But the corner of the street was no place for a duel with honour. Death is a serious matter, and everything has to be arranged just so. The Samurai’s friend spoke to the tea master’s friends, and gave them the time and the place for the mortal contest.

When the fierce warriors had departed, the tea master’s friends fanned his face and treated his faint nerves with smelling salts. They steadied him as they took him into a nearby place of rest and refreshment. There they assured him that there was no need to fear for his life. Each one of them would give freely of money from his own purse, and they would collect a handsome enough sum to buy the warrior off and make him forget his desire to fight a duel. And if by chance the warrior was not satisfied with the bribe, then surely Lord Yamanouchi would give generously to save his much prized master of the way of the tea.

But these generous words brought no cheer to the tea master. He thought of his family, and his ancestors, and of Lord Yamanouchi himself, and he knew that he must not bring them any reason to be ashamed of him.

“No,” he said with a firmness that surprised his friends. “I have one day and one night to learn how to die with honour, and I will do so.”

And so speaking, he got up and returned alone to the court of Lord Yamanouchi. There he found his equal in rank, the master of fencing, he was skilled as no other in the art of fighting with a sword.

“Master,” he said, when he had explained his tale, “Teach me to die like a Samurai.”

But the master of fencing was a wise man, and he had a great respect for the master of the Tea ceremony. And so he said: “I will teach you all you require, but first, I ask that you perform the way of the Tea for me one last time.”

The tea master could not refuse this request. As he performed the ceremony, all trace of fear seemed to leave his face. He was serenely concentrated on the simple but beautiful cups and pots, and the delicate aroma of the leaves. There was no room in his mind for anxiety. His thoughts were focused on the ritual.

When the ceremony was complete, the fencing master slapped his thigh and exclaimed with pleasure : “There you have it. No need to learn anything of the way of death. Your state of mind when you perform the tea ceremony is all that is required. When you see your challenger tomorrow, imagine that you are about to serve tea for him. Salute him courteously, express regret that you could not meet him sooner, take of your coat and fold it as you did just now. Wrap your head in a silken scarf and and do it with the same serenity as you dress for the tea ritual. Draw your sword, and hold it high above your head. Then close your eyes and ready yourself for combat. “

And that is exactly what the tea master did when, the following morning, at the crack of dawn he met his opponent. The Samurai warrior had been expecting a quivering wreck and he was amazed by the tea master’s presence of mind as he prepared himself for combat. The Samurai’s eyes were opened and he saw a different man altogether. He thought he must have fallen victim to some kind of trick or deception ,and now it was he who feared for his life. The warrior bowed, asked to be excused for his rude behaviour, and left the place of combat with as much speed and dignity as he could muster.

(excerpted from http://storynory.com/2011/03/27/the-samurai-and-the-tea-master/)

My name is Ted Neward. And I bow with respect to the "software laborers" of the world, who churn out quality code without concern for "craftsmanship", because their lives are more than just their code.


.NET | Android | C# | C++ | Conferences | Development Processes | F# | Industry | Java/J2EE | Languages | LLVM | Objective-C | Parrot | Personal | Reading | Ruby | Scala | Social | Visual Basic | Windows

Wednesday, January 23, 2013 9:06:24 PM (Pacific Standard Time, UTC-08:00)
Comments [14]  | 
 Monday, January 21, 2013
On Functional Programming in Java

Elliott Rusty Harold is blogging that functional programming in Java is dangerous. He's wrong, and he's way late to the party on this one--it's coming to Java whether he likes it or not.

Go read his post first, while I try to sum up the reasons he cites for saying it's dangerous:

  1. Java is not a lazy-evaluated language. Programmers in Java will screw up and create heap and stack errors as a result.
  2. See? Here's a naive implementation of Clojure code taken directly over to Java and look how it blows up.
  3. Programmers can do bad things with this idea, so therefore we should avoid it.
  4. Oh, and by the way, it's "dangerously inefficient" in Java/JVM, even though I offer no perf benchmarks or comparisons to back this statement, and I'm somehow ignoring that Clojure and Scala run on the JVM as well, apparently without problem.
That kind of about sums it up, I think.

Look, as Elliott points out, Java is not Haskell. Neither is it Lisp. It's its own language, rooted in imperative and object-oriented history, but no less able to incorporate functional features into its development than Lisp could incorporate object-oriented features. However, if you do stupid things, like trying to re-create an infinite (implicitly lazily-evaluated) list in Clojure by creating an actualized list that stretches to infinity... you're going to blow the JVM up. Duh. Not even the supercomputer on the USS Enterprise five hundred years from now will be able to construct that list.

Porting code from one language to another is not a trivial exercise. If you attempt to port line-for-line and expression-for-expression, you can expect that your ported code will not be idiomatically correct. (I know this already, having done the exercise myself.) The root of the problem in his ported code is twofold. One, he (rather foolishly and in elegant strawman fashion) badly simulates what an infinite list would look like in Java--a commenter does the better job by showing how an Iterator can be made to perform the same thing that Haskell actually does under the hood by producing the next value on demand, rather than trying to create a list of Integers stretching to infinity. For someone who professes to have some Haskell experience and love, it surprises me that Elliott makes this kind of mistake, which leads me to conclude that he's trying to create the strawman. Two, he assumes that anyone who programs in Java functionally will have to create all of their functional tools by hand, and frankly, using Guava or FJ here would make this code sample a LOT easier to swallow. The fact that he ignores both of these in his stawman again sort of reinforces the idea that he's deliberately crippling his strawman to make his point.

His underlying point, though, seems to be simple: "I work with bad programmers, who don't seem to understand how to write code functionally in Java without screwing it up." Dude. Sucks to be you. "Bad programmers will move heaven and earth to do the wrong thing." --Glenn Vanderburg.

What really sucks for him is that these features are coming in Java 8, including lambda expressions and library support including a Stream interface that will allow for exactly this kind of code to be written without pain. Those programmers Elliott is working with are going to be even more on fire to use their functional approaches (and all the associated goodness of doing so, including composability and what-not) in their Java code. What might make Elliott more happy is that at least they won't have written it; it'll all be written by guys much smarter than any of them.


Android | Java/J2EE | Languages | Personal | Review | Scala

Monday, January 21, 2013 2:10:14 PM (Pacific Standard Time, UTC-08:00)
Comments [1]  | 
 Monday, January 07, 2013
Modifications to the Tech 2013 Predictions

I just added this to my Tech 2013 Predictions; I think they're important enough to re-post here as new content too:

  • Hardware is the new platform. A buddy of mine (Scott Davis) pointed out on a mailing list we share that "hardware is the new platform", and with Microsoft's Surface out now, there's three major players (Apple, Google, Microsoft) in this game. It's becoming apparent that more and more companies are starting to see opportunities in going the Apple route of owning not just the OS and the store, but the hardware underneath it. More and more companies are going to start playing this game, too, I think, and we're going to see Amazon take some shots here, and probably a few others. Of course, already announced is the Ubuntu Phone, and a new Android-like player, Tizen, but I'm not thinking about new players--there's always new players--but about some of the big standouts. And look for companies like Dell and HP to start looking for ways to play in this game, too, either through partnerships or acquisitions. (Hello, Oracle, I'm looking at you.... And Adobe, too.)
  • APIs for lots of things are going to come out. Ford just did this. This is not going away--this is going to proliferate. And the startup community is going to lap it up like kittens attacking a bowl of cream. If you're looking for a play in the startup world, pursue this.




Monday, January 07, 2013 7:04:13 PM (Pacific Standard Time, UTC-08:00)
Comments [0]  | 
Thoughts on a CodeMash Gone By

A year ago today (roughly), I gave the opening keynote at CodeMash 2.0.1.2. For those of you who were there, I don't think I need to tell you what happened. For those of you who weren't there, you probably still heard about, thanks to the Twitterstream of comments and counter-comments that followed. I've more or less tried to keep quiet about it since that time, trying to just let the furor die down (and it did, pretty quickly, I thought) out of respect to the conference organizers.

But with the show starting up again this week, and there having been a few people over the last twelve months who've asked me about "what the f*ck were you thinking" (whether that was in deliberate pun/jest or not, I can't always tell), and most importantly, now that I know that Jim and I are square with each other (thanks to a Twitter conversation a few days ago), I figure it's time to come clean and tell my side of the story.

TL;DR: If I had the chance to do the keynote over again, I'd do it differently.

(By the way, the rest of this post does have a few profanities in it, so if you're offended by that sort of thing, this is a good place to stop reading. Or, as the movies would say, this post is rated PG-13 for adult language.)

As a speaker, I have always sought to create a "persona" on stage that allowed me the maximum freedom of expression and opportunity to get my point across. A long time ago, when I started teaching at DevelopMentor, I learned from some of the best--one of those best being, of course, Don Box, but another of those was Ted Pattison. It was he who taught me that "If you can make 'em laugh, you can do whatever you want to them" (meaning the audience). He demonstrated this quite graphically by guest-lecturing in one of my classes once, early in my tenure as a DM instructor, and promptly castrated one of the students who was constantly irritating the class (and me) with off-topic questions. It was an eye-opening experience. Later, Don mentioned in passing that what we did was "equal parts education and entertainment". Education because, yes, it's what we do, but entertainment, too, because if the room falls asleep, then they're not getting educated.

And folks, I've sat in those chairs, I know how boring talks can be sometimes. And that sometimes, despite your best efforts, no matter how interesting the material, it can just be sooooo easy to pop open the laptop and do some email. Or write some code. Or even let the ambient warmth of the room in a post-lunch talk just... make... eyes... so heavy.... I get it, really.

So I decided, quite consciously, to develop a speaking persona that was a little on the edge, a little outrageous, a little "over the top", because then that persona gave me the freedom to do some of the crazy things that would keep the crowd awake and on its toes. I stand people up from the audience and use them in my demos. I write code on the fly based on their questions, and I try to use examples that allow for a certain amount of "Wow, that was weird, so I'll remember it better" in the demo itself. Case in point: when writing code to demonstrate delegates and events in C#, I would use the idea of a "Rock Band" and its fan club which, of course, must include groupies.

Is it politically correct to talk about groupies in a professional programming classroom setting? Probably not. Did anybody complain? Never heard one, directly or indirectly. Part of that, I believe, was because they got the point of the demo, and that was the point. Not that I was advocating groupie-ism, or that rock bands were more interesting than programming, but that the domain was easy enough to grip in their heads, and that made the result (loose coupling between event generators and consumers, in the case of delegates and events) more easily understood.

Analogies, for me, are never gratuitous. I choose my analogies quite carefully, and try to be very clear about where and when they do break down, because all analogies break down eventually. Even my most famous analogy breaks down, as many people have pointed out: nobody has ever died from O/R-Ms. Yep. But your wife's eyes were never burning balls of superheated plasma billions of light years away, either.

Point is, I deliberately seek ways to keep you entertained. And you know what? Entertainment often comes, in the case, from making the room laugh, and humor most often derives from the unexpected. And what's more unexpected than a profanity dropped at the most unexpected moment?

You don't have to agree with that sentiment to realize that it's FUCKING true.

When I got up to speak at CodeMash, I wanted very badly for this to be the best damn keynote I'd ever done in my life up to that point. I wanted the room to rock. Buzzing. Yes, I wanted to succeed very, very badly. It was an early-morning keynote, first one of the show. People were still milling around, there was a lot of background noise. People were still eating breakfast and waking up. And when Keith Elder, just before he introduced me to the crowd, whispered (I'm paraphrasing here) "Put some energy into this crowd, would ya?", I said to myself, "Oh, yeah. I'm on it."

A little TOO on it, as it turns out. I went way overboard. Brian Prince counted 18 f-bombs that day. Others counted, as well; lowest total I heard was 13, highest was 23. Needless to say, it was a carpet bombing to rival anything we ever did to North Vietnam. Made Dresden look like a weenie roast. (There's probably a Hiroshima joke in there too somewhere, but you get the point.)

The interesting thing about profanity used like that, however, is that it loses its efficacy. They have to be spaced out, chosen carefully, or they lose their impact. Which was, of course, exactly what happened. It's not going to have the 'unexpected' effect if it's coming every other minute or so. No matter how hard you try.

The result? Kind of predictable. Not my best results. For which I am most heartily sorry. I so wanted that keynote to go off so well, and it didn't, and I'm sorry.

For three hours after the keynote was over, as the Twitterstream was dissecting me for all that, I lay on the couch in my hotel room, bordering on tears. Seriously.

Had I the chance to do the keynote over again, you'd better damn well believe that I'd do it differently. Would I cut out all the profanity entirely? Nope. That's a part of my speaking persona, and anyone who brings me to a conference that doesn't know that probably didn't do their homework about me as a speaker beforehand. (It's not like there aren't ample opportunities to see me speaking in person, or videos of the same.) But somebody suggested not too long ago that maybe it wouldn't be a bad idea to warn people ahead of time, and yep, that's a great idea. Because (and for this, I am really even more sorry) sometimes kids are in the room, such as was the case for CodeMash, and they shouldn't have to hear it unless their parents are OK with it, and I didn't give their parents (or any attendees that felt the same way) an opportunity to "opt out" if they so chose.

I could, I suppose, hide behind the excuse that "We were all adults, we should be able to handle that kind of language", but in the case of the kids, that wasn't the case. Even then, in the case of the adults, you still should be given an opportunity to opt out.

More critically, if the message got lost because of the messenger's choice of words, then I failed as a speaker. And that, my friends, is where the real frustration for me lies--not with the words I used in of themselves, but in that the message--that we as an industry have to break out of our 'box-arrow-box-arrow-cylinder' habits and modes of thinking--got lost for so many people, That is how I failed most of all, and it is on those grounds that I say, once again, I am sorry.

To you, Jim, and to the rest of the CodeMash staff, I am particularly sorry. CodeMash is your baby, and I gave it a black eye.

To the attendees of CodeMash 2.0.1.2, I am sorry if my language offended you and distracted you from the message I was trying to deliver. I hope that you were able to get past it and enjoy the rest of the show. I think a lot of you did--many came up to me afterwards, but it was such a small fraction of the total I don't want to assume anything.

Enjoy CodeMash 2.0.1.3. With any luck, I'll see you there next year: hopefully a little wiser, but still just as FUCKING outrageous as I have always been, only this time, with an up-front disclaimer.

Flame away.


.NET | C# | Conferences | F# | Industry | Personal | Review | Social | Windows

Monday, January 07, 2013 3:23:42 PM (Pacific Standard Time, UTC-08:00)
Comments [4]  | 
 Saturday, January 05, 2013
Review (in advance): F# Deep Dives

F# Deep Dives, by Tomas Petricek and Phillip Trelford, Manning Publications

As many readers of my writing will already know, I've been kind of "involved" with F# (and its cousin on the JVM, Scala) for a few years now, to the degree that I and a couple of really smart guys wrote a book on the subject. Now, assuming you're one of the .NET developers who've heard of F# and functional programming, and took a gander at the syntax, and maybe even bought a book on it (my publisher and I both thank you if you bought ours), but weren't quite sure what to do with it, a book has come along to help get you past that.

As of this writing, the early-access (what Manning calls their MEAP) version had only Chapters 3 ("Parsing text-based languages") and Chapter 11 ("Creating games using XNA"), but the other topics ("Integrating external data into the F# language", "Handling dirty data with machine learning" and "Functional programming in the cloud" are just three of the other chapters listed) are juicy and meaty, and both Tomas and Philip are recognized names in the F# space. Neither are strangers to the subject material nor to writing, and the prose from the MEAP edition is pretty easy to read already, despite the fact that it's early-access material. In particular, the Markdown parser they implement in chapter 3 is a great example of a non-trivial language parser, which is not an easy task to approach but certainly a lot easier to do in a functional language. (For the record, I built a custom parser of my own for generating slides, and the blog entries that described the early implementations are here, and yes, I really should finish that series out, I know. I got more interested in extending the system, then realized I needed a full-fledged parser, and got distracted trying to integrate... surprise, surprise... Tomas' Markdown parser that he made available online.)

This book looks really promising, and I'm really hopeful Manning will send me a copy when it comes out, so I can level up my F# myself.


.NET | F# | Industry | Languages | Reading | Review | Windows | XNA

Saturday, January 05, 2013 2:10:05 AM (Pacific Standard Time, UTC-08:00)
Comments [0]  | 
Review: Metaprogramming in .NET

Metaprogramming in .NET, by Kevin Hazzard and Jason Bock, Manning Publications

TL;DR: This is a great book (not perfect), but not an easy read for everyone, not because the writing is bad, but because the subject is a whole new level of abstraction above what most developers deal with.

Full disclosure: Manning Publications is a publisher I've published with before, and Kevin and Jason are both friends of mine in the .NET community. I write a column for MSDN Magazine, and metaprogramming was one of the topics in one of the series I've written ("Metaparadigmatic Programming") for the column, so this subject is not unfamiliar to me.

Kevin and Jason have done a great job covering a pretty diverse subject, in my opinion. Because metaprogramming is "programming about programming", it's sometimes a hard concept for people who've never really investigated it to wrap their heads around, but Kevin and Jason do a great job opening with some concepts first, then exploring .NET Reflection, which is most developers' first introduction to metaprogramming. If you can understand how Reflection is programming against code and code metadata, then you're in a good place to start exploring metaprogramming in further depth.

And explore it they do. From code generation with T4, CodeDOM and Reflection.Emit to code-level Expressions to low-level IL munging, they take you through a lot of the metaprogramming tools. They've also tried to include some practical places where these techniques are useful, though I do wish the examples had been a bit "larger", meaning they were integrated into the larger picture of a "real-world" system, but that's hard to do sometimes, and most readers sufficiently senior enough to read this book should be able to see how to apply them to their own problems. I also wish they'd approached generics a bit more thoroughly, since that's another metaprogrammatic technique that often doesn't get much love from developers (most of whom seem to view generics as a necessary evil, not a huge opportunity for design power), but maybe that would've been too much head-exploding for one book. Writing a LINQ provider would've been a good enhancement to the book, but again, that may have been a little too much for one book. I also wish they had put an IL overview into its own chapter, since it comes up in several chapters at once and would've been good as a reference, but there's books out there on IL, which hasn't changed much since .NET 2.0 days, so readers finding IL challenging should pick up one of those if they're finding their heads spinning a little on the IL syntax.

Having taken you through those techniques, though, they then take a different tack and take you through scripting languages and the Microsoft Dynamic Language Runtime (DLR), as well as into a few "alternative" languages for the CLR, which is an entirely different way of approaching metaprogrammatic techniques. Nemerle, for example, is a language that supports macros defined within the language, a technique that generally is limited to Lisps. (I admit it, Nemerle is one of my favorite CLR languages, and should be something every .NET developer plays with for at least a weekend.) They also include the first published coverage that I'm aware of on Roslyn, the Compiler-as-a-Service project under way at Microsoft, so readers intrigued by how they might use the compiler as part of their development efforts in v.Next of Visual Studio should definitely have a look.

Overall, the writing style is crisp, clean, not too academic but not too folksy, and entirely representative of two men I've been privileged to meet, have interesting technical conversations with, and have over to my house for dinner. Both are extremely approachable, and their text reflects this. Every .NET developer that wants to claim "senior" or "guru" level status should read this book and experiment with one or more of these techniques; these are the things that the "cool kids" in the .NET world know how to do, and if you want to hang with the best, this is the book you'll read cover to cover.

(This review was posted to Amazon at the above link on 5 Jan 2013, then copy-and-pasted here because I like posting reviews to my blog as well as to Amazon.)


.NET | C# | F# | Languages | Reading | Review | Visual Basic | Windows

Saturday, January 05, 2013 1:54:53 AM (Pacific Standard Time, UTC-08:00)
Comments [0]  | 
 Tuesday, January 01, 2013
Tech Predictions, 2013

Once again, it's time for my annual prognostication and review of last year's efforts. For those of you who've been long-time readers, you know what this means, but for those two or three of you who haven't seen this before, let's set the rules: if I got a prediction right from last year, you take a drink, and if I didn't, you take a drink. (Best. Drinking game. EVAR!)

Let's begin....

Recap: 2012 Predictions

THEN: Lisps will be the languages to watch.

With Clojure leading the way, Lisps (that is, languages that are more or less loosely based on Common Lisp or one of its variants) are slowly clawing their way back into the limelight. Lisps are both functional languages as well as dynamic languages, which gives them a significant reason for interest. Clojure runs on top of the JVM, which makes it highly interoperable with other JVM languages/systems, and Clojure/CLR is the version of Clojure for the CLR platform, though there seems to be less interest in it in the .NET world (which is a mistake, if you ask me).

NOW: Clojure is definitely cementing itself as a "critic's darling" of a language among the digital cognoscenti, but I don't see its uptake increasing--or decreasing. It seems that, like so many critic's darlings, those who like it are using it, and those who aren't have either never heard of it (the far more likely scenario) or don't care for it. Datomic, a NoSQL written by the creator of Clojure (Rich Hickey), is interesting, but I've not heard of many folks taking it up, either. And Clojure/CLR is all but dead, it seems. I score myself a "0" on this one.

THEN: Functional languages will....

I have no idea. As I said above, I'm kind of stymied on the whole functional-language thing and their future. I keep thinking they will either "take off" or "drop off", and they keep tacking to the middle, doing neither, just sort of hanging in there as a concept for programmers to take and run with. Mind you, I like functional languages, and I want to see them become mainstream, or at least more so, but I keep wondering if the mainstream programming public is ready to accept the ideas and concepts hiding therein. So this year, let's try something different: I predict that they will remain exactly where they are, neither "done" nor "accepted", but continue next year to sort of hang out in the middle.

NOW: Functional concepts are slowly making their way into the mainstream of programming topics, but in some cases, programmers seem to be picking-and-choosing which of the functional concepts they believe in. I've heard developers argue vehemently about "lazy values" but go "meh" about lack-of-side-effects, or vice versa. Moreover, it seems that developers are still taking an "object-first, functional-when-I-need-it" kind of approach, which seems a little object-heavy, if you ask me. So, since the concepts seem to be taking some sort of shallow root, I don't know that I get the point for this one, but at the same time, it's not like I was wildly off. So, let's say "0" again.

THEN: F#'s type providers will show up in C# v.Next.

This one is actually a "gimme", if you look across the history of F# and C#: for almost every version of F# v."N", features from that version show up in C# v."N+1". More importantly, F# 3.0's type provider feature is an amazing idea, and one that I think will open up language research in some very interesting ways. (Not sure what F#'s type providers are or what they'll do for you? Check out Don Syme's talk on it at BUILD last year.)

NOW: C# v.Next hasn't been announced yet, so I can't say that this one has come true. We should start hearing some vague rumors out of Redmond soon, though, so maybe 2013 will be the year that C# gets type providers (or some scaled-back version thereof). Again, a "0".

THEN: Windows8 will generate a lot of chatter.

As 2012 progresses, Microsoft will try to force a lot of buzz around it by keeping things under wraps until various points in the year that feel strategic (TechEd, BUILD, etc). In doing so, though, they will annoy a number of people by not talking about them more openly or transparently.

NOW: Oh, my, did they. Windows8 was announced with a bang, but Microsoft (and Sinofsky, who ran the OS division up until recently) decided that they could go it alone and leave critical partners (like Dropbox!) out of the loop entirely. As a result, the Windows8 Store didn't have a lot of apps in it that people (including myself) really expected would be there. And THEN, there was Surface... which took everybody by surprise, as near as I can tell. Totally under wraps. I'm scoring myself "+2" for that one.

THEN: Windows8 ("Metro")-style apps won't impress at first.

The more I think about it, the more I'm becoming convinced that Metro-style apps on a desktop machine are going to collectively underwhelm. The UI simply isn't designed for keyboard-and-mouse kinds of interaction, and that's going to be the hardware setup that most people first experience Windows8 on--contrary to what (I think) Microsoft thinks, people do not just have tablets laying around waiting for Windows 8 to be installed on it, nor are they going to buy a Windows8 tablet just to try it out, at least not until it's gathered some mojo behind it. Microsoft is going to have to finesse the messaging here very, very finely, and that's not something they've shown themselves to be particularly good at over the last half-decade.

NOW: I find myself somewhat at a loss how to score this one--on the one hand, the "used-to-be-called-Metro"-style applications aren't terrible, and I haven't really heard anyone complain about them tremendously, but at the same time, I haven't heard anyone really go wild and ga-ga over them, either. Part of that, I think, is because there just aren't a lot of apps out there for it yet, aside from a rather skimpy selection of games (compared to the iOS App Store and Android Play Store). Again, I think Microsoft really screwed themselves with this one--keeping it all under wraps helped them make a big "Oh, WOW" kind of event buzz within the conference hall when they announced Surface, for example, but that buzz sort of left the room (figuratively) when people started looking for their favorite apps so they could start using that device. (Which, by the way, isn't a bad piece of hardware, I'm finding.) I'll give myself a "+1" for this.

THEN: Scala will get bigger, thanks to Heroku.

With the adoption of Scala and Play for their Java apps, Heroku is going to make Scala look attractive as a development platform, and the adoption of Play by Typesafe (the same people who brought you Akka) means that these four--Heroku, Scala, Play and Akka--will combine into a very compelling and interesting platform. I'm looking forward to seeing what comes of that.

NOW: We're going to get to cloud in a second, but on the whole, Heroku is now starting to make Scala/Play attractive, arguably as attractive as Ruby/Rails is. Play 2.0 unfortunately is not backwards-compatible with Play 1.x modules, which hurts it, but hopefully the Play community brings that back up to speed fairly quickly. "+1"

THEN: Cloud will continue to whip up a lot of air.

For all the hype and money spent on it, it doesn't really seem like cloud is gathering commensurate amounts of traction, across all the various cloud providers with the possible exception of Amazon's cloud system. But, as the different cloud platforms start to diversify their platform technology (Microsoft seems to be leading the way here, ironically, with the introduction of Java, Hadoop and some limited NoSQL bits into their Azure offerings), and as we start to get more experience with the pricing and costs of cloud, 2012 might be the year that we start to see mainstream cloud adoption, beyond "just" the usage patterns we've seen so far (as a backing server for mobile apps and as an easy way to spin up startups).

NOW: It's been whipping up air, all right, but it's starting to look like tornadoes and hurricanes--the talk of 2012 seems to have been more around notable cloud outages instead of notable cloud successes, capped off by a nationwide Netflix outage on Christmas Eve that seemed to dominate my Facebook feed that night. Later analysis suggested that the outage was with Amazon's AWS cloud, on which Netflix resides, and boy, did that make a few heads spin. I suspect we haven't yet (as of this writing) seen the last of that discussion. Overall, it seems like lots of startups and other greenfield apps are being deployed to the cloud, but it seems like corporations are hesitating to pull the trigger on an "all-in" kind of cloud adoption, because of some of the fears surrounding cloud security and now (of all things) robustness. "+1"

THEN: Android tablets will start to gain momentum.

Amazon's Kindle Fire has hit the market strong, definitely better than any other Android-based tablet before it. The Nooq (the Kindle's principal competitor, at least in the e-reader world) is also an Android tablet, which means that right now, consumers can get into the Android tablet world for far, far less than what an iPad costs. Apple rumors suggest that they may have a 7" form factor tablet that will price competitively (in the $200/$300 range), but that's just rumor right now, and Apple has never shown an interest in that form factor, which means the 7" world will remain exclusively Android's (at least for now), and that's a nice form factor for a lot of things. This translates well into more sales of Android tablets in general, I think.

NOW: Google's Nexus 7 came to dominate the discussion of the 7" tablet, until...

THEN: Apple will release an iPad 3, and it will be "more of the same".

Trying to predict Apple is generally a lost cause, particularly when it comes to their vaunted iOS lines, but somewhere around the middle of the year would be ripe for a new iPad, at the very least. (With the iPhone 4S out a few months ago, it's hard to imagine they'd cannibalize those sales by releasing a new iPhone, until the end of the year at the earliest.) Frankly, though, I don't expect the iPad 3 to be all that big of a boost, just a faster processor, more storage, and probably about the same size. Probably the only thing I'd want added to the iPad would be a USB port, but that conflicts with the Apple desire to present the iPad as a "device", rather than as a "computer". (USB ports smack of "computers", not self-contained "devices".)

NOW: ... the iPad Mini. Which, I'd like to point out, is just an iPad in a 7" form factor. (Actually, I think it's a little bit bigger than most 7" tablets--it looks to be a smidge wider than the other 7" tablets I have.) And the "new iPad" (not the iPad 3, which I call a massive FAIL on the part of Apple marketing) is exactly that: same iPad, just faster. And still no USB port on either the iPad or iPad Mini. So between this one and the previous one, I score myself at "+3" across both.

THEN: Apple will get hauled in front of the US government for... something.

Apple's recent foray in the legal world, effectively informing Samsung that they can't make square phones and offering advice as to what will avoid future litigation, smacks of such hubris and arrogance, it makes Microsoft look like a Pollyanna Pushover by comparison. It is pretty much a given, it seems to me, that a confrontation in the legal halls is not far removed, either with the US or with the EU, over anti-cometitive behavior. (And if this kind of behavior continues, and there is no legal action, it'll be pretty apparent that Apple has a pretty good set of US Congressmen and Senators in their pocket, something they probably learned from watching Microsoft and IBM slug it out rather than just buy them off.)

NOW: Congress has started to take a serious look at the patent system and how it's being used by patent trolls (of which, folks, I include Apple these days) to stifle innovation and create this Byzantine system of cross-patent licensing that only benefits the big players, which was exactly what the patent system was designed to avoid. (Patents were supposed to be a way to allow inventors, who are often independents, to avoid getting crushed by bigger, established, well-monetized firms.) Apple hasn't been put squarely in the crosshairs, but the Economist's article on Apple, Google, Microsoft and Amazon in the Dec 11th issue definitely points out that all four are squarely in the sights of governments on both sides of the Atlantic. Still, no points for me.

THEN: IBM will be entirely irrelevant again.

Look, IBM's main contribution to the Java world is/was Eclipse, and to a much lesser degree, Harmony. With Eclipse more or less "done" (aside from all the work on plugins being done by third parties), and with IBM abandoning Harmony in favor of OpenJDK, IBM more or less removes themselves from the game, as far as developers are concerned. Which shouldn't really be surprising--they've been more or less irrelevant pretty much ever since the mid-2000s or so.

NOW: IBM who? Wait, didn't they used to make a really kick-ass laptop, back when we liked using laptops? "+1"

THEN: Oracle will "screw it up" at least once.

Right now, the Java community is poised, like a starving vulture, waiting for Oracle to do something else that demonstrates and befits their Evil Emperor status. The community has already been quick (far too quick, if you ask me) to highlight Oracle's supposed missteps, such as the JVM-crashing bug (which has already been fixed in the _u1 release of Java7, which garnered no attention from the various Java news sites) and the debacle around Hudson/Jenkins/whatever-the-heck-we-need-to-call-it-this-week. I'll grant you, the Hudson/Jenkins debacle was deserving of ire, but Oracle is hardly the Evil Emperor the community makes them out to be--at least, so far. (I'll admit it, though, I'm a touch biased, both because Brian Goetz is a friend of mine and because Oracle TechNet has asked me to write a column for them next year. Still, in the spirit of "innocent until proven guilty"....)

NOW: It is with great pleasure that I score myself a "0" here. Oracle's been pretty good about things, sticking with the OpenJDK approach to developing software and talking very openly about what they're trying to do with Java8. They're not entirely innocent, mind you--the fact that a Java install tries to monkey with my browser bar by installing some plugin or other and so on is not something I really appreciate--but they're not acting like Ming the Merciless, either. Matter of fact, they even seem to be going out of their way to be community-inclusive, in some circles. I give myself a "-1" here, and I'm happy to claim it. Good job, guys.

THEN: VMWare/SpringSource will start pushing their cloud solution in a major way.

Companies like Microsoft and Google are pushing cloud solutions because Software-as-a-Service is a reoccurring revenue model, generating revenue even in years when the product hasn't incremented. VMWare, being a product company, is in the same boat--the only time they make money is when they sell a new copy of their product, unless they can start pushing their virtualization story onto hardware on behalf of clients--a.k.a. "the cloud". With SpringSource as the software stack, VMWare has a more-or-less complete cloud play, so it's surprising that they didn't push it harder in 2011; I suspect they'll start cramming it down everybody's throats in 2012. Expect to see Rod Johnson talking a lot about the cloud as a result.

NOW: Again, I give myself a "-1" here, and frankly, I'm shocked to be doing it. I really thought this one was a no-brainer. CloudFoundry seemed like a pretty straightforward play, and VMWare already owned a significant share of the virtualization story, so.... And yet, I really haven't seen much by way of significant marketing, advertising, or developer outreach around their cloud story. It's much the same as what it was in 2011; it almost feels like the parent corporation (EMC) either doesn't "get" why they should push a cloud play, doesn't see it as worth the cost, or else doesn't care. Count me confused. "0"

THEN: JavaScript hype will continue to grow, and by years' end will be at near-backlash levels.

JavaScript (more properly known as ECMAScript, not that anyone seems to care but me) is gaining all kinds of steam as a mainstream development language (as opposed to just-a-browser language), particularly with the release of NodeJS. That hype will continue to escalate, and by the end of the year we may start to see a backlash against it. (Speaking personally, NodeJS is an interesting solution, but suggesting that it will replace your Tomcat or IIS server is a bit far-fetched; event-driven I/O is something both of those servers have been doing for years, and the rest of it is "just" a language discussion. We could pretty easily use JavaScript as the development language inside both servers, as Sun demonstrated years ago with their "Phobos" project--not that anybody really cared back then.)

NOW: JavaScript frameworks are exploding everywhere like fireworks at a Disney theme park. Douglas Crockford is getting more invites to conference keynote opportunities than James Gosling ever did. You can get a job if you know how to spell "NodeJS". And yet, I'm starting to hear the same kinds of rumblings about "how in the hell do we manage a 200K LOC codebase written in JavaScript" that I heard people gripe about Ruby/Rails a few years ago. If the backlash hasn't started, then it's right on the cusp. "+1"

THEN: NoSQL buzz will continue to grow, and by years' end will start to generate a backlash.

More and more companies are jumping into NoSQL-based solutions, and this trend will continue to accelerate, until some extremely public failure will start to generate a backlash against it. (This seems to be a pattern that shows up with a lot of technologies, so it seems entirely realistic that it'll happen here, too.) Mind you, I don't mean to suggest that the backlash will be factual or correct--usually these sorts of things come from misuing the tool, not from any intrinsic failure in it--but it'll generate some bad press.

NOW: Recently, I heard that NBC was thinking about starting up a new comedy series called "Everybody Hates Mongo", with Chris Rock narrating. And I think that's just the beginning--lots of companies, particularly startups, decided to run with a NoSQL solution before seriously contemplating how they were going to make up for the things that a NoSQL doesn't provide (like a schema, for a lot of these), and suddenly find themselves wishing they had spent a little more time thinking about that back in the early days. Again, if the backlash isn't already started, it's about to. "+1"

THEN: Ted will thoroughly rock the house during his CodeMash keynote.

Yeah, OK, that's more of a fervent wish than a prediction, but hey, keep a positive attitude and all that, right?

NOW: Welllll..... Looking back at it with almost a years' worth of distance, I can freely admit I dropped a few too many "F"-bombs (a buddy of mine counted 18), but aside from a (very) vocal minority, my takeaway is that a lot of people enjoyed it. Still, I do wish I'd throttled it back some--InfoQ recorded it, and the fact that it hasn't yet seen public posting on the website implies (to me) that they found it too much work to "bleep" out all the naughty words. Which I call "my bad" on, because I think they were really hoping to use that as part of their promotional activities (not that they needed it, selling out again in minutes). To all those who found it distasteful, I apologize, and to those who chafe at the fact that I'm apologizing, I apologize. I take a "-1" here.

2013 Predictions:

Having thus scored myself at a "9" (out of 17) for last year, let's take a stab at a few for next year:

  • "Big data" and "data analytics" will dominate the enterprise landscape. I'm actually pretty late to the ballgame to talk about this one, in fact--it was starting its rapid climb up the hype wave already this year. And, part and parcel with going up this end of the hype wave this quickly, it also stands to reason that companies will start marketing the hell out of the term "big data" without being entirely too precise about what they mean when they say "big data".... By the end of the year, people will start building services and/or products on top of Hadoop, which appears primed to be the "Big Data" platform of choice, thus far.
  • NoSQL buzz will start to diversify. The various "NoSQL" vendors are going to start wanting to differentiate themselves from each other, and will start using "NoSQL" in their marketing and advertising talking points less and less. Some of this will be because Pandora's Box on data storage has already been opened--nobody's just assuming a relational database all the time, every time, anymore--but some of this will be because the different NoSQL vendors, who are at different stages in the adoption curve, will want to differentiate themselves from the vendors that are taking on the backlash. I predict Mongo, who seems to be leading the way of the NoSQL vendors, will be the sacrificial scapegoat for a lot of the NoSQL backlash that's coming down the pike.
  • Desktops increasingly become niche products. Look, does anyone buy a desktop machine anymore? I have three sitting next to me in my office, and none of the three has been turned on in probably two years--I'm exclusively laptop-bound these days. Between tablets as consumption devices (slowly obsoleting the laptop), and cloud offerings becoming more and more varied (slowly obsoleting the server), there's just no room for companies that sell desktops--or the various Mom-and-Pop shops that put them together for you. In fact, I'm starting to wonder if all those parts I used to buy at Fry's Electronics and swap meets will start to disappear, too. Gamers keep desktops alive, and I don't know if there's enough money in that world to keep lots of those vendors alive. (I hope so, but I don't know for sure.)
  • Home servers will start to grow in interest. This may seem paradoxical to the previous point, but I think techno-geek leader-types are going to start looking into "servers-in-a-box" that they can set up at home and have all their devices sync to and store to. Sure, all the media will come through there, and the key here will be "turnkey", since most folks are getting used to machines that "just work". Lots of friends, for example, seem to be using Mac Minis for exactly this purpose, and there's a vendor here in Redmond that sells a ridiculously-powered server in a box for a couple thousand. (This is on my birthday list, right after I get my maxed-out 13" MacBook Air and iPad 3.) This is also going to be fueled by...
  • Private cloud is going to start getting hot. The great advantage of cloud is that you don't have to have an IT department; the great disadvantage of cloud is that when things go bad, you don't have an IT department. Too many well-publicized cloud failures are going to drive corporations to try and find a solution that is the best-of-both-worlds: the flexibility and resiliency of cloud provisioning, but staffed by IT resources they can whip and threaten and cajole when things fail. (And, by the way, I fully understand that most cloud providers have better uptimes than most private IT organizations--this is about perception and control and the feelings of powerlessness and helplessness when things go south, not reality.)
  • Oracle will release Java8, and while several Java pundits will decry "it's not the Java I love!", most will actually come to like it. Let's be blunt, Java has long since moved past being the flower of fancy and a critic's darling, and it's moved squarely into the battleship-gray of slogging out code and getting line-of-business apps done. Java8 adopting function literals (aka "closures") and retrofitting the Collection library to use them will be a subtle, but powerful, extension to the lifetime of the Java language, but it's never going to be sexy again. Fortunately, it doesn't need to be.
  • Microsoft will start courting the .NET developers again. Windows8 left a bad impression in the minds of many .NET developers, with the emphasis on HTML/JavaScript apps and C++ apps, leaving many .NET developers to wonder if they were somehow rendered obsolete by the new platform. Despite numerous attempts in numerous ways to tell them no, developers still seem to have that opinion--and Microsoft needs to go on the offensive to show them that .NET and Windows8 (and WinRT) do, in fact, go very well together. Microsoft can't afford for their loyal developer community to feel left out or abandoned. They know that, and they'll start working on it.
  • Samsung will start pushing themselves further and further into the consumer market. They already have started gathering more and more of a consumer name for themselves, they just need to solidify their tablet offerings and get closer in line with either Google (for Android tablets) or even Microsoft (for Windows8 tablets and/or Surface competitors) to compete with Apple. They may even start looking into writing their own tablet OS, which would be something of a mistake, but an understandable one.
  • Apple's next release cycle will, again, be "more of the same". iPhone 6, iPad 4, iPad Mini 2, MacBooks, MacBook Airs, none of them are going to get much in the way of innovation or new features. Apple is going to run squarely into the Innovator's Dilemma soon, and their products are going to be "more of the same" for a while. Incremental improvements along a couple of lines, perhaps, but nothing Earth-shattering. (Hey, Apple, how about opening up Siri to us to program against, for example, so we can hook into her command structure and hook our own apps up? I can do that with Android today, why not her?)
  • Visual Studio 2014 features will start being discussed at the end of the year. If Microsoft is going to hit their every-two-year-cycle with Visual Studio, then they'll start talking/whispering/rumoring some of the v.Next features towards the middle to end of 2013. I fully expect C# 6 will get some form of type providers, Visual Basic will be a close carbon copy of C# again, and F# 4 will have something completely revolutionary that anyone who sees it will be like, "Oh, cool! Now, when can I get that in C#?"
  • Scala interest wanes. As much as I don't want it to happen, I think interest in Scala is going to slow down, and possibly regress. This will be the year that Typesafe needs to make a major splash if they want to show the world that they're serious, and I don't know that the JVM world is really all that interested in seeing a new player. Instead, I think Scala will be seen as what "the 1%" of the Java community uses, and the rest will take some ideas from there and apply them (poorly, perhaps) to Java.
  • Interest in native languages will rise. Just for kicks, developers will start experimenting with some of the new compile-to-native-code languages (Go, Rust, Slate, Haskell, whatever) and start finding some of the joys (and heartaches) that come with running "on the metal". More importantly, they'll start looking at ways to use these languages with platforms where running "on the metal" is more important, like mobile devices and tablets.

As always, folks, thanks for reading. See you next year.

UPDATE: Two things happened this week (7 Jan 2013) that made me want to add to this list:
  • Hardware is the new platform. A buddy of mine (Scott Davis) pointed out on a mailing list we share that "hardware is the new platform", and with Microsoft's Surface out now, there's three major players (Apple, Google, Microsoft) in this game. It's becoming apparent that more and more companies are starting to see opportunities in going the Apple route of owning not just the OS and the store, but the hardware underneath it. More and more companies are going to start playing this game, too, I think, and we're going to see Amazon take some shots here, and probably a few others. Of course, already announced is the Ubuntu Phone, and a new Android-like player, Tizen, but I'm not thinking about new players--there's always new players--but about some of the big standouts. And look for companies like Dell and HP to start looking for ways to play in this game, too, either through partnerships or acquisitions. (Hello, Oracle, I'm looking at you.... And Adobe, too.)
  • APIs for lots of things are going to come out. Ford just did this. This is not going away--this is going to proliferate. And the startup community is going to lap it up like kittens attacking a bowl of cream. If you're looking for a play in the startup world, pursue this.

.NET | Android | Azure | C# | C++ | Conferences | Development Processes | F# | Flash | Industry | iPhone | Java/J2EE | Languages | LLVM | Mac OS | Objective-C | Parrot | Python | Reading | Review | Ruby | Scala | Security | Solaris | Visual Basic | VMWare | WCF | Windows | XML Services | XNA

Tuesday, January 01, 2013 1:22:30 AM (Pacific Standard Time, UTC-08:00)
Comments [2]  | 
 Wednesday, December 26, 2012
Thoughts on my new Surface

As a post-Christmas gift to myself, I took a bit of the money that my folks gave us and bought myself a 64GB Surface. Couple of thoughts came to mind as I've sat down to play with this thing:

  1. Microsoft doesn't sell a 64GB model with a Type keyboard? I know the touch-thing is, like, the new hotness with everyone, but frankly, having played with a friend's Surface and his (preferred) Touch keyboard cover, I think both he and Microsoft are smoking some serious crack if they think anyone can seriously touch-type on the touch keyboard. (To be fair, it's not just Microsoft, either--I can't effectively touch-type on my iPad or Galaxy Tab, either. I need the tactile feedback from the spring underneath the key and the edges of the keys themselves to know if I hit the key squarely or not.) More importantly, why on earth does Microsoft think that people buying the 64GB model won't want the Type cover? Or is this an insiduous ploy to force me to accept a bundle (the 64GB model apparently only comes with a Touch cover, not no cover at all) that I don't want? It certainly worked--I bought the 64GB with Touch cover for $699, then the Type cover by itself for another $129. (Let the conspiracists go crazy with that one.)
  2. The packaging is awfully reminiscient of the iPad/iPhone/iPod packaging style. Nice to see that Microsoft can leverage good ideas. ;-)
  3. So I fire this thing up, and the first thing I'm told is that there are 15 updates waiting. I'm all for keeping bits fresh and current and fixed, but this seems a bit excessive--why do so many apps need an update so quickly after its initial release? What's worse, the Store app doesn't tell you what these updates are for, as near as I can tell, so you can't tell which ones are crucial and which ones are just cosmetic. Kind of a fail there.
  4. Wait, how do I right-click on this thing? Or has Microsoft finally come to the realization that one mouse button is all you need right about the time that Apple seems ready to accept that two buttons are, in fact, a superior way of life?
  5. The form factor on this thing is a little bit larger than I expected for some reason. Not that I didn't really know how big it is (and it's not really all that big, at least not when compared to the Samsung tablet they gave us at //Build/ two years ago), but for some reason it just feels bigger than it is.
  6. The keyboard makes me think of it as a laptop, not a tablet. I find myself wanting to go download Visual Studio and put a stripped-down version of it on here. (I even asked my buddy who had a Surface if he'd managed to do that yet, and he--gently--reminded me that since this is Windows RT, and an ARM processor, it won't run on here.)
  7. Because I still wasn't convinced that this isn't a laptop, I tried to download Dropbox onto here. The Surface let me download the whole thing, then told me "This app cannot run on Surface". D'oh! Busted. I am an idiot.
  8. But no Dropbox on here? Really, Microsoft? This seems like a fairly major oversight. I know, Sinofsky was not a "team player", but he's gone now: Find the Dropbox team, give them a ton of money and a few "We're sorry, we won't shut you out again, we promise" mea culpas, and get one of the most popular productivity apps on the planet on this thing. Seriously.
  9. And while we're fixing things, can we please get the Store to be a little more responsive? I know the UX here is going for a "minimalist" vibe, but some part of me wants to see some whirlygigs or something going on while I'm downloading apps. (I, of course, will probably regret this in two years, and vehemently deny saying this when the whirlygigs make me long for a clean and simple interface after Microsoft jazzes it all up to the point of migraine-inducing snazziness.)
  10. And why did the Store hang in the middle of doing my 15 updates and 4 app downloads? It may have been the Internet connection (I'm sitting in a restaurant as I do this, and restaurant WiFi is on par with hotel WiFI in its reliability and bandwidth), but if it is, give me some kind of indication and don't lock me out of doing anything. (The screen became entirely unresponsive.) That's silly.
  11. Oh, and Evernote? After you install and start downloading my notes, same thing--don't get all silent on me and not tell me what's going on.
  12. Wait, Word and Excel and PowerPoint and OneNote are just Office 2013 previews? Not the real thing? Interesting--will I get a free update when those go live, or is this just another "play for free for 90 days then we soak you for money" kinds of arrangements? (And if so, will I be able to use an MVP MSDN key to update/upgrade/install them?)
  13. And now, post-reboot, Store won't launch--it just goes into the spinning circle of deathly dots. (Did I just coin that phrase? Can I copyright it?)
All in all, in the hour or so I've had it, it's not been a terrible experience, but I can't say it's been "sublime" or "world-changing". I'm glad I have it, because once I get a system worked out whereby I can easily share files back and forth between my Surface and the rest of my machines (yes, Mr. Ballmer, I know about SkyDrive, I just haven't been using mine and have to figure out how and where and when I would shift things back and forth between it and Dropbox), I look forward to giving this thing a spin for some of my upcoming blog entries and articles.

Which reminds me: whichever of BitBucket or GitHub manages to bring git or Mercurial over to the Surface (and iPad, and Android) will be a hell of a first-mover on integrating source control into peoples' daily lives. Can you imagine if GitHub and Dropbox joined forces? That would be interesting.


Conferences | Industry | Personal | Reading | Review | Social | Windows

Wednesday, December 26, 2012 6:02:26 PM (Pacific Standard Time, UTC-08:00)
Comments [0]  | 
 Thursday, December 20, 2012
Envoy (in Scala, JavaScript, and more)

A little over a decade ago, Eugene Wallingford wrote a paper for the PloP '99 conference, describing the Envoy pattern language, "a pattern language for managing state in a functional program". It's a good read, but the implementation language for the paper is Scheme--given that it's a Lisp dialect, often isn't particularly obvious or easy to understand at first, I thought it might be interesting (both for me and any readers that wanted to follow along) to translate the implementation examples into a variety of different languages. In this case, I thought it would be relatively easy to do it in Scala and F#, given their hybrid object-functional nature, but it's also an interesting exercise to demonstrate it in [Javascript] (I'll use NodeJS v0.8.15, running on my Mac, and Rhino, with the JVM), Yeti (an ML dialect that runs on the JVM), Jaskell (a Haskell dialect that also runs on the JVM), and, hey, what the heck, let's do it in C# while we're at it, just so the .NET guys don't feel too badly outnumbered.

(I'm posting this now with the intent of filling in the Yeti, Jaskell, F# and C# implementations later.)

Note that with lambdas coming in Java 8, it'll be possible to adapt this pattern language to work with Java, too--I'll leave that as exercise to do for myself (and update this blog entry) once I get a Java8 build on the machine on which I'm writing this.

One reason for doing this in Yeti and Jaskell is to demonstrate the original purpose of the Envoy pattern language--that we can achieve object-like semantics even in languages that don't directly support object semantics (like Scheme). But for the other languages, it's fair to ask why anyone would bother doing this in languages that do directly support objects (a la Scala, F#, etc), since it would seem a lot easier to just use the object features directly. And, truth be told, it's true--when looking to model objects in a language that has first-class support for objects, just use that support and those features, and call it a day. The point of this exercise is, for me, to exercise the functional features of those languages, and see exactly how functional languages can provide some of the same benefits that an O-O language enjoys, without having to use the O-O features directly. (There's been a lot of people writing functional-isms in O-O languages, yours truly included, so it seems a good exercise to flip that on its head.) This will also help me figure out where/when/how to use these features, when the need arises.

If you've not yet read the Envoy pattern language, take a moment and do that now; I don't want to annoy Mr. Wallingford in any way by repeating his prose here (not to mention that I'm going to have enough to do as it is just translating the code into several different languages). But I will toss in a brief summary of each of the elements in the pattern language, just so we're all on the same page about what's happening in each of these code samples.

Implementation notes

These are a few notes for each of the implementation langauges.

JavaScript

Because I want to be able to run the JavaScript code on either the Node platform directly or on the Rhino engine (via the Java JDK "jrunscript" command that installs on Java implementations starting with Java 6), and because those two environments provide different mechanisms for printing to the console ("console.log" in Node, and "println" in Rhino), I create a top-level function "out" that aliases to one or the other of those, depending on what's defined in the environment:

var out = (function() {
  if (typeof(console) !== "undefined" &&
      typeof(console.log) !== "undefined")
    return console.log
  else if (typeof(println) !== "undefined")
    return println
  else
    throw new Error("No idea what to use for output")
})();

(This actually gives away one of the punchlines in the first element of the pattern language, Function as Object, below, because here we're pretty clearly using "out" as a function-as-object.)

I used Rhino that ships with Java6, and node v0.8.15 for these.

Scala

I used Scala v2.9.2 running on Java6 for this.

Yeti

Yeti is an ML-based language that compiles to Java bytecode. Unlike Scala, it's a functional-only language (well, sort of), with Hindley-Milner type inference. As the Yeti home page describes, it supports polymorphic structure and variant types, property fields, lazy lists, pattern-matching on values, and a decent interop facility against Java code (meaning it can call Java classes, as well as compile to classes to be called from Java if desired.)

Yeti was at v0.9.7 at the time I wrote this, and again, running on the Java6 VM.

F#

I'm using the Visual Studio 2012 release to write the F# bits, which corresponds to F# 3.0. As far as I can tell, there's nothing really all that "3.0-specific" that I'm using, so it should work with F# 2.0, which shipped with Visual Studio 2010, and there's nothing Windows-specific here either, which means it should run fine on F# 3.0-on-Mono.

Note that, like what I'm doing with the JavaScript version, I'm binding each of the pattern elements into a function for execution, thus creating a scope block that is dissociated from the larger global scope:

let example = fun () ->
    Console.WriteLine "Howdy world"

If (like I tried once) we were to use the more naive approach:

let example =
    Console.WriteLine "Howdy world"

... then each of the functions is executed and the results bound to the name described ("example", in this case) at the time the compiler sees it; in other words, each is eagerly-evaluated, instead of waiting to be invoked in the main entry point of the program later. By binding an anonymous function literal, it essentially lazy-fies each of them, and won't execute them until they are deliberately invoked in Main, as in:

[<EntryPoint>]
let main argv = 
    example()
    // ... the others go here
    0 // return an integer exit code

With the platform (and prelude) details out of the way, let's begin.

C#

Wow, the C# version is going to be ugly. Let me explain what I mean.

Let's start with the syntax for an anonymous function literal (a lambda, in C# parlance):

() => { return 5; };

This is a function that takes no arguments and yields an int. (Actually, to be exact, this is a delegate, since the lambda wouldn't need an explicit return or the curly braces, since it's a single-expression block and the lambda assumes that the result of the single expression should be implicitly returned.)

Ideally, we'd be able to capture this in an implicitly-typed local variable, like so:

var giveMeFive = () => { return 5; };

But unfortunately, C# doesn't allow this, saying that it "Cannot assign lambda to implicitly-typed local variable". (Doesn't get much more straightforward than that when it comes to an error message.) So, we have to explicitly-type the local variable, which is a Func<> of some type:

Func<int> giveMeFive = () => { return 5; };

Hold on to this thought, because things are going to get even uglier when we want to invoke an anonymous block like this later (when we get into the Closure parts of the pattern language).

Function as Object

In pure functional languages, it's actually difficult to keep state and data tied together--in fact, part of the whole point of a functional language is to write functions that operate on data, ideally on lots of different kinds of data. "Therefore, create a function that acts like an object. Such a function carries the data it needs along with the expression that operates on the data. More importantly, an object encapsulates its data, ensuring that only the allowed operations are applied to them." In other words, by writing a function and keeping the data buried inside of it, we achieve the same kind of encapsulation that object-orientation has traditionally reserved for itself as its principal advantage. This is done via a closure, which is the next element in the language.

Scheme:

The original Scheme implementation looked like this:

(define balance 0)
(define withdraw
  (lambda (balance amount)
    (if (<= amount balance)
      (- balance amount)
      (error "Insufficient funds" balance))
  ))
(define deposit
  (lambda (balance amount)
    (+ balance amount)
  ))
(define accrue-interest
  (lambda (balance interest-rate)
    (+ balance (* balance interest-rate))
  ))

There's a few things wrong with this approach, as Wallingford points out, but to be faithful, recreating this in our target languages is pretty straightforward: three functions, each of which operate on parameters passed in. "You could create new accounts simply by binding values to names. Operating on accounts involves passing the account to the appropriate procedure and binding the new value as appropriate."

JavaScript:

In JavaScript, we can bind function values to names just as we can in Scheme, so it's not actually all that different, once you get past the lack of parentheses and added curly braces. Thus, it looks like:

(function() {
  out("function-as-object =========")
  
  var balance = 0
  var withdraw = function(amount) {
    if (amount <= balance)
      balance = balance - amount
    else
      throw new Error("Insufficient funds")
  }
  var deposit = function(amount) {
    balance += amount
  }
  var accrueInterest = function(interestRate) {
    balance += (balance * interestRate)
  }
})()

Note that I wrap all of it into its own function so as to give the whole thing some scope--makes it easier to define in a single .js file and execute.

Scala:

Similarly, Scala allows us to bind functions to names, too:

  def functionAsObject() = {
    def withdraw(balance : Int, amount : Int) = {
      if (amount <= balance) balance - amount else throw new RuntimeException("Insufficient funds")
    }
    def deposit(balance: Int, amount : Int) = {
      balance + amount
    }
    def accrueInterest(balance : Int, rate : Float) = {
      balance + (balance * rate)
    }
  }

Again, all of it is wrapped into a function for easier (on me, while I was experimenting with all of this) scoping.

F#:

F#, like most functional/object hybrid languages, also offers the ability to bind functions to values, so this is also pretty straightforward. I choose to just operate against the "global" balance value, rather than do the more functional "pass the balance in" that the previous two use:

let functionAsObject = fun () ->
    let balance = ref 0
    let withdraw = 
        fun amt ->
            if amt <= !balance then
                balance := (!balance) - amt
                !balance
            else
                raise (Exception("Insufficient funds"))
    let deposit = 
        fun amt -> 
            balance := (!balance) + amt
            !balance
    let accrueInterest = 
        fun (intRate : float) -> 
            balance := (!balance) + (int (float !balance * intRate))
            !balance
    
    Console.WriteLine "=========> Function as Object"
    printfn "%d" (deposit 200)
    printfn "%d" (withdraw 50)

Yeti (ML):

Although Yeti supports a slightly more succinct syntax for defining a function, I choose to use the syntax that more closely matches what we're doing in the other examples--bind a function literal (do ... done;) to a name (withdraw, deposit and accrueInterest). Again, since this is running on top of the JVM, we have full access to the underlying Java library, which means we can make use of RuntimeException again as a cheap way of signaling a bad withdrawal.

withdraw = 
  do bal amt:
    if amt <= bal then
      bal - amt
    else
      throw new RuntimeException("Insufficient funds")
    fi
  done;

deposit =
  do bal amt: bal + amt done;
  
accrueInterest =
  do bal intRate:
    bal + (bal * intRate)
  done;

balance = 100;
println (withdraw balance 10)

Jaskell (Haskell):

C#:

This is a little more verbose than some of the other versions we've seen thus far, because C# lacks the type-inference that F# or Yeti or Scala has, yet requires explicit typing (in some places) because it is a statically-typed language. Again, because the language explicitly forbids the assignment of a lambda/delegate to an implicitly-typed local variable, the local names "withdraw", "deposit", and "accrueInterest" have to be explicitly typed.

static void FunctionAsObject()
{
    var balance = 0;
    Func<int, int> withdraw = (amount) =>
    {
        if (amount <= balance)
        {
            balance = balance - amount;
            return balance;
        }
        else
            throw new Exception("Insufficient funds");
    };
    Func<int, int> deposit = (amount) => 
    {
        balance += amount; return balance;
    };
    Func<float, int> accrueInterest = (intRate) => 
    { 
        balance += (int)(intRate * balance); return balance; 
    };

    Console.WriteLine("=============> FunctionAsObject");
    Console.WriteLine("{0}", deposit(100));
    Console.WriteLine("{0}", withdraw(10));
}

Notice that again, I choose to operate on the "global" variable "balance", rather than pass it in. (It's fairly easy to imagine how it would look if "balance" were passed in.)

Closure

"You are writing a function with a free variable. How do you bundle a function with a data value defined outside the procedure's body?" If the data value is defined inside the procedure, remember, it gets reset to the same value each time, and obviously this isn't going to track state at all well. "So you might try defining the balance outside the function." But that doesn't work, because now the value isn't encapsulated anymore. "Therefore, create the function in an environment where its free variables are bound to local variables."

This is something that O-O folks won't see right away, but it's a powerful mechanism for reuse. Traditional O-O says to tuck the encapsulated value (balance) away as a private field, but in environments like JavaScript, which lack any sort of formal access control, or in environments like the JVM or CLR, both of which offer a means by which to bypass access control directives (via the Reflection libraries in both), what's marked as "private" often isn't as private as we might want. By creating a local variable that's outside the scope of the returned function object but inside of the scope of the function returning the function (see where "balance" is declared in the JavaScript version, for example), the language or platform has to "close over" that variable (hence the name "closure"), thus making it accessible to the returned function for use, but effectively hidden away from any prying eyes that might want to screw with it outside of permitted access channels.

Scheme:

The only key thing to note here is that "withdraw" references a lambda, a function literal in Scheme. We'll try to keep this flavor in the other language implementations, just to be faithful:

(define withdraw
  (let ((balance 100)) ;; balance is defined here,
    (lambda (amount)
      (if (>= balance amount) ;; so this reference is bound
        (begin
          (set! balance (- balance amount))
          balance)
        (error "Insufficient funds" balance)))
    ))

JavaScript:

JavaScript is, surprisingly to some "old-school" JavaScript programmers, a full-fledged member of the family of languages that support closures, so all that's necessary here is to define a function that returns a function that "closes over" the local variable "balance". But, in order to make sure that balance isn't reset to its original value of 100 each time we call the function, we have to actually invoke the outer function to return the inner function, which is then bound to the name "withdraw"; that way, the variable "balance" is initialized once, yet still referenced:

(function() {
  out("closure ====================")

  var withdraw = function() {
    var balance = 100
    return function(amount) {
      if (balance >= amount) {
        balance -= amount
        return balance
      }
      else
        throw new Error("Insufficient funds")
    }
  }()
  out("withdraw 20 " + withdraw(20))
  out("withdraw 30 " + withdraw(30))
})()

Scala:

We can do the same thing in Scala, and the syntax looks somewhat similar to the JavaScript version--create a function literal, invoke it, and bind the result to the name "withdraw", where the return is another anonymous function literal:

  def closure() = {
    val withdraw = (() => {
      var balance = 100
      (amount: Int) => {
        if (amount <= balance) {
          balance -= amount
          balance
        }
        else
          throw new RuntimeException("Insufficient funds")
      }
    })()
    println(withdraw(20))
    println(withdraw(20))
  }

F#:

The F# version gets interesting because when we try to do the same thing that the JavaScript (or other languages) do--that is, "close over" a local variable defined in the outer scope--the compiler immediately rejects that, saying point-blank that the language does not support that, and to use "reference variables" (the ref keyword) instead:

let closure = fun () ->
    let withdraw =
        let balance = ref 100
        fun amt ->
            if amt <= !balance then
                balance := (!balance) - amt
                !balance
            else
                raise (Exception("Insufficient funds"))

    Console.WriteLine "=========> Closure"
    printfn "%d" (withdraw 20)
    printfn "%d" (withdraw 30)

What essentially we're doing, then, is capturing a pointer/reference to balance, and carrying that into the returned function literal, rather than letting the language capture that for us. The ref is dereferenced using the "!" operator, and assigned through using the ":=" operator, as can be seen above. Other than that, this is pretty much identical to the other languages' versions.

By the way, it should be noted that F#'s "printfn" function is actually type-safe, so attempts to write "printfn "%d" x" where "x" is a non-integer value will yield a compile-time error. That's an incredibly spiffy feature, and I wish it were something we could apply to our own F# APIs, but from what I understand from Don Syme (the F# language creator), it's something that's baked into the compiler somehow. :-/

Yeti (ML):

Yeti works just as any of the others have, since we can define a function literal that returns a function literal, so just like the JavaScript and Scala versions, we can bind a variable (as opposed to a value, which is immutable) just outside the inner function literal, and Yeti will "close over" that variable and use it for modifiable state:

withdraw = 
  (do:
    var balance = 100;
    do amt:
      if amt <= balance then
        balance := balance - amt;
        balance
      else
        throw new RuntimeException("Insufficient funds")
      fi
    done;
  done;) ();

println (withdraw 10);  // prints 90
println (withdraw 10);  // prints 80
println (withdraw 10);  // prints 70

Jaskell (Haskell):

C#:

Brace yourself--things are about to get really ugly here. The other versions suggest that we obtain encapsulation by capturing the "balance" value inside an outer function scope which is then referenced from an inner function scope, that inner function scope being the returned function literal. But... C# doesn't let us invoke function literals directly, except if they're cast to Func<> instances:

static void Closure()
{
    Func<int, int> withdraw = ((Func<Func<int, int>>)(() => {
        var balance = 100;
        Func<int, int> result = delegate(int amount)
        {
            if (balance >= amount)
            {
                balance -= amount;
                return balance;
            }
            else
                throw new Exception("Insufficient funds");
        };
        return result;
    }))();
    Console.WriteLine("=============> Closure");
    Console.WriteLine("{0}", withdraw(20));
}

Did all that make sense? It might be clearer if I go back to the version I wrote that I had to use in order to figure all this out on my own:

static void Closure()
{
    Func<Func<int, int>> withdrawMaker = (delegate {
        var balance = 100;
        Func<int, int> result = delegate(int amount)
        {
            if (balance >= amount)
            {
                balance -= amount;
                return balance;
            }
            else
                throw new Exception("Insufficient funds");
        };
        return result;
    });
    Func<int, int> withdraw = withdrawMaker();

    Console.WriteLine("=============> Closure");
    Console.WriteLine("{0}", withdraw(20));
}

Why bother will all of this--why not just write it as a generalized method like O-O folks have done since the beginning of time? Because we want that "balance" tucked away somewhere where Reflection can't find it. So the double-level of function indirection is necessary; to cap things off, we don't want to have to write a one-use "maker" function every time.

Constructor Function

"You are creating a Function as Object using a Closure. How do you create instances of the object? [M]ake a function that returns your Function as Object. Give the function an Intention Revealing Name (Beck) such as make-object."

Scheme:

Now things get more interesting, because the Scheme code is defining "make-withdraw" to be a lambda that in turn nests a lambda inside of it. This makes the syntax a little weird--since the returned value from "make-withdraw" is a lambda, the bound lambda must be executed in order to do the actual withdrawal.

(define make-withdraw
  (lambda (balance)
    (lambda (amount)
      (if (>= balance amount)  ;; balance is still bound,
        (begin                 ;; but to a new object on each call
          (set! balance (- balance amount))
          balance)
        (error "Insufficient funds" balance)))
    ))
(define account-for-eugene (make-withdraw 100))
(account-for-eugene 20)    => 80
(define account-for-tom (make-withdraw 1000))
(account-for-tom 20)       => 980

JavaScript:

It's pretty common in JavaScript to create a function that returns a function, and that's the heart of what Constructor Function is doing: returning a function:

(function() {
  out("constructorFunction ========")

  var makeWithdraw = function(balance) {
    return function(amount) {
      if (balance >= amount) {
        balance -= amount
        return balance
      }
      else
        throw new Error("Insufficient funds")
    }
  }
  var acctForEugene = makeWithdraw(100)
  out(acctForEugene(20))
  var acctForTed = makeWithdraw(1000)
  out(acctForTed(20))
})()

Scala:

Ditto for Scala, though the idiom/pattern of function-literal-returning- function-literal isn't always quite this obvious in Scala:

  def constructorFunction() = {
    def makeWithdraw(bal : Int) = {
      var balance = bal
      (amt : Int) => {
        if (balance >= amt) {
          balance = (balance - amt) 
          balance
        }
        else 
          throw new RuntimeException("Insufficient funds")
      }
    }
    val acctForEugene = makeWithdraw(100)
    println(acctForEugene(20))
    val acctForTed = makeWithdraw(1000)
    println(acctForTed(20))
  }

F#:

Really, not any different from the other languages: a function binding that returns a function, with the passed-in "balance" captured as a reference (see the earlier pattern element discussion for why it's a ref) inside the outer function scope, and used from the inner function scope.

let constructorFunction = fun () ->
    let makeAccount =
        fun bal ->
            let balance = ref bal
            fun amt ->
                if amt <= !balance then
                    balance := (!balance) - amt
                    !balance
                else
                    raise (Exception("Insufficient funds"))                
            
    Console.WriteLine "=========> Constructor Function"
    let acctForEugene = makeAccount 100
    printfn "%d" (acctForEugene 20)

Yeti (ML):

Same exercise--a function binding that returns a function, with the passed-in "balance" stored as a variable (var) inside the outer function scope, such that it is closed over by the inner function scope.

makeWithdraw =
  (do bal:
    var balance = bal;
    do amt:
      if amt <= balance then
        balance := balance - amt;
        balance
      else
        throw new RuntimeException("Insufficient funds")
      fi
    done;
  done;);

acctForEugene = makeWithdraw 100;
println (acctForEugene 10);   // 90
println (acctForEugene 10);   // 80

Jaskell (Haskell):

C#:

The constructor function must be explicitly typed, again, but we gain a tiny bit of brevity by changing the "delegate" literals into (slightly) shorter C# lambdas:

static void ConstructorFunction()
{
    Func<int,Func<int, int>> makeAccount = 
        ((Func<int,Func<int,int>>)( (bal) => {
            var balance = bal;
            return (int amount) =>
            {
                if (balance >= amount)
                {
                    balance -= amount;
                    return balance;
                }
                else
                    throw new Exception("Insufficient funds");
            };
        }));

    Console.WriteLine("=============> Closure");
    var acctForEugene = makeAccount(100);
    Console.WriteLine("{0}", acctForEugene(20));
}

Were it not for the implicitly-typed local variable declaration syntax around "acctForEugene", it would be acutely obvious that "makeAccount" isn't creating any kind of object at all, but a function to be executed. Even so, the explicit typing requirement for the lambdas is kind of annoying, and will only get worse as we move through the pattern language.

Method Selector

"You are creating a Function as Object using a Closure. A Constructor Function creates new instances of the object. How do you provide shared access to the closure's state?" After all, an account can do more than just withdraw, but all of the operations on the account have to share the same state--the account balance--without violating encapsulation.

Scheme:

Again we see the nested lambdas, but now there's a third level of nesting; the first invocation (make-account) returns a second invocation that will take a single string, switch on the string, and return a third lambda that will do the actual work of manipulating the balance.

(define make-account
  (lambda (balance)
    (lambda (transaction)
      (case transaction
        ('withdraw
          (lambda (amount)
            (if (>= balance amount)
              (begin
                (set! balance (- balance amount)
                balance)
              (error "Insufficient funds" balance)))))
        ('deposit
          (lambda (amount)
            (set! balance (+ balance amount))
            balance))
        ('balance
          (lambda ()
            balance))
        (else
          (error "Unknown request -- ACCOUNT"
            transaction))))
  ))
(define account-for-eugene (make-account 100))
((account-for-eugene 'withdraw) 10)  => 90
((account-for-eugene 'withdraw) 10)  => 80
((account-for-eugene 'deposit) 100)  => 180

JavaScript:

Doing this in JavaScript is, again, straightforward, though it does seem a little too subtle for idiomatic JavaScript:

(function() {
  out("methodSelector ========")

  var makeAccount = function(bal) {
    var balance = bal
    return function(transaction) {
      if (transaction === "withdraw") {
        return function(amount) {
          if (balance >= amount)
            return (balance = (balance - amount))
          else
            throw new Error("Insufficient funds")
        }
      }
      else if (transaction === "deposit") {
        return function(amount) {
          return (balance = (balance + amount))
        }
      }
      else if (transaction === "balance") {
        return function() {
          return balance
        }
      }
      else {
        throw new Error("Insufficient funds")
      }
    }
  }
  var acctForEugene = makeAccount(100)
  out(acctForEugene("withdraw")(20))
  out(acctForEugene("balance")())
})();

Scala:

This style of interface--passing in a string and a variable list of arguments--really isn't quite Scala's style, since (being a strongly- typed language) it prefers to be able to compile-time-check as much as it can, but that doesn't mean we can't build it when the need and opportunity mesh:

  def methodSelector() = {
    def makeAccount(bal : Int) = {
      var balance = bal
      (transaction : String) => {
        transaction match {
          case "withdraw" =>
            (amt : Int) => {
              if (balance >= amt) {
                balance = (balance - amt) 
                balance
              }
              else 
                throw new RuntimeException("Insufficient funds")
            }
          case "deposit" => {
            (amt : Int) => {
              balance += amt
              balance
            }
          }
          case _ => 
            throw new RuntimeException("Unknown request")
        }
      }
    }
    val acctForEugene = makeAccount(100)
    println(acctForEugene("deposit")(50))
    val acctForTed = makeAccount(100)
    println(acctForTed("withdraw")(50))
  }

F#:

This is, again, like the Yeti version and the Scala version, going to require some sacrifice in terms of flexibility in order to stay true to the original Scheme version--in F#, like in Scala and other statically-typed languages, we have to make sure that all branches of a pattern-match yield the same type of result, so the "balance" branch has to yield a function that takes a parameter (and it must be of the same type of parameter as the other two branches), even though "balance" never makes use of it. This also means that when calling the return value from "makeAccount", even for balance, we have to pass along some parameter that will be ignored.

let methodSelector = fun () ->
    let makeAccount =
        fun (bal : int) ->
            let balance = ref bal
            fun transaction ->
                match transaction with
                | "balance" ->
                    fun _ -> !balance
                | "deposit" ->
                    fun (amt : int) ->
                        balance := (!balance) + amt
                        !balance
                | "withdraw" ->
                    fun (amt : int) ->
                        if amt <= !balance then
                            balance := (!balance) - amt
                            !balance
                        else
                            raise (Exception("Insufficient funds"))
                | _ ->
                    raise (Exception("Unrecognized operation" + transaction))
                            
    Console.WriteLine "=========> Method Selector"
    let acctForEugene = makeAccount 100
    printfn "%d" ((acctForEugene "withdraw") 20)
    printfn "%d" ((acctForEugene "balance") 0)

We can address this required-uniformity-of-access a little bit more consistently with the next pattern element, but whether it's an improvement is debatable.

Yeti (ML):

Nothing new here: the makeAccount function now nests three function literals, just like the JavaScript and Scala ones do. Like the other languages, we use a pattern-match/switch-case construct to decide between the different action strings ("deposit", "withdraw", "balance") and then return the appropriate function literal for further execution. Note that Yeti, like JavaScript, actually has a way of returning an "object" here (a structure, which is a data type the contains one or more named fields, a la objects in JavaScript or case classes in Scala), but since the goal is to remain as faithful as possible to the original Scheme implementation, I stick with the more "functional-only" approach.

makeAccount =
  (do bal:
    var balance = bal;
    do action:
      case action of
        "withdraw": 
          do amt:
            if amt <= balance then
              balance := balance - amt;
              balance
            else
              throw new RuntimeException("Insufficient funds")
            fi
          done;
        "deposit": 
          do amt: 
            balance := balance + amt;
            balance;
          done;
        "balance": 
          do: 
            balance; 
          done;
        _ : throw new RuntimeException("Unknown operation")
      esac
    done;
  done;);

acctForEugene = makeAccount 100;
println ((acctForEugene "withdraw") 20);
println ((acctForEugene "deposit") 20);

Jaskell (Haskell):

C#:

If you stopped reading right here, I wouldn't blame you; this is some ugly C#, without question, particularly considering that there are other ways to accomplishing this same effect without requiring quite so much nesting.

static void MethodSelector()
{
    Func<int, Func<string, Func<int, int>>> makeAccount =
        ((Func<int, Func<string, Func<int, int>>>)((bal) =>
        {
            var balance = bal;
            return (string transaction) =>
                {
                    switch (transaction)
                    {
                        case "deposit":
                            return (int amount) =>
                                {
                                    if (balance >= amount)
                                    {
                                        balance -= amount;
                                        return balance;
                                    }
                                    else
                                        throw new Exception("Insufficient funds");
                                };
                        case "withdraw":
                            return (int amount) =>
                                {
                                    balance += amount;
                                    return balance;
                                };
                        case "balance":
                            return (int unused) =>
                                {
                                    return balance;
                                };
                        default:
                            throw new Exception("Illegal operation");
                    }
                };
        }));
    Console.WriteLine("=============> MethodSelector");
    var acctForEugene = makeAccount(100);
    Console.WriteLine("{0}", acctForEugene("deposit")(20));
    Console.WriteLine("{0}", acctForEugene("withdraw")(20));
    Console.WriteLine("{0}", acctForEugene("balance")(0));
}

Bear in mind, too, that there are some other ways to accomplish what the C# code here tries to do, one using dynamic types (from 4.0):

static void MethodSelector2()
{
    Func<int, dynamic> makeAccount = (int bal) =>
    {
        var balance = bal;
        dynamic result = new System.Dynamic.ExpandoObject();
        result.withdraw = (Func<int, int>)((amount) => {
            if (balance >= amount)
            {
                balance -= amount;
                return balance;
            }
            else
                throw new Exception("Insufficient funds");
        });
        result.deposit = (Func<int, int>)((amount) =>
        {
            balance += amount;
            return balance;
        });
        result.balance = (Func<int>)(() => balance);
        return result;
    };

    Console.WriteLine("=============> MethodSelector2");
    var acctForEugene = makeAccount(100);
    Console.WriteLine("{0}", acctForEugene.deposit(20));
    Console.WriteLine("{0}", acctForEugene.balance());
    var acctForTed = makeAccount(100);
    Console.WriteLine("{0}", acctForTed.withdraw(10));
    Console.WriteLine("{0}", acctForTed.balance());
}

... or even using ye old plain ol' Dictionary type, taking a string as a key and yielding Func<> as values for execution:

static void MethodSelector3()
{
    Func<int, Dictionary<string,Func<int,int>>> makeAccount = 
    (int bal) =>
    {
        var balance = bal;
        var result = new Dictionary<string, Func<int,int>>();
        result["withdraw"] = (Func<int, int>)((amount) =>
        {
            if (balance >= amount)
            {
                balance -= amount;
                return balance;
            }
            else
                throw new Exception("Insufficient funds");
        });
        result["deposit"] = (Func<int, int>)((amount) =>
        {
            balance += amount;
            return balance;
        });
        result["balance"] = (Func<int, int>)((unused) => balance);
        return result;
    };

    Console.WriteLine("=============> MethodSelector3");
    var acctForEugene = makeAccount(100);
    Console.WriteLine("{0}", acctForEugene["deposit"](20));
    Console.WriteLine("{0}", acctForEugene["balance"](0));
    var acctForTed = makeAccount(100);
    Console.WriteLine("{0}", acctForTed["withdraw"](10));
    Console.WriteLine("{0}", acctForTed["balance"](0));
}

The second of these two is closer to strict intent of Method Selector from the Scheme example, but the first allows for flexible arity (numbers of parameters) in the functions handed back when dereferenced (so that "balance" doesn't have to take a bogus unused parameter). Frankly, had I to choose, I'd probably go with the dynamic version, just because of that flexibility.

Message-Passing Interface

"You have created a Method Selector for a Function as Object. You prefer to use your object in code that has an object-oriented feel. How do you invoke the methods of an object? [P]rovide a simple message-passing interface for using the closure."

Scheme:

Everything in a Lisp is a list, and the Scheme implementation uses that to full effect by taking the argument list passed in to "send" and splits it up into the object (the account), message (withdraw/deposit/etc), and the arguments (if any) that are left.

(define send
  (lambda argument-list
    (let ((object  (car argument-list))
          (message (car (cdr argument-list)))
          (args    (cdr (cdr argument-list))))
      (apply (get-method object message) args))
  ))
(define get-method
  (lambda (object selector)
    (object selector)
  ))
(define account-for-eugene (make-account 100))
(send account-for-eugene 'withdraw 50)  => 50
(send account-for-eugene 'deposit 100)  => 150
(send account-for-eugene 'balance)      => 150

JavaScript:

In JavaScript, peeling off the head and tail of the arguments reference is trickier here, because unlike Scheme, JavaScript sees "arguments" as an array, not a list. While I could've created "car" and "cdr" functions in JavaScript to perform the relevant operations on an array, it felt more idiomatic to provide a function "slice" to do the "slicing" (which is actually a copy) of elements off the end of the array instead. More importantly, "slice" is a primitive method on Array objects in ECMAScript 5, though neither Node nor Rhino in Java 6 recognize it (I suspect because neither is a compliant ECMAScript 5 environment yet), and if this code ever gets run in a ECMAScript 5 world, then it would/should use that version, instead, since it'll likely be faster than mine.

The other interesting tidbit in here is that when I wrote it the first time, when doing a deposit, the "balance" became "8020", instead of the mathematically-correct "100". JavaScript's "promiscuous typing" thought that the "+" operator wanted to do a string concatenation, instead of a mathematical add of two numbers, so I had to convince it that the value coming out of arguments[1] was, in fact, a number, and the easiest way (it seemed to me at the time) was to just do a quick redundant math operation on it (multiply by 10, then divide by 10 again). There's likely a more idiomatic way to do that, I suspect.

I also note that getMethod() in JavaScript is a bit unnecessary; we could inline its functionality directly inside of send().

(function() {
  out("messagePassingInterface ========")

  var slice = function(src, start, end) {
    var returnVal = []
    var j = 0
    if (end === undefined)
      end = src.length
    for (var i = start; i < end; i++) {
      if (src.length > i)
        returnVal[j++] = src[i]
    }
    return returnVal;
  }
  
  var makeAccount = function(bal) {
    var balance = bal
    return function(transaction) {
      if (transaction === "withdraw") {
        return function(amount) {
          if (balance >= amount)
            return (balance = (balance - amount))
          else
            throw new Error("Insufficient funds")
        }
      }
      else if (transaction === "deposit") {
        return function(amount) {
          return (balance = (balance + (amount * 10.0 / 10.0)))
        }
      }
      else if (transaction === "balance") {
        return function() {
          return balance
        }
      }
      else {
        throw new Error("Insufficient funds")
      }
    }
  }
  var getMethod = function(object, selector) {
    return object(selector)
  }
  var send = function(object, message) {
    return (getMethod(object, message))(slice(arguments, 2))
  }
  var acctForEugene = makeAccount(100)
  out(send(acctForEugene, "withdraw", 20)) // 80
  out(send(acctForEugene, "balance"))      // 80
  out(send(acctForEugene, "deposit", 20))  // 100
  out(send(acctForEugene, "balance"))      // 100
})();

Scala:

The Scala version of this follows the JavaScript version in that it works off of a variable-argument list, but since Scala doesn't give us the built-in "arguments" reference, we have to specify it at the method declaration:

  def messagePassingInterface() = {
    def makeAccount(bal : Int) = {
      var balance = bal
      def send(key:String, args:Any*) = {
        key match {
          case "withdraw" => {
            val amt = args.head.asInstanceOf[Int]
            if (balance >= amt) {
              balance = (balance - amt) 
              balance
            }
            else 
              throw new RuntimeException("Insufficient funds")
          }
          case "deposit" => {
            val amt = args.head.asInstanceOf[Int]
            balance += amt
            balance
          }
        }
      }
      send _
    }
    val acctForEugene = makeAccount(100)
    println(acctForEugene("withdraw", 10))
  }

F#:

By now taking an "obj list" for the parameters, we unify all of the calls to the account to take a consistent parameter list that still allows for a flexible number of parameters, but.... It still requires that callers that don't want to pass any arguments have to pass an empty list. And, on top of that, it doesn't really feel "F#-ish".

let messagePassingInterface = fun () ->
    let makeAccount =
        fun (bal : int) ->
            let balance = ref bal
            fun transaction ->
                match transaction with
                | "balance" ->
                    fun _ -> !balance
                | "deposit" ->
                    fun (arglist : obj list) ->
                        let amt = arglist.Head :?> int
                        balance := (!balance) + amt
                        !balance
                | "withdraw" ->
                    fun (arglist : obj list) ->
                        let amt = arglist.Head :?> int
                        if amt <= !balance then
                            balance := (!balance) - amt
                            !balance
                        else
                            raise (Exception("Insufficient funds"))
                | _ ->
                    raise (Exception("Unrecognized operation" + transaction))
    let getMethod = fun (acct : string -> obj list -> int) selector -> acct selector
    let send = 
        fun (acct : string -> obj list -> int) (message : string) (arglist : obj list) ->
            (getMethod acct message)(arglist)

    Console.WriteLine "=========> Message Passing Interface"
    let acctForEugene = makeAccount 100
    printfn "%d" (send acctForEugene "withdraw" [20])
    printfn "%d" (send acctForEugene "balance" [])

Note that F# does have language facilities for allowing a variable-argument list to be passed, but it only works on method members:

// From the MSDN documentation
open System

type X() =
    member this.F([<ParamArray>] args: Object[]) =
        for arg in args do
            printfn "%A" arg

[<EntryPoint>]
let main _ =
    // call a .NET method that takes a parameter array, passing values of various types
    Console.WriteLine("a {0} {1} {2} {3} {4}", 1, 10.0, "Hello world", 1u, true)

    let xobj = new X()
    // call an F# method that takes a parameter array, passing values of various types
    xobj.F("a", 1, 10.0, "Hello world", 1u, true)
    0

We could go back and rewrite all of the F# samples to be class member methods (that is, return actual objects), but that sort of gets away from the spirit of what the blog exercise is trying to do, so I'll leave that as an exercise to the reader. (Which, by the way, is author-speak for "I'm feeling lazy and I don't want to bother".)

Yeti (ML):

Unfortunately, while Yeti (like most functional languages) has a built-in list type, it doesn't recognize arguments to a function as a list, so we either have to explicitly put the arguments in, or we have to explicitly state that the arguments to the returned function literal are a list. I choose the latter tactic, even though it's not the world's most impressive syntax:

makeAccount =
  (do bal:
    var balance = bal;
    do action:
      case action of
        "withdraw":
          do argList:
            amt = head argList;
            if amt <= balance then
              balance := balance - amt;
              balance;
            else
              throw new RuntimeException("Insufficient funds")
            fi
          done;
        "deposit": 
          do argList:
            amt = head argList; 
            balance := balance + amt;
            balance;
          done;
        "balance": 
          do: 
            balance; 
          done;
        _ : throw new RuntimeException("Unknown operation")
      esac
    done;
  done;);

acctForEugene = makeAccount 100;
println  ((acctForEugene "withdraw")[20]);  // 80

If there's a way to get a Yeti function to accept a variable number of arguments, I've not seen it in the language overview. I don't know if any ML-derivative has this, to be honest. Of course, the other thing to do, since this is a statically-typed environment, is to just return function literals that expect the proper number of arguments, which will get us the compile-time safety that these languages are supposed to provide; the below does exactly that--the last line will fail to compile if you uncomment it:

makeAccount =
  (do bal:
    var balance = bal;
    do action:
      case action of
        "withdraw":
          do amt:
            if amt <= balance then
              balance := balance - amt;
              balance;
            else
              throw new RuntimeException("Insufficient funds")
            fi
          done;
        "deposit": 
          do amt:
            balance := balance + amt;
            balance;
          done;
        "balance": 
          do: 
            balance; 
          done;
        _ : throw new RuntimeException("Unknown operation")
      esac
    done;
  done;);

acctForEugene = makeAccount 100;
println ((acctForEugene "withdraw") 20);      // 80
println ((acctForEugene "balance") 0);        // 80
//println ((acctForEugene "withdraw") "fred");  // won't compile

(Truthfully, we should do this for the Scala version, too.) This choice is going to cause us a little bit of heartache, though, because in order to use "balance", we have to pass in a number--if we leave off the "_" in the function literal returned from the "balance" arm of the selector, we don't need to pass "0" when we invoke it, but what's returned isn't a number, but a function. I can't figure out how to make Yeti take that function and just invoke it--the syntax guide doesn't seem to say out loud exactly how I can invoke that function without having to pass in a number argument. If I'd left it as taking a list, then I could pass an empty list and all would look consistent, if a little weird.

(Note that this is deliberately opposite what I chose to do for the F# version.)

Jaskell (Haskell):

C#:

Generic Function

"You have created a Method Selector for a Function as Object. You want to take full advantage of the tools available in your functional language. How do you invoke the methods of an object? ... [P]rovide a simple interface to the Method Selector that more closely follows the functional style."

Scheme:

In the Scheme implementation, it's interesting that having written the send function in the last element of the pattern language, we don't really use it here, but instead just inline its functionality in each of the named functions (which, in turn, take the argument list, peel off the head of the argument list as the account object, and pass the remainder of the arguments on to the selected function):

(define withdraw
  (lambda argument-list
    (let ((object (car argument-list))
          (withdraw-arguments (cdr argument-list)))
      (apply (object 'withdraw) withdraw-arguments)
    )))
(define deposit
  (lambda argument-list
    (let ((object (car arguments))
          (deposit-arguments (cdr arguments)))
      (apply (object 'deposit) deposit-arguments)
    )))
(define balance
  (lambda (object)
    (object 'balance)
  ))
  
(define account-for-eugene (make-account 100))
(withdraw account-for-eugene 10)
(map  (lambda (account) (deposit account 10)) account-for-eugene)

Interestingly enough, I sort of expected the Scheme version to use "deposit" directly, rather than write a trampoline that calls "deposit", since we could've avoided the Generic Function part of the language just by using "send" directly, as well:

(map  (lambda (account) (send account 'deposit 10)) account-for-eugene)

And, to be honest, calling "map" on a single object doesn't really seem to be a profoundly functional experience, so in my examples I'm going to create a collection of accounts (called a "bank", naturally enough), and map across that collection.

JavaScript:

The JavaScript version of this is, again, pretty similar to the Scheme version. Again, ECMAScript 5 environments are supposed to have a "map" function natively built in, but previous environments don't, so I have to write one to verify that we can, in fact, use the named functions as the mapped operation. I also write a "map2", another version of map that takes the function to apply to the collection but also takes any additional arguments after that and passes them to the function being applied across the collection; it allows me to use "deposit" directly, instead of having to write a trampoline for it, and besides, it's trivial to write in JavaScript:

(function() {
  out("genericFunction ==============")

  var slice = function(src, start, end) {
    var returnVal = []
    var j = 0
    if (end === undefined)
      end = src.length
    for (var i = start; i < end; i++) {
      if (src.length > i)
        returnVal[j++] = src[i]
    }
    return returnVal
  }
  
  var map = function(fn, src) {
    var retVal = []
    for (i in src)
      retVal[i] = fn(src[i])
    return retVal
  }
  var map2 = function(src, fn) {
    var retVal = []
    for (i in src)
      retVal[i] = fn(src[i], slice(arguments, 2))
    return retVal
  }
  
  var makeAccount = function(bal) {
    var balance = bal
    return function(transaction) {
      if (transaction === "withdraw") {
        return function(amount) {
          if (balance >= amount)
            return (balance = (balance - amount))
          else
            throw new Error("Insufficient funds")
        }
      }
      else if (transaction === "deposit") {
        return function(amount) {
          return (balance = (balance + (amount * 10.0 / 10.0)))
        }
      }
      else if (transaction === "balance") {
        return function() {
          return balance
        }
      }
      else {
        throw new Error("Insufficient funds")
      }
    }
  }
  var withdraw = function() {
    var object = arguments[0]
    var argumentList = slice(arguments, 1)
    return object("withdraw")(argumentList)
  }
  var deposit = function() {
    var object = arguments[0]
    var argumentList = slice(arguments, 1)
    return object("deposit")(argumentList)
  }
  var balance = function(object) {
    return object("balance")()
  }

  var acctForEugene = makeAccount(100)
  out(withdraw(acctForEugene, 20))
  out(deposit(acctForEugene, 20))
  
  var bank = [
    makeAccount(100),  // acctForEugene
    makeAccount(1000)  // acctForTed
  ]
  map(function(it) { deposit(it, 20) }, bank)
  out(balance(bank[0]))
  out(balance(bank[1]))
  
  map2(bank, deposit, 20)
  out(balance(bank[0]))
  out(balance(bank[1]))
})();

Scala:

Scala, of course, has functional methods built onto its List type (which we can use instead of an array, since Scala has much better support for lists than arrays):

  def genericFunction() = {
    def makeAccount(bal : Int) = {
      var balance = bal
      def send(key:String, args:Any*) = {
        key match {
          case "withdraw" => {
            val amt = args.head.asInstanceOf[Int]
            if (balance >= amt) {
              balance = (balance - amt) 
              balance
            }
            else 
              throw new RuntimeException("Insufficient funds")
          }
          case "deposit" => {
            val amt = args.head.asInstanceOf[Int]
            balance += amt
            balance
          }
          case "balance" => {
            balance
          }
          case _ =>
            throw new RuntimeException("Unknown request")
        }
      }
      send _
    }
    def withdraw(account : (String, Any*) => Int, amount : Int) = {
      account("withdraw", amount)
    }
    def deposit(account : (String, Any*) => Int, amount : Int) = {
      account("deposit", amount)
    }
    def balance(account : (String, Any*) => Int) = {
      account("balance")
    }
    val accounts = List(makeAccount(100), makeAccount(200), makeAccount(300))
    accounts.foreach(withdraw(_, 20))
    accounts.foreach((in) => { println(balance(in)) })
  }

F#:

The F# version helps clean up some of the syntax a little, sort of:

let genericFunction = fun () ->
    let makeAccount =
        fun (bal : int) ->
            let balance = ref bal
            fun transaction ->
                match transaction with
                | "balance" ->
                    fun _ -> !balance
                | "deposit" ->
                    fun (arglist : obj list) ->
                        let amt = arglist.Head :?> int
                        balance := (!balance) + amt
                        !balance
                | "withdraw" ->
                    fun (arglist : obj list) ->
                        let amt = arglist.Head :?> int
                        if amt <= !balance then
                            balance := (!balance) - amt
                            !balance
                        else
                            raise (Exception("Insufficient funds"))
                | _ ->
                    raise (Exception("Unrecognized operation" + transaction))
    let deposit = 
        fun amt acct->
            acct "deposit" [amt :> obj]
    let withdraw =
        fun amt acct ->
            acct "withdraw" [amt :> obj]
    let balance =
        fun acct ->
            acct "balance" []

    Console.WriteLine "=========> Generic Function"
    let bank = [ makeAccount 100; makeAccount 200; makeAccount 300 ]
    let balances = List.map (fun it -> deposit 20 it) bank
    List.iter (fun it -> printfn "%d" it) balances
    let balances = List.map (deposit 20) bank
    List.iter (printfn "%d") balances

Notice that by putting the account counter-intuitively as the last parameter to the generic "deposit" and "withdraw" functions, we can avoid having to write the "trampoline" function that we would've had to write when using "map"; the account gets curried from the List directly (as shown in the second example). We could do the same thing in the Scala version, too, and then wouldn't have to use the explicit "_" syntax that Scala provides. Of course, if the desire is instead to pass the amount in a curried fashion, instead of the account, then the original ordering of the parameters is better.

Yeti (ML):

Writing this in Yeti/ML is definitely trickier than it was in JavaScript, despite the built-in "map" and other functions, because getting the arguments to "trampoline" right is a little harder. Fortunately, the generic method hides the "balance 0" weirdness from the last pattern element, making it a tad easier to use:

makeAccount =
  (do bal:
    var balance = bal;
    do action:
      case action of
        "withdraw":
          do amt:
            if amt <= balance then
              balance := balance - amt;
              balance
            else
              throw new RuntimeException("Insufficient funds")
            fi
          done;
        "deposit": 
          do amt:
            balance := balance + amt;
            balance
          done;
        "balance": 
          do: 
            balance
          done;
        _ : throw new RuntimeException("Unknown operation")
      esac
    done;
  done;);

withdraw =
  (do acct amt:
    (acct "withdraw") amt;
  done;);
deposit =
  (do acct amt:
    (acct "deposit") amt;
  done;);
balance =
  (do acct:
    acct "balance" 0;
  done;);

acctForEugene = makeAccount 100;
println (withdraw acctForEugene 20);      // 80
println (deposit acctForEugene 20);       // 100
println (balance acctForEugene);          // 100

accounts = [(makeAccount 100), (makeAccount 200), (makeAccount 300)];
balances = map (do acct: (deposit acct 20) done) accounts;
for accounts do acct: println(deposit acct 20) done;

Yeti complained if I didn't bind the result of the "map" call to a value, hence the "balances" value there, even though the balances are actually also stored in the relevant closures for each account. Note that the "for" line that follows it actually does the same thing, and prints the results out, to boot. In fact, it's high time people started to realize that the "for" loop in most imperative languages is just a non-functional way of doing a "map" without yielding a value. Languages like Scala and Yeti/ML essentially blur that line significantly enough to the point where we should just eschew "for" altogether and use "map", if you ask me.

Jaskell (Haskell):

C#:

Delegation

"You are creating a Function as Object. How do you create a new object that extends the behavior of an existing object? ... [U]se delegation. Make a Function as Object that has an instance variable an instance of the object you want to extend. Implement behaviors specific to the new object as methods in a Method Selector. Pass all other messages onto the instance variable."

Again, in a traditional O-O language, we'd just inherit, and in an object- functional hybrid, we could do the same. There's no real point not to, to be honest. But the interesting thing about this implementation is that it demonstrates the runtime relationship between a JavaScript object and its prototype: calling a function passing in the "derived" object causes the "derived" to try its "base" (its prototype) in the event that the method in question isn't defined on the "derived".

Note also that this particular trick is really only feasible because the "object" presents a uniform interface: all interaction with the "object" (whether it is a standard account or an interest-bearing one) is done through the Method Selector mechanism, which allows for this extension without having to modify any sort of base interface. This isn't so much a knock on O-O as a whole as it is on statically-typed traditional O-O.

Scheme:

This is pretty straightforward, if you understood the Message-Passing Interface implementation of earlier.

(define make-interest-bearing-account
  (lambda (balance interest-rate)
    (let ((my-account (make-account balance)))
      (lambda (transaction)
        (case transaction
          ('accrue-interest
            (lambda ()
              ((my-account 'deposit)
                (* ((my-account 'balance))
                   interest-rate)) ))
        (else
          (my-account transaction))
        )))
  ))
(define account-for-eugene (make-interest-bearing-account 100 0.05))
((account-for-eugene 'balance))         => 100
((account-for-eugene 'deposit) 100)     => 200
((account-for-eugene 'balance))         => 200
((account-for-eugene 'accrue-interest)) => 210
((account-for-eugene 'balance))         => 210

JavaScript:

Despite the fact that the JavaScript implementation just keeps getting longer and longer, it's actually not that much harder to add in this delegation functionality--again, as has been the case for a lot of the JavaScript code, it's almost a direct one-to-one port from the Scheme:

(function() {
  out("delegation =======")
  
  var slice = function(src, start, end) {
    var returnVal = []
    var j = 0
    if (end === undefined)
      end = src.length
    for (var i = start; i < end; i++) {
      if (src.length > i)
        returnVal[j++] = src[i]
    }
    return returnVal
  }
  
  var makeAccount = function(bal) {
    var balance = bal
    return function(transaction) {
      if (transaction === "withdraw") {
        return function(amount) {
          if (balance >= amount)
            return (balance = (balance - amount))
          else
            throw new Error("Insufficient funds")
        }
      }
      else if (transaction === "deposit") {
        return function(amount) {
          return (balance = (balance + (amount * 10.0 / 10.0)))
        }
      }
      else if (transaction === "balance") {
        return function() {
          return balance
        }
      }
      else {
        throw new Error("Insufficient funds")
      }
    }
  }
  var makeInterestBearingAccount = function(bal, intRate) {
    var myAccount = makeAccount(bal)
    return function(transaction) {
      if (transaction === "accrueInterest") {
        return function() {
          var balance = myAccount("balance")()
          var interest = (balance * intRate)
          return myAccount("deposit")(interest)
        }
      }
      else
        return myAccount(transaction)
    }
  }
  
  var acctForEugene = makeInterestBearingAccount(100, 0.05)
  out(acctForEugene("balance")())
  out(acctForEugene("deposit")(20))
  out(acctForEugene("accrueInterest")())
  out(acctForEugene("balance")())
})();

Scala:

The Scala version of this is tricky, because it relies on a very subtle bit of Scala syntax; specifically, when we try to pass the "args" sequence (which, in actual implementation, is a WrappedArray) from the "makeInterestBearingAccount" function to the "makeAccount" function (by which I mean, the functions returned from those two functions), if we don't use the peculiar ": _*" syntax, Scala interprets "args" to be a single parameter (a single parameter whose type is a collection), instead of the intended "pass the arguments through" behavior. (If you're a Java or C# developer, it's like having a varargs method calling another varargs method, and passing the array of arguments from the first as an array instead of each element on its own to form the array of arguments in the second. Yeah, I know--it's a little brain-twisty.)

  def delegation() = {
    def makeAccount(bal : Int) = {
      var balance = bal
      def send(key:String, args:Any*) = {
        key match {
          case "withdraw" => {
            val amt = args.head.asInstanceOf[Int]
            if (balance >= amt) {
              balance = (balance - amt) 
              balance
            }
            else 
              throw new RuntimeException("Insufficient funds")
          }
          case "deposit" => {
            val amt = args.head.asInstanceOf[Int]
            balance += amt
            balance
          }
          case "balance" => {
            balance
          }
          case _ =>
            throw new RuntimeException("Unknown request")
        }
      }
      send _
    }
    def makeInterestBearingAccount(bal : Int, intRate : Double) = {
      val account = makeAccount(bal)
      def send(key: String, args:Any*) = {
        key match {
          case "accrueInterest" => {
            val amt = (int2float(account("balance")) * intRate).toInt
            account("deposit", amt)
          }
          case _ =>
            account(key, args : _*)
        }
      }
      send _
    }
    val acctForEugene = makeInterestBearingAccount(100, 0.05)
    println(acctForEugene("deposit", 20))
    println(acctForEugene("accrueInterest"))
    println(acctForEugene("balance"))
  }

F#:

Aside from the aforementioned weirdness about the obj list as a generic parameter mechanism, this is really straightforward:

let delegation = fun () ->
    let makeAccount =
        fun (bal : int) ->
            let balance = ref bal
            fun transaction ->
                match transaction with
                | "balance" ->
                    fun _ -> !balance
                | "deposit" ->
                    fun (arglist : obj list) ->
                        let amt = arglist.Head :?> int
                        balance := (!balance) + amt
                        !balance
                | "withdraw" ->
                    fun (arglist : obj list) ->
                        let amt = arglist.Head :?> int
                        if amt <= !balance then
                            balance := (!balance) - amt
                            !balance
                        else
                            raise (Exception("Insufficient funds"))
                | _ ->
                    raise (Exception("Unrecognized operation" + transaction))
    let makeInterestBearingAccount =
        fun (bal : int) (intRate : float) ->
            let account = makeAccount bal
            fun transaction ->
                match transaction with
                | "accrueInterest" ->
                    fun _ ->
                        let balance = (account "balance" [])
                        let interest : float = float balance * intRate 
                        account "deposit" [int interest]
                | _ -> account transaction
    Console.WriteLine "=========> Delegation"
    let acctForEugene = makeInterestBearingAccount 100 0.05
    printfn "%d" (acctForEugene "deposit" [20])
    printfn "%d" (acctForEugene "accrueInterest" [])

Note the explicit casts in the "accrueInterest" code: this is because F#, like a lot of functional languages, won't do automatic type-promotion for you. So the "int"s have to be explicitly converted to "float"s, and back again.

Yeti (ML):

Since we didn't go down the path of trying to do the variable-argument list in Yeti, we don't have the same problems the Scala version presented, and the generic methods (the top-level "withdraw", "deposit" and "balance" functions) actually help hide the syntactic weirdness that we ran into in the last pattern element:

makeAccount =
  (do bal:
    var balance = bal;
    do action:
      case action of
        "withdraw":
          do amt:
            if amt <= balance then
              balance := balance - amt;
              balance
            else
              throw new RuntimeException("Insufficient funds")
            fi
          done;
        "deposit": 
          do amt:
            balance := balance + amt;
            balance
          done;
        "balance": 
          do: 
            balance
          done;
        _ : throw new RuntimeException("Unknown operation")
      esac
    done;
  done;);

withdraw =
  (do acct amt:
    (acct "withdraw") amt;
  done;);
deposit =
  (do acct amt:
    (acct "deposit") amt;
  done;);
balance =
  (do acct:
    acct "balance" 0;
  done;);

acctForEugene = makeAccount 100;
println (withdraw acctForEugene 20);      // 80
println (deposit acctForEugene 20);       // 100
println (balance acctForEugene);          // 100

accounts = [(makeAccount 100), (makeAccount 200), (makeAccount 300)];
balances = map (do acct: (deposit acct 20) done) accounts;
for accounts do acct: println(deposit acct 20) done;

Note the last two lines--the "for" construct in most imperative languages is actually akin to the "map" construct in most functional languages, except that in the imperative "for" there's no return value from the expression, and in a functional "map" there (usually) is. This is why we have to bind the result from the "map" to a name, and we don't have any results from the "for". (The "map" also insists on having a returned value--a list of Unit isn't acceptable, which is what would be returned if we used the "println" expression in the "map".)

Jaskell (Haskell):

C#:

Private Method

"You have created a Method Selector. How do you factor common behavior out of the methods in the Method Selector? ... [D]efine the common code in a Local Procedure (Wallingford). Invoke this procedure in place of the duplicated code within the Method Selector."

Scheme:

(define make-account
  (lambda (balance)
    (let ((transaction-log '())
      (log-transaction
        (lambda type amount)
          (set! transaction-log
                (cons (list type amount)
                      transaction-log)))) )
      (lamba (transaction)
        (case transaction
          ('withdraw
            (lambda (amount)
              (if (>= balance amount)
                (begin
                  (set! balance (- balance amount))
                  (log-transaction 'withdraw amount)
                  balance)
                (error "Insufficient funds" balance))))
          ('deposit
            (lambda (amount)
              (set! balance (+ balance amount))
              (log-transaction 'deposit amount)
              balance))
        ...))
    ))

JavaScript:

Again, in JavaScript, we rely on the fact that anything declared inside the "makeAccount" function but outside the function returned by "makeAccount" is encapsulated, and create both the "transactionLog" (an array, since JavaScript likes those better than lists) and the function to append to it ("logTransaction") within that "neutral zone". Just to prove that the transaction log is being written, I add another method to the method selector table, "viewLog", to return the contents of the transaction log.

(function() {
  out("privateMethod ===========")
  
  var makeAccount = function(bal) {
    var transactionLog = []
    var logTransaction = function(type, amount) {
      transactionLog.push("Action: " + type + " for " + amount)
    }
    
    var balance = bal
    return function(transaction) {
      if (transaction === "withdraw") {
        return function(amount) {
          if (balance >= amount) {
            logTransaction("withdraw", amount)
            return (balance = (balance - amount))
          }
          else
            throw new Error("Insufficient funds")
        }
      }
      else if (transaction === "deposit") {
        return function(amount) {
          logTransaction("deposit", amount)
          return (balance = (balance + (amount * 10.0 / 10.0)))
        }
      }
      else if (transaction === "balance") {
        return function() {
          logTransaction("balance", balance)
          return balance
        }
      }
      else if (transaction === "viewLog") {
        return function() {
          return (transactionLog)
        }
      }
      else {
        throw new Error("Insufficient funds")
      }
    }
  }
  var acctForEugene = makeAccount(100)
  out(acctForEugene("withdraw")(20))
  out(acctForEugene("balance")())
  out(acctForEugene("deposit")(20))
  out(acctForEugene("balance")())
  out(acctForEugene("viewLog")())
})();

Scala:

The Scala version is also pretty straightforward--we've already seen that Scala supports nested functions, so it is simply a matter of defining the logTransaction() function and an empty List[String] in the same "neutral zone" in which the "balance" variable lives. Instead of adding a new selector to the list, I chose this time to just print the transaction log as part of the "balance" operation.

  def privateMethod() = {
    def makeAccount(bal : Int) = {
      var balance = bal
      var transactionLog = List[String]()
      def logTransaction(action:String, amount:Int) = {
        val msg = ("Action: " + action + " for " + amount)
        transactionLog = transactionLog :+ msg
      }
      def send(key:String, args:Any*) = {
        key match {
          case "withdraw" => {
            val amt = args.head.asInstanceOf[Int]
            if (balance >= amt) {
              logTransaction("withdraw", amt)
              balance = (balance - amt) 
              balance
            }
            else 
              throw new RuntimeException("Insufficient funds")
          }
          case "deposit" => {
            val amt = args.head.asInstanceOf[Int]
            logTransaction("deposit", amt)
            balance += amt
            balance
          }
          case "balance" => {
            println(transactionLog)
            balance
          }
          case _ =>
            throw new RuntimeException("Unknown request")
        }
      }
      send _
    }
    val acctForEugene = makeAccount(100)
    println(acctForEugene("deposit", 20))
    println(acctForEugene("balance"))
  }

F#:

Binding a local function is, by this point, somewhat trivial and uninspiring, but it's just as easily done in F# as it is in any of the other languages:

let privateMethod = fun () ->
    let makeAccount =
        fun (bal : int) ->
            let transactionLog = ref []
            let logTransaction act (amt : int) =
                let message = "Action: " + act + " for " + amt.ToString()
                transactionLog := List.append !transactionLog [message]
            let balance = ref bal
            fun transaction ->
                match transaction with
                | "balance" ->
                    fun _ -> 
                        List.iter (printfn "%s") !transactionLog
                        !balance
                | "deposit" ->
                    fun (arglist : obj list) ->
                        let amt = arglist.Head :?> int
                        balance := (!balance) + amt
                        logTransaction "deposit" amt
                        !balance
                | "withdraw" ->
                    fun (arglist : obj list) ->
                        let amt = arglist.Head :?> int
                        if amt <= !balance then
                            balance := (!balance) - amt
                            logTransaction "withdraw" amt
                            !balance
                        else
                            raise (Exception("Insufficient funds"))
                | _ ->
                    raise (Exception("Unrecognized operation" + transaction))
    Console.WriteLine "=========> Private Method"
    let acctForEugene = makeAccount 100
    printfn "%d" (acctForEugene "deposit" [20])
    printfn "%d" (acctForEugene "withdraw" [50])
    printfn "%d" (acctForEugene "balance" [])

Yeti (ML):

The private method in Yeti is, again, just a nested function hiding out in the closure that is returned by "makeAccount"; the fact that Yeti supports expressions embedded inside of strings makes it easy to create the transaction log string:

makeAccount =
  (do bal:
    var balance = bal;
    var transactionLog is list<string> = [];
    logTransaction action amount = 
      transactionLog := "Action: \(action) for \(amount)" :: transactionLog;
    do action:
      case action of
        "withdraw":
          do amt:
            if amt <= balance then
              logTransaction "withdraw" amt;
              balance := balance - amt;
              balance
            else
              throw new RuntimeException("Insufficient funds")
            fi
          done;
        "deposit": 
          do amt:
            logTransaction "deposit" amt;
            balance := balance + amt;
            balance
          done;
        "balance": 
          do: 
            println transactionLog;
            balance
          done;
        _ : throw new RuntimeException("Unknown operation")
      esac
    done;
  done;);

Jaskell (Haskell):

C#:

Summary

JavaScript is, of course, the de-facto golden child right now.

And Scala is, undoubtedly, one of my favorites. It's syntax is a little quirky in places, but no more so than any other language I've used.

I like the Yeti code style and syntax, and could definitely see doing some small projects in it, particularly some service-y kinds of things with it, a la Web or REST services; the Yeti source code has some examples of how to create (for example) servlets and WARs, and it's a nice syntax. I don't know that I'd want to create a full-fledged MVC framework on top of Yeti, but as something that's basically taking input, doing processing and sending back JSON or XML results, it's not a bad approach. Considering you can also create classes in Yeti, which puts it into the same grounds as F#, it's worth looking into if you've got some ML in your background and want to go back to it while staying on top of the JVM.

The F# version is a nice mix of ML and objects, though the casting operators are definitely a syntax that only a mother could love, and the distinction between what is allowed on functions vs. methods (such as the parameter arrays) feels a little arbitrary at times. (I'm sure there's good reasons for it, but it still feels a little arbitrary, at least to me.) The "cannot close over local variables, use refs instead" rule is also a little annoying, although it does make it explicitly clear that now you're closing over a reference, not the actual value, so now the "what happens if I modify the closed-over value" question becomes self-explanatory. (This sometimes trips people up in other languages that don't make the by-value or by-reference closing-over semantics explicit.)

Honestly, I don't really expect that anyone reading this piece is going to immediately turn around, abandon all their domain objects, and take up this approach as a replacement--in some cases, taking this "all functional" style creates more angst than it really provides benefits--but we can use parts of it to generate some really interesting new patterns.


.NET | C# | F# | Industry | Java/J2EE | Languages | Personal | Scala | Windows

Thursday, December 20, 2012 6:24:55 PM (Pacific Standard Time, UTC-08:00)
Comments [1]  | 
 Saturday, December 08, 2012
Scala syntax bug?

I'm running into a weird situation in some Scala code I'm writing (more on why in a later post), and I'm curious to know from my Scala-ish followers if this is a bug or intentional/"by design".

First of all, I can define a function that takes a variable argument list, like so:

    def varArgs(key:String, args:Any*) = {
      println(key)
      println(args)
      true
    }
    varArgs("Howdy")
And this is good.

I can also write a function that returns a function, to be bound and invoked, like so:

    val good1 = (key:String) => {
      println(key)
      true
    }
    good1("Howdy")
And this also works.

But when I try to combine these two, I get an interesting error:

    val bad3 = (key:String, args:Any*) => {
        println(key)
        println(args)
        true
    }
    bad3("Howdy", 1, 2.0, "3")
... which yields the following compilation error:
Envoy.scala:169: error: ')' expected but identifier found.
    val bad3 = (key:String, args:Any*) => {
                                    ^
one error found
... where the "^" is lined up on the "*" in the "args" parameter, in case the formatting isn't clear.

Now, I can get around this by using a named function and returning it as a partially-applied function:

    val good2 = {
      def inner(key:String, args:Any*) = {
        println(key)
        println(args)
        true
      }
      inner _
    }
    good2("Howdy", 1, 2.0, "3")
... but it's a pain. Can somebody tell me why "bad3", above, refuses to compile? Am I not getting the syntax right here, or is this a legit bug in teh compiler?


Java/J2EE | Languages | Reading | Scala

Saturday, December 08, 2012 12:20:34 AM (Pacific Standard Time, UTC-08:00)
Comments [4]  | 
 Friday, November 30, 2012
On Uniqueness, and Difference

In my teenage formative years, which (I will have to admit) occurred during the 80s, educators and other people deeply involved in the formation of young peoples' psyches laid great emphasis on building and enhancing our self-esteem. Self-esteem, in fact, seems to have been the cause and cure of every major problem suffered by any young person in the 80s; if you caved to peer pressure, it was because you lacked self-esteem. If you dressed in the latest styles, it was because you lacked the self-esteem to differentiate yourself from the crowd. If you dressed contrary to the latest styles, it was because you lacked the self-esteem to trust in your abilities (rather than your fashion) to stand out. Everything, it seemed, centered around your self-esteem, or lack thereof. "Be yourself", they said. "Don't be what anyone else says you are", and so on.

In what I think was supposed to be a trump card for those who suffered from chronically low self-esteem, those who were trying to form us into highly-self-esteemed young adults stressed the fact that by virtue of the fact that each of us owns a unique strand of DNA, then each of us is unique, and therefore each of us is special. This was, I think, supposed to impose on each of us a sense of self- worth and self-value that could be relied upon in the event that our own internal processing and evaluation led us to believe that we weren't worth anything.

(There was a lot of this handed down at my high school, for example, particularly my freshman year when one of my swim team teammates committed suicide.)

With the benefit of thirty years' hindsight, I can pronounce this little experiment/effort something of a failure.

The reason I say this is because it has, it seems, spawned a generation of now-adults who are convinced that because they are unique, that they are somehow different--that because of their uniqueness, the generalizations that we draw about people as a whole don't apply to them. I knew one woman (rather well) who told me, flat out, that she couldn't get anything out of going to therapy, because she was different from everybody else. "And if I'm different, then all of those things that the therapist thinks about everybody else won't apply to me." And before readers start thinking that she was a unique case, I've heard it in a variety of different forms from others, too, on a variety of different topics other than mental health. Toss in the study, quoted in a variety of different psych books, that something like 80% of the population thinks they are "above average", and you begin to get what I mean--somewhere, deep down, we've been led down this path that says "Because you are unique, you are different."

And folks, I hate to burst your bubble, but you're not.

Don't get me wrong, I understand that fundamentally, if you are unique, then by definition you are different from everybody else. But implicit in this discussion of the word "different" is an assumption that suggests that "different" means "markedly different", and it's in that distinction that the argument rests.

Consider this string of numbers for a second:

12345678901234567890123456789012345678901234567890
and this string of numbers:
12345678901234567890123456788012345678901234567890
These two strings are unique, but I would argue that they're not different--in fact, their contents differ by one digit (did you spot it?), but unless you're looking for the difference, they're basically the same sequential set of numbers. Contrast, then, the first string of numbers with this one:
19283746519283746519283746554637281905647382910000
Now, the fact that they are unique is so clear, it's obvious that they are different. Markedly different, I would argue.

If we look at your DNA, and we compare it to another human's DNA, the truth is (and I'm no biologist, so I'm trying to quote the numbers I was told back in high school biology), you and I share about 99% of the same DNA. Considering the first two strings above are exactly 98% different (one number in 50 digits), if you didn't see the two strings as different, then I don't think you can claim that you're markedly different from any other human if you're half again less different than those two numbers.

(By the way, this is actually a very good thing, because medical science would be orders of magnitude more difficult, if not entirely impossible, to practice if we were all more different than that. Consider what life would be like if the MD had to study you, your body, for a few years before she could determine whether or not Tylenol would work on your biochemistry to relieve your headache.)

But maybe you're one of those who believes that the difference comes from your experiences--you're a "nurture over nature" kind of person. Leaving all the twins' research aside (the nature-ists final trump card, a ton of research that shows twins engaging in similar actions and behaviors despite being raised in separate households, thus providing the best isolation of nature and nurture while still minimizing the variables), let's take a small quiz. How many of you have:

  1. kissed someone not in your family
  2. slept with someone not in your family
  3. been to a baseball game
  4. been to a bar
  5. had a one-night stand
  6. had a one-night stand that turned into "something more"
... we could go on, probably indefinitely. You can probably see where I'm going with this--if we look at the sum total of our experiences, we're going to find that a large percentage of our experiences are actually quite similar, particularly if we examine them at a high level. Certainly we can ask the questions at a specific enough level to force uniqueness ("How many of you have kissed Charlotte Neward on September 23rd 1990 in Davis, California?"), but doing so ignores a basic fact that despite the details, your first kiss with the man or woman you married has more in common with mine than not.

If you still don't believe me, go read your horoscope for yesterday, and see how much of that "prediction" came true. Then read the horoscope for yesterday for somebody born six months away from you, and see how much of that "prediction" came true. Or, if you really want to test this theory, find somebody who believes in horoscopes, and read them the wrong one, and see if they buy it as their own. (They will, trust me.) Our experiences share far more in common--possibly to the tune of somewhere in the high 90th percentiles.

The point to all of this? As much as you may not want to admit it, just because you are unique does not make you different. Your brain reacts the same ways as mine does, and your emotions lead you to make bad decisions in the same ways that mine does. Your uniqueness does not in any way exempt you from the generalizations that we can infer based on how all the rest of us act, behave, and interact.

This is both terrifying and reassuring: terrifying because it means that the last bastion of justification for self-worth, that you are unique, is no longer a place you can hide, and reassuring because it means that even if you are emotionally an absolute wreck, we know how to help you straighten your life out.

By the way, if you're a software dev and wondering how this applies in any way to software, all of this is true of software projects, as well. How could it not? It's a human exercise, and as a result it's going to be made up of a collection of experiences that are entirely human. Which again, is terrifying and reassuring: terrifying in that your project really isn't the unique exercise you thought it was (and therefore maybe there's no excuse for it being in such a deep hole), and reassuring in that if/when it goes off the rails into the land of dysfunction, it can be rescued.


Conferences | Development Processes | Industry | Personal | Reading | Social

Friday, November 30, 2012 10:03:48 PM (Pacific Standard Time, UTC-08:00)
Comments [2]  | 
 Wednesday, November 28, 2012
On Knowledge

Back during the Bush-Jr Administration, Donald Rumsfeld drew quite a bit of fire for his discussion of knowledge, in which he said (loosely paraphrasing) "There are three kinds of knowledge: what you know you know, what you know you don't know, and what you don't know you don't know". Lots of Americans, particularly those who were not kindly disposed towards "Rummy" in the first place, took this to be canonical Washington doublespeak, and berated him for it.

I actually think that was one of the few things Rumsfeld said that was worth listening to, and I have a slight amendment to the statement; but first, let's level-set and make sure we're all on the same page about what those first three categories mean, in real life, with a few assumptions along the way to simplify the discussion (as best we can, anyway):

  1. What you know you know. This is the category of information that the individual in question has studied to some level of depth: for a student of International Relations (as I was), this would be the various classes that they took and received (presumably) a passing grade in. For you, the reader of my blog, that would probably be some programming language and/or platform. This is knowledge that you have, in some depth, at a degree that most people would consider "factually accurate".
  2. What you know you don't know. This is the category of information that the individual in question has heard about, but has never studied to any level or degree: for the student of International Relations, this might be the subject of biochemistry or electrical engineering. For you, the reader of my blog, it might be certain languages that you've heard of, perhaps through this blog (Erlang, F#, Scala, Clojure, Haskell, etc) or data-storage systems (Cassandra, CouchDB, Riak, Redis, etc) that you've never investigated or even sat through a lecture about. This is knowledge that you realize you don't have.
  3. What you don't know you don't know. This is the category of information that the individual in question has never even heard about, and so therefore, by definition, has not only the lack of knowledge of the subject, but lacks the realization that they lack the knowledge of the subject. For the student of International Relations, this might be phrenology or Schrodinger's Cat. For you, the reader of my blog, it might be languages like Dylan, Crack, Brainf*ck, Ook, or Shakespeare (which I'm guessing is going to trigger a few Google searches) or platforms like BeOS (if you're in your early 20's now), AmigaOS (if you're in your early 30's now) or database tools/platforms/environments like Pick or Paradox. This is knowledge that you didn't realize you don't have (but, paradoxically, now that you know you don't have it, it moves into the "know you don't know" category).
Typically, this discussion comes up in my "Pragmatic Architecture" talk, because an architect needs to have a very clear realization of what technologies and/or platforms are in which of those three categories, and (IMHO) push as many of them from category #3 (don't know that you don't know) into category #2 (know you don't know) or, ideally, category #1 (know you know). Note that category #1 doesn't mean that you are the world's foremost expert on the thing, but you have some working knowledge of the thing in question--I don't consider myself to be an expert on Cassandra, for example, but I know enough that I can talk reasonably intelligently to it, and I know where I can get more in the way of details if that becomes important, so therefore I peg it in category #1.

But what if I'm wrong?

See, here's where I think there's a new level of knowledge, and it's one I think every software developer needs to admit exists, at least for various things in their own mind:

  • What you think you know. This is knowledge that you believe, in your heart of hearts, you have about a given subject.
Be honest with yourself: we've all met somebody in this industry who claims to have knowledge/expertise on a subject, and damn if they can't talk a good game. They genuinely believe, in fact, that they know the subject in question, and speak with the confidence and assurance that comes with that belief. (I'm assuming that the speaker in question isn't trying to deliberately deceive anyone, which may, in some cases, be a naive and/or false assumption, but I'm leaving that aside for now.) But, after a while, it becomes apparent, either to themselves or to the others around them, that the knowledge they have is either incorrect, out of date, out of context, or some combination of all three.

As much as "what you don't know you don't know" information is dangerous, "what you think you know" information is far, far more so, particularly because until you demonstrate to yourself that your information is actually correct, you're a danger and a liability to anyone who listens to you. Without regularly challenging yourself to some form of external review/challenge, you'll never exactly know whether what you know is real, or just made up from your head.

This is why, at every turn, your assumption should be that any information you have is some or all incorrect until proven otherwise. Find out why you know something--what combination of facts/data lead you to believe that this is the case?--and you will quickly begin to discover whether that knowledge is real, or just some kind of elaborate self-deception.


Conferences | Development Processes | Industry | Personal | Review | Social

Wednesday, November 28, 2012 6:13:45 PM (Pacific Standard Time, UTC-08:00)
Comments [0]  | 
 Friday, November 23, 2012
On Tech, and Football

Today was Thanksgiving in the US, a holiday that is steeped in "tradition" (if you can call a country of less than three hundred years in history to have any traditions, anyway). Americans gather in their homes with friends and family, prepare an absurdly large meal centered around a turkey, mashed potatoes, gravy, and "all the trimmings", and eat. Sometimes the guys go outside and play some football before the meal, while the gals drink wine and/or margaritas and prep the food, and the kids escape to video games or nerf gun wars outside, and so on.

One of these traditions commonly associated with this holiday is the National Football League (NFL, to those of you not familiar with American football): there is always a game on, and for whatever reason (tradition!), usually the game (or one of the games, if there's more than one--today there were three) is between the Dallas Cowboys and the Washington Redskins. I don't have the statistics handy, but I think those two teams have played on Thanksgiving like every year for the last four decades (or something like that).

This year, the Washington Redskins defeated the Dallas Cowboys 38-31. Apparently, it was quite the blowout in the second quarter, when Washington's rookie quarterback, Robert Griffin III, threw three touchdown passes in one quarter, then one more later in the game to become the first quarterback in Washington franchise history to throw back-to-back four-TD games. ESPN has all the details, if you're interested. What you won't find, however, in that news report, is far more important about what you will find. For all the praise heaped on RGIII (as Mr. Griffin is known in sports circles), you will not hear one very interesting factoid:

RGIII is black.

So, it turns out, is Michael Vick (Philadelphia). So is Byron Leftwich (Pittsburgh's backup QB), as is Charlie Batch (the backup for Pittsburgh now that Leftwich is down for the season with an injury). In fact, despite the fact that no team in the NFL had a starting black quarterback just twenty or thirty years ago, the issue of race is pretty much "done" in the NFL: nobody cares what the race of the players is anymore, unless the player themselves makes an issue of it. After Doug Williams, the first black quarterback to win a Super Bowl, people just kinda... stopped caring.

What does this have to do with tech?

People have been making a big deal out of the lack of women (and minority, though women get better press) speakers in the software industry. This post, for example, implicitly suggests that somehow, women aren't getting the opportunities that they deserve:

Where are these opportunities? You don't see the opportunities that no one offers you. You don't see the suggestions, requests for collaboration, invitations to the user group, that didn't happen.

Where are these obstacles? Also invisible. They're a lack of inclusion, and of a single role model. They're not having your opinion asked for technical decisions. They're an absence of sponsorship -- of people who say in management meetings "Jason would make a great architect." Jason doesn't even know someone's speaking up for him, so how could Rokshana know she's missing this?

You can't see what isn't there. You can't fight for what you can't see.

I take issue with a couple of these points. Not everyone deserves the opportunity: sometimes an opportunity is not handed to you not because you're a woman, but because you're not willing to go after it. Look, as much as we may want to pretend that everybody is equal, that everybody can make the same results given the same inputs, if you put a football in my hand and ask me to make the throw 85 yards down the field into target area that's about the diameter of your average trash can, I'm not going to generate the same results that RGIII can. He's bigger than me, stronger than me, faster than me, and so on. What's more, even if I put in the same kinds of hours into practicing and training and bodybuilding and so forth, he's still going to get the nod, because he's been aggressive about pursuing the opportunities that gave people the confidence to put the ball in his hands in the fourth quarter. Me? Not so much. It wasn't that I didn't have the opportunities, it's that I chose not to take them when those opportunities arose.

Some people choose to not see opportunities. Some people choose other opportunities--when the choice comes down to staying a few extra hours to get stuff done at work, versus going home to spend time with your family, regardless of which one you choose, that choice will have consequences. The IT worker who chooses to stay will often be rewarded by being given opportunities to pursue additional opportunities at work and/or promotions and/or recognition; the one who chooses to go home will often be rewarded by a deeper connection to their family. The one who stays gets labeled "workaholic"; the one who goes home gets labeled "selfish" or "not committed to the project". Toh-may-toh, toh-mah-toh.

I don't care what gender you are--this choice applies equally to you.

Contrary to what the other blogger seems to imply, there is no secret "Men's IT Success Club", identifying promising members and giving them the necessary secret training to succeed. Nobody ever held a hand out to me and said, "Dude, you're smart. You should get ahead in life--let me help you get there." I had to take risks. I had to put myself out there. I got lucky, in a lot of ways, but don't for a second think that it was all me or it was all luck, it was a combination of the two. When I was sitting in meetings, as just a Programmer I, I had to weigh very carefully the risks of speaking up in the meeting or keeping quiet. Speaking up gets you noticed--and if you're wrong, you get shot down very quickly. Staying quiet lets you fly under the radar and avoids humiliation, but also doesn't get your boss' attention or demonstrate that you have a strong grasp of the situation.

I don't care what gender you are--this choice applies equally to you.

Sure, maybe someone will notice you and offer you that hand up. Someone will recognize your talents and say, "Damn, I think you'd be good at this, are you interested?", and if you say yes, smooth the road for you and mentor you and give you opportunities that would've taken you years otherwise to create for yourself. But notice, at the front of that sentence, I said, "Someone will recognize your talents", and in the middle I said, "if you say yes". Your talents have to be on display, and you have to say yes. Neglecting either of these will remove those opportunities. Not taking the risk to show off your talents takes away the opportunity. Not taking the risk by saying yes takes away the opportunity.

Frankly, I'm appalled that she says we have to:

  1. Create explicit opportunities to make up for the implicit ones minorities aren't getting. Invite women to speak, create minority-specific scholarships, make extra effort to reach out to underrepresented people.
  2. Make conscious effort to think about including everyone on the team in decisions. Don't always go with your gut for whom to invite to the table.
  3. Don't interrupt a woman in a meeting. (I catch myself doing this, now that I know it's a problem.) Listen, and ask questions.
  4. If you are a woman, be the first woman in the room. We are the start of making others feel like they belong.
My thoughts in response, in order:
  1. I call bull. The call for speakers should always be color- and gender-blind. If a woman speaker wants to be take seriously, she has to be taken to speak because she is a good speaker, not because she has boobies. To offer women speakers a lower bar means essentially that she's still not equal, that she's there only because she's a woman and "we need to have a few of those to liven the place up". Yep, that's 1950's sexism talking, and it horrifies me that someone could suggest that with a straight face. Particularly someone who hasn't had to scrabble her way into conferences like other speakers have had to.
  2. I call bull. There are some decisions that are appropriate for the entire team to make, there are some decisions that only the team leads and/or architects should make, and there are some decisions that are best made by someone within the team who has the technical background to make them--for example, asking me about CSS or which client-side Javascript library to use is rather foolish, since I don't really have the background to make a good call. RGIII doesn't ask the offensive linemen where he should throw the ball, and they don't ask him how they should react to the hand slap that the defensive end throws out as he tries to go around them. No one should be deliberately excluded from a conversation they can contribute to, no, but then again, no one should be included in meetings for which they have no expertise. Want to be in on that meeting? Develop the expertise first, then look for the chance to demonstrate it--they're always there, if you look for them.
  3. Don't interrupt a woman in a meeting? How about, don't interrupt ANYONE in a meeting? If interruptions are a sign of disrespect, then those signs should be removed regardless of gender. If interruptions are just a way that teams generate flow (and I believe they are, based on my own experiences), then artificially establishing that rule means that the woman is an artificial barrier to the "form/storm/norm" process.
  4. If you are a woman, then sure, keep an eye out for the other women in the room that may want to be where you are now. But if you're a man, keep an eye out for the other men in the room that seek the same opportunities, and help them. If you're black, keep an eye out for the other blacks, Asian for the other Asians, and... Well, wait, no, come to think of it, women could mentor men, and men could mentor women, and blacks for Asians and Asians for blacks, and... How about you just keep your eyes open for anyone that shows the talent and drive, and reward that with your offer of mentorship and aid?

Within the NFL, a rule was established demanding that teams interview at least one minority for any open coaching position; it was a rule designed to make sure that blacks and other minorities could make it into the very top rungs of coaching. Today, I'm guessing somewhere between a quarter to a third of the NFL teams are led by a minority head coach. But no such rule, to my knowledge, has ever been passed about which players are taken for which positions. Despite the adage a few decades ago that "blacks aren't cerebral enough to play quarterback", I'm guessing that about a quarter to a third of the quarterbacks in the league are black, and several have won a Super Bowl. This, despite absolutely no artificial aids designed to help them.

Women in IT don't need special rules or special favors. They don't need some kind of corporate return to chivalry--they're not some kind of "weaker sex" that need special help. If a woman today wants to become a speaker, the opportunities are there. Maybe it's not a keynote session at a 20,000-person industry-spanning show, but hey, not a lot of men get those opportunities, either. Some opportunities are earned, not just offered. So rather than trying to force organizations to offer opportunities to women, maybe women should look to themselves and ask, "What do I need to do to earn that opportunity?" Instead of insisting that women be given a handout, insist that everyone be given the chance equally well, based on merit, not genital plumbing.

Because then, it's a choice, and one you can make for yourself.


Conferences | Industry | Personal | Social

Friday, November 23, 2012 12:51:12 AM (Pacific Standard Time, UTC-08:00)
Comments [5]  | 
 Saturday, November 03, 2012
Cloud legal

There's an interesting legal interpretation coming out of the Electronic Freedom Foundation (EFF) around the Megaupload case, and the EFF has said this:

"The government maintains that Mr. Goodwin lost his property rights in his data by storing it on a cloud computing service. Specifically, the government argues that both the contract between Megaupload and Mr. Goodwin (a standard cloud computing contract) and the contract between Megaupload and the server host, Carpathia (also a standard agreement), "likely limit any property interest he may have" in his data. (Page 4). If the government is right, no provider can both protect itself against sudden losses (like those due to a hurricane) and also promise its customers that their property rights will be maintained when they use the service. Nor can they promise that their property might not suddenly disappear, with no reasonable way to get it back if the government comes in with a warrant. Apparently your property rights "become severely limited" if you allow someone else to host your data under standard cloud computing arrangements. This argument isn't limited in any way to Megaupload -- it would apply if the third party host was Amazon's S3 or Google Apps or or Apple iCloud."
Now, one of the participants on the Seattle Tech Startup list, Jonathan Shapiro, wrote this as an interpretation of the government's brief and the EFF filing:

What the government actually says is that the state of Mr. Goodwin's property rights depends on his agreement with the cloud provider and their agreement with the infrastructure provider. The question ultimately comes down to: if I upload data onto a machine that you own, who owns the copy of the data that ends up on your machine? The answer to that question depends on the agreements involved, which is what the government is saying. Without reviewing the agreements, it isn't clear if the upload should be thought of as a loan, a gift, a transfer, or something else.

Lacking any physical embodiment, it is not clear whether the bits comprising these uploaded digital artifacts constitute property in the traditional sense at all. Even if they do, the government is arguing that who owns the bits may have nothing to do with who controls the use of the bits; that the two are separate matters. That's quite standard: your decision to buy a book from the bookstore conveys ownership to you, but does not give you the right to make further copies of the book. Once a copy of the data leaves the possession of Mr. Goodwin, the constraints on its use are determined by copyright law and license terms. The agreement between Goodwin and the cloud provider clearly narrows the copyright-driven constraints, because the cloud provider has to be able to make copies to provide their services, and has surely placed terms that permit this in their user agreement. The consequences for ownership are unclear. In particular: if the cloud provider (as opposed to Mr. Goodwin) makes an authorized copy of Goodwin's data in the course of their operations, using only the resources of the cloud provider, the ownership of that copy doesn't seem obvious at all. A license may exist requiring that copy to be destroyed under certain circumstances (e.g. if Mr. Goodwin terminates his contract), but that doesn't speak to ownership of the copy.

Because no sale has occurred, and there was clearly no intent to cede ownership, the Government's challenge concerning ownership has the feel of violating common sense. If you share that feeling, welcome to the world of intellectual property law. But while everyone is looking at the negative side of this argument, it's worth considering that there may be positive consequences of the Government's argument. In Germany, for example, software is property. It is illegal (or at least unenforceable) to write a software license in Germany that stops me from selling my copy of a piece of software to my friend, so long as I remove it from my machine. A copy of a work of software can be resold in the same way that a book can be resold because it is property. At present, the provisions of UCITA in the U.S. have the effect that you do not own a work of software that you buy. If the district court in Virginia determines that a recipient has property rights in a copy of software that they receive, that could have far-reaching consequences, possibly including a consequent right of resale in the United States.

Now, whether or not Jon's interpretation is correct, there are some huge legal implications of this interpretation of the cloud, because data "ownership" is going to be the defining legal issue of the next century.


.NET | Android | Azure | C# | C++ | Conferences | Development Processes | F# | Flash | Industry | iPhone | Java/J2EE | Languages | LLVM | Mac OS | Objective-C | Parrot | Personal | Python | Reading | Review | Ruby | Scala | Security | Social | Solaris | Visual Basic | VMWare | WCF | Windows | XML Services | XNA

Saturday, November 03, 2012 12:14:40 AM (Pacific Daylight Time, UTC-07:00)
Comments [0]  | 
 Thursday, November 01, 2012
Vietnam... in Bulgarian

I received an email from Dimitar Teykiyski a few days ago, asking if he could translate the "Vietnam of Computer Science" essay into Bulgarian, and no sooner had I replied in the affirmative than he sent me the link to it. If you're Bulgarian, enjoy. I'll try to make a few moments to put the link to the translation directly on the original blog post itself, but it'll take a little bit--I have a few other things higher up in the priority queue. (And somebody please tell me how to say "Thank you" in Bulgarian, so I may do that right for Dimitar?)


.NET | Android | C# | Conferences | Development Processes | F# | Industry | Java/J2EE | Languages | Objective-C | Python | Reading | Review | Ruby | Scala | Visual Basic | WCF | XML Services

Thursday, November 01, 2012 4:17:58 PM (Pacific Daylight Time, UTC-07:00)
Comments [1]  | 
 Sunday, October 21, 2012
On JDD2012

There aren't many times that I cancel out of a conference (fortunately), so when I do I often feel a touch of guilt, even if I have to cancel for the best of reasons. (I'd like to think that if I have to cancel my appearance at a conference, it's only for the best of reasons, but obviously there may be others who disagree--I won't get into that.)

The particular case that merits this blog post is my lack of appearance at the JDD 2012 show (JDD standing for "Java Developer Days") in Krakow, Poland. Don't get me wrong, I love that show--Krakow is a fun city, quickly establishing itself as a university town (hellooo night clubs and parties!) as well as something of a Polish Silicon Valley, or so I've been told. (Actually, I think Krakow has a history of being a university town, but the tech angle to it is fairly recent.) My previous trips there have always been wonderful experiences, and when the organizers and I discussed my attendance at this years' show back in the start of the calendar year, I was looking forward to it.

Unfortunately, my current employer took an issue with my European travels, stating something to the effect that "three trips to Europe in five weeks' time is not a great value for us", and when coupled with the fact that there was a US speaker going to the show (that I helped get to the show, ironically) that I didn't particularly want to be around and that I'd be just walking off the plane from London before I'd have to get back on the plane to get to Krakow.... *shrug* It was just a little too much all at once. Regretfully, I emailed Slawomir (the organizer) and told him I was going to have to cancel.

Any one of these, I'd have bulled my way through. Two of them, I probably still would have shown up. But all three.... I just decided that the divine heavens had spoken, and I should just take the message and stay home. And let the message be very clear here, there was no fault or blame about this decision to be laid anywhere but at my feet--if you're at JDD now and you're pissed that I'm not there, then you should blame me, and not the organizers. (But honestly, with Rebecca Wirfs-Brock and Adam Bien there, you're getting some top-notch content, so you probably won't even miss me.)

And yes, assuming I haven't burned a bridge with the organizers (and I think we're all good on that score), I sincerely hope to be back there in 2013; Polish attendees and conference organizers are off the hook when it comes to making a speaker feel welcome.


Android | Conferences | Industry | Java/J2EE | Languages | Personal | Scala

Sunday, October 21, 2012 12:12:07 AM (Pacific Daylight Time, UTC-07:00)
Comments [0]  | 
 Tuesday, October 16, 2012
On NFJS

As the calendar year comes to a close, it's time (it's well past time, in fact) that I comment publicly on my obvious absence from the No Fluff, Just Stuff tour.

In January, when I emailed Jay Zimmerman, the organizer of the conference, to talk about topics for the coming year, I got no response. This is pretty typical Jay--he is notoriously difficult to reach over email, unless he has something he wants from you. In his defense, that's not an uncommon modus operandi for a lot of people, and it's pretty common to have to email him several times to get his attention. It's something I wish he were a little more professional about, but... *shrug* The point is, when I emailed him and got no response, I didn't think much of it.

However, as soon as the early years' schedule came out, a friend of mine on the tour emailed me to ask why I wasn't scheduled for any of the shows--I responded with a rather shocked "Wat?" and checked for myself--sure enough, nowhere on the tour. I emailed Jay, and... cue the "Sounds of Silence" melody.

Apparently, my participation was no longer desired.

Now, in all fairness, last year I joined Neudesic, LLC as a full-time employee, working as an Architectural Consultant and I mentioned to Jay that I was interested in scaling back my participation from all the shows (25 or so across the year) to maybe 15 or so, but at no point did I ever intend to give him the impression that I wanted to pull off the tour entirely. Granted, the travel schedule is brutal--last year (calendar year 2011) it wasn't uncommon for me to be doing three talks each day (Friday, Saturday and Sunday), and living in Seattle usually meant that I had to use all day Thursday to fly out to wherever the show was being held, and could sometimes return on Sunday night but more often had to fly back on Monday, making for a pretty long weekend. But I enjoyed hanging with my speaker buddies, I enjoyed engaging with the crowds, and I definitely enjoyed the "aha" moments that would fire off inside my head while speaking. (I'm an "external processor", so talking out loud is actually a very effective way for me to think about things.)

Across the year, I got a few emails and Tweets from people asking about my absence, and I always tried to respond to those as fairly and politely as I could without hiding the fact that I wished I was still there. In truth, folks, I have to admit, I enjoy having my weekends back. I miss the tour, but being off of it has made me realize just how much family time I was missing when I was off gallavanting across the country to various hotel conference rooms to talk about JVMs or languages or APIs. I miss hanging with my speaker friends, but friends remain friends regardless of circumstance, and I'm happy to say that holds true here as well. I miss the chance to hone my ideas and talks, but that in of itself isn't enough to justify missing out on my 13-year-old's football games or just enjoying a quiet Saturday with my wife on the back porch.

All in all, though I didn't ask for it, my rather unceremonious "boot" to the backside off the tour has worked out quite well. Yes, I'd love to come back to the tour and talk again, but that's up to Jay, not me. I wouldn't mind coming back, but I don't mind not being there, either. And, quite honestly, I think there's probably more than a few attendees who are a bit relieved that I'm not there, since sitting in on my sessions was always running the risk that they'd be singled out publicly, which I've been told is something of a "character-building experience". *grin*

Long story short, if enough NFJS attendee alumni make the noise to Jay to bring me back, and he offers it, I'd take it. But it's not something I need to do, so if the crowds at NFJS are happy without me, then I'm happy to stay home, sip my Diet Coke, blog a little more, and just bask in the memories of almost a full decade of the NFJS experience. It was a hell of a run, and I'm very content with having been there from almost the very beginning and helping to make that into one of the best conference experiences anyone's ever had.


Android | Conferences | Development Processes | F# | Industry | iPhone | Java/J2EE | Languages | LLVM | Mac OS | Scala | Social | Solaris | Windows

Tuesday, October 16, 2012 3:11:31 AM (Pacific Daylight Time, UTC-07:00)
Comments [0]  | 
 Friday, October 12, 2012
On Equality

Recently (over the last half-decade, so far as I know) there's been a concern about the numbers of women in the IT industry, and in particular the noticeable absence of women leaders and/or industry icons in the space. All of the popular languages (C, C++, Java, C#, Scala, Groovy, Ruby, you name it) have been invented by or are represented publicly by men. The industry speakers at conferences are nearly all men. The rank-and-file that populate the industry are men. And this strikes many as a bad thing.

Honestly, I used to be a lot more concerned than I am today. While I'm sure that many will see my statements and position that follows as misogynistic and/or discriminatory, let me be the first to suggest quite plainly that I have nothing against any woman who wants to be a programmer, who wants to be an industry speaker, or who wants to create a startup and/or language and/or library and/or framework and/or tool and/or any other role of leadership and authority within the industry. I have always felt that this industry is more merit-based than any other I have ever had direct or indirect contact with. There is no need for physical strength, there is no need for dexterity or mobility, there is no need for any sort of physical stress tolerances (such as the G forces fighter pilots incur during aerial combat which, by the way, women are actually scientifically better at handling than men), there really even is no reason that somebody who is physically challenged couldn't excel here. So long as you can type (or, quite frankly, have some other mechanism by which you can put characters into an IDE), you can program.

And no, I have no illusions that somehow men are biologically wired better to be leaders. In fact, I think that as time progresses, we will find that the stereotypical characteristics that we ascribe to each of the genders (male competitiveness and female nuturing) each serve incredibly useful purposes in the IT world. Cathi Gero, for example, was once referred to by a client in my presence as "the Mom of the IT department"--by which they meant, Cathi would simply not rest until everything was exactly as it should be, a characteristic that they found incredibly comforting and supportive. Exactly the kind of characteristic you would want from a highly-paid consultant: that they will stick with you through all the mess until the problem is solved.

And no, I also have no illusions that somehow I understand what it's been like to be a woman in IT. I've never experienced the kind of "automatic discrimination" that women sometimes describe, being mistaken for recruiters at a technical conference, rather than as a programmer. I won't even begin to try and pretend that I know what that's like.

Unless, of course, I can understand it by analogy, such as when a woman sees me walking down the street, and crosses the street ahead of me so that she won't have to share the sidewalk, for even a second, with a long-haired, goateed six-foot-plus stranger. She has no reason to assume I represent any threat to her other than my physical appearance, but still, her brain makes the association, and she chooses to avoid the potential possibility of threat. Still, that's probably not the same.

What I do think, quite bluntly, is that one of the reasons we don't have more women in IT is because women simply choose not to be here.

Yes, I know, there are dozens of stories of misogynistic behavior at conferences, and dozens more stories of discriminatory behavior. Dozens of stories of "good ol' boys behavior" making women feel isolated, and dozens of stories of women feeling like they had to over-compensate for their gender in order to be heard and respected. But for each conference story where a woman felt offended by a speakers' use of a sexual epithet or joke, there are dozens of conferences where no such story ever emerges.

I'm reminded of a story, perhaps an urban myth, of a speaker at a leadership conference that stood in front of a crowd, took a black marker, made a small circle in the middle of a flip board, and asked a person in the first row what they saw. "A black spot", they replied. A second person said the same thing, and a third. Finally, after about a half-dozen responses of "a block spot", the speaker said, "All of you said you saw the same thing: a black spot. I'm curious as to why none of you saw the white background behind it".

It's easy for us to focus on the outlier and give that attention. It's even easier when we see several of them, and if they come in a cluster, we call it a "dangerous trend" and "something that must be addressed". But how easy it is, then, to miss the rest of the field, in the name of focusing on the outlier.

My ex-apprentice wants us to proactively hire women instead of men in order to address this lack:

Bring women to the forefront of the field. If you're selecting a leader and the best woman you can find is not as qualified as the best man you can find, (1) check your numbers to make sure unintentional bias isn't working against her, and (2) hire her anyway. She is smart and she will rise to the occasion. She is not as experienced because women haven't been given these opportunities in the past. So give it to her. Next round, she will be the most qualified. Am I advocating affirmative action in hiring? No, I'm advocating blind hiring as much as is feasible. This has worked for conferences that do blind session selection and seek out submissions from women. However, I am advocating deliberate bias in favor of a woman in promotions, committee selection, writing and speaking solicitation, all technical leadership positions. The small biases have multiplied until there are almost no women in the highest technical levels of the field.
But you can't claim that you're advocating "blind hiring" while you're saying "hire her anyway" if she "is not as qualified as the best man you can find". This is, by definition, affirmative action, and while it does put women into those positions, it doesn't address the underlying problem--that she isn't as qualified. There is no reason that she shouldn't be as qualified as the man, so why are we giving her a pass? Why is it this company's responsibility to fix the industry at a cost to themselves? (I'm assuming, of course, that there is a lost productivity or lost innovation or some other cost to not hiring the best candidate they can find; if such a loss doesn't exist, then there's no basis for assuming that she isn't equally qualified as the man.)

Did women routinely get "railroaded" out of technical directions (math and science) and into more "soft areas" (English and fine arts) in schools back when I was a kid? Yep. Studies prove that. My wife herself tells me that she was "strongly encouraged" to take more English classes than math or science back in Junior high and high school, even when her grades in math and science were better than those in English. That bias happened. But does it happen with girls today? Studies I'm reading about third-hand suggest not appreciably. And even if you were discriminated against back then, what stops you now? If you're reading this, you have a computer, so what stops you now from pursuing that career path? Programming today is not about math and science--it's about picking up a book, downloading a free SDK and/or IDE, and diving in. My background was in International Relations--I was never formally trained, either. Has it held me back? You betcha--there are a few places that refused to hire me because I didn't have the formal CS background to be able to select the right algorithm or do big-O analysis. Didn't seem to stop me--I just went and interviewed someplace else.

Equality means equality. If a woman wants to be given the same respect as a man, then she has to earn it the same way he does, by being equally qualified and equally professional. It is this "we should strengthen the weak" mentality that leads to soccer games with no score kept, because "we're all winners". That in turn leads to children that then can't handle it when they actually do lose at something, which they must, eventually, because life is not fair. It never will be. Pretending otherwise just does a disservice to the women who have put in the blood, sweat, and tears to achieve the positions of prominence and respect that they earned.

Am I saying this because I worry that preferential treatment to women speakers at conferences and in writing will somehow mean there are fewer opportunities for me, a man? Some will accuse me of such, but those who do probably don't realize that I turn down more conferences than I accept these days, and more writing opportunities as well. In fact, regardless of your gender, there are dozens, if not hundreds, of online portals and magazines that are desperate for authors to write quality work--if you're at all stumped trying to write for somebody, then you're not trying very hard. And every week user groups across the country are canceled for a lack of a speaker--if you're trying to speak and you're not, then you're either setting your bar too high ("If I don't get into TechEd, having never spoken before in my life, it must be because I'm a woman, not that I'm not a qualified speaker!") or you're really not trying ("Why aren't the conferences calling me about speaking there?").

If you're a woman, and you're thinking about a career in IT, more power to you. This industry offers more opportunity and room for growth than any other I've yet come across. There are dozens of meetings and meetups and conferences that are springing into place to encourage you and help you earn that distinction. Yes, as you go you will want and/or need help. So did I. You need people that will help you sharpen your skills and improve your abilities, yes. But a specific and concrete bias in your favor? No. You don't need somebody's charity.

Because if you do, then it means that you're admitting that you can't do it on your own, and you aren't really equal. And that, I think, would be the biggest tragedy of the whole issue.

Flame away.


Conferences | Development Processes | Industry | Personal | Reading | Security | Social

Friday, October 12, 2012 2:17:22 AM (Pacific Daylight Time, UTC-07:00)
Comments [2]  | 
Blogging Again

Readers of this blog will not be surprised when I say that I've neglected it recently--partly because I've been busy, partly because I've got other opportunities to give volume to my voice through the back-cover editorial in CoDe Magazine. But I feel a little guilty about it, and yes, I've noticed that my readership numbers have gone down, which, I must admit, bothers me. Fortunately, there is an easy remedy--blog more.

And, it sort of goes without saying, if anybody out there is still listening and has particular subjects they'd like to see me address, take a shot and let me know, email or comments. After all, sometimes even the most experienced authors can use a little inspiration.




Friday, October 12, 2012 12:39:52 AM (Pacific Daylight Time, UTC-07:00)
Comments [1]  | 
 Thursday, May 10, 2012
Microsoft is to Monopolist as Apple is to….

Remember the SAT test and their ridiculous analogy questions? “Apple : Banana as Steak : ???”, where you have to figure out the relationship between the first pair in order to guess what the relationship in the second pair should be? (Of course, the SAT guys give you a multiple-choice answer, whereas I’m leaving it open to your interpretation.)

What triggers today’s blog post is this article that showed up in GeekWire, about how Firefox is accusing Microsoft of anti-competitive behaviors by claiming IE will have an unfair advantage on their new ARM-based machines.

Anderson says the situation has antitrust implications. Microsoft has agreed to abide by a set of principles to maintain a level playing field on Windows for competitors despite the expiration of its consent decree with the U.S. Justice Department.

OK, wait a second here. Last time I checked, there’s another operating system out there that completely and entirely prevents any kind of web browser from being deployed on it, which strikes me as grossly anticompetitive, and yet Mozilla chooses to fire their guns at Microsoft, who is attempting to take a shot at the ARM market?

Seems to me like somebody’s either not getting the point of “anticompetitive”, or else they’re just taking a potshot at the company that everybody loves to hate because it’s an easy shot. If Mozilla is really serious about anticompetitive concerns, they will ask DOJ to investigate Apple’s iOS (that owns, what, 2500% of the tablet market) and AppStore, not Microsoft IE on a market that doesn’t event exist yet.

Otherwise, I call bullshit.


.NET | Android | C# | C++ | Industry | iPhone | Java/J2EE | Mac OS | Windows

Thursday, May 10, 2012 11:58:12 AM (Pacific Daylight Time, UTC-07:00)
Comments [4]  |