JOB REFERRALS
    ON THIS PAGE
    ARCHIVES
    CATEGORIES
    BLOGROLL
    LINKS
    SEARCH
    MY BOOKS
    DISCLAIMER
 
 Friday, January 03, 2014
Tech Predictions, 2014

Here we go again: the annual review of last year's predictions, and a set of new ones for the new year.

2013 Retrospective

Without further ado, first we examine last year's Gregorian prognostications:

  • THEN:"Big data" and "data analytics" will dominate the enterprise landscape.

    NOW: Yeah, it was a bit of a slam dunk breakaway kind of call, but it clearly counts. Vendors and consulting companies were climbing all over themselves to talk about "big data", and startups basing their existence on gathering, analyzing, displaying and (theoretically) offering insight from "big data" were all the rage in the startup community, such as local startup Predixion (CTO'ed by a buddy of mine). If you live anywhere in the Pacific Northwest, chances are there's a similar kind of startup within spitting distance of you right now. 1-0.

  • THEN:NoSQL buzz will start to diversify.

    NOW: It didn't happen quite as much as I'd expected, but the various vendors are, in fact, starting to use terms other than "NoSQL" to define themselves. In particular, we're seeing database vendors (MongoDB, Neo4J, Cassandra being my principal examples) talking about being a "document database" or a "graph database" instead of being a "NoSQL" database, though they're fairly quick to claim the NoSQL tag when it comes to differentiating against the traditional relational database. Since I said "start" to diversify, I'm going to take the win. 2-0.

  • THEN:Desktops increasingly become niche products.

    NOW: Well, this one is hard to call. Yes, desktop sales have plummeted, but it's hard to see what those remaining sales are being used for. I will point out that the Mac Pro, with it's radically-different internal construction, definitely puts a new spin on the desktop, but I'm not sure that this counts. Since I took the benefit of the doubt on the last one, I'll forgot it on this one. 2-1.

  • THEN:Home servers will start to grow in interest.

    NOW: I wish I had sales numbers to go with some of this stuff, as hard evidence, but the fact that many people are using their console devices (XBox, XBoxOne, PS3, PS4, etc) as media servers means I missed the boat on this one. I think we may still see home servers continue to rise, but the clear trend has been to make the console gaming device into a server, and not purchase servers on their own to serve as media servers. 2-2.

  • THEN:Private cloud is going to start getting hot.

    NOW: Meh. I see certain cloud vendors talking about private cloud, but for the most part the emphasis is still on public cloud. 2-3. Not looking good for the home team.

  • THEN:Oracle will release Java8, and while several Java pundits will decry "it's not the Java I love!", most will actually come to like it.

    NOW: Well, let's start with the fact that Java8 actually didn't ship this year. And even that, what I would have guessed would be a hugely-debated and hotly-contested choice, really went by without much fanfare or complaint, except from some of the usual hard-liner complaint sources. Which means one of two things: either (a) it's finally come to pass that most of the people developing on top of the JVM really don't care about the Java language's growth anymore, or (b) the community felt as Oracle's top engineering brass did, that getting this release "right" was far better than getting it out on the promised deadline. And while I agree with the (b) group on that, it still means that the prediction was way off. 2-4.

  • THEN:Microsoft will start courting the .NET developers again.

    NOW: Quite frankly, this one got left in dust almost the moment that Ballmer's retirement was announced. Whatever emphasis the company as a whole might have put into courting .NET developers back into the fold was immediately shelved, at least until a successor comes in to take Ballmer's place and decide what kind of strategy the company as a whole will pursue. Granted, the individual divisions within Microsoft, most notably DevDiv, continue to try and woo the developer community, but that was always going to be the case. However, the lack of any central "push" from the company effectively meant that the perceived "push" against .NET in favor of WinRT was almost immediately left behind, and the subsequent declaration of the Surface's failure (and Surface was by far the most public and prominent of the WinRT-based devices) from most corners meant that most .NET developers who cared about this breathed a sigh of relief and no longer felt those Microsoft cyclical Darwinian crosshairs (the same ones that claimed first C programmers, then C++ programmers, then COM programmers) on their back. Still, no points. 2-5.

  • THEN:Samsung will start pushing themselves further and further into the consumer market.

    NOW: And boy, howdy, did they. Samsung not only released several new versions of their various devices into the market, but they've also really pushed their consumer electronics in other form factors, too, such as their TVs and such. If there is a rival to Apple in the consumer electronics department, it is clearly Samsung, and the various court cases and patent violation filings are obvious verification of that. 3-5.

  • THEN:Apple's next release cycle will, again, be "more of the same".

    NOW: Can you say "iPhone 5c", and "iPad Air", boys and girls? Even iOS7 is basically the same OS, with a skinnier font and--oh, wow, innovation!--nested folders. 4-5.

  • THEN:Visual Studio 2014 features will start being discussed at the end of the year.

    NOW: Microsoft tossed me a major curve ball with their announcement of quarterly releases, and the subsequent release of Visual Studio 2013, and as a result, we haven't really seen the traditional product hype cycle out of the Microsoft DevDiv that we're used to. Again, how much of that is due to internal confusion over how to project their next-gen products out into the world without a clear Ballmer successor, and how much of that was planned from the beginning isn't clear, but either way, we ain't heard a peep outta nobody about C# 6 at all in 2013, so... 4-6.

  • THEN:Scala interest wanes.

    NOW: If anything, the opposite took place--Typesafe, Scala's owner/pimp/corporate backer, made some pretty splashy headlines within the JVM community, and lots of people talked a lot about it in places where Scala wasn't being discussed before. We can argue about whether that indicates just a strong marketing effort (where before Typesafe's formation there really was none) or actual growth in acceptance, but either way, I can't claim that it "waned", so the score becomes 4-7.

  • THEN:Interest in native languages will rise.

    NOW: Again, this one is hard to judge. There's been some uptick in interest in those native languages (Rust, Go, etc), and certainly there's been some interesting buzz around some kind of Phoenix-like rise of C++, but none of it has really made waves within the mainstream that I can see. (Then again, I don't really spend a lot of time in those areas where native languages would have made a larger mark, so this could be observer's contextual bias at work here.) That said, more native-based languages are emerging, and certainly Apple's interest and support in LLVM (which, contrary to it's name, is not really a "virtual machine", per se) can be seen as such, but not enough to make me feel comfortable saying I got this one right. 4-8.

  • THEN:Hardware is the new platform.

    NOW: Surface was a bust. Chromebooks hardly registered on anybody's radar. Dell threw out an arguable Surface-killer tablet, but for most consumer-minded folks it never even crossed their minds, it seems. Hardware may be the new platform, and certainly we're seeing a lot of non-x86-based machines continuing their race into consumers' hands, but most consumers don't think twice about the hardware as much as they do the visible projection of that hardware choice, in the operating system. (Think about it this way: when you go buy a device, do you care about the CPU, or the OS--iOS, Android, Windows8--running it?) 4-9.

  • THEN:APIs for lots of things are going to come out.

    NOW: Oh, my, yes. More on this later, but for now... 5-9.

Well, with a final tally of 5 "rights" to 9 "wrongs", clearly my 2013 record was about as win-filled as the Baltimore Ravens' 2013 record. *sigh* Oh, well, can't win 'em all every year, right?

2014 Predictions

Now, though, let's do the fun part: What does Ted think 2014 has in store for us geeky types?

  • iOS, Android and Windows8 start to move into your car. Audi has already announced this. Ford announced this last year with their SDK release. Frankly, with all the emphasis on "wearable tech" and "alternative tech", this seems a natural progression, considering how much time Americans, at least, spend time in their car. What, exactly, people will want software developers to do with this capability remains entirely unclear to me (and, it seems, to everybody else, given the lack of apps for the Ford SDK so far), but auto manufacturers will put it into their 2015 models just because their competitors are starting to, and the auto industry is one place were you cannot be seen as not keeping up with the neighbors.
  • Wearable tech hypes up (with little to no actual adoption or innovation). The Samsung Smart Watch is out, one of nearly a dozen models introduced in 2013. Then there was Google Glass. And given that the tech industry is a frequent "hype it before we even barely know it's going to work" kind of place, this one seems like another fast breakway layup kind of claim. Note that I fully expect that what we see offered will, in time, be as hip and as cool as the original Newton, meaning that these first iterations will be stumblin', fumblin', bumblin' attempts to try and see what anybody can do with these things to make them even remotely useful, and that unless you like living on the very edge of techno-geekery, there'll be absolutely zero reason for anyone to get one for at least calendar year 2014.
  • Apple's gadgets will be more of the same. Same one as last year: iPhone, iPad, iPod, MacBook, they're all going to be incremental refinements on what we see already. There will be no AppleTV, there will be no iWatch, there will be no radical new laptop-ish thing. Apple is clearly the market leader, and they are clearly in the grips of the Innovator's Dilemma, and they have no clear challenger (yet) that threatens to dethrone them, leaving them with no reason to shake up the status quo.
  • Android market consolidates further around Samsung and Motorola. The Android consumer market has slowly been collapsing around those two manufacturers, and I don't see any reason for that trend to change. Yes, other carriers will continue to offer Android on their devices, and yes, other device manufacturers will continue to put Android on their devices, and yes, Android will continue to appear on things other than tablets and phones, but as far as the consumer electronics world goes, the Android market will be classified as Samsung, Motorola, and everybody else.
  • We'll see one iOS release, two minor Android releases, and maybe two Windows8 minor releases. The players are basically set, the game plans are already in play, and nobody appears to have any kind of major game-changing feature set in the wings. 2014 will be a year of minor releases, tweaks to the existing systems and UIs, minor software improvements, and so on. I can't see the mobile market getting any kind of major shock or surprise this year.
  • Windows 8/8.1/9/whatever gains a little respect, but not market share gains. Windows8 as a tablet OS has been quietly gathering some converts, particularly among those who didn't find themselves on the WindowsStore-only SurfaceRTs, and as such, I think the "Windows line" will begin to gather more "critics' choice" kinds of respect, but that's not going to translate into much in the way of consumer sales. Unfortunately for the Microsoftians, Windows as of yet doesn't demonstrate any kind of compelling reason to choose it over the other two market leaders (iOS and Android), and until that happens, Windows8, as a device OS, remains a distant third and always will.
  • UI/UX emphasis is going to start moving to "alternate" input streams. Microsoft's Kinect has demonstrated that gesture is a viable input technology. Google Glass demonstrated that eyeballs can drive a UI. Voice commands are making their way into console gaming/media devices (XBox, etc). This year, enterprise and business developers, looking for ways to make a splash and justify greater research budgets, are going to start experimenting with how those "alternative" kinds of input can be utilized in non-gaming scenarios. Particularly when combined with the rise of automobiles offering programmable SDKs/platforms (see above), this represents a huge, rich area for exploration.
  • Java-the-language starts to see a resurgence of "mojo". Java8 will ship this year--not even God Himself could stop that at this point. And once it does, Java-the-language will see a revitalization as developers who've been flirting with Groovy, Scala, Clojure, and other lambda-supporting languages but can't use them on the job start to bring those ideas into Java APIs. Google's already been doing this with Guava, but now many of those ideas--already percolating in some libraries--will explode into common usage.
  • Meanwhile, this will be a quiet year for C#. The big news coming out of Microsoft, "Roslyn", the "compiler-as-a-service" rewrite of the C# and Visual Basic compilers, won't be of much use to most developers on a practical level, and as a result, this will likely be a pretty quiet year for C# and VB.
  • Functional languages will remain "hipster" tools that most people can't use. Haskell remains far out of reach for most developers, and that's the most approachable of the various functional languages discussed. (Don't even get me started on Julia, Pure, Clean, or any of the others.) As much as I wish to the contrary, this is also likely to remain true for several of the "hybrid" languages, like Scala, F#, and Clojure, though I do think they will see some modest growth as some of the upper-echelon development community begins to grok them. Those who understand them will continue to do some amazing things with them, but this is not the year I would suggest starting a business with anything "functional" as part of its business model, because not only will it be difficult to find developers who can use those tools, but trying to sell developer-facing things with those tools at the core will find a pretty dry and dusty market.
  • Dynamic languages will see continued growth and success. Ruby doesn't look to be slowing down, Node/JavaScript only looks to get more hyped, and Objective-C remains the dominant language for doing iOS development, which itself doesn't look to be slowing down. More importantly, I think we're going to start to see a rise in hybrid "static/dynamic" languages, wherein developers can choose (based on the way they write their code) compiler enforcement as they wish. Between the introduction of "invokedynamic" in Java7 (and its deeper use in Java8), and "dynamic" in C# getting some serious exercise in the Oak framework, I'm becoming more and more convinced that having a language that supports both static and dynamic typing capabilities represents the best compromise between those two poles of software development languages. That said, neither Java nor C# "gets it all the way right" on this topic, and I suspect that somewhere out there, there's a language hacker who's got a few ideas that he or she will trot out and make us all go "Doh!"
  • HTML 5 "fragmentation" will start to echo in the industry. Unfortunately, HTML 5 is not the same thing to all browsers, and those who are looking to HTML 5 as a way to save them from platform differences are going to start to feel some pain. That, in turn, is going to lead to some backlash as they are forced to deal with something they thought they were going to be saved from.
  • "Mobile browsers" become just "browsers". With the explosive growth of devices (tablets and phones) and the explosive growth of the capabilities of those devices (processor(s), memory, and so on), the need for a "crippled" or "low-end-optimized" browser has effectively gone the way of the Dodo bird. As a result...
  • "Mobile web" starts a slow, steady slide into irrelevancy. ... sites optimized for "mobile" browsing experiences--which represents a non-trivial development effort in most cases--will start to drop away, mostly due to neglect. Instead...
  • "Responsive web" becomes the new black. ... we'll see web sites using CSS frameworks (among other tools) to build user interfaces that adjust themselves to the physical viewsizes and input capabilities of the target browser. Bootstrap is an obvious frontrunner here for building said kinds of user interfaces, but don't be surprised if a number of other CSS and JavaScript frameworks to achieve the same ends start to spring up.
  • Microsoft fails to name a Ballmer successor. Yeah, this one's a stretch. It's absolutely inconceivable that they wouldn't. And yet, in all honesty, I can't see the Microsoft board finding somebody that meets Bill's approval from outside of the company, and I can't imagine anyone inside of the company who isn't somehow "tainted" by the various internecine wars that have been fought since Bill's departure. It is, quite frankly, a mess, and I don't know that it'll be cleaned up before this time next year. It would be a horrible result were that to be the case, by the way, but... *shrug* I dunno. Pretty clearly, whomever it is, is going to have a monumental task in front of them.
  • "Programmable Web" becomes an even bigger thing, leading companies to develop APIs that make no sense to anybody. Right now, as we spin up 2014, it's become the fashionable thing to build your website not as an HTML-based presentation layer website, but as a series of APIs. For some companies, that makes sense; for others, though, that is going to be a seductive Siren song that leads them to a huge development effort for little payoff. Note, I think almost all companies have interesting data and/or resources that can be exposed as APIs that would lead to some powerful mashups--I'm not arguing otherwise. But what I think we're about to stumble into is the cargo-culting blind obedience to the letter of the idea that so many companies undertake when a new concept hits the industry hard, as "Web APIs" are doing now.
  • Five new single-page JavaScript MVC application frameworks will ship and gather interest. For those of you who know me from the Java world, remember the 2000s and the huge glut of open-source Web frameworks that led us all into analysis paralysis for a half-decade or more? I see absolutely no reason why the exact same thing isn't already under way in the JavaScript Web framework world, with the added delicious twist that in the JavaScript world, we can do it on BOTH the client AND the server. Let the forking begin.
  • Apple's MacPro machine inspires dozens of knock-off clones. When the MacBook came out, silver-metal cases with chiclet keyboards suddenly appeared all over the PC clone market. When the MacBook Air came out, suddenly thin was in. What on Earth makes us think that the trashcan-sized MacPro desktop/server isn't gong to have exactly the same effect?
  • Desktop machine sales creep slightly higher. Work this through with me before you shoot it down out of hand: Tablet sales are continuing to skyrocket, and nothing seems to change that. But people still need to produce stuff (reports, articles, etc), and that really requires a keyboard. But if tablets are easier to consume data on the road, you're more likely to carry your tablet instead of your laptop (and most people--myself wildly excluded--don't like carrying more than one or at most two devices with them). Assuming that your mobile workload is light enough that you can "get by" using your tablet, and you don't want to carry a laptop *and* a tablet, you're more likely to leave your laptop at home or at work. Once your laptop is a glorified workstation, why pay that added premium for a laptop at all? In other words, I think people are going to start doing this particular math, and while tablets will continue to eat away at the "I need a mobile computing solution" sales, desktops are going to start to eat away at the "I need a computing solution for my desk" sales. Don't believe me? Look around the office at all the workstations powered by laptops already, and then start looking at whether those laptops are actually being used as laptops, and whether that mobility need could, in fact, be replaced by a far lighter tablet. It's a stretch, and it may not hit in 2014, but I really think that the world is going to slowly stratify into an 80/20 split of tablets and desktops.
  • Dozens of new "cloud" platforms will be introduced, and most of them will remain entirely irrelevant behind the "Big Three". Lots of the big players are going to start tossing out their version of a cloud platform if they haven't already (HP, Oracle, IBM, I'm looking at you), and smaller players are going to start offering "cloud" platforms of their own (a la Rackspace), but fundamentally, the cloud will remain a "Big Three" place: Amazon's AWS, Microsoft's Azure, and Google's Cloud Platform.
  • We will never see any kind of official announcement, much less actual working prototypes, around Amazon's "Drone Delivery" program ever again. Sure, Jeff made a splash when he announced it. Sure, it resonates with the geek crowd. Sure, it seems like a spiffy idea on paper. Do you have any idea of how much infrastructure and overhead (and potential for failure that has nothing to do with geeks deploying "anti-drone defenses") would be involved? No way. What's more, Amazon is not really in the shipping business (as the all-but-failed Amazon "deliver groceries to your front door" program highlights), but in the "We'll sell it to you and ship it through somebody else" business. It's a cool idea, but it'll never, ever, EVER, see the light of day.

As always, thanks for reading, and keep this channel open--I've got some news percolating about my next new adventure that I'm planning to "splash" in mid-January. It won't be too surprising, but it's exciting (at least to me), and hopefully represents an adventure that I can still be... uh... adventuring... for years to come.


.NET | Android | Azure | C# | C++ | Conferences | Development Processes | F# | Flash | Industry | iPhone | Java/J2EE | Languages | LLVM | Mac OS | Objective-C | Parrot | Personal | Python | Reading | Review | Ruby | Scala | Security | Social | Solaris | Visual Basic | VMWare | WCF | Windows | XML Services | XNA

Friday, January 03, 2014 12:35:25 AM (Pacific Standard Time, UTC-08:00)
Comments [1]  | 
 Monday, August 19, 2013
Programming Interviews

Apparently I have become something of a resource on programming interviews: I've had three people tell me they read the last two blog posts, one because his company is hiring and he wants his people to be doing interviews right, and two more expressing shock that I still get interviewed--which I don't really think is all that fair, more on that in a moment--and relief that it's not just them getting grilled on areas that they don't believe to be relevant to the job--and more on that in a moment, too.

A couple of things have emerged in the last few weeks since the saga described earlier, so I thought I'd wrap the thing up with a final post. Besides, I like things that come in threes.

First, go see this video. Jonathan pinged me about it shortly after the second blog post came out, and damn if he and Mitch don't nail a bunch of things directly on the head. Specifically, I want to call out two lists they put into their slides (which I can't find online, or I'd include a link, sorry).

One, what are the things you're trying to answer in an interview? They call it out as three questions an interviewer or interview team is seeking to answer:

  1. Can they do the job?
  2. Will they be motivated?
  3. Would they get along with the team?
Personally, #2 to me is a red herring--frankly, I expect that if you, the candidate, take a job with my company, then either you have determined that you will be motivated to work here, or else you can force yourself to be. I don't really expect you to be the company cheerleader (unless, of course, I'm hiring you for that role), but I do expect professionalism: that you will be at work when you are scheduled or expected to be, that you will do quality work while you are there, and that you will look to make the best decisions possible given the information you have at the time. Motivation is not something I should be interviewing for; it's something you should be bringing.

But the other two? Spot-on.

And this brings me to my interim point: I'm not opposed to a programming test. I think I gave the impression to a number of readers that I think that I'm too good or too famous or whatever to be tested on my skills; that's the furthest thing from the truth. I think you most certainly should be verifying that I have the technical chops to do the job you want me to do; what I do want to suggest, however, is that for a number of candidates (myself included), there are ways to determine my technical chops without forcing me to stand at a whiteboard and code with a pen. For some candidates, you can examine their GitHub profile and see how many repos they have that're public (and have a look through some of the code they wrote). In fact, what I think would be a great interview question would be to look at a repo they haven't touched in a year, find some element of the code inside there, and ask them to explain what they were thinking when they wrote it. If it's well-documented, or if it's simple code, they'll be able to do that fairly quickly (once they context-swap to the codebase--got to give them time to remember, after all). If it's a complex or tricky bit, and they can't explain it...

... well, you just learned something about the code they write, now didn't you?

In my case, I have no public GitHub profile to choose from, but I'm an edge case, in that you can also watch my videos, and/or read my books and articles. Granted, there's a chance that I have amazing editors who save me from incredible stupidity and make me look good... but what are the chances that somebody is doing that for over a decade, across several technology platforms, and all without any credit? Probably pretty close to nil, IMHO. I'm not unique in this case--there's others whose work more or less speaks for itself, and I think you're disrespecting the candidate if you don't do your homework on the interview first.

Which, by the way, brings up another point: As an interviewer, you have a responsibility to do your homework on the candidate before they walk in the door, particularly if you're expecting them to have done their homework on your firm. Don't waste my time (and yours, particularly since yours is probably a LOT more expensive than mine, considering that a lot of companies are doing "interview loops" these days with a team of people, and all of their time adds up). If you're not going to take my candidacy seriously, why should I take your job or job offer or interview seriously?

The second list Jon and Mitch call out is their "interviewing antipatterns" list:

  • The Riddler
  • The Disorienter
  • The Stone Tablet
  • The Knuth Fanatic
  • The Cram Session
  • Groundhog Day
  • The Gladiator
  • Hear No Evil
I want you to watch the video, so I'm not going to summarize each here; go watch it. If you're in a position of doing hiring, ask yourself how many of those you yourself are perpetrating.

Second, go read this article. I don't like that he has "Dig into algorithms, data structures, code organization, simplicity" as one of his takeaways, because I think most interviewers are going to see "algorithms" and "data structures" and stop there, but the rest seems pretty spot-on.

Third, ask yourself the critical question: What, exactly, are we doing wrong? You think you're an agile organization? Then ask yourself how much feedback you get on your interviewing process, and how you would know if you screwed it up. Yes, you will know if hire a bad candidate, but how will you know if you're letting good candidates go? Maybe you're the hot company that everybody wants to work at, and you can afford to throw some wheat out with the chaff a few times, but you're not going to be in that position for long if you do, and more importantly, you're not going to be in that position for long, period. If you don't start trying to improve your hiring process now, by the time you need to, it'll be too late.

Fourth, practice! When unit-testing came out, many programmers said, "I don't need to test my code, my code is great!", and then everybody had a good laugh at their expense. Yet I see a lot of companies say essentially the same thing about their hiring and interview practices. How do you test an interview process? Easy--interview yourselves. Work with known-good conditions (people you know, people who work with you already, and so on), and run them through the process, but with the critical stipulation that you must treat them exactly as you would a candidate. If you look at your tech lead and say, "Yeah, this is where I'd ask you a technical question, but I already know...", then unless you're prepared to do that for your candidates, you're cheating yourself on the feedback. It's exactly like saying, "Yeah, this is where I'd write a test checking to see how we handle a null in that second parameter, but I already know...". If you're not prepared to do the latter, don't do the former. (And if you are prepared to do the latter, then I probably don't want to work with you anyway.)

Fifth, remember: Interviewing is not easy! It's not easy on the candidates, and it shouldn't be on you. It would be great if you could just test somebody on one dimension of themselves and call it good, but as much as people want to pretend that a programmer is just a code-spewing cog in a machine, they're not. If you want well-rounded candidates, then you must interview all aspects of that well-roundedness to determine if they are or not.

Whatever you interview for, that's what you will get.


.NET | Android | Azure | C# | C++ | Conferences | Development Processes | F# | Flash | Industry | iPhone | Java/J2EE | Languages | LLVM | Mac OS | Objective-C | Parrot | Personal | Python | Reading | Review | Ruby | Scala | Security | Social | Solaris | Visual Basic | VMWare | WCF | Windows | XML Services

Monday, August 19, 2013 9:30:55 PM (Pacific Daylight Time, UTC-07:00)
Comments [1]  | 
On "Exclusive content"

Although it seems to have dipped somewhat in recent years, periodically I get requests from conferences or webinars or other presentation-oriented organizations/events that demand that the material I present be "exclusive", usually meaning that I've never delivered said content at any other organized event (conference or what-have-you). And, almost without exception, I refuse to speak at those events, or else refuse to abide by the "exclusive" tag (and let them decide whether they still want me to speak for them).

People (by which I mean "organizers"--most speakers seem to get it intuitively if they've spoken at more than five or so conferences in their life) have expressed some surprise and shock at my attitude. So, I decided to answer some of the more frequently-asked questions that I get in response to this, partly so that I don't have to keep repeating myself (yeah, right, as if said organizers are going to read my blog) and partly because putting something into a blog is a curious form of sanity-check, in that if I'm way off, commenters will let me know posthaste.

Thus...:

  • "Nobody will come to our conference/listen to our webinar if the content is the same as elsewhere." This is, by far, the first and most-used reaction I get, and let me be honest: if people came to your conference or fired up your webinar solely because of the information contained, they would never come to your conference or listen to your webinar. The Internet is huge. Mind-staggeringly huge. Anything you could possibly ever want about any topic you could ever possibly imagine, it's captured it somewhere. (There's a corollary to that, too; I call it "Whittington's Law", which states, "Anything you can possibly imagine, the Internet not only has it, but a porn site version of it, as well".) You will never have exclusive content, because unless I invented the damn thing, and I've never shown it to anybody or ever used it before, somebody will likely have used it, written a blog post or a video tutorial or what-have-you, and posted it to the Internet. Therefore, by definition, it can't be exclusive.
  • But even on top of that first point, no presentation given by the same guy using the same slides is ever exactly the same. Anybody who's ever seen me give a talk twice knows that a lot of how I give my presentations is extremely ad-hoc; I like to write code on the fly, incorporate audience feedback and participation, and sometimes I even get caught up in a tangent that we explore along the way. None of my presentations are ever scripted, such that if you filmed two of them and played them side-by-side, you'll see marked and stark differences between them. And frankly, if you're a conference organizer, you should be quite happy about this, because one of the first rules of presenting is to "Know thy audience", but if you can't know your audience ahead of time, what course is left to you but to poll the audience when you first get started, and adjust your presentation based on that?
  • "Sure, the experience won't be as great as if they were in the room at the time, but if they can get the content elsewhere, why should they come to our conference?" Well.... Honestly, that question really needs to be rephrased: "Given all the vast amounts of information out there on the Internet, why should someone come to your conference, period?" If you and your fellow organizers can't answer that question, then my content isn't going to help you in the slightest. TechEd and other big conferences that stream all of their content to the Web seem to be coming to the realization that there is something about the in-person experience that still creates value for attendees, so maybe you should be thinking about that, instead. Yes, you will likely lose a few ticket sales from people watching the content online, but if those numbers are staggeringly large, it means that your conference offered nothing but content in the first place, and you were going to see those numbers drop off significantly anyway once the majority of your audience figured out that the content is available elsewhere. And for free, no less.
  • "But why is this so important to you?" Because, my friends, everything gets better with practice, and that includes presentations. When I taught for DevelopMentor lo those many years ago, one of the fundamental rules was that "You don't really know a deck until you've delivered it five times". (I call it "Sumida's Law", after the guy who trained me there.) What's more, the more often you've presented on a subject, the more easily you see the "right" order to the topics, and better ways of explaining and analogizing those topics occur to you over time. ("Halloway's Corollary to Sumida's Law": "Once you've delivered a deck five times, you immediately want to rewrite it all".) To be quite honest with you all, the first time I give a talk is much like the beta release of any software product: it takes user interaction and feedback before you start to see the non-obvious bugs.

I still respect the conference or webinar host that insists on exclusive content, and I wish you well finding your next speaker.


.NET | C# | C++ | Conferences | Industry | Java/J2EE | Languages | Personal | Review | Social | WCF | Windows

Monday, August 19, 2013 7:17:56 PM (Pacific Daylight Time, UTC-07:00)
Comments [0]  | 
 Friday, April 05, 2013
"Craftsmanship", by another name

This blog, talking about the "1/10" developer as a sort of factored replacement for the "x10" developer, caught my eye over Twitter. Frankly, I'm not sure what to say about it, but there's a part of me that says I need to say something.

I don't like the terminology "1/10 developer". As the commenters on the author's blog suggest, it implies a denigration of the individual in question. I don't think that was the author's intent, but intentions don't matter--results do. You're still suggesting that this guy is effectively worthless, even if your intent is to say that his programming skills aren't great.

Some programmers shouldn't be. It's hard to say it, but yes, there are going to be some programmers at either end of the bell curve. (Assuming that skill in programming is a bell curve, and some have suggested that it's not, which is its own fascinating discussion, but for another day.) That means that some of the people writing code with you or for you are not going to be from the end you'd hope them to be from. That doesn't necessarily mean they should all immediately retire and take up farming.

Be careful how you measure. The author assumed that because this programmer wasn't able to churn out code at the same rate that the author himself could, the programmer in question was therefore one of these "1/10" programmers. Hubris is a dangerous thing in a CTO, even a temporary one--assuming that you could write it in "like, 2 hours, tops" is a dangerous, dangerous path. Every programmer I've ever known has looked at a feature or a story, thought, "Oh, that should only take me, like, 2 hours, tops" and then discovered later, to his/her chagrin, that there's a lot more involved in that than first considered. It's very possible the author/CTO is a wunderkind programmer who could do everything he talked about in, like, 1 or 2 hours, tops. It's also very possible that this author/CTO misunderstood the problem (which he never once seems to consider).

The teacher isn't finished teaching until the student learns. From the sound of the blog post, it doesn't sound like the author/CTO was really putting that much of an effort into teaching the programmer, but just "leading him step by step" to the solution. Give a man a fish... teach a man to fish.... Not all wunderkind programmer/author/CTOs are great teachers.

Some students just don't learn very well. The sword of teaching swings both ways, though: sometimes, some teachers just can't reach some students. It sucks, but it's life.

This programmer was a PhD candidate? The programmer in question, by the way, was (according to the blog) studying for a PhD at the time. And couldn't grasp MVC? Something is off here. I believe it, on the surface of it, because I worked with a guy who had graduated university with a PhD, and couldn't understand C++ and MFC to save his life, and got fired (and I inherited his project, which was a mess, to be blunt), but he'd spent all his time in university studying artificial intelligence, and had written it all using straight C code because that's what the libraries and platform he was using for his research demanded. I don't think he was a "1/10" developer, I think he was woefully mis-placed. Would you like an offensive lineman and put him as a slot receiver? Would you take a catcher and put him at pitcher? Would you take a Marketing guy and put him on server support? We need to stop thinking that all programmers are skilled alike--this is probably creating more problems than we really realize. Sure, on the whole, it sounds great that "craftsmen" should be able to pick up any tool and be just as effective with that tool as they are with any other--just like a drywaller can pick up a wrench and be just as effective a plumber, and pick up a circuit breaker and be just as effective an electrician. Right?

In the end reckoning, I don't think the "1/10" vs "10x" designation really does a whole lot--I have a hard time caring where the decimal point goes in this particular home-spun tale of metrics. And I'll even give the author the benefit of the doubt and assume the programmer he had was, in fact, from the lower end of the bell curve, and just wasn't capable of putting together the necessary abstractions in his head to get from point "A" to point "B", figuratively and literally.

But to draw this conclusion from a data point of one person? Seems a little sketchy, to me.

Software development, once again, thy name is hubris.


Development Processes | Industry | Languages | Reading | Review | Social

Friday, April 05, 2013 1:35:47 AM (Pacific Daylight Time, UTC-07:00)
Comments [1]  | 
 Thursday, February 21, 2013
Java was not the first

Charlie Kindel blogs that he thinks James Gosling (and the rest of Sun) screwed us all with Java and it's "Write Once, Run Anywhere" mantra. It's catchy, but it's wrong.

Like a lot of Charlie's blogs, he nails parts of this one squarely on the head:

WORA was, is, and always will be, a fallacy. ... It is the “Write once…“ part that’s the most dangerous. We all wish the world was rainbows and unicorns, and “Write once…” implies that there is a world where you can actually write an app once and it will run on all devices. But this is precisely the fantasy that the platform vendors will never allow to become reality. ...
And, given his current focus on building a mobile startup, he of course takes this lesson directly into the "native mobile app vs HTML 5 app" discussion that I've been a part of on way too many speaker panels and conference BOFs and keynotes and such:
HTML5 is awesome in many ways. If applied judiciously, it can be a great technology and tool. As a tool, it can absolutely be used to reduce the amount of platform specific code you have to write. But it is not a starting place. Starting with HTML5 is the most customer unfriendly thing a developer can do. ... Like many ‘solutions’ in our industry the “Hey, write it once in in HTML5 and it will run anywhere” story didn’t actually start with the end-user customer. It started with idealistic thoughts about technology. It was then turned into snake oil for developers. Not only is the “build a mobile app that hosts a web view that contains HTML5″ approach bass-ackwards, it is a recipe for execution disaster. Yes, there are examples of teams that have built great apps using this technique, but if you actually look at what they did, they focused on their experience first and then made the technology work. What happens when the shop starts with “we gotta use HTML5 running in a UIWebView” is initial euphoria over productivity, followed by incredible pain doing the final 20%.
And he's flat-out right about this: HTML 5, as an application development technology, takes you about 60 - 80% of the way home, depending on what you want your application to do.

In fact, about the only part of Charlie's blog post that I disagree with is the part where he blames Gosling and Java:

I blame James Gosling. He foisted Java on us and as a result Sun coined the term Write Once Run Anywhere. ... Developers really want to believe it is possible to “Write once…”. They also really want to believe that more threads will help. But we all know they just make the problems worse. Just as we’ve all grown to accept that starting with “make it multi-threaded” is evil, we need to accept “Write once…” is evil.
It didn't start with Java--it started well before that, with a set of cross-platform C++ toolkits that promised the same kind of promise: write your application in platform-standard C++ to our API, and we'll have the libraries on all the major platforms (back in those days, it was Windows, Mac OS, Solaris OpenView, OSF/Motif, and a few others) and it will just work. Even Microsoft got into this game briefly (I worked at Intuit, and helped a consultant who was struggling to port QuickBooks, I think it was, over to the Mac using Microsoft's short-lived "MFC For Mac OS" release), And, even before that, we had the discussions of "Standard C" and the #ifdef tricks we used to play to struggle to get one source file to compile on all the different platforms that C runs on.

And that, folks, is the heart of the matter: long before Gosling took his fledgling failed set-top box Oak-named project and looked around for a space to which to apply it next, developers... no, let's get that right, "developers and their managers who hate the idea of violating DRY by having the code in umpteen different codebases" have been looking for ways to have a single source base that runs across all the platforms. We've tried it with portable languages (see C, C++, Java, for starters), portable libraries (in the C++ space see Zinc, zApp, XVT, Tools.h++), portable containers (see EJB, the web browser), and now portable platforms (see PhoneGap/Cordova, Titanium, etc), portable cross-compilers (see MonoTouch/MonoDroid, for recent examples), and I'm sure there will be other efforts along these lines for years and decades to come. It's a noble goal, but the major players in the space to which we are targeting--whether that be operating systems, browsers, mobile platforms, console game devices, or whatever comes next two decades from now--will not allow their systems to be commoditized that easily. Because at the heart of it, that's exactly what these "cross-platform" tools and languages and libraries are trying to do: reduce the underlying "thing" to a commodity that lacks interest or impact.

Interestingly enough, as a side-note, one thing I'm starting to notice is that the more pervasive mobile devices become and the more mobile applications we see reaching those devices, the less and less "device-standard" those interfaces are trying to look even as they try to achieve cross-platform similarities. Consider, for a moment, the Fly Delta app on iPhone: it doesn't really use any of the standard iOS UI metaphors (except for some of the basic ones), largely because they've defined their own look-and-feel across all the platforms they support (iOS and Android, at least so far). Ditto for the CNN and USA Today apps, as well as the ESPN app, and of course just about every game ever written for any of those platforms. So even as Charlie argues:

The problem is each major platform has its own UI model, its own model for how a web view is hosted, its own HTML rendering engine, and its own JavaScript engine. These inter-platform differences mean that not only is the platform-specific code unique, but the interactions between that code and the code running within the web view becomes device specific. And to make matters worse intra-platform fragmentation, particularly on the platform with the largest number of users, Android, is so bad that this “Write Once..” approach provides no help.
We are starting to see mobile app developers actually striving to define their own UI model entirely, with only passing nod to the standards of the device on which they're running. Which then makes me wonder if we're going to start to see new portable toolkits that define their own unique UI model on each of these platforms, or will somehow allow developers to define their own UI model on each of these platforms--a UI model toolkit, so to speak. Which would be an interesting development, but one that will eventually run into many of the same problems as the others did.


.NET | Android | Azure | C# | C++ | Development Processes | F# | Flash | Industry | iPhone | Java/J2EE | Languages | LLVM | Mac OS | Objective-C | Parrot | Review | Ruby | Windows

Thursday, February 21, 2013 4:08:04 PM (Pacific Standard Time, UTC-08:00)
Comments [6]  | 
 Saturday, February 02, 2013
Last Thoughts on "Craftsmanship"

TL;DR Live craftsmanship, don't preach it. The creation of a label serves no purpose other than to disambiguate and distinguish. If we want to hold people accountable to some sort of "professionalism", then we have to define what that means. I found Uncle Bob's treatment of my blog heavy-handed and arrogant. I don't particularly want to debate this anymore; this is my last take on the subject.


I will freely admit, I didn't want to do this. I really didn't. I had hoped that after my second posting on the subject, the discussion would kind of fade away, because I think we'd (or I'd, at least) wrought about the last few drops of discussion and insight and position on it. The same memes were coming back around, the same reactions, and I really didn't want to perpetuate the whole thing ad infinitum because I don't really think that's the best way to reach any kind of result or positive steps forward. I'd said my piece, I was happy about it.

Alas, such was not to be. Uncle Bob posted his thoughts, and quite frankly, I think he did a pretty bad job of hearing what I had to say, couching it in terms of populism (I stopped counting the number of times he used that word at six or so) even as he framed in it something of his own elitist argument.

Bob first points us all at the Manifesto for Software Craftsmanship. Because everyone who calls themselves a craftsman has to obey this manifesto. It's in the rules somewhere. Sort of like the Agile Manifesto--if you're not a signatory, you're doing it wrong.

(Oh, I know, to suggest that there is even the smallest thing wrong with the Agile Manifesto borders on heresy. Which, if that's the reaction you have, should be setting off a few warning bells in your head--something about replacing dogma with dogma.)

And you know what? I actually agree with most of the principles of the Craftsmanship Manifesto. It's couched in really positive, uplifting language: who doesn't want "well-crafted" software, or "steadily-increasing value", or "productive partnerships"? It's a wonderfully-worded document that unfortunately is way short on details, but hey, it should be intuitively obvious to anyone who is a craftsman, right?

See, this is part of my problem. Manifestos tend to be long on rhetoric, but very, very short on details. The Agile Manifesto is another example. It stresses "collaboration" and "working software" and "interactions" and "responding to change", but then people started trying to figure out how to apply this, and we got into the knife-fights that people arguing XP vs. Scrum vs. Kanban vs. your-homebrewed-craptaculous-brand-of-"little-a"-agile turned into brushfire wars. It's wonderful to say what the end result should be, but putting that into practice is a whole different ball of wax. So I'm a little skeptical any time somebody points to a Manifesto and says, "I believe in that, and that should suffice for you".

Frankly, if we want this to have any weight whatsoever, I think we should model something off the Hippcratic Oath, instead--it at least has prescriptive advice within it, telling doctors what they can and cannot (or, perhaps worded more accurately, should or should not) do. (I took something of a stab at this six years ago. It could probably use some work and some communal input; it was a first iteration.)

Besides (beware the accusation coming of my attempt at a false-association argument here, this is just for snarkiness purposes!), other manifestos haven't always worked out so well.

So by "proving [that I misinterpreted the event] by going to the Manifesto", you're kind of creating a circular argument: "What happened can't have been because of Software Craftsmanship, because look, there, in the Manifesto, it says we don't do that, so clearly, we can't have done that. It says it, right there! Seriously!"

The Supposed "Segregation"

Bob then says I'm clearly mistaken about "craftsmen" creating a segregation, because there's nothing about segregation in the manifesto:

any intimation of those who "get it" vs. those who don't; or any mention of the "right" tools or the "right" way. Indeed, what I see instead is a desire to steadily add value by writing well-crafted software while working in a community of professionals who behave as partners with their customers. That doesn't sound like "narcissistic, high-handed, high-minded" elitism to me.
Hold on to that thought for a bit.

Bob then goes on an interesting leap of logical assumption here. He takes my definition of a "software laborer":

"somebody who comes in at 9, does what they're told, leaves at 5, and never gives a rat's ass about programming except for what they need to know to get their job done [...] who [crank] out one crappy app after another in (what else?) Visual Basic, [that] were [...] sloppy, bloated, ugly [...] cut-and-paste cobbled-together duct-tape wonders."
and interprets it as
Now let's look past the hyperbole, and the populist jargon, and see if we can identify just who Ted is talking about. Firstly, they work 9-5. Secondly, they get their job done. Thirdly, they crank out lots of (apparently useful) apps. And finally, they make a mess in the code. The implication is that they are not late, have no defects, and their projects never fail.
That's weird. I go back and read my definition over and over again, and nowhere do I see me suggesting that they are never late, no-defect, and never-fail projects. Is it possible that Bob is trying to set up his next argument by reductio ad absurdum, basically by saying, "These laborers that Ted sets up, they're all perfect! They walk on water! They must be the illegitimate offspring of Christ himself! Have you met them? No? Oh, then they must not exist, and therefore his entire definition of the 'laborer' is whack, as these young-un kids like to say."

(See what I did there? I make Bob sound old and cantankerous. Not that he would do the same to himself, trying to use his years of experience as a subtle bludgeon to anyone who's younger and therefore less experienced--less professional, by implication--in his argument, right?

Programming is barely 60 years old. I, personally, have been programming for 43+ of those years.
Oh.)

Having sort of wrested my definition of the laborer away from me, Bob goes on:

I've never met these people. In my experience a mess in the code equates to lots of overtime, deep schedule overruns, intolerable defect rates, and frequent project failure -- not to mention eventual redesign.
Funny thing. I've seen "crafted" projects that fell to the same victims. Matter of fact, I had a ton of people (so it's not just my experience, folks, clearly there's a few more examples out there) email and comment to me that they saw "craftsmen" come in and take what could've been a one-week project and turn it into a six-month-or-more project by introducing a bunch of stuff into the project that didn't really need to be there, but were added in order to "add value" to the code and make it "well-crafted". (I could toss off some of the software terms that were cited as the reasons behind the "adding of value"--decoupled design, dependency injection, reusability, encapsulation, and others--but since those aren't in the Manifesto either, it's easy to say in the abstract that the people who did those projects weren't really adding value, even though these same terms seem to show up on every singe project during architecture and design, agile or otherwise.)

Bob goes on to sort of run with this theme:

Ted has created a false dichotomy that appeals to a populist ideology. There are the elite, condescending, self-proclaimed craftsmen, and then there are the humble, honorable, laborers. Ted then declares his allegiance to the latter... .
Well, last time I checked, all I have to do to be listed amongst the craftsmen is sign a web page, so "self-proclaimed" seems pretty accurate as a title. And "elite"? I dunno, can anyone become a craftsman? If so, then the term as a label has no meaning; if not, then yes, there's some kind of segregation, and it sure sounds like you're preaching from on high, particularly when you tell me that I've created a "false dichotomy" that appeals to a "populist ideology":
Generally, populists tend to claim that they side with "the people" against "the elites". While for much of the twentieth century, populism was considered to be a political phenomenon mostly affecting Latin America, since the 1980s populist movements and parties have enjoyed degrees of success in First World democracies such as the USA, Canada, Italy, the Netherlands and Scandinavian countries.
So apparently I'm trying to appeal to "the people", even though Bob will later tell us that we're all the same people. (Funny how there's a lot of programmers who feel like they're being looked down on by the elites--and this isn't my interpretation, read my blog's comments and the responses that have mushroomed on Twitter.) Essentially, Bob will argue later that there is no white-collar/blue-collar divide, even though according to him I'm clearly forming an ideology to appeal to people in the blue-collar camp.

So either I'm talking into a vacuum, or there's more of a divide than Bob thinks. You make the call on that one.

Shall we continue?

He strengthens his identity with, and affinity for, these laborers by telling a story about a tea master and a samurai (or was it some milk and a cow) which further extends and confuses the false dichotomy.
Nice non-sequitur there, Bob! By tossing in that "some milk and a cow", you neatly rob my Zen story of any power whatsoever! You just say it "extends and confuses the false dichotomy", without any real sort of analysis or discussion (that comes later, if you read through to the end), and because you're a craftsman, and I'm just appealing to populist ideology, my story no longer has any meaning! Because reductio ad make-fun-of-em is also a well-recognized and well-respected logical analysis in debating circles.

Oh, the Horror! ... of Ted's Psyche

Not content to analyze the argument, because clearly (he says this so many times, it must be true) my argument is so weak as to not stand on its own (even though I'm not sure, looking back at this point, that Bob has really attacked the argument itself at all, other than to say, "Look at the Manifesto!"), he decides to engage in a little personal attack:

I'm not a psychoanalyst; and I don't really want to dive deep into Ted's psyche to unravel the contradictions and false dichotomies in his blog. However, I will make one observation. In his blog Ted describes his own youthful arrogance as a C++ programmer... It seems to me that Ted is equating his own youthful bad behavior with "craftsmanship". He ascribes his own past arrogance and self-superiority with an entire movement. I find that very odd and very unfortunate. I'm not at all sure what prompted him to make such a large and disconnected leap in reasoning. While it is true that the Software Craftsmanship movement is trying to raise awareness about software quality; it is certainly not doing so by promoting the adolescent behavior that Ted now disavows.
Hmm. One could argue that I'm just throwing out that I'm not perfect nor do I need to profess to be, but maybe that's not a "craftsman's" approach. Or that I was trying to show others my mistakes so they could learn from them. You know, as a way of trying to build a "community of professionals", so that others don't have to go through the mistakes I made. But that would be psychoanalyzing, and we don't want to do that. Others didn't seem to have the problem understanding the "very large and disconnected leap in reasoning", and I would hate to tell someone with over twice my years of experience programming how to understand a logical argument, so how about let's frame the discussion this way: I tend to assume that someone behaving in a way that I used to behave (or still behave) is doing so for the same reasons that I do. (It's a philosophy of life that I've found useful at times.) So I assume that craftsmen take the path they take because they want to take pride in what they do--it's important to them that their code sparkle with elegance and beauty, because that's how code adds value.

Know what? I think one thing that got lost somewhere in all this debate is that value is only value if it's of value to the customer. And in a lot of the "craftsmanship" debates, I don't hear the customer's voice being brought up all that much.

You remember all those crappy VB apps that Bob maligned earlier? Was the customer happy? Did anybody stop to ask them? Or was the assumption that, since the code was crappy, the customer implicitly must be unhappy as well? Don't get me wrong, there's a lot of crappy code out there that doesn't make the customer happy. As a matter of fact, I'll argue that any code that doesn't make the customer happy is crap, regardless of what language it's written in or what patterns it uses or how decoupled or injected or new databases it stores data into. Value isn't value unless it's value to the person who's paying for the code.

Bob Discusses the Dichotomy

Eh, I'm getting tired of writing all this, and I'm sure you're getting tired of reading it, so let's finish up and call it a day. Bob goes on to start dissecting my false dichotomy, starting with:

Elitism is not encouraged in the Software Craftsmanship community. Indeed we reject the elitist attitude altogether. Our goal is not to make others feel bad about their code. Our goal is to teach programmers how to write better code, and behave better as professionals. We feel that the software industry urgently needs to raise the bar of professionalism.
Funny thing is, Bob, one could argue that you're taking a pretty elitist stance yourself with your dissection of my blog post. Nowhere do I get the benefit of the doubt, nor is there an effort to try and bring yourself around to understand where I'm coming from; instead, I'm just plain wrong, and that's all there is to it. Perhaps you will take the stance that "Ted started it, so therefore I have to come back hard", but that doesn't strike me as humility, that strikes me as preaching from a pulpit in tone. (I'd use a Zen story here to try and illustrate my point, but I'm afraid you'd characterize it as another "milk and a cow" story.)

But "raising the bar of professionalism", again, misses a crucial point, one that I've tried to raise earlier: Who defines what that "professionalism" looks like? Does the three-line Perl hack qualify as "professionalism" if it gets the job done for the customer so they can move on? Or does it need to be rewritten in Ruby, using convention over configuration, and a whole host of dynamic language/metaprogramming/internal DSL tricks? What defines professionalism in our world? In medicine, it's defined pretty simply: is the patient healthier or not after the care? In the legal profession, it's "did we represent the client to the best of our ability while remaining in compliance with the rules of ethics laid down by the bar and the laws of the entity in which we practice?" What defines "professionalism" in software? When you can tell me what that looks like, in concrete, without using words that allow for high degree of interpretation, then we can start to make progress towards whether or not my "laborers" are, in actuality, professionals.

We continue.

There are few "laborers" who fit the mold that Ted describes. While there are many 9-5 programmers, and many others who write cut-paste code, and still others who write big, ugly, bloated code, these aren't always the same people. I know lots of 12-12 programmers who work hellish hours, and write bloated, ugly, cut-paste code. I also know many 9-5 programmers who write clean and elegant code. I know 9-5ers who don't give a rat's ass, and I know 9-5ers who care deeply. I know 12-12ers who's only care is to climb the corporate ladder, and others who work long hours for the sheer joy of making something beautiful.
Of course there aren't, Bob, you took my description and sort of twisted it. (See above.) And yes, I'll agree with you, there's lots of 9-5 developers, and lots of 12-12 developers, lots of developers who write great code, and lots of developers who write crap code and what's even funnier about this discussion is that sometimes they're all the same person! (They do that just to defy this kind of stereotyping, I'm sure.) But maybe it's just the companies I've worked for compared to the companies you've worked for, but I can rattle off a vastly larger list of names who fit in the "9-5" category than those who fit into the "12-12" category. All of them wanted to do a good job, I believe, but I believe that because I believe that every human being innately wants to do things they are proud of and can point to with a sense of accomplishment. Some will put more energy into it than others. Some will have more talent for it than others. Just like dancing. Or farming. Or painting. Or just about any endeavor.

The Real Problem

Bob goes on to talk about the youth of our industry, but I think the problem is a different one. Yes, we're a young industry, but frankly, so is Marketing and Sales (they've only really existed in their modern forms for about sixty or seventy years, maybe a hundred if you stretch the definitions a little), and ditto for medicine (remember, it was only about 150 years ago that surgeons were also barbers). Yes, we have a LOT to learn yet, and we're making a lot of mistakes, I think, because our youth is causing us to reach out to other, highly imperfect metaphor/role-model industries for terminology and inspiration. (Cue the discussion of "software architecture" vs "building architecture" here.) Personally, I think we've learned a lot, we're continuing to learn more, and we're reaching a point where looking at other industries for metaphors is reaching a practical end in terms of utility to us.

The bigger problem? Economics. The supply and demand curve.

Neal Ford pointed out on an NFJS panel a few years back that the demand for software vastly exceeds the supply of programmers to build it. I don't know where he got that--whether he read that somewhere or that formed out of his own head--but he's absolutely spot-on right, and it seriously throws the whole industry out of whack.

If the software labor market were like painting, or car repair, or accounting, then the finite demand for people in those positions would mean that those who couldn't meet customer satisfaction would eventually starve and die. Or, more likely, take up some other career. It's a natural way to take the bottom 20% of the bell curve (the portion out to the far right) of potential practitioners, and keep them from ruining some customers' life. If you're a terrible painter, no customers will use you (at least, not twice), and while I suppose you could pick up and move to a new market every year or so until you're run out of town on a rail for crappy work, quite honestly, most people will just give up and go do something else. There are thousands--millions--of actors and actresses in Southern California that never make it to stage or screen, and they wait tables until they find a new thing to pursue that adds value to their customers' lives in such a way that they can make a living.

But software... right now, if you walk out into the middle of the street in San Francisco wearing a T-shirt that says, "I write Rails code", you will have job offers flying after you like the paper airplanes in Disney's just-released-to-the-Internet video short. IT departments are throwing huge amounts of cash into mechanisms, human or otherwise, working or otherwise, to help them find developers. Software engineering has been at the top of the list of "best jobs" for several years, commanding high salaries in a relatively stress-free environment, all in a period of time that many of equated to be the worst economic cycle since the Great Depression. Don't believe me? Take a shot yourself, go to a Startup Weekend and sign up as a developer: there are hundreds of people with new app ideas (granted, most of them total fantasy) who are just looking for a "technical co-founder" to help them see their dream to reality. IT departments will take anybody right now, and I do mean anybody. I'm reasonably convinced that half the reason software development outsourcing overseas happens is because it's a choice between putting up with doing the development overseas, even with all of the related problems and obstacles that come up, or not doing the development at all for lack of being able to staff the team to do it. (Which would you choose, if you were the CTO--some chance of success, or no chance at all?)

Wrapping up

Bob wraps up with this:

The result is that most programmers simply don't know where the quality bar is. They don't know what disciplines they should adopt. They don't know the difference between good and bad code. And, most importantly, they have not learned that writing good clean code in a disciplined manner is the fastest and best way get the job done well.

We, in the Software Craftsmanship movement are trying to teach those lessons. Our goal is to raise the awareness that software quality matters. That doing a good job means having pride in workmanship, being careful, deliberate, and disciplined. That the best way to miss a deadline, and lay the seeds of defeat, is to make a mess.

We, in the Software Craftsmanship movement are promoting software professionalism.
Frankly, Bob, you sort of reject your own "we're not elitists" argument by making it very clear here: "most programmers simply don't know where the quality bar is. They don't know .... They don't know.... They have not learned. ... We, in the Software Craftsmanship movement are trying to teach those lessons." You could not sound more elitist if you quoted the colonial powers "bringing enlightenment" to the "uncivilized" world back in the 1600s and 1700s. They are an ignorant, undisciplined lot, and you have taken this self-appointed messiah role to bring them into the light.

Seriously? You can't see how that comes across as elitist? And arrogant?

Look, I really don't mean to perpetuate this whole argument, and I'm reasonably sure that Uncle Bob is already firing up his blog editor to point out all the ways in which my "populist ideology" is falsly dichotomous or whatever. I'm tired of this argument, to be honest, so let me try to sum up my thoughts on this whole mess in what I hope will be a few, easy-to-digest bullet points:

  1. Live craftsmanship, don't preach it. If you hold the craftsman meme as a way of trying to improve yourself, then you and I have no argument. If you put "software craftsman" on your business cards, or website, or write Manifestos that you try to use as a bludgeon in an argument, then it seems to me that you're trying to distinguish yourself from the rest, and that to me smacks of elitism. You may not think of yourself as covering yourself in elitism, but to a lot of the rest of the world, that's exactly how you're coming off. Sorry if that's not how you intended it.
  2. Value is only value if the customer sees it as value. And the customer gets to define what is valuable to them, not you. You can (and should) certainly try to work with them to understand what they see as value, and you can (and should) certainly try to help them see how there may be value in ways they don't see today. But at the end of the day, they are the customer, they are paying the checks, and even after advising them against it, if they want to prioritize quick-and-dirty over longer-and-elegant, then (IMHO) that's what you do. Because they may have reasons for choosing that approach that they simply don't care to share with you, and it's their choice.
  3. The creation of a label serves no purpose other than to disambiguate and distinguish. If there really is no blue-collar programming workforce, Bob, then I challenge you to drop the term "craftsman" from your bio, profile, and self-description anywhere it appears, and replace it with "programer". Or else refer to all software developers as "craftsmen" (in which case the term becomes meaningless, and thus useless). Because, let's face it, how many doctors do you know who put "Hippocratic-sworn" somewhere on their business cards?
  4. If we want to hold people accountable to some sort of "professionalism", then we have to define what that means. The definition of the term "professional" is not really what we want, in practice, for it's usually defined as "somebody who got paid to do the job". The Craftsmanship Manifesto seems to want some kind of code of ethics or programmer equivalent to the Hippocratic Oath, so that the third precept isn't "a community of people who are paid to do what they do", but something deeper and more meaningful and concrete. (I don't have that definition handy, by the way, so don't look to me for it. But I will also roundly reject anyone who tries to use the Potter Stewart-esque "I can't define it but I know it when I see it" approach, because now we're back to individual interpretation.)
  5. I found Uncle Bob's treatment of my blog heavy-handed and arrogant. In case that wasn't obvious. And I reacted in similar manner, something for which I will apologize now. By reacting in that way, I'm sure I perpetuate the blog war, and truthfully, I have a lot of respect for Bob's technical skills; I was an avid fan of his C++ articles for years, and there's a lot of good technical ideas and concepts that any programmer would be well-advised to learn. His technical skill is without question; his compassion and empathy, however, might be. (As are mine, for stooping to that same level.)
Peace out.


.NET | C# | C++ | Conferences | Development Processes | F# | Industry | Java/J2EE | Languages | Parrot | Personal | Reading | Review | Social | Visual Basic | Windows

Saturday, February 02, 2013 4:33:12 AM (Pacific Standard Time, UTC-08:00)
Comments [3]  | 
 Friday, January 25, 2013
More on "Craftsmanship"

TL;DR: To all those who dissented, you're right, but you're wrong. Craftsmanship is a noble meme, when it's something that somebody holds as a personal goal, but it's often coming across as a way to beat up and denigrate on others who don't choose to invest significant time and energy into programming. The Zen Masters didn't walk around the countryside, proclaiming "I am a Zen Master!"

Wow. Apparently I touched a nerve.

It's been 48 hours since I posted On the Dark Side of 'Craftsmanship', and it's gotten a ton of interest, as well as a few syndicated re-posts (DZone and a few others). Comments to the blog included a response from Dave Thomas, other blog posts have been brought to my attention, and Twitter was on FIRE with people pinging me with their thoughts, which turn out to be across the spectrum, approving and dissenting. Not at all what I really expected to happen, to be honest--I kinda thought it would get lost in the noise of others commenting around the whole thing.

But for whatever reason, it's gotten a lot of attention, so I feel a certain responsibility to respond and explain to some of the dissenters who've responded. Not to defend, per se, but to at least demonstrate some recognition and attempt to clarify my position where I think it's gotten mis-heard. (To those who approved of the message, thank you for your support, and I'm happy to have vocalized something you felt unable, unwilling, unheard, or too busy to vocalize yourself. I hope my explanations here continue to represent your opinions, but if not, please feel free to let me know.)

A lot of the opinions centered around a few core ideas, it seems, so let me try and respond to those first.

You're confusing "craftsmanship" with a few people behaving badly. That may well be, but those who behaved badly included at least one who holds himself up as a leader of the craftsman movement and has held his actions up as indications of how "craftsmen" should behave. When you do this, you invite this kind of criticism and association. So if the movement is being given a black eye because of the actions of a single individual, well, now you know how a bunch of moderate Republicans feel about Paul Ryan.

Corey is a nice guy, he apologized, don't crucify him. Of course he is. Corey is a nice guy--and, speaking well to his character, he apologized almost immediately when it all broke. I learned a long time ago that "true sorry" means you (a) apologize for your actions, (b) seek to remedy the damage your actions have caused ("make it right", in other words), and (c) avoid making the same mistake in the future. From a distance, it seems like he feels contrition, and has publicly apologized for his actions. I would hope he's reached out to Heather directly to try and make things right with her, but that's between the two of them. Whether he avoids this kind of activity in the future remains to be seen. I think he will, but that's because I think he's learned a harsh lesson about being in the spotlight--it tends to be a harsh place to be. The rest of this really isn't about Corey and Heather anymore, so as far as I'm concerned, that thread complete.

You misunderstand the nature of "craftsmanship". Actually, no, I don't. At its heart, the original intent of "craftsmanship" was a constant striving to be better about what you do, and taking pride in the things that you do. It's related to the Japanese code of the samurai (kaizen) that says, in essence, that we are constantly striving to get better. The samurai sought to become better swordsmen, constantly challenging each other to prove the mettle against one another, improving their skills and, conditioning, but also their honor, by how they treated each other, their lord, their servants, and those they sought to protect. Kanban is a wonderful code, and one I have tried to live my entire life, even before I'd discovered it. Please don't assume that I misunderstand the teachings of your movement just because I don't go to the meetings.

Why you pick on "craftsmanship", anyway? If I want to take pride in what I do, what difference does it make? This is me paraphrasing on much of the dissent, and my response boils down to two basic thoughts:

  1. If you think your movement is "just about yourself", why invent a label to differentiate yourself from the rest?
  2. If you invent a label, it becomes almost automatic to draw a line between "us" and "them", and that in of itself almost automatically leads to "us vs them" behavior and mentality.
Look, I view this whole thing as kind of like religion: whatever you want to do behind closed doors, that's your business. But when you start waving it in other peoples' faces, then I have a problem with it. You want to spend time on the weekends improving your skills, go for it. You want to spend time at night learning a bunch of programming languages so you can improve your code and your ability to design systems, go for it. You want to study psychology and philosophy so you can understand other people better when it comes time to interact with them, go for it. And hey, you want to put some code up somewhere so people can point to it and help you get it better, go for it. But when you start waving all that time and dedication in my face, you're either doing it because you want recognition, or you want to suggest that I'm somehow not as good as you. Live the virtuous life, don't brag about it.

There were some specific blogs and comments that I think deserve discusson, too:

Dave Thomas was kind enough to comment on my blog:

I remember the farmer comment :) I think I said 30%, but I stand by what I said. And it isn't really an elitist stance. Instead, I feel that programming is hard work. At the end of a day of coding, I'm tired. And so I believe that if you are asking someone to do programming, then it is in both your and their interest that they are doing something they enjoy. Because if they don't enjoy it, then they are truly just a laborer, working hard at something that has no meaning to them. And as you spend 8 hours a day, 5 days a week doing it, that seems like an awful waste of an intelligent person's life.
Sure, programming is hard. So is house painting. They're different kinds of exhaustion, but it's exhaustion all the same. But, frankly, if somebody has chosen to take up a job that they do just because it's a job, that's their choice, and not ours to criticize, in my opinion. (And I remember it as 50%, because I very clearly remember saying the "way to insult half the room" crack after it, but maybe I misheard you. I do know others also heard it at 50%, because an attendee or two came up to talk about it after the panel. At least, that's how I remember it at the time. But the number itself is kinda meaningless, now that I think about it.)
The farming quote was a deliberate attempt at being shocking to make a point. But I still think it is valid. I'd guess that 30% of the developers I meet are not happy in their work. And I think those folks would be happier and more fulfilled doing something else that gave them more satisfaction.
Again, you and I are both in agreement, that people should be doing what they love, but that's a personal judgment that each person is permitted to make for themselves. There are aspects of our lives that we don't love, but we do because they make other people happy (Juliet and Charlotte driving the boys around to their various activities comes to mind, for example), and it is not our position to judge how others choose for themselves, IMHO.
No one should have to be a laborer.
And here, you and I will disagree quite fundamentally: as I believe it was Martin Luther King, Jr, who said, "If you are going to be a janitor, be the best janitor you know how to be." It seems by that statement that you are saying that people who labor with their bodies rather than your minds (and trust me, you may not be a laborer anymore, big publishing magnate that you are, but I know I sure still am) are somehow less well-off than those who have other people working for them. Some people don't want the responsibility of being the boss, or the owner. See the story of the mexican fisherman at the end of this blog.

Nate commented:

You have a logical fallacy by lumping together the people that derided Heather's code and people that are involved in software craftmanship. It's actually a huge leap of logic to make that connection, and it really retracts from the article.
As I point out later, the people who derided Heather's code were some of the same folks who hold up software craftsmanship. That wasn't me making that up.

Now you realise that you are planting your flag firmly in the 'craftmanship' camp while propelling your position upwards by drawing a line in the sand to define another group of people as 'labourers'. Or in other words attempt to elevate yourself by patronising others with the position you think you are paying them a compliment. Maybe you do not realise this?
No, I realize it, and it's a fair critique, which is why I don't label myself as a "craftsman". I have more to say on this below.
However, have you considered that the craft is not how awesome and perfect you and your code are, but what is applicable for the task at hand. I think most people who you would put into either camp share the same mix of attributes whether good or bad. The important thing is if the solution created does what it is designed to do, is delivered on time for when it is needed and if the environment that the solution has been created for warrants it, that the code is easily understandable by yourself and others (that matter) so it can be developed further over time and maintained.
And the very people who call themselves "craftsmen" criticized a piece of code that, as near as I can tell, met all of those criteria. Hence my reaction that started this whole thing.
I don't wish to judge you, and maybe you are a great, smart guy who does good in the world, but like you I have not researched anything about you, I have simply read your assessment above and come to a conclusion, that's being human I guess.
Oh, people judge each other all the time, and it's high time we stopped beating them up for it. It's human to judge. And while it would be politically correct to say, "You shouldn't judge me before you know me", fact is, of course you're going to do exactly that, because you don't have time to get to know me. And the fact that you don't know me except but through the blog is totally acceptable--you shouldn't have to research me in order to have an opinion. So we're all square on that point. (As to whether I'm a great smart guy who does good in the world, well, that's for others to judge in my opinion, not mine.)
The above just sounds like more of the same 'elitism' that has been ripe in this world from playground to the workplace since the beginning.
It does, doesn't it? And hopefully I clarify the position more clearly later.

In It's OK to love your job, Chad McCallum says that

The basic premise (or at least the one the author start out with) is that because there’s a self-declared group of “software craftspeople”, there is going to be an egotistical divide between those who “get it” and those who don’t.
Like it or not, Chad, that egotistical divide is there. You can "call bullshit" all day long, but look at the reactions that have popped up over this--people feel that divide, and frankly, it's one that's been there for a long, long time. This isn't just me making this up.

Chad also says,

It’s true the feedback that Heather got was unnecessarily negative. And that it came from people who are probably considered “software craftspeople”. That said, correlation doesn’t equal causation. I’m guessing the negative feedback was more because those original offenders had a bad day and needed to vent. And maybe the comments after that one just jumped on the bandwagon because someone with lots of followers and/or respect said it.

These are both things that can and have happened to anyone, regardless of the industry they work in. It’s extremely unfair to associate “someone who’s passionate about software development” to “person who’s waiting to jump on you for your mistakes”.

Unfortunately, Chad, the excuse that "others do it, too" is not an acceptable excuse. If everybody jumped off a cliff, would you do it, too? I understand the rationale--it's extremely hard being the one to go against the herd (I've got the psychological studies I can cite at you that prove it), but that doesn't make it OK or excuse it. Saying "it happens in other industries" is just an extension of that. In other industries, women are still explicitly discriminated against--does that make it OK for us to do that, too?

Chad closes his blog with "Stop calling us egotistical jerks just because we love what we do." To which I respond, "I am happy to do so, as soon as those 'craftsmen' who are acting like one, stop acting like one." If you're not acting like one, then there should be no argument here. If you're trying to tell me that your label is somehow immune to criticism, then I think we just have to agree to disagree.

Paul Pagel (on a site devoted to software craftsmanship, no less) responded as well with his Humble Pursuit of Mastery. He opens with:

I have been reading on blogs and tweets the sentiment that "software craftsmanship is elitism". This perception is formed around comments of code, process, or techniques. I understand a craftsman's earned sense of pride in their work can sometimes be inappropriately communicated.
I don't think I commented on code, process or technique, so I can't be sure if this is directly refuting what I'm saying, but I note that Paul has already touched on the meme he wants to communicate in his last phrase: the craftsman's "earned sense of pride". I have no problem with the work being something that you take pride in; I note, however, that "pride goeth before a fall", and note that, again, Ozymandias was justifiably proud of his accomplishments, too.

Paul then goes through a summation of his career, making sure to smallcaps certain terms with which I have no argument: "sacrifice", "listen", "practicing", "critique" and "teaching". And, in all honesty, these are things that I embrace, as well. But I start getting a little dubious about the sanctity of your terminology, Paul, when it's being used pretty blatantly as an advertising slogan and theme all over the site--if you want the term to remain a Zen-like pursuit, then you need to keep the commercialism out of it, in my opinion, or you invite the kind of criticism that's coming here (explicit or implicit).

Paul's conclusion wraps up with:

Do sacrificing, listening, practice, critiquing, and teaching sound like elitist qualities to you? Software craftsmanship starts out as a humble endeavor moving towards mastery. I won't let 140 or 1000 characters redefine the hours and years spent working hard to become a craftsman. It gave me humility and the confidence to be a professional software developer. Sometimes I let confidence get the better of me, but I know when that happens I am not honoring the spirit of craftsmanship which I was trained.
Humility enough to trademark your phrase "Software is our craft"? Humility enough to call yourself a "driving force" behind software craftsmanship? Don't get me wrong, Paul, there is a certain amount of commercialism that any consultant must adopt in order to survive--but either please don't mix your life-guiding principles with your commercialism, or else don't be surprised when others take aim at your "humility" when you do. It's the same when ministers stand in a multi-million dollar building on a Sunday morning and talk about the parable of the widow giving away her last two coppers--that smacks of hypocrisy.

Finally, Matt van Horn wrote in Crafsmanship, a rebuttal that:

there is an allusion to software craftsmen as being an exclusive group who agre on the “right” tools and techniques. This could not be further from the truth. Anyone who is serious about their craft knows that for every job there are some tools that are better and some that are worse.
... but then he goes right into making that exact mistake:
Now, I may not have a good definition of elegant code, but I definitely know it when I see it – regardless of who wrote it. If you can’t see that
(1..10).each{|i| puts i}

is more elegant than
x = 0
while true do
  x = x + 1
  if x > 10
    break
  end
  puts x
end
then you must near the beginning of your journey towards mastery. Practicing your craft develops your ability to recognize these differences, just as a skilled tailor can more easily spot the difference between a bespoke suit and something from Men’s Wearhouse.
Matt, you kind of make my point for me. What makes it elegant? You take it as self-evident. I don't. As a matter of fact, I've been asking this question for some years now, "What makes code 'elegant', as opposed to 'ugly'? Ironically, Elliott Rusty Harold just blogged about how this style of coding is dangerous in Java, and got crucified for it, but he has the point that functional style (your first example) doesn't JIT as well as the more imperative style right now on the JVM (or on the CLR, from what I can tell). Are you assuming that this will be running on a native Ruby implementation, on JRuby, IronRuby, ...? You have judged the code in the second example based on an intrinsic value system that you may have never questioned. To judge, you have to be able to explain your judgments in terms of the value system. And the fact that you judge without any context, kind of speaks directly to the point I was trying to make: "craftsmen", it seems, have this tendency to judge in absence of context, because they are clearly "further down their journey towards mastery", to use your own metaphor.

Or, to put it much more succinctly, "Beauty is in the eye of the beholder".

Matt then tells me I missed the point of the samurai and tea master story:

inally, he closes with a famous zen story, but he entirely misses the point of it. The story concerns a tea master, and a samurai, who get into a duel. The tea master prevails by bringing the same concentration to the duel that he brings to his tea ceremony. The point that Ted seems to miss here is that the tea master is a craftsman of the highest order. A master of cha-do (the way of tea) is able to transform the simple act of making and pouring a cup of tea into something transcendant by bringing to this simple act a clear mind, a good attitude, and years of patient, humble practice. Arguably he prevails because he has perfected his craft to a higher degree than the samurai has perfected his own. That is why he has earned the right to wear the garb of a samurai, and why he is able to face down his opponent.
Which, again, I find funny, because most Zen masters will tell you that the story--any Zen story, in fact--has no "definitive" meaning, but has meaning based on how you interpret it. (There are a few Zen parables that reinforce this point, but it gets a little meta to justify my understanding of a Zen story by quoting another Zen story.) How Matt chooses to interpret that parable is, of course, up to him. I choose to interpret the story thusly: the insulted samurai felt that his "earned sense of pride" at his sword mastery was insulted by the tea master--clearly no swordsman, as it says in the story--wore robes of a rank and honor that he had not earned. And clearly, the tea master was no swordsman. But what the tea master learned from his peer was not how to use his concentration and discipline to improve his own swordsmanship, but how to demonstrate that he had, in fact, earned a note of mastery through an entirely different discipline than the insulted samurai's. The tea master still has no mastery of the sword, but in his own domain, he is an expert. This was all the insulted samurai needed to see, that the badge of honor had been earned, and not just imposed by a capricious (and disrespectful) lord. Put a paintbrush and canvas into the hands of a house painter, and you get pretty much a mess--but put a spray painter in the hands of Leonardo, and you still get a mess. In fact, to really do the parable justice, we should see how much "craft" Matt can bring when asked to paint a house, because that's about how much relevance swordsmanship and house painting have in relationship to one another. (All analogies fail eventually, by the way, and we're probably reaching the boundaries of this one.)

Billy Hollis is a master with VB, far more than I ever will be; I know C++ far better than he ever will. I respect his abilities, and he, mine. There is no argument here. But more importantly, there are friends I've worked with in the past who are masters with neither VB nor C++, nor any other programming language, but chose instead to sink their time and energy into skiing, pottery, or being a fan of a television show. They chose to put their energies--energies the "craftsmen" seem to say should be put towards their programming--towards things that bring them joy, which happen to not be programming.

Which brings me to another refrain that came up over and over again: You criticize the craftsman, but then you draw a distinction between "craftsman" and "laborer". You're confusing (or confused). First of all, I think it important to disambiguate along two axes: those who are choosing to invest their time into learning to write better software, and those who are choosing to look at writing code as "just" a job as one axis, and along a second axis, the degree to which they have mastered programming. By your own definitions, "craftsmen", can one be early in your mastery of programming and still be a "craftsman"? Can one be a master bowler who's just picked up programming and be considered a "craftsman"? Is the nature of "craftsmanship" a measure of your skill, or is it your dedication to programming, or is it your dedication to something in your life, period? (Remember, the tea master parable says that a master C++ developer will see the master bowler and respect his mastery of bowling, even though he can't code worth a crap. Would you call him a "craftsman"?)

Frankly, I will say, for the record, that I think there are people programming who don't want to put a ton of time and energy into learning how to be better programmers. (I suspect that most of them won't ever read this blog, either.) They see the job as "just a job", and are willing to be taught how to do things, but aren't willing to go off and learn how to do them on their own. They want to do the best job they can, because they, like any human being, want to bring value to the world, but don't have that passion for programming. They want to come in at 9, do their job, and go home at 5. These are those whom I call "laborers". They are the "fisherman" in the following story:

The businessman was at the pier of a small coastal Mexican village when a small boat with just one fisherman docked. Inside the small boat were several large yellowfin tuna. The businessman complimented the Mexican on the quality of his fish and asked how long it took to catch them. The Mexican replied only a little while.

The businessman then asked why he didn't stay out longer and catch more fish? The Mexican said he had enough to support his family's immediate needs. The businessman then asked, but what do you do with the rest of your time? The Mexican fisherman said, "I sleep late, fish a little, play with my children, take a siesta with my wife, Maria, stroll into the village each evening where I sip wine and play guitar with my amigos; I have a full and busy life, señor."

The businessman scoffed, "I am a Harvard MBA and I could help you. You should spend more time fishing and with the proceeds buy a bigger boat. With the proceeds from the bigger boat you could buy several boats; eventually you would have a fleet of fishing boats. Instead of selling your catch to a middleman, you would sell directly to the processor and eventually open your own cannery. You would control the product, processing and distribution. You would need to leave this small coastal fishing village and move to Mexico City, then LA and eventually New York City where you would run your expanding enterprise."

The Mexican fisherman asked, "But señor, how long will this all take?" To which the businessman replied, "15-20 years." "But what then, señor?" The businessman laughed and said, "That's the best part! When the time is right you would announce an IPO and sell your company stock to the public and become very rich. You would make millions." "Millions, señor? Then what?" The businessman said, "Then you would retire. Move to a small coastal fishing village where you would sleep late, fish a little, play with your kids, take a siesta with your wife, stroll to the village in the evenings where you could sip wine and play your guitar with your amigos."

What makes all of this (this particular subject, craftsmanship) particularly hard for me is that I like the message that craftsmanship brings, in terms of how you conduct yourself. I love the book Apprenticeship Patterns, for example, and think that anyone, novice or master, should read this book. I have taken on speaking apprentices in the past, and will continue to do so well into the future. The message that underlies the meme of craftsmanship--the constant striving to improve--is a good one, and I don't want to throw the baby out with the bathwater. If you have adopted "craftsmanship" as a core value of yours, then please, by all means, continue to practice it! Myself, I choose to do so, as well. I have mentored programmers, I have taken speaking apprentices, and I strive to learn more about my craft by branching my studies out well beyond software--I am reading books on management, psychology, building architecture, and business, because I think there is more to software than just the choice of programming language or style.

But be aware that if you start telling people how you're living your life, there is an implicit criticism or expectation that they should be doing that, as well. And when you start criticizing other peoples' code as being "unelegant" or "unbeautiful" or "unclean", you'd better be able to explain your value system and why you judged it as so. Humility is a hard, hard path to tread, and one that I have only recently started to see the outlines of; I am guilty of just about every sin imaginable when it comes to this subject. I have created "elegant" systems that failed their original intent. I have criticized "ugly" code that, in fact, served the purpose well. I have bragged of my own accomplishments to those who accomplished a lot more than I did, or ever will. And I consider it amazing to me that my friends who've been with me since long before I started to eat my justly-deserved humble pie are still with me. (And that those friends are some amazing people in their own right.; if a man is judged by the company he keeps, then by looking around at my friends, I am judged to be a king.) I will continue to strive to be better than I am now, though, even within this discussion right now: those of you who took criticism with my post, you have good points, all of you, and I certainly don't want to stop you from continuing on your journeys of self-discovery, either.

And if we ever cross paths in person, I will buy you a beer so that we can sit down, and we can continue this discussion in person.


.NET | C# | C++ | Conferences | Development Processes | F# | Industry | Java/J2EE | Languages | Objective-C | Parrot | Personal | Reading | Review | Ruby | Scala | Social | Windows

Friday, January 25, 2013 10:24:27 PM (Pacific Standard Time, UTC-08:00)
Comments [7]  | 
 Monday, January 21, 2013
On Functional Programming in Java

Elliott Rusty Harold is blogging that functional programming in Java is dangerous. He's wrong, and he's way late to the party on this one--it's coming to Java whether he likes it or not.

Go read his post first, while I try to sum up the reasons he cites for saying it's dangerous:

  1. Java is not a lazy-evaluated language. Programmers in Java will screw up and create heap and stack errors as a result.
  2. See? Here's a naive implementation of Clojure code taken directly over to Java and look how it blows up.
  3. Programmers can do bad things with this idea, so therefore we should avoid it.
  4. Oh, and by the way, it's "dangerously inefficient" in Java/JVM, even though I offer no perf benchmarks or comparisons to back this statement, and I'm somehow ignoring that Clojure and Scala run on the JVM as well, apparently without problem.
That kind of about sums it up, I think.

Look, as Elliott points out, Java is not Haskell. Neither is it Lisp. It's its own language, rooted in imperative and object-oriented history, but no less able to incorporate functional features into its development than Lisp could incorporate object-oriented features. However, if you do stupid things, like trying to re-create an infinite (implicitly lazily-evaluated) list in Clojure by creating an actualized list that stretches to infinity... you're going to blow the JVM up. Duh. Not even the supercomputer on the USS Enterprise five hundred years from now will be able to construct that list.

Porting code from one language to another is not a trivial exercise. If you attempt to port line-for-line and expression-for-expression, you can expect that your ported code will not be idiomatically correct. (I know this already, having done the exercise myself.) The root of the problem in his ported code is twofold. One, he (rather foolishly and in elegant strawman fashion) badly simulates what an infinite list would look like in Java--a commenter does the better job by showing how an Iterator can be made to perform the same thing that Haskell actually does under the hood by producing the next value on demand, rather than trying to create a list of Integers stretching to infinity. For someone who professes to have some Haskell experience and love, it surprises me that Elliott makes this kind of mistake, which leads me to conclude that he's trying to create the strawman. Two, he assumes that anyone who programs in Java functionally will have to create all of their functional tools by hand, and frankly, using Guava or FJ here would make this code sample a LOT easier to swallow. The fact that he ignores both of these in his stawman again sort of reinforces the idea that he's deliberately crippling his strawman to make his point.

His underlying point, though, seems to be simple: "I work with bad programmers, who don't seem to understand how to write code functionally in Java without screwing it up." Dude. Sucks to be you. "Bad programmers will move heaven and earth to do the wrong thing." --Glenn Vanderburg.

What really sucks for him is that these features are coming in Java 8, including lambda expressions and library support including a Stream interface that will allow for exactly this kind of code to be written without pain. Those programmers Elliott is working with are going to be even more on fire to use their functional approaches (and all the associated goodness of doing so, including composability and what-not) in their Java code. What might make Elliott more happy is that at least they won't have written it; it'll all be written by guys much smarter than any of them.


Android | Java/J2EE | Languages | Personal | Review | Scala

Monday, January 21, 2013 2:10:14 PM (Pacific Standard Time, UTC-08:00)
Comments [1]  | 
 Monday, January 07, 2013
Thoughts on a CodeMash Gone By

A year ago today (roughly), I gave the opening keynote at CodeMash 2.0.1.2. For those of you who were there, I don't think I need to tell you what happened. For those of you who weren't there, you probably still heard about, thanks to the Twitterstream of comments and counter-comments that followed. I've more or less tried to keep quiet about it since that time, trying to just let the furor die down (and it did, pretty quickly, I thought) out of respect to the conference organizers.

But with the show starting up again this week, and there having been a few people over the last twelve months who've asked me about "what the f*ck were you thinking" (whether that was in deliberate pun/jest or not, I can't always tell), and most importantly, now that I know that Jim and I are square with each other (thanks to a Twitter conversation a few days ago), I figure it's time to come clean and tell my side of the story.

TL;DR: If I had the chance to do the keynote over again, I'd do it differently.

(By the way, the rest of this post does have a few profanities in it, so if you're offended by that sort of thing, this is a good place to stop reading. Or, as the movies would say, this post is rated PG-13 for adult language.)

As a speaker, I have always sought to create a "persona" on stage that allowed me the maximum freedom of expression and opportunity to get my point across. A long time ago, when I started teaching at DevelopMentor, I learned from some of the best--one of those best being, of course, Don Box, but another of those was Ted Pattison. It was he who taught me that "If you can make 'em laugh, you can do whatever you want to them" (meaning the audience). He demonstrated this quite graphically by guest-lecturing in one of my classes once, early in my tenure as a DM instructor, and promptly castrated one of the students who was constantly irritating the class (and me) with off-topic questions. It was an eye-opening experience. Later, Don mentioned in passing that what we did was "equal parts education and entertainment". Education because, yes, it's what we do, but entertainment, too, because if the room falls asleep, then they're not getting educated.

And folks, I've sat in those chairs, I know how boring talks can be sometimes. And that sometimes, despite your best efforts, no matter how interesting the material, it can just be sooooo easy to pop open the laptop and do some email. Or write some code. Or even let the ambient warmth of the room in a post-lunch talk just... make... eyes... so heavy.... I get it, really.

So I decided, quite consciously, to develop a speaking persona that was a little on the edge, a little outrageous, a little "over the top", because then that persona gave me the freedom to do some of the crazy things that would keep the crowd awake and on its toes. I stand people up from the audience and use them in my demos. I write code on the fly based on their questions, and I try to use examples that allow for a certain amount of "Wow, that was weird, so I'll remember it better" in the demo itself. Case in point: when writing code to demonstrate delegates and events in C#, I would use the idea of a "Rock Band" and its fan club which, of course, must include groupies.

Is it politically correct to talk about groupies in a professional programming classroom setting? Probably not. Did anybody complain? Never heard one, directly or indirectly. Part of that, I believe, was because they got the point of the demo, and that was the point. Not that I was advocating groupie-ism, or that rock bands were more interesting than programming, but that the domain was easy enough to grip in their heads, and that made the result (loose coupling between event generators and consumers, in the case of delegates and events) more easily understood.

Analogies, for me, are never gratuitous. I choose my analogies quite carefully, and try to be very clear about where and when they do break down, because all analogies break down eventually. Even my most famous analogy breaks down, as many people have pointed out: nobody has ever died from O/R-Ms. Yep. But your wife's eyes were never burning balls of superheated plasma billions of light years away, either.

Point is, I deliberately seek ways to keep you entertained. And you know what? Entertainment often comes, in the case, from making the room laugh, and humor most often derives from the unexpected. And what's more unexpected than a profanity dropped at the most unexpected moment?

You don't have to agree with that sentiment to realize that it's FUCKING true.

When I got up to speak at CodeMash, I wanted very badly for this to be the best damn keynote I'd ever done in my life up to that point. I wanted the room to rock. Buzzing. Yes, I wanted to succeed very, very badly. It was an early-morning keynote, first one of the show. People were still milling around, there was a lot of background noise. People were still eating breakfast and waking up. And when Keith Elder, just before he introduced me to the crowd, whispered (I'm paraphrasing here) "Put some energy into this crowd, would ya?", I said to myself, "Oh, yeah. I'm on it."

A little TOO on it, as it turns out. I went way overboard. Brian Prince counted 18 f-bombs that day. Others counted, as well; lowest total I heard was 13, highest was 23. Needless to say, it was a carpet bombing to rival anything we ever did to North Vietnam. Made Dresden look like a weenie roast. (There's probably a Hiroshima joke in there too somewhere, but you get the point.)

The interesting thing about profanity used like that, however, is that it loses its efficacy. They have to be spaced out, chosen carefully, or they lose their impact. Which was, of course, exactly what happened. It's not going to have the 'unexpected' effect if it's coming every other minute or so. No matter how hard you try.

The result? Kind of predictable. Not my best results. For which I am most heartily sorry. I so wanted that keynote to go off so well, and it didn't, and I'm sorry.

For three hours after the keynote was over, as the Twitterstream was dissecting me for all that, I lay on the couch in my hotel room, bordering on tears. Seriously.

Had I the chance to do the keynote over again, you'd better damn well believe that I'd do it differently. Would I cut out all the profanity entirely? Nope. That's a part of my speaking persona, and anyone who brings me to a conference that doesn't know that probably didn't do their homework about me as a speaker beforehand. (It's not like there aren't ample opportunities to see me speaking in person, or videos of the same.) But somebody suggested not too long ago that maybe it wouldn't be a bad idea to warn people ahead of time, and yep, that's a great idea. Because (and for this, I am really even more sorry) sometimes kids are in the room, such as was the case for CodeMash, and they shouldn't have to hear it unless their parents are OK with it, and I didn't give their parents (or any attendees that felt the same way) an opportunity to "opt out" if they so chose.

I could, I suppose, hide behind the excuse that "We were all adults, we should be able to handle that kind of language", but in the case of the kids, that wasn't the case. Even then, in the case of the adults, you still should be given an opportunity to opt out.

More critically, if the message got lost because of the messenger's choice of words, then I failed as a speaker. And that, my friends, is where the real frustration for me lies--not with the words I used in of themselves, but in that the message--that we as an industry have to break out of our 'box-arrow-box-arrow-cylinder' habits and modes of thinking--got lost for so many people, That is how I failed most of all, and it is on those grounds that I say, once again, I am sorry.

To you, Jim, and to the rest of the CodeMash staff, I am particularly sorry. CodeMash is your baby, and I gave it a black eye.

To the attendees of CodeMash 2.0.1.2, I am sorry if my language offended you and distracted you from the message I was trying to deliver. I hope that you were able to get past it and enjoy the rest of the show. I think a lot of you did--many came up to me afterwards, but it was such a small fraction of the total I don't want to assume anything.

Enjoy CodeMash 2.0.1.3. With any luck, I'll see you there next year: hopefully a little wiser, but still just as FUCKING outrageous as I have always been, only this time, with an up-front disclaimer.

Flame away.


.NET | C# | Conferences | F# | Industry | Personal | Review | Social | Windows

Monday, January 07, 2013 3:23:42 PM (Pacific Standard Time, UTC-08:00)
Comments [4]  | 
 Saturday, January 05, 2013
Review (in advance): F# Deep Dives

F# Deep Dives, by Tomas Petricek and Phillip Trelford, Manning Publications

As many readers of my writing will already know, I've been kind of "involved" with F# (and its cousin on the JVM, Scala) for a few years now, to the degree that I and a couple of really smart guys wrote a book on the subject. Now, assuming you're one of the .NET developers who've heard of F# and functional programming, and took a gander at the syntax, and maybe even bought a book on it (my publisher and I both thank you if you bought ours), but weren't quite sure what to do with it, a book has come along to help get you past that.

As of this writing, the early-access (what Manning calls their MEAP) version had only Chapters 3 ("Parsing text-based languages") and Chapter 11 ("Creating games using XNA"), but the other topics ("Integrating external data into the F# language", "Handling dirty data with machine learning" and "Functional programming in the cloud" are just three of the other chapters listed) are juicy and meaty, and both Tomas and Philip are recognized names in the F# space. Neither are strangers to the subject material nor to writing, and the prose from the MEAP edition is pretty easy to read already, despite the fact that it's early-access material. In particular, the Markdown parser they implement in chapter 3 is a great example of a non-trivial language parser, which is not an easy task to approach but certainly a lot easier to do in a functional language. (For the record, I built a custom parser of my own for generating slides, and the blog entries that described the early implementations are here, and yes, I really should finish that series out, I know. I got more interested in extending the system, then realized I needed a full-fledged parser, and got distracted trying to integrate... surprise, surprise... Tomas' Markdown parser that he made available online.)

This book looks really promising, and I'm really hopeful Manning will send me a copy when it comes out, so I can level up my F# myself.


.NET | F# | Industry | Languages | Reading | Review | Windows | XNA

Saturday, January 05, 2013 2:10:05 AM (Pacific Standard Time, UTC-08:00)
Comments [0]  | 
Review: Metaprogramming in .NET

Metaprogramming in .NET, by Kevin Hazzard and Jason Bock, Manning Publications

TL;DR: This is a great book (not perfect), but not an easy read for everyone, not because the writing is bad, but because the subject is a whole new level of abstraction above what most developers deal with.

Full disclosure: Manning Publications is a publisher I've published with before, and Kevin and Jason are both friends of mine in the .NET community. I write a column for MSDN Magazine, and metaprogramming was one of the topics in one of the series I've written ("Metaparadigmatic Programming") for the column, so this subject is not unfamiliar to me.

Kevin and Jason have done a great job covering a pretty diverse subject, in my opinion. Because metaprogramming is "programming about programming", it's sometimes a hard concept for people who've never really investigated it to wrap their heads around, but Kevin and Jason do a great job opening with some concepts first, then exploring .NET Reflection, which is most developers' first introduction to metaprogramming. If you can understand how Reflection is programming against code and code metadata, then you're in a good place to start exploring metaprogramming in further depth.

And explore it they do. From code generation with T4, CodeDOM and Reflection.Emit to code-level Expressions to low-level IL munging, they take you through a lot of the metaprogramming tools. They've also tried to include some practical places where these techniques are useful, though I do wish the examples had been a bit "larger", meaning they were integrated into the larger picture of a "real-world" system, but that's hard to do sometimes, and most readers sufficiently senior enough to read this book should be able to see how to apply them to their own problems. I also wish they'd approached generics a bit more thoroughly, since that's another metaprogrammatic technique that often doesn't get much love from developers (most of whom seem to view generics as a necessary evil, not a huge opportunity for design power), but maybe that would've been too much head-exploding for one book. Writing a LINQ provider would've been a good enhancement to the book, but again, that may have been a little too much for one book. I also wish they had put an IL overview into its own chapter, since it comes up in several chapters at once and would've been good as a reference, but there's books out there on IL, which hasn't changed much since .NET 2.0 days, so readers finding IL challenging should pick up one of those if they're finding their heads spinning a little on the IL syntax.

Having taken you through those techniques, though, they then take a different tack and take you through scripting languages and the Microsoft Dynamic Language Runtime (DLR), as well as into a few "alternative" languages for the CLR, which is an entirely different way of approaching metaprogrammatic techniques. Nemerle, for example, is a language that supports macros defined within the language, a technique that generally is limited to Lisps. (I admit it, Nemerle is one of my favorite CLR languages, and should be something every .NET developer plays with for at least a weekend.) They also include the first published coverage that I'm aware of on Roslyn, the Compiler-as-a-Service project under way at Microsoft, so readers intrigued by how they might use the compiler as part of their development efforts in v.Next of Visual Studio should definitely have a look.

Overall, the writing style is crisp, clean, not too academic but not too folksy, and entirely representative of two men I've been privileged to meet, have interesting technical conversations with, and have over to my house for dinner. Both are extremely approachable, and their text reflects this. Every .NET developer that wants to claim "senior" or "guru" level status should read this book and experiment with one or more of these techniques; these are the things that the "cool kids" in the .NET world know how to do, and if you want to hang with the best, this is the book you'll read cover to cover.

(This review was posted to Amazon at the above link on 5 Jan 2013, then copy-and-pasted here because I like posting reviews to my blog as well as to Amazon.)


.NET | C# | F# | Languages | Reading | Review | Visual Basic | Windows

Saturday, January 05, 2013 1:54:53 AM (Pacific Standard Time, UTC-08:00)
Comments [0]  | 
 Tuesday, January 01, 2013
Tech Predictions, 2013

Once again, it's time for my annual prognostication and review of last year's efforts. For those of you who've been long-time readers, you know what this means, but for those two or three of you who haven't seen this before, let's set the rules: if I got a prediction right from last year, you take a drink, and if I didn't, you take a drink. (Best. Drinking game. EVAR!)

Let's begin....

Recap: 2012 Predictions

THEN: Lisps will be the languages to watch.

With Clojure leading the way, Lisps (that is, languages that are more or less loosely based on Common Lisp or one of its variants) are slowly clawing their way back into the limelight. Lisps are both functional languages as well as dynamic languages, which gives them a significant reason for interest. Clojure runs on top of the JVM, which makes it highly interoperable with other JVM languages/systems, and Clojure/CLR is the version of Clojure for the CLR platform, though there seems to be less interest in it in the .NET world (which is a mistake, if you ask me).

NOW: Clojure is definitely cementing itself as a "critic's darling" of a language among the digital cognoscenti, but I don't see its uptake increasing--or decreasing. It seems that, like so many critic's darlings, those who like it are using it, and those who aren't have either never heard of it (the far more likely scenario) or don't care for it. Datomic, a NoSQL written by the creator of Clojure (Rich Hickey), is interesting, but I've not heard of many folks taking it up, either. And Clojure/CLR is all but dead, it seems. I score myself a "0" on this one.

THEN: Functional languages will....

I have no idea. As I said above, I'm kind of stymied on the whole functional-language thing and their future. I keep thinking they will either "take off" or "drop off", and they keep tacking to the middle, doing neither, just sort of hanging in there as a concept for programmers to take and run with. Mind you, I like functional languages, and I want to see them become mainstream, or at least more so, but I keep wondering if the mainstream programming public is ready to accept the ideas and concepts hiding therein. So this year, let's try something different: I predict that they will remain exactly where they are, neither "done" nor "accepted", but continue next year to sort of hang out in the middle.

NOW: Functional concepts are slowly making their way into the mainstream of programming topics, but in some cases, programmers seem to be picking-and-choosing which of the functional concepts they believe in. I've heard developers argue vehemently about "lazy values" but go "meh" about lack-of-side-effects, or vice versa. Moreover, it seems that developers are still taking an "object-first, functional-when-I-need-it" kind of approach, which seems a little object-heavy, if you ask me. So, since the concepts seem to be taking some sort of shallow root, I don't know that I get the point for this one, but at the same time, it's not like I was wildly off. So, let's say "0" again.

THEN: F#'s type providers will show up in C# v.Next.

This one is actually a "gimme", if you look across the history of F# and C#: for almost every version of F# v."N", features from that version show up in C# v."N+1". More importantly, F# 3.0's type provider feature is an amazing idea, and one that I think will open up language research in some very interesting ways. (Not sure what F#'s type providers are or what they'll do for you? Check out Don Syme's talk on it at BUILD last year.)

NOW: C# v.Next hasn't been announced yet, so I can't say that this one has come true. We should start hearing some vague rumors out of Redmond soon, though, so maybe 2013 will be the year that C# gets type providers (or some scaled-back version thereof). Again, a "0".

THEN: Windows8 will generate a lot of chatter.

As 2012 progresses, Microsoft will try to force a lot of buzz around it by keeping things under wraps until various points in the year that feel strategic (TechEd, BUILD, etc). In doing so, though, they will annoy a number of people by not talking about them more openly or transparently.

NOW: Oh, my, did they. Windows8 was announced with a bang, but Microsoft (and Sinofsky, who ran the OS division up until recently) decided that they could go it alone and leave critical partners (like Dropbox!) out of the loop entirely. As a result, the Windows8 Store didn't have a lot of apps in it that people (including myself) really expected would be there. And THEN, there was Surface... which took everybody by surprise, as near as I can tell. Totally under wraps. I'm scoring myself "+2" for that one.

THEN: Windows8 ("Metro")-style apps won't impress at first.

The more I think about it, the more I'm becoming convinced that Metro-style apps on a desktop machine are going to collectively underwhelm. The UI simply isn't designed for keyboard-and-mouse kinds of interaction, and that's going to be the hardware setup that most people first experience Windows8 on--contrary to what (I think) Microsoft thinks, people do not just have tablets laying around waiting for Windows 8 to be installed on it, nor are they going to buy a Windows8 tablet just to try it out, at least not until it's gathered some mojo behind it. Microsoft is going to have to finesse the messaging here very, very finely, and that's not something they've shown themselves to be particularly good at over the last half-decade.

NOW: I find myself somewhat at a loss how to score this one--on the one hand, the "used-to-be-called-Metro"-style applications aren't terrible, and I haven't really heard anyone complain about them tremendously, but at the same time, I haven't heard anyone really go wild and ga-ga over them, either. Part of that, I think, is because there just aren't a lot of apps out there for it yet, aside from a rather skimpy selection of games (compared to the iOS App Store and Android Play Store). Again, I think Microsoft really screwed themselves with this one--keeping it all under wraps helped them make a big "Oh, WOW" kind of event buzz within the conference hall when they announced Surface, for example, but that buzz sort of left the room (figuratively) when people started looking for their favorite apps so they could start using that device. (Which, by the way, isn't a bad piece of hardware, I'm finding.) I'll give myself a "+1" for this.

THEN: Scala will get bigger, thanks to Heroku.

With the adoption of Scala and Play for their Java apps, Heroku is going to make Scala look attractive as a development platform, and the adoption of Play by Typesafe (the same people who brought you Akka) means that these four--Heroku, Scala, Play and Akka--will combine into a very compelling and interesting platform. I'm looking forward to seeing what comes of that.

NOW: We're going to get to cloud in a second, but on the whole, Heroku is now starting to make Scala/Play attractive, arguably as attractive as Ruby/Rails is. Play 2.0 unfortunately is not backwards-compatible with Play 1.x modules, which hurts it, but hopefully the Play community brings that back up to speed fairly quickly. "+1"

THEN: Cloud will continue to whip up a lot of air.

For all the hype and money spent on it, it doesn't really seem like cloud is gathering commensurate amounts of traction, across all the various cloud providers with the possible exception of Amazon's cloud system. But, as the different cloud platforms start to diversify their platform technology (Microsoft seems to be leading the way here, ironically, with the introduction of Java, Hadoop and some limited NoSQL bits into their Azure offerings), and as we start to get more experience with the pricing and costs of cloud, 2012 might be the year that we start to see mainstream cloud adoption, beyond "just" the usage patterns we've seen so far (as a backing server for mobile apps and as an easy way to spin up startups).

NOW: It's been whipping up air, all right, but it's starting to look like tornadoes and hurricanes--the talk of 2012 seems to have been more around notable cloud outages instead of notable cloud successes, capped off by a nationwide Netflix outage on Christmas Eve that seemed to dominate my Facebook feed that night. Later analysis suggested that the outage was with Amazon's AWS cloud, on which Netflix resides, and boy, did that make a few heads spin. I suspect we haven't yet (as of this writing) seen the last of that discussion. Overall, it seems like lots of startups and other greenfield apps are being deployed to the cloud, but it seems like corporations are hesitating to pull the trigger on an "all-in" kind of cloud adoption, because of some of the fears surrounding cloud security and now (of all things) robustness. "+1"

THEN: Android tablets will start to gain momentum.

Amazon's Kindle Fire has hit the market strong, definitely better than any other Android-based tablet before it. The Nooq (the Kindle's principal competitor, at least in the e-reader world) is also an Android tablet, which means that right now, consumers can get into the Android tablet world for far, far less than what an iPad costs. Apple rumors suggest that they may have a 7" form factor tablet that will price competitively (in the $200/$300 range), but that's just rumor right now, and Apple has never shown an interest in that form factor, which means the 7" world will remain exclusively Android's (at least for now), and that's a nice form factor for a lot of things. This translates well into more sales of Android tablets in general, I think.

NOW: Google's Nexus 7 came to dominate the discussion of the 7" tablet, until...

THEN: Apple will release an iPad 3, and it will be "more of the same".

Trying to predict Apple is generally a lost cause, particularly when it comes to their vaunted iOS lines, but somewhere around the middle of the year would be ripe for a new iPad, at the very least. (With the iPhone 4S out a few months ago, it's hard to imagine they'd cannibalize those sales by releasing a new iPhone, until the end of the year at the earliest.) Frankly, though, I don't expect the iPad 3 to be all that big of a boost, just a faster processor, more storage, and probably about the same size. Probably the only thing I'd want added to the iPad would be a USB port, but that conflicts with the Apple desire to present the iPad as a "device", rather than as a "computer". (USB ports smack of "computers", not self-contained "devices".)

NOW: ... the iPad Mini. Which, I'd like to point out, is just an iPad in a 7" form factor. (Actually, I think it's a little bit bigger than most 7" tablets--it looks to be a smidge wider than the other 7" tablets I have.) And the "new iPad" (not the iPad 3, which I call a massive FAIL on the part of Apple marketing) is exactly that: same iPad, just faster. And still no USB port on either the iPad or iPad Mini. So between this one and the previous one, I score myself at "+3" across both.

THEN: Apple will get hauled in front of the US government for... something.

Apple's recent foray in the legal world, effectively informing Samsung that they can't make square phones and offering advice as to what will avoid future litigation, smacks of such hubris and arrogance, it makes Microsoft look like a Pollyanna Pushover by comparison. It is pretty much a given, it seems to me, that a confrontation in the legal halls is not far removed, either with the US or with the EU, over anti-cometitive behavior. (And if this kind of behavior continues, and there is no legal action, it'll be pretty apparent that Apple has a pretty good set of US Congressmen and Senators in their pocket, something they probably learned from watching Microsoft and IBM slug it out rather than just buy them off.)

NOW: Congress has started to take a serious look at the patent system and how it's being used by patent trolls (of which, folks, I include Apple these days) to stifle innovation and create this Byzantine system of cross-patent licensing that only benefits the big players, which was exactly what the patent system was designed to avoid. (Patents were supposed to be a way to allow inventors, who are often independents, to avoid getting crushed by bigger, established, well-monetized firms.) Apple hasn't been put squarely in the crosshairs, but the Economist's article on Apple, Google, Microsoft and Amazon in the Dec 11th issue definitely points out that all four are squarely in the sights of governments on both sides of the Atlantic. Still, no points for me.

THEN: IBM will be entirely irrelevant again.

Look, IBM's main contribution to the Java world is/was Eclipse, and to a much lesser degree, Harmony. With Eclipse more or less "done" (aside from all the work on plugins being done by third parties), and with IBM abandoning Harmony in favor of OpenJDK, IBM more or less removes themselves from the game, as far as developers are concerned. Which shouldn't really be surprising--they've been more or less irrelevant pretty much ever since the mid-2000s or so.

NOW: IBM who? Wait, didn't they used to make a really kick-ass laptop, back when we liked using laptops? "+1"

THEN: Oracle will "screw it up" at least once.

Right now, the Java community is poised, like a starving vulture, waiting for Oracle to do something else that demonstrates and befits their Evil Emperor status. The community has already been quick (far too quick, if you ask me) to highlight Oracle's supposed missteps, such as the JVM-crashing bug (which has already been fixed in the _u1 release of Java7, which garnered no attention from the various Java news sites) and the debacle around Hudson/Jenkins/whatever-the-heck-we-need-to-call-it-this-week. I'll grant you, the Hudson/Jenkins debacle was deserving of ire, but Oracle is hardly the Evil Emperor the community makes them out to be--at least, so far. (I'll admit it, though, I'm a touch biased, both because Brian Goetz is a friend of mine and because Oracle TechNet has asked me to write a column for them next year. Still, in the spirit of "innocent until proven guilty"....)

NOW: It is with great pleasure that I score myself a "0" here. Oracle's been pretty good about things, sticking with the OpenJDK approach to developing software and talking very openly about what they're trying to do with Java8. They're not entirely innocent, mind you--the fact that a Java install tries to monkey with my browser bar by installing some plugin or other and so on is not something I really appreciate--but they're not acting like Ming the Merciless, either. Matter of fact, they even seem to be going out of their way to be community-inclusive, in some circles. I give myself a "-1" here, and I'm happy to claim it. Good job, guys.

THEN: VMWare/SpringSource will start pushing their cloud solution in a major way.

Companies like Microsoft and Google are pushing cloud solutions because Software-as-a-Service is a reoccurring revenue model, generating revenue even in years when the product hasn't incremented. VMWare, being a product company, is in the same boat--the only time they make money is when they sell a new copy of their product, unless they can start pushing their virtualization story onto hardware on behalf of clients--a.k.a. "the cloud". With SpringSource as the software stack, VMWare has a more-or-less complete cloud play, so it's surprising that they didn't push it harder in 2011; I suspect they'll start cramming it down everybody's throats in 2012. Expect to see Rod Johnson talking a lot about the cloud as a result.

NOW: Again, I give myself a "-1" here, and frankly, I'm shocked to be doing it. I really thought this one was a no-brainer. CloudFoundry seemed like a pretty straightforward play, and VMWare already owned a significant share of the virtualization story, so.... And yet, I really haven't seen much by way of significant marketing, advertising, or developer outreach around their cloud story. It's much the same as what it was in 2011; it almost feels like the parent corporation (EMC) either doesn't "get" why they should push a cloud play, doesn't see it as worth the cost, or else doesn't care. Count me confused. "0"

THEN: JavaScript hype will continue to grow, and by years' end will be at near-backlash levels.

JavaScript (more properly known as ECMAScript, not that anyone seems to care but me) is gaining all kinds of steam as a mainstream development language (as opposed to just-a-browser language), particularly with the release of NodeJS. That hype will continue to escalate, and by the end of the year we may start to see a backlash against it. (Speaking personally, NodeJS is an interesting solution, but suggesting that it will replace your Tomcat or IIS server is a bit far-fetched; event-driven I/O is something both of those servers have been doing for years, and the rest of it is "just" a language discussion. We could pretty easily use JavaScript as the development language inside both servers, as Sun demonstrated years ago with their "Phobos" project--not that anybody really cared back then.)

NOW: JavaScript frameworks are exploding everywhere like fireworks at a Disney theme park. Douglas Crockford is getting more invites to conference keynote opportunities than James Gosling ever did. You can get a job if you know how to spell "NodeJS". And yet, I'm starting to hear the same kinds of rumblings about "how in the hell do we manage a 200K LOC codebase written in JavaScript" that I heard people gripe about Ruby/Rails a few years ago. If the backlash hasn't started, then it's right on the cusp. "+1"

THEN: NoSQL buzz will continue to grow, and by years' end will start to generate a backlash.

More and more companies are jumping into NoSQL-based solutions, and this trend will continue to accelerate, until some extremely public failure will start to generate a backlash against it. (This seems to be a pattern that shows up with a lot of technologies, so it seems entirely realistic that it'll happen here, too.) Mind you, I don't mean to suggest that the backlash will be factual or correct--usually these sorts of things come from misuing the tool, not from any intrinsic failure in it--but it'll generate some bad press.

NOW: Recently, I heard that NBC was thinking about starting up a new comedy series called "Everybody Hates Mongo", with Chris Rock narrating. And I think that's just the beginning--lots of companies, particularly startups, decided to run with a NoSQL solution before seriously contemplating how they were going to make up for the things that a NoSQL doesn't provide (like a schema, for a lot of these), and suddenly find themselves wishing they had spent a little more time thinking about that back in the early days. Again, if the backlash isn't already started, it's about to. "+1"

THEN: Ted will thoroughly rock the house during his CodeMash keynote.

Yeah, OK, that's more of a fervent wish than a prediction, but hey, keep a positive attitude and all that, right?

NOW: Welllll..... Looking back at it with almost a years' worth of distance, I can freely admit I dropped a few too many "F"-bombs (a buddy of mine counted 18), but aside from a (very) vocal minority, my takeaway is that a lot of people enjoyed it. Still, I do wish I'd throttled it back some--InfoQ recorded it, and the fact that it hasn't yet seen public posting on the website implies (to me) that they found it too much work to "bleep" out all the naughty words. Which I call "my bad" on, because I think they were really hoping to use that as part of their promotional activities (not that they needed it, selling out again in minutes). To all those who found it distasteful, I apologize, and to those who chafe at the fact that I'm apologizing, I apologize. I take a "-1" here.

2013 Predictions:

Having thus scored myself at a "9" (out of 17) for last year, let's take a stab at a few for next year:

  • "Big data" and "data analytics" will dominate the enterprise landscape. I'm actually pretty late to the ballgame to talk about this one, in fact--it was starting its rapid climb up the hype wave already this year. And, part and parcel with going up this end of the hype wave this quickly, it also stands to reason that companies will start marketing the hell out of the term "big data" without being entirely too precise about what they mean when they say "big data".... By the end of the year, people will start building services and/or products on top of Hadoop, which appears primed to be the "Big Data" platform of choice, thus far.
  • NoSQL buzz will start to diversify. The various "NoSQL" vendors are going to start wanting to differentiate themselves from each other, and will start using "NoSQL" in their marketing and advertising talking points less and less. Some of this will be because Pandora's Box on data storage has already been opened--nobody's just assuming a relational database all the time, every time, anymore--but some of this will be because the different NoSQL vendors, who are at different stages in the adoption curve, will want to differentiate themselves from the vendors that are taking on the backlash. I predict Mongo, who seems to be leading the way of the NoSQL vendors, will be the sacrificial scapegoat for a lot of the NoSQL backlash that's coming down the pike.
  • Desktops increasingly become niche products. Look, does anyone buy a desktop machine anymore? I have three sitting next to me in my office, and none of the three has been turned on in probably two years--I'm exclusively laptop-bound these days. Between tablets as consumption devices (slowly obsoleting the laptop), and cloud offerings becoming more and more varied (slowly obsoleting the server), there's just no room for companies that sell desktops--or the various Mom-and-Pop shops that put them together for you. In fact, I'm starting to wonder if all those parts I used to buy at Fry's Electronics and swap meets will start to disappear, too. Gamers keep desktops alive, and I don't know if there's enough money in that world to keep lots of those vendors alive. (I hope so, but I don't know for sure.)
  • Home servers will start to grow in interest. This may seem paradoxical to the previous point, but I think techno-geek leader-types are going to start looking into "servers-in-a-box" that they can set up at home and have all their devices sync to and store to. Sure, all the media will come through there, and the key here will be "turnkey", since most folks are getting used to machines that "just work". Lots of friends, for example, seem to be using Mac Minis for exactly this purpose, and there's a vendor here in Redmond that sells a ridiculously-powered server in a box for a couple thousand. (This is on my birthday list, right after I get my maxed-out 13" MacBook Air and iPad 3.) This is also going to be fueled by...
  • Private cloud is going to start getting hot. The great advantage of cloud is that you don't have to have an IT department; the great disadvantage of cloud is that when things go bad, you don't have an IT department. Too many well-publicized cloud failures are going to drive corporations to try and find a solution that is the best-of-both-worlds: the flexibility and resiliency of cloud provisioning, but staffed by IT resources they can whip and threaten and cajole when things fail. (And, by the way, I fully understand that most cloud providers have better uptimes than most private IT organizations--this is about perception and control and the feelings of powerlessness and helplessness when things go south, not reality.)
  • Oracle will release Java8, and while several Java pundits will decry "it's not the Java I love!", most will actually come to like it. Let's be blunt, Java has long since moved past being the flower of fancy and a critic's darling, and it's moved squarely into the battleship-gray of slogging out code and getting line-of-business apps done. Java8 adopting function literals (aka "closures") and retrofitting the Collection library to use them will be a subtle, but powerful, extension to the lifetime of the Java language, but it's never going to be sexy again. Fortunately, it doesn't need to be.
  • Microsoft will start courting the .NET developers again. Windows8 left a bad impression in the minds of many .NET developers, with the emphasis on HTML/JavaScript apps and C++ apps, leaving many .NET developers to wonder if they were somehow rendered obsolete by the new platform. Despite numerous attempts in numerous ways to tell them no, developers still seem to have that opinion--and Microsoft needs to go on the offensive to show them that .NET and Windows8 (and WinRT) do, in fact, go very well together. Microsoft can't afford for their loyal developer community to feel left out or abandoned. They know that, and they'll start working on it.
  • Samsung will start pushing themselves further and further into the consumer market. They already have started gathering more and more of a consumer name for themselves, they just need to solidify their tablet offerings and get closer in line with either Google (for Android tablets) or even Microsoft (for Windows8 tablets and/or Surface competitors) to compete with Apple. They may even start looking into writing their own tablet OS, which would be something of a mistake, but an understandable one.
  • Apple's next release cycle will, again, be "more of the same". iPhone 6, iPad 4, iPad Mini 2, MacBooks, MacBook Airs, none of them are going to get much in the way of innovation or new features. Apple is going to run squarely into the Innovator's Dilemma soon, and their products are going to be "more of the same" for a while. Incremental improvements along a couple of lines, perhaps, but nothing Earth-shattering. (Hey, Apple, how about opening up Siri to us to program against, for example, so we can hook into her command structure and hook our own apps up? I can do that with Android today, why not her?)
  • Visual Studio 2014 features will start being discussed at the end of the year. If Microsoft is going to hit their every-two-year-cycle with Visual Studio, then they'll start talking/whispering/rumoring some of the v.Next features towards the middle to end of 2013. I fully expect C# 6 will get some form of type providers, Visual Basic will be a close carbon copy of C# again, and F# 4 will have something completely revolutionary that anyone who sees it will be like, "Oh, cool! Now, when can I get that in C#?"
  • Scala interest wanes. As much as I don't want it to happen, I think interest in Scala is going to slow down, and possibly regress. This will be the year that Typesafe needs to make a major splash if they want to show the world that they're serious, and I don't know that the JVM world is really all that interested in seeing a new player. Instead, I think Scala will be seen as what "the 1%" of the Java community uses, and the rest will take some ideas from there and apply them (poorly, perhaps) to Java.
  • Interest in native languages will rise. Just for kicks, developers will start experimenting with some of the new compile-to-native-code languages (Go, Rust, Slate, Haskell, whatever) and start finding some of the joys (and heartaches) that come with running "on the metal". More importantly, they'll start looking at ways to use these languages with platforms where running "on the metal" is more important, like mobile devices and tablets.

As always, folks, thanks for reading. See you next year.

UPDATE: Two things happened this week (7 Jan 2013) that made me want to add to this list:
  • Hardware is the new platform. A buddy of mine (Scott Davis) pointed out on a mailing list we share that "hardware is the new platform", and with Microsoft's Surface out now, there's three major players (Apple, Google, Microsoft) in this game. It's becoming apparent that more and more companies are starting to see opportunities in going the Apple route of owning not just the OS and the store, but the hardware underneath it. More and more companies are going to start playing this game, too, I think, and we're going to see Amazon take some shots here, and probably a few others. Of course, already announced is the Ubuntu Phone, and a new Android-like player, Tizen, but I'm not thinking about new players--there's always new players--but about some of the big standouts. And look for companies like Dell and HP to start looking for ways to play in this game, too, either through partnerships or acquisitions. (Hello, Oracle, I'm looking at you.... And Adobe, too.)
  • APIs for lots of things are going to come out. Ford just did this. This is not going away--this is going to proliferate. And the startup community is going to lap it up like kittens attacking a bowl of cream. If you're looking for a play in the startup world, pursue this.

.NET | Android | Azure | C# | C++ | Conferences | Development Processes | F# | Flash | Industry | iPhone | Java/J2EE | Languages | LLVM | Mac OS | Objective-C | Parrot | Python | Reading | Review | Ruby | Scala | Security | Solaris | Visual Basic | VMWare | WCF | Windows | XML Services | XNA

Tuesday, January 01, 2013 1:22:30 AM (Pacific Standard Time, UTC-08:00)
Comments [2]  | 
 Wednesday, December 26, 2012
Thoughts on my new Surface

As a post-Christmas gift to myself, I took a bit of the money that my folks gave us and bought myself a 64GB Surface. Couple of thoughts came to mind as I've sat down to play with this thing:

  1. Microsoft doesn't sell a 64GB model with a Type keyboard? I know the touch-thing is, like, the new hotness with everyone, but frankly, having played with a friend's Surface and his (preferred) Touch keyboard cover, I think both he and Microsoft are smoking some serious crack if they think anyone can seriously touch-type on the touch keyboard. (To be fair, it's not just Microsoft, either--I can't effectively touch-type on my iPad or Galaxy Tab, either. I need the tactile feedback from the spring underneath the key and the edges of the keys themselves to know if I hit the key squarely or not.) More importantly, why on earth does Microsoft think that people buying the 64GB model won't want the Type cover? Or is this an insiduous ploy to force me to accept a bundle (the 64GB model apparently only comes with a Touch cover, not no cover at all) that I don't want? It certainly worked--I bought the 64GB with Touch cover for $699, then the Type cover by itself for another $129. (Let the conspiracists go crazy with that one.)
  2. The packaging is awfully reminiscient of the iPad/iPhone/iPod packaging style. Nice to see that Microsoft can leverage good ideas. ;-)
  3. So I fire this thing up, and the first thing I'm told is that there are 15 updates waiting. I'm all for keeping bits fresh and current and fixed, but this seems a bit excessive--why do so many apps need an update so quickly after its initial release? What's worse, the Store app doesn't tell you what these updates are for, as near as I can tell, so you can't tell which ones are crucial and which ones are just cosmetic. Kind of a fail there.
  4. Wait, how do I right-click on this thing? Or has Microsoft finally come to the realization that one mouse button is all you need right about the time that Apple seems ready to accept that two buttons are, in fact, a superior way of life?
  5. The form factor on this thing is a little bit larger than I expected for some reason. Not that I didn't really know how big it is (and it's not really all that big, at least not when compared to the Samsung tablet they gave us at //Build/ two years ago), but for some reason it just feels bigger than it is.
  6. The keyboard makes me think of it as a laptop, not a tablet. I find myself wanting to go download Visual Studio and put a stripped-down version of it on here. (I even asked my buddy who had a Surface if he'd managed to do that yet, and he--gently--reminded me that since this is Windows RT, and an ARM processor, it won't run on here.)
  7. Because I still wasn't convinced that this isn't a laptop, I tried to download Dropbox onto here. The Surface let me download the whole thing, then told me "This app cannot run on Surface". D'oh! Busted. I am an idiot.
  8. But no Dropbox on here? Really, Microsoft? This seems like a fairly major oversight. I know, Sinofsky was not a "team player", but he's gone now: Find the Dropbox team, give them a ton of money and a few "We're sorry, we won't shut you out again, we promise" mea culpas, and get one of the most popular productivity apps on the planet on this thing. Seriously.
  9. And while we're fixing things, can we please get the Store to be a little more responsive? I know the UX here is going for a "minimalist" vibe, but some part of me wants to see some whirlygigs or something going on while I'm downloading apps. (I, of course, will probably regret this in two years, and vehemently deny saying this when the whirlygigs make me long for a clean and simple interface after Microsoft jazzes it all up to the point of migraine-inducing snazziness.)
  10. And why did the Store hang in the middle of doing my 15 updates and 4 app downloads? It may have been the Internet connection (I'm sitting in a restaurant as I do this, and restaurant WiFi is on par with hotel WiFI in its reliability and bandwidth), but if it is, give me some kind of indication and don't lock me out of doing anything. (The screen became entirely unresponsive.) That's silly.
  11. Oh, and Evernote? After you install and start downloading my notes, same thing--don't get all silent on me and not tell me what's going on.
  12. Wait, Word and Excel and PowerPoint and OneNote are just Office 2013 previews? Not the real thing? Interesting--will I get a free update when those go live, or is this just another "play for free for 90 days then we soak you for money" kinds of arrangements? (And if so, will I be able to use an MVP MSDN key to update/upgrade/install them?)
  13. And now, post-reboot, Store won't launch--it just goes into the spinning circle of deathly dots. (Did I just coin that phrase? Can I copyright it?)
All in all, in the hour or so I've had it, it's not been a terrible experience, but I can't say it's been "sublime" or "world-changing". I'm glad I have it, because once I get a system worked out whereby I can easily share files back and forth between my Surface and the rest of my machines (yes, Mr. Ballmer, I know about SkyDrive, I just haven't been using mine and have to figure out how and where and when I would shift things back and forth between it and Dropbox), I look forward to giving this thing a spin for some of my upcoming blog entries and articles.

Which reminds me: whichever of BitBucket or GitHub manages to bring git or Mercurial over to the Surface (and iPad, and Android) will be a hell of a first-mover on integrating source control into peoples' daily lives. Can you imagine if GitHub and Dropbox joined forces? That would be interesting.


Conferences | Industry | Personal | Reading | Review | Social | Windows

Wednesday, December 26, 2012 6:02:26 PM (Pacific Standard Time, UTC-08:00)
Comments [0]  | 
 Wednesday, November 28, 2012
On Knowledge

Back during the Bush-Jr Administration, Donald Rumsfeld drew quite a bit of fire for his discussion of knowledge, in which he said (loosely paraphrasing) "There are three kinds of knowledge: what you know you know, what you know you don't know, and what you don't know you don't know". Lots of Americans, particularly those who were not kindly disposed towards "Rummy" in the first place, took this to be canonical Washington doublespeak, and berated him for it.

I actually think that was one of the few things Rumsfeld said that was worth listening to, and I have a slight amendment to the statement; but first, let's level-set and make sure we're all on the same page about what those first three categories mean, in real life, with a few assumptions along the way to simplify the discussion (as best we can, anyway):

  1. What you know you know. This is the category of information that the individual in question has studied to some level of depth: for a student of International Relations (as I was), this would be the various classes that they took and received (presumably) a passing grade in. For you, the reader of my blog, that would probably be some programming language and/or platform. This is knowledge that you have, in some depth, at a degree that most people would consider "factually accurate".
  2. What you know you don't know. This is the category of information that the individual in question has heard about, but has never studied to any level or degree: for the student of International Relations, this might be the subject of biochemistry or electrical engineering. For you, the reader of my blog, it might be certain languages that you've heard of, perhaps through this blog (Erlang, F#, Scala, Clojure, Haskell, etc) or data-storage systems (Cassandra, CouchDB, Riak, Redis, etc) that you've never investigated or even sat through a lecture about. This is knowledge that you realize you don't have.
  3. What you don't know you don't know. This is the category of information that the individual in question has never even heard about, and so therefore, by definition, has not only the lack of knowledge of the subject, but lacks the realization that they lack the knowledge of the subject. For the student of International Relations, this might be phrenology or Schrodinger's Cat. For you, the reader of my blog, it might be languages like Dylan, Crack, Brainf*ck, Ook, or Shakespeare (which I'm guessing is going to trigger a few Google searches) or platforms like BeOS (if you're in your early 20's now), AmigaOS (if you're in your early 30's now) or database tools/platforms/environments like Pick or Paradox. This is knowledge that you didn't realize you don't have (but, paradoxically, now that you know you don't have it, it moves into the "know you don't know" category).
Typically, this discussion comes up in my "Pragmatic Architecture" talk, because an architect needs to have a very clear realization of what technologies and/or platforms are in which of those three categories, and (IMHO) push as many of them from category #3 (don't know that you don't know) into category #2 (know you don't know) or, ideally, category #1 (know you know). Note that category #1 doesn't mean that you are the world's foremost expert on the thing, but you have some working knowledge of the thing in question--I don't consider myself to be an expert on Cassandra, for example, but I know enough that I can talk reasonably intelligently to it, and I know where I can get more in the way of details if that becomes important, so therefore I peg it in category #1.

But what if I'm wrong?

See, here's where I think there's a new level of knowledge, and it's one I think every software developer needs to admit exists, at least for various things in their own mind:

  • What you think you know. This is knowledge that you believe, in your heart of hearts, you have about a given subject.
Be honest with yourself: we've all met somebody in this industry who claims to have knowledge/expertise on a subject, and damn if they can't talk a good game. They genuinely believe, in fact, that they know the subject in question, and speak with the confidence and assurance that comes with that belief. (I'm assuming that the speaker in question isn't trying to deliberately deceive anyone, which may, in some cases, be a naive and/or false assumption, but I'm leaving that aside for now.) But, after a while, it becomes apparent, either to themselves or to the others around them, that the knowledge they have is either incorrect, out of date, out of context, or some combination of all three.

As much as "what you don't know you don't know" information is dangerous, "what you think you know" information is far, far more so, particularly because until you demonstrate to yourself that your information is actually correct, you're a danger and a liability to anyone who listens to you. Without regularly challenging yourself to some form of external review/challenge, you'll never exactly know whether what you know is real, or just made up from your head.

This is why, at every turn, your assumption should be that any information you have is some or all incorrect until proven otherwise. Find out why you know something--what combination of facts/data lead you to believe that this is the case?--and you will quickly begin to discover whether that knowledge is real, or just some kind of elaborate self-deception.


Conferences | Development Processes | Industry | Personal | Review | Social

Wednesday, November 28, 2012 6:13:45 PM (Pacific Standard Time, UTC-08:00)
Comments [0]  | 
 Saturday, November 03, 2012
Cloud legal

There's an interesting legal interpretation coming out of the Electronic Freedom Foundation (EFF) around the Megaupload case, and the EFF has said this:

"The government maintains that Mr. Goodwin lost his property rights in his data by storing it on a cloud computing service. Specifically, the government argues that both the contract between Megaupload and Mr. Goodwin (a standard cloud computing contract) and the contract between Megaupload and the server host, Carpathia (also a standard agreement), "likely limit any property interest he may have" in his data. (Page 4). If the government is right, no provider can both protect itself against sudden losses (like those due to a hurricane) and also promise its customers that their property rights will be maintained when they use the service. Nor can they promise that their property might not suddenly disappear, with no reasonable way to get it back if the government comes in with a warrant. Apparently your property rights "become severely limited" if you allow someone else to host your data under standard cloud computing arrangements. This argument isn't limited in any way to Megaupload -- it would apply if the third party host was Amazon's S3 or Google Apps or or Apple iCloud."
Now, one of the participants on the Seattle Tech Startup list, Jonathan Shapiro, wrote this as an interpretation of the government's brief and the EFF filing:

What the government actually says is that the state of Mr. Goodwin's property rights depends on his agreement with the cloud provider and their agreement with the infrastructure provider. The question ultimately comes down to: if I upload data onto a machine that you own, who owns the copy of the data that ends up on your machine? The answer to that question depends on the agreements involved, which is what the government is saying. Without reviewing the agreements, it isn't clear if the upload should be thought of as a loan, a gift, a transfer, or something else.

Lacking any physical embodiment, it is not clear whether the bits comprising these uploaded digital artifacts constitute property in the traditional sense at all. Even if they do, the government is arguing that who owns the bits may have nothing to do with who controls the use of the bits; that the two are separate matters. That's quite standard: your decision to buy a book from the bookstore conveys ownership to you, but does not give you the right to make further copies of the book. Once a copy of the data leaves the possession of Mr. Goodwin, the constraints on its use are determined by copyright law and license terms. The agreement between Goodwin and the cloud provider clearly narrows the copyright-driven constraints, because the cloud provider has to be able to make copies to provide their services, and has surely placed terms that permit this in their user agreement. The consequences for ownership are unclear. In particular: if the cloud provider (as opposed to Mr. Goodwin) makes an authorized copy of Goodwin's data in the course of their operations, using only the resources of the cloud provider, the ownership of that copy doesn't seem obvious at all. A license may exist requiring that copy to be destroyed under certain circumstances (e.g. if Mr. Goodwin terminates his contract), but that doesn't speak to ownership of the copy.

Because no sale has occurred, and there was clearly no intent to cede ownership, the Government's challenge concerning ownership has the feel of violating common sense. If you share that feeling, welcome to the world of intellectual property law. But while everyone is looking at the negative side of this argument, it's worth considering that there may be positive consequences of the Government's argument. In Germany, for example, software is property. It is illegal (or at least unenforceable) to write a software license in Germany that stops me from selling my copy of a piece of software to my friend, so long as I remove it from my machine. A copy of a work of software can be resold in the same way that a book can be resold because it is property. At present, the provisions of UCITA in the U.S. have the effect that you do not own a work of software that you buy. If the district court in Virginia determines that a recipient has property rights in a copy of software that they receive, that could have far-reaching consequences, possibly including a consequent right of resale in the United States.

Now, whether or not Jon's interpretation is correct, there are some huge legal implications of this interpretation of the cloud, because data "ownership" is going to be the defining legal issue of the next century.


.NET | Android | Azure | C# | C++ | Conferences | Development Processes | F# | Flash | Industry | iPhone | Java/J2EE | Languages | LLVM | Mac OS | Objective-C | Parrot | Personal | Python | Reading | Review | Ruby | Scala | Security | Social | Solaris | Visual Basic | VMWare | WCF | Windows | XML Services | XNA

Saturday, November 03, 2012 12:14:40 AM (Pacific Daylight Time, UTC-07:00)
Comments [0]  | 
 Thursday, November 01, 2012
Vietnam... in Bulgarian

I received an email from Dimitar Teykiyski a few days ago, asking if he could translate the "Vietnam of Computer Science" essay into Bulgarian, and no sooner had I replied in the affirmative than he sent me the link to it. If you're Bulgarian, enjoy. I'll try to make a few moments to put the link to the translation directly on the original blog post itself, but it'll take a little bit--I have a few other things higher up in the priority queue. (And somebody please tell me how to say "Thank you" in Bulgarian, so I may do that right for Dimitar?)


.NET | Android | C# | Conferences | Development Processes | F# | Industry | Java/J2EE | Languages | Objective-C | Python | Reading | Review | Ruby | Scala | Visual Basic | WCF | XML Services

Thursday, November 01, 2012 4:17:58 PM (Pacific Daylight Time, UTC-07:00)
Comments [1]  | 
 Friday, March 16, 2012
Just Say No to SSNs

Two things conspire to bring you this blog post.

Of Contracts and Contracts

First, a few months ago, I was asked to participate in an architectural review for a project being done for one of the states here in the US. It was a project dealing with some sensitive information (Child Welfare Services), and I was required to sign a document basically promising not to do anything bad with the data. Not a problem to sign, since I was going to be more focused on the architecture and code anyway, and would stay away from the production servers and data as much as I possibly could. But then the state agency asked for my social security number, and when I pushed back asking why, they told me it was “mandatory” in order to work on the project. I suspect it was for a background check—but when I asked how long they were going to hold on to the number and what their privacy policy was regarding my data, they refused to answer, and I never heard from them again. Which, quite frankly, was something of a relief.

Second, just tonight there was a thread on the Seattle Tech Startup mailing list about SSNs again. This time, a contractor who participates on the list was being asked by the contracting agency for his SSN, not for any tax document form, but… just because. This sounded fishy. It turned out that the contract was going to be with AT&T, and that they commonly use a contractor’s SSN as a way of identifying the contractor in their vendor database. It was also noted that many companies do this, and that it was likely that many more would do so in the future. One poster pointed out that when the state’s attorney general’s office was contacted about this practice, it isn’t illegal.

Folks, this practice has to stop. For both your sake, and the company’s.

Of Data and Integrity

Using SSNs in your database is just a bad idea from top to bottom. For starters, it makes your otherwise-unassuming enterprise application a ripe target for hackers, who seek to gather legitimate SSNs as part of the digital fingerprinting of potential victims for identity theft. What’s worse, any time I’ve ever seen any company store the SSNs, they’re almost always stored in plaintext form (“These aren’t credit cards!”), and they’re often used as a primary key to uniquely identify individuals.

There’s so many things wrong with this idea from a data management perspective, it’s shameful.

  • SSNs were never intended for identification purposes. Yeah, this is a weak argument now, given all the de facto uses to which they are put already, but when FDR passed the Social Security program back in the 30s, he promised the country that they would never be used for identification purposes. This is, in fact, why the card reads “This number not to be used for identification purposes” across the bottom. Granted, every financial institution with whom I’ve ever done business has ignored that promise for as long as I’ve been alive, but that doesn’t strike me as a reason to continue doing so.
  • SSNs are not unique. There’s rumors of two different people being issued the same SSN, and while I can’t confirm or deny this based on personal experience, it doesn’t take a rocket scientist to figure out that if there are 300 million people living in the US, and the SSN is a nine-digit number, that means that there are 999,999,999 potential numbers in the best case (which isn’t possible, because the first three digits are a stratification mechanism—for example, California-issued numbers are generally in the 5xx range, while East Coast-issued numbers are in the 0xx range). What I can say for certain is that SSNs are, in fact, recycled—so your new baby may (and very likely will) end up with some recently-deceased individual’s SSN. As we start to see databases extending to a second and possibly even third generation of individuals, these kinds of conflicts are going to become even more common. As US population continues to rise, and immigration brings even more people into the country to work, how soon before we start seeing the US government sweat the problems associated with trying to go to a 10- or 11-digit SSN? It’s going to make the IPv4 and IPv6 problems look trivial by comparison. (Look for that to be the moment when the US government formally adopts a hexadecimal system for SSNs.)
  • SSNs are sensitive data. You knew this already. But what you may not realize is that data not only has a tendency to escape the organization that gathered it (databases are often sold, acquired, or stolen), but that said data frequently lives far, far longer than it needs to. Look around in your own company—how many databases are still online, in use, even though the data isn’t really relevant anymore, just because “there’s no cost to keeping it”? More importantly, companies are increasingly being held accountable for sensitive information breaches, and it’s just a matter of time before a creative lawyer seeking to tap into the public’s sensitivities to things they don’t understand leads him/her takes a company to court, suing them for damages for such a breach. And there’s very likely more than a few sympathetic judges in the country to the idea. Do you really want to be hauled up on the witness stand to defend your use of the SSN in your database?

Given that SSNs aren’t unique, and therefore fail as their primary purpose in a data management scheme, and that they represent a huge liability because of their sensitive nature, why on earth would you want them in your database?

A Call

But more importantly, companies aren’t going to stop using them for these kinds of purposes until we make them stop. Any time a company asks you for your SSN, challenge them. Ask them why they need it, if the transaction can be completed without it, and if they insist on having it, a formal declaration of their sensitive information policy and what kind of notification and compensation you can expect when they suffer a sensitive data breach. It may take a while to find somebody within the company who can answer your questions at the places that legitimately need the information, but you’ll get there eventually. And for the rest of the companies that gather it “just in case”, well, if it starts turning into a huge PITA to get them, they’ll find other ways to figure out who you are.

This is a call to arms, folks: Just say NO to handing over your SSN.


.NET | Android | Azure | C# | C++ | Conferences | Development Processes | F# | Flash | Industry | iPhone | Java/J2EE | Languages | LLVM | Mac OS | Objective-C | Parrot | Personal | Python | Reading | Review | Ruby | Scala | Security | Social | Solaris | Visual Basic | VMWare | WCF | Windows | XML Services | XNA

Friday, March 16, 2012 11:10:49 PM (Pacific Daylight Time, UTC-07:00)
Comments [1]  | 
 Saturday, January 01, 2011
Tech Predictions, 2011 Edition

Long-time readers of this blog know what’s coming next: it’s time for Ted to prognosticate on what the coming year of tech will bring us. But I believe strongly in accountability, even in my offered-up-for-free predictions, so one of the traditions of this space is to go back and revisit my predictions from this time last year. So, without further ado, let’s look back at Ted’s 2010 predictions, and see how things played out; 2010 predictions are prefixed with “THEN”, and my thoughts on my predictions are prefixed with “NOW”:

For 2010, I predicted....

  • THEN: ... I will offer 3- and 4-day training classes on F# and Scala, among other things. OK, that's not fair—yes, I have the materials, I just need to work out locations and times. Contact me if you're interested in a private class, by the way.
    • NOW: Well, I offered them… I just didn’t do much to advertise them or sell them. I got plenty busy just with the other things I had going on. Besides, this and the next prediction were pretty much all advertisement anyway, so I don’t know if anybody really counts these two.
  • THEN: ... I will publish two books, one on F# and one on Scala. OK, OK, another plug. Or, rather, more of a resolution. One will be the "Professional F#" I'm doing for Wiley/Wrox, the other isn't yet finalized. But it'll either be published through a publisher, or self-published, by JavaOne 2010.
    • NOW: “Professional F# 2.0” shipped in Q3 of 2010; the other Scala book I decided not to pursue—too much stuff going on to really put the necessary time into it. (Cue sad trombone.)
  • THEN: ... DSLs will either "succeed" this year, or begin the short slide into the dustbin of obscure programming ideas. Domain-specific language advocates have to put up some kind of strawman for developers to learn from and poke at, or the whole concept will just fade away. Martin's book will help, if it ships this year, but even that might not be enough to generate interest if it doesn't have some kind of large-scale applicability in it. Patterns and refactoring and enterprise containers all had a huge advantage in that developers could see pretty easily what the problem was they solved; DSLs haven't made that clear yet.
    • NOW: To be honest, this one is hard to call. Martin Fowler published his DSL book, which many people consider to be a good sign of what’s happening in the world, but really, the DSL buzz seems to have dropped off significantly. The strawman hasn’t appeared in any meaningful public way (I still don’t see an example being offered up from anybody), and that leads me to believe that the fading-away has started.
  • THEN: ... functional languages will start to see a backlash. I hate to say it, but "getting" the functional mindset is hard, and there's precious few resources that are making it easy for mainstream (read: O-O) developers make that adjustment, far fewer than there was during the procedural-to-object shift. If the functional community doesn't want to become mainstream, then mainstream developers will find ways to take functional's most compelling gateway use-case (parallel/concurrent programming) and find a way to "git 'er done" in the traditional O-O approach, probably through software transactional memory, and functional languages like Haskell and Erlang will be relegated to the "What Might Have Been" of computer science history. Not sure what I mean? Try this: walk into a functional language forum, and ask what a monad is. Nobody yet has been able to produce an answer that doesn't involve math theory, or that does involve a practical domain-object-based example. In fact, nobody has really said why (or if) monads are even still useful. Or catamorphisms. Or any of the other dime-store words that the functional community likes to toss around.
    • NOW: I think I have to admit that this hasn’t happened—at least, there’s been no backlash that I’ve seen. In fact, what’s interesting is that there’s been some movement to bring those functional concepts—including monads, which surprised me completely—into other languages like C# or Java for discussion and use. That being said, though, I don’t see Haskell and Erlang taking center stage as application languages—instead, I see them taking supporting-cast kinds of roles building other infrastructure that applications in turn make use of, a la CouchDB (written in Erlang). Monads still remain a mostly-opaque subject for most developers, however, and it’s still unclear if monads are something that people should think about applying in code, or if they are one of those “in theory” kinds of concepts. (You know, one of those ideas that change your brain forever, but you never actually use directly in code.)
  • THEN: ... Visual Studio 2010 will ship on time, and be one of the buggiest and/or slowest releases in its history. I hate to make this prediction, because I really don't want to be right, but there's just so much happening in the Visual Studio refactoring effort that it makes me incredibly nervous. Widespread adoption of VS2010 will wait until SP1 at the earliest. In fact....
    • NOW: Wow, did I get a few people here in Redmond annoyed with me about that one. And, as it turned out, I was pretty off-base about its stability. (It shipped pretty close if not exactly on the ship date Microsoft promised, as I recall, though I admit I wasn’t paying too much attention to it.)  I’ve been using VS 2010 for a lot of .NET work in the last six months, and I’ve yet (knock on wood) to have it crash on me. /bow Visual Studio team.
  • THEN: ... Visual Studio 2010 SP 1 will ship within three months of the final product. Microsoft knows that people wait until SP 1 to think about upgrading, so they'll just plan for an eager SP 1 release, and hope that managers will be too hung over from the New Year (still) to notice that the necessary shakeout time hasn't happened.
    • NOW: Uh…. nope. In fact, SP 1 has just reached a beta/CTP state. As for managers being too hung over, well…
  • THEN: ... Apple will ship a tablet with multi-touch on it, and it will flop horribly. Not sure why I think this, but I just don't think the multi-touch paradigm that Apple has cooked up for the iPhone will carry over to a tablet/laptop device. That won't stop them from shipping it, and it won't stop Apple fan-boiz from buying it, but that's about where the interest will end.
    • NOW: Oh, WOW did I come so close and yet missed the mark by a mile. Of course, the “tablet” that Apple shipped was the iPad, and it did pretty much everything except flop horribly. Apple fan-boys bought it… and then about 24 hours later, so did everybody else. My mom got one, for crying out loud. And folks, the iPad—along with the whole “slate” concept—is pretty clearly here to stay.
  • THEN: ... JDK 7 closures will be debated for a few weeks, then become a fait accompli as the Java community shrugs its collective shoulders. Frankly, I think the Java community has exhausted its interest in debating new language features for Java. Recent college grads and open-source groups with an axe to grind will continue to try and make an issue out of this, but I think the overall Java community just... doesn't... care. They just want to see JDK 7 ship someday.
    • NOW: Pretty close—except that closures won’t ship as part of JDK 7, largely due to the Oracle acquisition in the middle of the year here. And I was spot-on vis-à-vis the “they want to see JDK 7 ship someday”; when given the chance to wait for a year or so for a Java-with-closures to ship, the community overwhelmingly voted to get something sooner rather than later.
  • THEN: ... Scala either "pops" in 2010, or begins to fall apart. By "pops", I mean reaches a critical mass of developers interested in using it, enough to convince somebody to create a company around it, a la G2One.
    • NOW: … and by “somebody”, it turns out I meant Martin Odersky. Scala is pretty clearly a hot topic in the Java space, its buzz being disturbed only by Clojure. Scala and/or Clojure, plus Groovy, makes a really compelling JVM-based stack.
  • THEN: ... Oracle is going to make a serious "cloud" play, probably by offering an Oracle-hosted version of Azure or AppEngine. Oracle loves the enterprise space too much, and derives too much money from it, to not at least appear to have some kind of offering here. Now that they own Java, they'll marry it up against OpenSolaris, the Oracle database, and throw the whole thing into a series of server centers all over the continent, and call it "Oracle 12c" (c for Cloud, of course) or something.
    • NOW: Oracle made a play, but it was to continue to enhance Java, not build a cloud space. It surprises me that they haven’t made a more forceful move in this space, but I suspect that a huge amount of time and energy went into folding Sun into their corporate environment.
  • THEN: ... Spring development will slow to a crawl and start to take a left turn toward cloud ideas. VMWare bought SpringSource for a reason, and I believe it's entirely centered around VMWare's movement into the cloud space—they want to be more than "just" a virtualization tool. Spring + Groovy makes a compelling development stack, particularly if VMWare does some interesting hooks-n-hacks to make Spring a virtualization environment in its own right somehow. But from a practical perspective, any community-driven development against Spring is all but basically dead. The source may be downloadable later, like the VMWare Player code is, but making contributions back? Fuhgeddabowdit.
    • NOW: The Spring One show definitely played up Cloud stuff, and springsource.com seems to be emphasizing cloud more in a couple of subtle ways. Not sure if I call this one a win or not for me, though.
  • THEN: ... the explosion of e-book readers brings the Kindle 2009 edition way down to size. The era of the e-book reader is here, and honestly, while I'm glad I have a Kindle, I'm expecting that I'll be dusting it off a shelf in a few years. Kinda like I do with my iPods from a few years ago.
    • NOW: Honestly, can’t say that I’m using my Kindle a lot, but I am reading using the Kindle app on non-Kindle hardware more than I thought I would be. That said, I am eyeing the new Kindle hardware generation with an acquisitive eye…
  • THEN: ... "social networking" becomes the "Web 2.0" of 2010. In other words, using the term will basically identify you as a tech wannabe and clearly out of touch with the bleeding edge.
    • NOW: Um…. yeah.
  • THEN: ... Facebook becomes a developer platform requirement. I don't pretend to know anything about Facebook—I'm not even on it, which amazes my family to no end—but clearly Facebook is one of those mechanisms by which people reach each other, and before long, it'll start showing up as a developer requirement for companies looking to hire. If you're looking to build out your resume to make yourself attractive to companies in 2010, mad Facebook skillz might not be a bad investment.
    • NOW: I’m on Facebook, I’ve written some code for it, and given how much the startup scene loves the “Like” button, I think developers who knew Facebook in 2010 did pretty well for themselves.
  • THEN: ... Nintendo releases an open SDK for building games for its next-gen DS-based device. With the spectacular success of games on the iPhone, Nintendo clearly must see that they're missing a huge opportunity every day developers can't write games for the Nintendo DS that are easily downloadable to the device for playing. Nintendo is not stupid—if they don't open up the SDK and promote "casual" games like those on the iPhone and those that can now be downloaded to the Zune or the XBox, they risk being marginalized out of existence.
    • NOW: Um… yeah. Maybe this was me just being hopeful.

In general, it looks like I was more right than wrong, which is not a bad record to have. Of course, a couple of those “wrong”s were “giving up the big play” kind of wrongs, so while I may have a winning record, I still may have a defense that’s given up too many points to be taken seriously. *shrug* Oh, well.

What portends for 2011?

  • Android’s penetration into the mobile space is going to rise, then plateau around the middle of the year. Android phones, collectively, have outpaced iPhone sales. That’s a pretty significant statistic—and it means that there’s fewer customers buying smartphones in the coming year. More importantly, the first generation of Android slates (including the Galaxy Tab, which I own), are less-than-sublime, and not really an “iPad Killer” device by any stretch of the imagination. And I think that will slow down people buying Android slates and phones, particularly since Google has all but promised that Android releases will start slowing down.
  • Windows Phone 7 penetration into the mobile space will appear huge, then slow down towards the middle of the year. Microsoft is getting some pretty decent numbers now, from what I can piece together, and I think that’s largely the “I love Microsoft” crowd buying in. But it’s a pretty crowded place right now with Android and iPhone, and I’m not sure if the much-easier Office and/or Exchange integration is enough to woo consumers (who care about Office) or business types (who care about Exchange) away from their Androids and iPhones.
  • Android, iOS and/or Windows Phone 7 becomes a developer requirement. Developers, if you haven’t taken the time to learn how to program one of these three platforms, you are electing to remove yourself from a growing market that desperately wants people with these skills. I see the “mobile native app development” space as every bit as hot as the “Internet/Web development” space was back in 2000. If you don’t have a device, buy one. If you have a device, get the tools—in all three cases they’re free downloads—and start writing stupid little apps that nobody cares about, so you can have some skills on the platform when somebody cares about it.
  • The Windows 7 slates will suck. This isn’t a prediction, this is established fact. I played with an “ExoPC” 10” form factor slate running Windows 7 (Dell I think was the manufacturer), and it was a horrible experience. Windows 7, like most OSes, really expects a keyboard to be present, and a slate doesn’t have one—so the OS was hacked to put a “keyboard” button at the top of the screen that would slide out to let you touch-type on the slate. I tried to fire up Notepad and type out a haiku, and it was an unbelievably awkward process. Android and iOS clearly own the slate market for the forseeable future, and if Dell has any brains in its corporate head, it will phone up Google tomorrow and start talking about putting Android on that hardware.
  • DSLs mostly disappear from the buzz. I still see no strawman (no “pet store” equivalent), and none of the traditional builders-of-strawmen (Microsoft, Oracle, etc) appear interested in DSLs much anymore, so I think 2010 will mark the last year that we spent any time talking about the concept.
  • Facebook becomes more of a developer requirement than before. I don’t like Mark Zuckerburg. I don’t like Facebook’s privacy policies. I don’t particularly like the way Facebook approaches the Facebook Connect experience. But Facebook owns enough people to be the fourth-largest nation on the planet, and probably commands an economy of roughly that size to boot. If your app is aimed at the Facebook demographic (that is, everybody who’s not on Twitter), you have to know how to reach these people, and that means developing at least some part of your system to integrate with it.
  • Twitter becomes more of a developer requirement, too. Anybody who’s not on Facebook is on Twitter. Or dead. So to reach the other half of the online community, you have to know how to connect out with Twitter.
  • XMPP becomes more of a developer requirement. XMPP hasn’t crossed a lot of people’s radar screen before, but Facebook decided to adopt it as their chat system communication protocol, and Google’s already been using it, and suddenly there’s a whole lotta traffic going over XMPP. More importantly, it offers a two-way communication experience that is in some scenarios vastly better than what HTTP offers, yet running in a very “Internet-friendly” way just as HTTP does. I suspect that XMPP is going to start cropping up in a number of places as a useful alternative and/or complement to using HTTP.
  • “Gamification” starts making serious inroads into non-gaming systems. Maybe it’s just because I’ve been talking more about gaming, game design, and game implementation last year, but all of a sudden “gamification”—the process of putting game-like concepts into non-game applications—is cresting in a big way. FourSquare, Yelp, Gowalla, suddenly all these systems are offering achievement badges and scoring systems for people who want to play in their worlds. How long is it before a developer is pulled into a meeting and told that “we need to put achievement badges into the call-center support application”? Or the online e-commerce portal? It’ll start either this year or next.
  • Functional languages will hit a make-or-break point. I know, I said it last year. But the buzz keeps growing, and when that happens, it usually means that it’s either going to reach a critical mass and explode, or it’s going to implode—and the longer the buzz grows, the faster it explodes or implodes, accordingly. My personal guess is that the “F/O hybrids”—F#, Scala, etc—will continue to grow until they explode, particularly since the suggested v.Next changes to both Java and C# have to be done as language changes, whereas futures for F# frequently are either built as libraries masquerading as syntax (such as asynchronous workflows, introduced in 2.0) or as back-end library hooks that anybody can plug in (such as type providers, introduced at PDC a few months ago), neither of which require any language revs—and no concerns about backwards compatibility with existing code. This makes the F/O hybrids vastly more flexible and stable. In fact, I suspect that within five years or so, we’ll start seeing a gradual shift away from pure O-O systems, into systems that use a lot more functional concepts—and that will propel the F/O languages into the center of the developer mindshare.
  • The Microsoft Kinect will lose its shine. I hate to say it, but I just don’t see where the excitement is coming from. Remember when the Wii nunchucks were the most amazing thing anybody had ever seen? Frankly, after a slew of initial releases for the Wii that made use of them in interesting ways, the buzz has dropped off, and more importantly, the nunchucks turned out to be just another way to move an arrow around on the screen—in other words, we haven’t found particularly novel and interesting/game-changing ways to use the things. That’s what I think will happen with the Kinect. Sure, it’s really freakin’ cool that you can use your body as the controller—but how precise is it, how quickly can it react to my body movements, and most of all, what new user interface metaphors are people going to have to come up with in order to avoid the “me-too” dancing-game clones that are charging down the pipeline right now?
  • There will be no clear victor in the Silverlight-vs-HTML5 war. And make no mistake about it, a war is brewing. Microsoft, I think, finds itself in the inenviable position of having two very clearly useful technologies, each one’s “sphere of utility” (meaning, the range of answers to the “where would I use it?” question) very clearly overlapping. It’s sort of like being a football team with both Brett Favre and Tom Brady on your roster—both of them are superstars, but you know, deep down, that you have to cut one, because you can’t devote the same degree of time and energy to both. Microsoft is going to take most of 2011 and probably part of 2012 trying to support both, making a mess of it, offering up conflicting rationale and reasoning, in the end achieving nothing but confusing developers and harming their relationship with the Microsoft developer community in the process. Personally, I think Microsoft has no choice but to get behind HTML 5, but I like a lot of the features of Silverlight and think that it has a lot of mojo that HTML 5 lacks, and would actually be in favor of Microsoft keeping both—so long as they make it very clear to the developer community when and where each should be used. In other words, the executives in charge of each should be locked into a room and not allowed out until they’ve hammered out a business strategy that is then printed and handed out to every developer within a 3-continent radius of Redmond. (Chances of this happening: .01%)
  • Apple starts feeling the pressure to deliver a developer experience that isn’t mired in mid-90’s metaphor. Don’t look now, Apple, but a lot of software developers are coming to your platform from Java and .NET, and they’re bringing their expectations for what and how a developer IDE should look like, perform, and do, with them. Xcode is not a modern IDE, all the Apple fan-boy love for it notwithstanding, and this means that a few things will happen:
    • Eclipse gets an iOS plugin. Yes, I know, it wouldn’t work (for the most part) on a Windows-based Eclipse installation, but if Eclipse can have a native C/C++ developer experience, then there’s no reason why a Mac Eclipse install couldn’t have an Objective-C plugin, and that opens up the idea of using Eclipse to write iOS and/or native Mac apps (which will be critical when the Mac App Store debuts somewhere in 2011 or 2012).
    • Rumors will abound about Microsoft bringing Visual Studio to the Mac. Silverlight already runs on the Mac; why not bring the native development experience there? I’m not saying they’ll actually do it, and certainly not in 2011, but the rumors, they will be flyin….
    • Other third-party alternatives to Xcode will emerge and/or grow. MonoTouch is just one example. There’s opportunity here, just as the fledgling Java IDE market looked back in ‘96, and people will come to fill it.
  • NoSQL buzz grows. The NoSQL movement, which sort of got started last year, will reach significant states of buzz this year. NoSQL databases have a lot to offer, particularly in areas that relational databases are weak, such as hierarchical kinds of storage requirements, for example. That buzz will reach a fever pitch this year, and the relational database moguls (Microsoft, Oracle, IBM) will start to fight back.

I could probably go on making a few more, but I think these are enough to get me into trouble for the year.

To all of you who’ve been readers of this blog for the past year, I thank you—blog-gathered statistics tell me that I get, on average, about 7,000 hits a day, which just stuns me—and it is a New Years’ Resolution that I blog more and give you even more reason to stick around. Happy New Year, and may your 2011 be just as peaceful, prosperous, and eventful as you want it to be.


.NET | Android | Azure | C# | C++ | Conferences | Development Processes | F# | Flash | Industry | iPhone | Java/J2EE | Languages | LLVM | Mac OS | Objective-C | Parrot | Python | Reading | Review | Ruby | Scala | Security | Social | Solaris | Visual Basic | VMWare | WCF | Windows | XML Services | XNA

Saturday, January 01, 2011 2:27:11 AM (Pacific Standard Time, UTC-08:00)
Comments [6]  | 
 Monday, December 13, 2010
Thoughts on my first Startup Weekend

Startup Weekend came to Redmond this weekend, and as I write this it is all of three hours over. In the spirit of capturing post-mortem thoughts as quickly as possible, I thought I’d blog my reactions and thoughts from it, both as a reference for myself for the next one, and as a guide/warning/data point for others considering doing it.

A few weeks ago, emails started crossing the Seattle Tech Startup mailing list about this thing called “Startup Weekend”. I didn’t do a whole lot of research around it—just glanced at the website, and asked a few questions of the organizer in an email. Specifically, I wanted to know that as a tech guy, with no specific startup ideas, I would find something to do. I was reassured immediately that, in fact, as a tech guy, I would be “heavily recruited” by others at the event who were business types.

First takeaway: I can’t speak for all the events, this being my first, but it was a surprise, nay, a shock, to me just how many “business” and/or “marketing” types were at this event. I seriously expected that tech folks would outnumber the non-tech folks by a substantial margin, but it was exactly the opposite, probably on the order of 2 to 1. As a developer, I was definitely being courted, rather than hunting for a team to find a way to make room for me. It was refreshing, exciting and a little overwhelming at the same time.

The format of the event is interesting: anybody can pitch an idea, then everybody in the room is free to “attach” themselves to that idea to form a team to implement it somehow, sort of a “Law of Two Feet” applied to team-building.

Second takeaway: Have a pretty clear idea of what you want to do here. The ideas initially all sound pretty good, and choosing between them can actually be quite painful and difficult. Have a clear goal for yourself what you want out of the weekend—to socialize, to stretch yourself, to build a business, whatever. Mine were (1) just to be here and experience the event, (2) to socialize and network more deeply with the startup scene, (3) to hack on some code and try to ship something, and (4) to learn some new tech that I hadn’t had the chance to use beyond a “Hello World” demo before. There was always the chance I wouldn’t get any of those things, in which case I accepted a consolation prize of simply watching how the event was structured and run, since it operates in many ways on the same basic concept that  GiveCamp does, which is something I want to see done in Seattle sooner rather than later. So just by going and watching the event as a uninvolved observer was worth the price of admission, so once I’d walked through the door, I’d already met my #1 win condition.

I realized as I was choosing which team to join that I didn’t want to be paired alone with the project-pitching person (whomever that would be), since I had no idea how this event worked or what we were going for, so I deliberately turned away from a few projects that sounded interesting. I ended up as part of a team that was pretty well spread-out in terms of skillsets/interests (Chris, developer and “original idea guy”, Libby, business development, Maizer, also business development, Mohammed, small businessman, and Aaron, graphic designer), working on an idea around “social bar gaming”. In other words, we had a nebulous fuzzy idea about using games on a mobile device to help people in bars connect to other people in bars via some kind of “scavenger hunt” or similar social-engagement game. I had suggested that maybe one feature or idea would be to help groups of hard-drinking souls chart their path between bars (something like a Traveling Saleman’s Problem meets a PubCrawl), and Chris thought that was definitely something to consider. We laid out a brief idea of what it was we wanted to build, then parted ways Friday night about midnight or so, except for Chris and myself, who headed out to Denny’s to mull over some technology ideas for a while, until about 3 AM in the morning.

Third takeaway: Hoard the nighttime hours on Friday, to spend them working on the app the rest of the weekend. Even though you’re full of energy Friday night, rarin’ to go, bank it because you’ll need to be well-rested for the marathon that is Saturday and Sunday.

Chris and I briefly discussed the technology approaches we could use, and settled in on using Azure for the backplane, mostly because I felt it would be the quickest way to get us provisioned on a server, and it was an excuse for me to play with Azure, something I haven’t had much of a chance to do beyond simple demos. We also thought that having some kind of Facebook integration would be a good idea, depending on what we actually wanted to do with the idea. I thought to myself, “OK, so this is going to be interesting for me—I’m going to be actually ‘stretching’ on three things simultaneously: Azure, Facebook, and whatever Web framework we use to build this”, since I haven’t done much Web work in .NET in many, many years, and don’t consider myself “up to speed” on either ASP.NET or ASP.NET MVC. Chris was a “front to middle tier” guy, though, so I figured I’d focus on the Azure back-end parts—storage, queueing, etc—and maybe the Facebook integration, and we’d be good.

By Saturday morning, thanks to a few other things I had to do after Chris left, I got there a bit late—about 10:30—fully expecting that the team had a pretty clear app vision laid out and ready for Chris and I to execute on. Alas, not quite—we were still sort of thrashing on what exactly we wanted to build—specifically, we kept bouncing back and forth between what the game would be and how it would be monetized. If we wanted to sell to bars as a way to get more bodies in the door, then we needed some kind of “check-in” game where people earned points for bringing friends to the bar. Or we could sell to bars by creating a game that was a kind of “scavenger hunt”, forcing patrons to discover things about the bar or about new drinks the bar sells, and so on. But we also wanted a game that was intrinsically social, forcing peoples’ eyes away from the screens and up towards the other patrons—otherwise why play the game?

Aaron, a two-time veteran of Startup Weekend, suggested that we finalize our vision by 11 AM so we could start hacking. By 11 AM, we had a vision… until about an hour later, when I realized that Libby, Chris, Maizer, and Mohammed were changing the game to suit new monetization ideas. We set another deadline for 2 PM, at which point we had a vision…. until about an hour later again, when I looked up again and found them discussing again what kind of game we wanted to build. In the end, it wasn’t until 7 or 8 PM Saturday when we finally nailed down some kind of game app idea—and then only because Aaron came out of his shell a little and politely yelled at the group for wasting all of our time.

Fourth takeaway: Know what’s clear and unclear about your vision/idea. I think we didn’t realize how nebulous our ideas were until we started trying to put game mechanics around it, and that was what led to all the thrashing on ideas.

Fifth takeaway: Put somebody in charge. Have a dictator in place. Yes, everybody wants to be polite and yes, choosing a leader can be a bit uncomfortable, but having that final unambiguous deciding vote—a leader—who can make decisions and isn’t afraid to do so would have saved us a lot of headache and gotten us much more quickly down the path. Libby said it best in our little post-mortem at the bar afterwards: Don’t you dare leave Friday night until everybody is 100% clear on what you’re building.

Meanwhile, on the technical front, another warm front was meeting another cold front and developing into a storm. When we’d decided to use Azure, I had suggested it over Google App Engine because Chris had said he’d done some development with it before, so I figured he was comfortable with it and ready to go. As we started pulling out laptops to start working, though, Chris mentioned that he needed to spin up a virtual machine with Windows 7, Visual Studio, and the Azure tools in it. No worries—I needed some time to read up on Azure provisioning, data storage, and so on.

Unfortunately, setting up the VM took until about 8 PM Saturday night, meaning we lost 11 of our 15 hours (9 AM to midnight) for that day.

Sixth takeaway: Have your tools ready to go before you get there. Find a hosting provider—come on, everybody is going to need a hosting provider, even if you build a mobile app—and have a virtual machine or laptop configured with every dev tool you can think of, ready to go. Getting stuff downloaded and installed is burning a very precious commodity that you don’t have nearly enough of: time.

Seventh takeaway: Be clear about your personal motivation/win conditions for the weekend. Yes, I wanted to explore a new tech, but Chris took that to mean that I wasn’t going to succeed if we abandoned Azure, and as a result, we burned close to 50% of our development cycles prepping a VM just so I could put Azure on my resume. I would’ve happily redacted that line on my resume in exchange for getting us up and running by 11 AM Saturday morning, particularly because it became clear to me that others in the group were running with win conditions of “spin up a legitimate business startup idea”, and I had already met most of my win conditions for the weekend by this point. I should’ve mentioned this much earlier, but didn’t realize what was happening until a chance comment Chris made in passing Saturday night when we left for the night.

Sunday I got in about noonish, owing to a long day, short night, and forgotten cell phone (alarm clock) in the car. By the time I got there, tempers were starting to flare because we were clearly well behind the curve. Chris had been up all night working on HTML forms for the game, Aaron had been up all night creating some (amazing!) graphics for the game, I had been up a significant part of the night diving into Facebook APIs, and I think we all sensed that this was in real danger of falling apart. Unfortunately, we couldn’t even split the work between Chris and I, because we had (foolishly) not bothered to get some kind of source-control server going for the code so we could work in parallel.

See the sixth takeaway. It applies to source-control servers, too. And source-control clients, while we’re at it.

We were slotted to present our app and business idea first, as it turned out, which was my preference—I figured that if we went first, we might set a high bar that other groups would have a hard time matching. (That turned out to be a really false hope—the other groups’ work was amazing.) The group asked me to make the pitch, which was fine with me—when have I ever turned down the chance to work a crowd?

But our big concern was the demo—we’d originally called for a “feature freeze” at 4PM, so we would have time to put the app on the server and test it, but by 4:15 Chris was still stitching pages together and putting images on pages. In fact, the push to the Azure server for v0.1 of our app happened about about 5:15, a full 30 seconds before I started the pitch.

The pitch itself was deliberately simple: we put Libby on a bar stool facing the crowd, Mohammed standing against a wall, and said, “Ever been in a bar, wanting to strike up a conversation with that cute girl at the far table? With Pubbn, we give you an excuse—a social scavenger hunt—to strike up a conversation with her, or earn some points, or win a discount from the bar, or more. We want to take the usual social networking premise—pushing socialization into the network—and instead flip it on its ear—using the network to make it easier to socialize.” It was a nice pitch, but I forgot to tell people to download the app and try it during the demo, which left some people thinking we never actually finished anything. ACK.

Pubbn, by the way, was our app name, derived (obviously) from “going pubbing”, as in, going out to drink and socialize. I loved this name. It’s up at http://www.pubbn.com, but I’ll warn you now, it’s a static mockup and a far cry from what we wanted to end up with—in fact, I’ll go out on a limb and say that of all the projects, ours was by far the weakest technical achievement, and I lay the blame for that at my feet. I should’ve stepped up and taken more firm control of the development so Chris could focus more on the overall picture.

The eventual winners for the weekend were “Doodle-A-Doodle”, a fantastic learn-to-draw app for kids on the iPad; “Hold It!”, a game about standing in line in the mens’ room; and “CamBadge”, a brilliant little iPhone app for creating a conference badge on your phone, hanging your phone around your neck, and snapping a picture of the person standing in front of you with a single touch to the screen (assuming, of course, you have an iPhone 4 with its front-facing camera).

“CamBadge” was one of the apps I thought about working on, but passed on it because it didn’t seem challenging enough technologically. Clearly that was a foolish choice from a business perspective, but this is why knowing what your win conditions for the weekend are so important—I didn’t necessarily want to build a new business out of this weekend, and, to me, the more important “win” was to make a social connection with the people who looked like good folks to know in this space—and the “CamBadge” principal, Adam, clearly fit that bill. Drinking with him was far more important—to me—than building an app with him. Next Startup Weekend, my win conditions might be different, and if so, I’d make an entirely different decision.

In the end, Startup Weekend was a blast, and something I thoroughly recommend every developer who’s ever thought of going independent do. The cost is well, well worth the experience, and if you fail miserably, well, better to do that here, with so little invested, than to fail later in a “real” startup with millions attached.

By the way, Startup Weekend Redmond was/is #swred on Twitter, if you want to see the buzz that came out of it. Particularly good reading are the Tweets starting at about 5 PM tonight, because that’s when the presentations started.


.NET | Android | Azure | C# | Conferences | Development Processes | Industry | iPhone | Java/J2EE | Mac OS | Objective-C | Review | Ruby | VMWare | XML Services | XNA

Monday, December 13, 2010 1:53:24 AM (Pacific Standard Time, UTC-08:00)
Comments [0]  | 
 Friday, October 22, 2010
Doing it Twice… On Different Platforms

Short version: Matthew Baxter-Reynolds has written an intriguing book, Multimobile Development, about writing the same application over and over again on different mobile platforms. On the one hand, I applaud the effort, and think this is a great resource for developers who want to get started on mobile development, particularly since this book means you won’t have to choose a platform first. On the other hand, there’s a few things about the book I didn’t like, such as the fact that it misses the third platform in the room (Windows Phone 7) and that it could go out of date fairly quickly. On the other hand, there were a lot of things I did like about the book, like the use of OData and the sample app “in the cloud”. On the other hand…. wait, I ran out of hands. A while ago, in fact. Regardless, the book is worth picking up.

Long version: One of the interesting things about being me is that publishers periodically send me books to review, on the hopes that I’ll find it interesting and blog about it, and you, faithful blog readers that you are, will be so overwhelmed by my implicit endorsement that you’ll immediately drop what you’re doing and run (don’t walk!) to the nearest bookstore or Web browser and engage in that capitalist activity known as “impulsive consumption”. Now, I don’t know if that latter part of the equation actually takes shape, but I do like to get books, so….

(What publishers don’t like about me is when they send me a book and I don’t write a review about it. Usually that’s because I didn’t like it, didn’t think it covered the material well, or my cat is sitting on the laptop keyboard and I’m too lazy to move him off of it. Or I’m just too busy to blog about it. Or any of dozens of other reasons that have nothing to do with the book. Sometimes I’m just too busy eating pie and don’t want to get crumbs on the keyboard. Mmmm, pie. Wait. Where was I? Ah, right. Sorry.)

As many of you who’ve seen me present over the last couple of years know, I’ve been getting steadily deeper and deeper into the mobile space, predominantly aimed at three platforms: iOS, Android and WindowsPhone 7. I own an iPhone 3GS that I use for day-to-day stuff, an iPhone 3G (recycled hand-me-down in the family when one of my family bought an iPhone 4) that I feel free to abuse since it’s not my “business line phone”, an iPod Touch that I feel free to abuse for the same reason, an iPad WiFi that I just bought a few weeks ago that I’ll eventually feel like I can abuse, a Motorola Droid that my friends refer to as my “skank phone” (it has a live phone # associated with it), a Palm Pre that I rarely touch anymore, and a few other devices even older than that laying around. And yes, I will be buying a Windows Phone 7 when it comes out here in the US, and I probably will replace my Droid with a Droid X or Samsung Galaxy before long, and get an Android tablet/slate/whatever when they start to come out (I’m guessing around Christmas).

Yeah, OK, so it’s probably an addiction by this point. I’m fine. I can stop whenever I want. Really.

All of that is by way of establishing that I’m very interested in writing software for the mobile device market, and I’ve got a few ideas (games, utilities, whatever) that I tinker with when I have the chance, and I always have this little voice in the back of my head whispering that “It’s such a pain that I have to write different client apps for each one of the mobile devices—wouldn’t it be cool if I could reuse code across the different platforms….?”

(Honesty compels me to say that’s totally not true—the little voice part, I mean. Well, no, I mean, I do hear voices, but they don’t say anything about reusing code. I write these little knick-knacks because I want to learn about writing code for all the different platforms. But I can imagine somebody asking me that question at a conference, so I pretend. And the voices? Well, they tell me lots of things, but I only listen to some of them and then only some of the time. Usually when the body is easily disposable.)

Baxter-Reynolds’ book, then, really caught my eye—if there’s a way to do development across these different platforms in any sort of capturable way, then hell, yes, I want to have a look.

Except…. That’s not really what this book is about. Well, sort of.

Put bluntly, Multimobile Development is about writing two client apps for a “cloud”-based service, “Six Bookmarks”. (It’s an app that lets you jump to a URL from one of the six buttons exposed in the app—in other words, a fixed-number of URL bookmarks. The point is not the usefulness of the service, it’s to have a relatively useful backplane against which to develop the mobile apps, and this one is “just right” in terms of complexity for mobile app clients.) He then writes the same client app twice, once on Android and then once for iPhone, quite literally as a duplicate of one another. The chapters even line up one-for-one with one another: Chapters 4 and 8 are about “Installing the Toolset”, the first for Android and the second for iOS, Chapters 5 and 9 are both “Building the Logon Form and Consuming REST Services”, Chatpers 6 and 10 are “An ORM Layer Over SQLite”, and so on. It’s not about trying to reuse code between the two clients, but he does do a great job of demonstrating how the server is written to support both (using, not surprisingly, OData as the “wire protocol” for data between the two), thus facilitating a small amount of effective reuse by doing so.

The prose is pretty clear, although he does, from time to time, resort to the use of exclamation points, which I find as a pet peeve in technical writing; to me, it just doesn’t read well, almost like the faked enthusiasm you see from a late-night product-pitch man. (“The Sham-Wow! It’s great! You’ll love it! Somebody, please stop me from yelling like this!”) But it’s rare enough that I can blow past it, and I generally write it off as just an aesthetic difference between me and the author. Beyond that, he does a good job of laying down clear explanations, objectives, and rationale.

A couple of concerns I have about this book, both of which can be corrected in a future edition, stand out as “must be mentioned”. First, this space is clearly a moving target, something Baxter-Reynolds highlights in the very first two pages of the book itself. He chooses to use the XCode 4 Developer Preview for the iOS code, which obviously is not the latest bits by the time you get your hands on the book—he admits as much in the prose, but relies on the idea that the production/shipping version of XCode 4 won’t be that different from the beta (which may or may not be a viable assumption to make).

The other concern is a bit more far-reaching: I kinda wish the book had Windows Phone 7 in here, too.

I mean, if he’s OK with using the developer preview of XCode for the iOS parts, it would seem reasonable to do the same for the WP7 Developer Tools, which have been out in a relatively stable form for quite a few months. Granted, he (probably) wouldn’t have been able to test his software on the actual device, since they appear to be rarer than software architects who still write code, but I don’t know that this would’ve changed his point whatsoever, either. Still, if he’s working on a second edition with WP7 as an additional client platform and another five or so chapters for comparison, it’d be a near-flawless keeper, at least for the next two or three years.

(Granted, he does do the .NET world a little justice by including a final chapter on MonoTouch, but that feels a little “thrown in” at the end, almost as if he felt the need to assuage the WP7 stuff by reminding the .NET developers: “Don’t worry, guys, someday, real soon now, you’ll be able to write mobile apps, too! And then clients will love you! And women will flock to you at cocktail parties! Somebody, please stop me from yelling like this!”.)

Overall, it’s a good book, and I like the fact that somebody’s taking on the obvious topic of “Multi-Mobile” development without falling into the “one source base, multiple platforms” trap. (I groan at the thought of how many developers are now going to go off and try to write cross-platform mobile frameworks, by the way. I don’t think it’ll be pretty.) It’s not a complete reference to all things iOS or Android—you probably want a good reference to go with each platform you write to—but as a “getting started with mobile development”, this is actually a great resource to have for both of the major platforms.


.NET | Android | C# | iPhone | Java/J2EE | Languages | Objective-C | Review | Visual Basic

Friday, October 22, 2010 9:39:36 PM (Pacific Daylight Time, UTC-07:00)
Comments [2]  | 
 Wednesday, September 08, 2010
VMWare help

Hey, anybody who’s got significant VMWare mojo, help out a bro?

I’ve got a Win7 VM (one of many) that appears to be exhibiting weird disk behavior—the vmdk, a growable single-file VMDK, is almost precisely twice the used space. It’s a 120GB growable disk, and the Win7 guest reports about 35GB used, but the VMDK takes about 70GB on host disk. CHKDSK inside Windows says everything’s good, and the VMWare “Disk Cleanup” doesn’t change anything, either. It doesn’t seem to be a Windows7 thing, because I’ve got a half-dozen other Win7 VMs that operate… well, normally (by which I mean, 30GB used in the VMDK means 30GB used on disk). It’s a VMWare Fusion host, if that makes any difference. Any other details that might be relevant, let me know and I’ll post.

Anybody got any ideas what the heck is going on inside this disk?


.NET | Android | C# | C++ | Conferences | Development Processes | F# | Flash | Industry | iPhone | Java/J2EE | Languages | LLVM | Mac OS | Objective-C | Parrot | Python | Reading | Review | Ruby | Scala | Security | Social | Solaris | Visual Basic | VMWare | WCF | Windows | XML Services | XNA

Wednesday, September 08, 2010 8:53:01 PM (Pacific Daylight Time, UTC-07:00)
Comments [5]  | 
 Wednesday, March 24, 2010
Another Gartner prediction...

Let's see if this one holds: Gartner says that by 2012, Android will have a larger percentage of the worldwide mobile phone market than the iPhone, 14.5 % against 13.7%.

Reasons to doubt this particular bit of prescience? Gartner also predicts that "Windows Mobile" will have "12.8 percent" of the market. This despite the fact that at MIX last week, Microsoft basically canned Windows Mobile in favor of a complete reboot called "Windows Phone Series 7" based on ideas from Silverlight and XNA.

Huh.


.NET | Android | C# | Industry | iPhone | Java/J2EE | Languages | Reading | Review | Windows | XNA

Wednesday, March 24, 2010 12:15:23 AM (Pacific Daylight Time, UTC-07:00)
Comments [1]  | 
 Thursday, January 14, 2010
2010 TechEd PreCon: Multiparadigmatic C#

I'm excited to say that TechEd has accepted my pre-conference proposal, Multiparadigmatic C#, where the abstract reads:

C# has grown from “just” an object-oriented language into a language that is capable of expressing several different paradigms of software development: object-oriented, functional, and dynamic. In this session, developers will learn how to approach programming in C# to use each of these approaches, and when.

If you're interested in seeing C# used in a variety of different ways, come on out.

And if you're not going to TechEd.... why not? It's in New Orleans, folks!


.NET | C# | C++ | Conferences | F# | Industry | Languages | Python | Reading | Review | Ruby | Visual Basic | WCF | Windows | XML Services

Thursday, January 14, 2010 11:49:53 PM (Pacific Standard Time, UTC-08:00)
Comments [1]  | 
 Tuesday, January 05, 2010
2010 Predictions, 2009 Predictions Revisited

Here we go again—another year, another set of predictions revisited and offered up for the next 12 months. And maybe, if I'm feeling really ambitious, I'll take that shot I thought about last year and try predicting for the decade. Without further ado, I'll go back and revisit, unedited, my predictions for 2009 ("THEN"), and pontificate on those subjects for 2010 before adding any new material/topics. Just for convenience, here's a link back to last years' predictions.

Last year's predictions went something like this (complete with basketball-scoring):

  • THEN: "Cloud" will become the next "ESB" or "SOA", in that it will be something that everybody will talk about, but few will understand and even fewer will do anything with. (Considering the widespread disparity in the definition of the term, this seems like a no-brainer.) NOW: Oh, yeah. Straight up. I get two points for this one. Does anyone have a working definition of "cloud" that applies to all of the major vendors' implementations? Ted, 2; Wrongness, 0.
  • THEN: Interest in Scala will continue to rise, as will the number of detractors who point out that Scala is too hard to learn. NOW: Two points for this one, too. Not a hard one, mind you, but one of those "pass-and-shoot" jumpers from twelve feet out. James Strachan even tweeted about this earlier today, pointing out this comparison. As more Java developers who think of themselves as smart people try to pick up Scala and fail, the numbers of sour grapes responses like "Scala's too complex, and who needs that functional stuff anyway?" will continue to rise in 2010. Ted, 4; Wrongness, 0.
  • THEN: Interest in F# will continue to rise, as will the number of detractors who point out that F# is too hard to learn. (Hey, the two really are cousins, and the fortunes of one will serve as a pretty good indication of the fortunes of the other, and both really seem to be on the same arc right now.) NOW: Interestingly enough, I haven't heard as many F# detractors as Scala detractors, possibly because I think F# hasn't really reached the masses of .NET developers the way that Scala has managed to find its way in front of Java developers. I think that'll change mighty quickly in 2010, though, once VS 2010 hits the streets. Ted, 4; Wrongness 2.
  • THEN: Interest in all kinds of functional languages will continue to rise, and more than one person will take a hint from Bob "crazybob" Lee and liken functional programming to AOP, for good and for ill. People who took classes on Haskell in college will find themselves reaching for their old college textbooks again. NOW: Yep, I'm claiming two points on this one, if only because a bunch of Haskell books shipped this year, and they'll be the last to do so for about five years after this. (By the way, does anybody still remember aspects?) But I'm going the opposite way with this one now; yes, there's Haskell, and yes, there's Erlang, and yes, there's a lot of other functional languages out there, but who cares? They're hard to learn, they don't always translate well to other languages, and developers want languages that work on the platform they use on a daily basis, and that means F# and Scala or Clojure, or its simply not an option. Ted 6; Wrongness 2.
  • THEN: The iPhone is going to be hailed as "the enterprise development platform of the future", and companies will be rolling out apps to it. Look for Quicken iPhone edition, PowerPoint and/or Keynote iPhone edition, along with connectors to hook the iPhone up to a presentation device, and (I'll bet) a World of Warcraft iPhone client (legit or otherwise). iPhone is the new hotness in the mobile space, and people will flock to it madly. NOW: Two more points, but let's be honest—this was a fast-break layup, no work required on my part. Ted 8; Wrongness 2.
  • THEN: Another Oslo CTP will come out, and it will bear only a superficial resemblance to the one that came out in October at PDC. Betting on Oslo right now is a fools' bet, not because of any inherent weakness in the technology, but just because it's way too early in the cycle to be thinking about for anything vaguely resembling production code. NOW: If you've worked at all with Oslo, you might argue with me, but I'm still taking my two points. The two CTPs were pretty different in a number of ways. Ted 10; Wrongness 2.
  • THEN: The IronPython and IronRuby teams will find some serious versioning issues as they try to manage the DLR versioning story between themselves and the CLR as a whole. An initial hack will result, which will be codified into a standard practice when .NET 4.0 ships. Then the next release of IPy or IRb will have to try and slip around its restrictions in 2010/2011. By 2012, IPy and IRb will have to be shipping as part of Visual Studio just to put the releases back into lockstep with one another (and the rest of the .NET universe). NOW: Pressure is still building. Let's see what happens by the time VS 2010 ships, and then see what the IPy/IRb teams start to do to adjust to the versioning issues that arise. Ted 8; Wrongness 2.
  • THEN: The death of JSR-277 will spark an uprising among the two leading groups hoping to foist it off on the Java community--OSGi and Maven--while the rest of the Java world will breathe a huge sigh of relief and look to see what "modularity" means in Java 7. Some of the alpha geeks in Java will start using--if not building--JDK 7 builds just to get a heads-up on its impact, and be quietly surprised and, I dare say, perhaps even pleased. NOW: Ah, Ted, you really should never underestimate the community's willingness to take a bad idea, strip all the goodness out of it, and then cycle it back into the mix as something completely different yet somehow just as dangerous and crazy. I give you Project Jigsaw. Ted 10; Wrongness 2;
  • THEN: The invokedynamic JSR will leapfrog in importance to the top of the list. NOW: The invokedynamic JSR begat interest in other languages on the JVM. The interest in other languages on the JVM begat the need to start thinking about how to support them in the Java libraries. The need to start thinking about supporting those languages begat a "Holy sh*t moment" somewhere inside Sun and led them to (re-)propose closures for JDK 7. And in local sports news, Ted notched up two more points on the scoreboard. Ted 12; Wrongness 2.
  • THEN: Another Windows 7 CTP will come out, and it will spawn huge media interest that will eventually be remembered as Microsoft promises, that will eventually be remembered as Microsoft guarantees, that will eventually be remembered as Microsoft FUD and "promising much, delivering little". Microsoft ain't always at fault for the inflated expectations people have--sometimes, yes, perhaps even a lot of times, but not always. NOW: And then, just when the game started to turn into a runaway, airballs started to fly. The Windows7 release shipped, and contrary to what I expected, the general response to it was pretty warm. Yes, there were a few issues that emerged, but overall the media liked it, the masses liked it, and Microsoft seemed to have dodged a bullet. Ted 12; Wrongness 5.
  • THEN: Apple will begin to legally threaten the clone market again, except this time somebody's going to get the DOJ involved. (Yes, this is the iPhone/iTunes prediction from last year, carrying over. I still expect this to happen.) NOW: What clones? The only people trying to clone Macs are those who are building Hackintosh machines, and Apple can't sue them so long as they're using licensed copies of Mac OS X (as far as I know). Which has never stopped them from trying, mind you, and I still think Steve has some part of his brain whispering to him at night, calculating all the hardware sales lost to Hackintosh netbooks out there. But in any event, that's another shot missed. Ted 12; Wrongness 7.
  • THEN: Alpha-geek developers will start creating their own languages (even if they're obscure or bizarre ones like Shakespeare or Ook#) just to have that listed on their resume as the DSL/custom language buzz continues to build. NOW: I give you Ioke. If I'd extended this to include outdated CPU interpreters, I'd have made that three-pointer from half-court instead of just the top of the key. Ted 14; Wrongness 7.
  • THEN: Roy Fielding will officially disown most of the "REST"ful authors and software packages available. Nobody will care--or worse, somebody looking to make a name for themselves will proclaim that Roy "doesn't really understand REST". And they'll be right--Roy doesn't understand what they consider to be REST, and the fact that he created the term will be of no importance anymore. Being "REST"ful will equate to "I did it myself!", complete with expectations of a gold star and a lollipop. NOW: Does anybody in the REST community care what Roy Fielding wrote way back when? I keep seeing "REST"ful systems that seem to have designers who've never heard of Roy, or his thesis. Roy hasn't officially disowned them, but damn if he doesn't seem close to it. Still.... No points. Ted 14; Wrongness 9.
  • THEN: The Parrot guys will make at least one more minor point release. Nobody will notice or care, except for a few doggedly stubborn Perl hackers. They will find themselves having nightmares of previous lives carrying around OS/2 books and Amiga paraphernalia. Perl 6 will celebrate it's seventh... or is it eighth?... anniversary of being announced, and nobody will notice. NOW: Does anybody still follow Perl 6 development? Has the spec even been written yet? Google on "Perl 6 release", and you get varying reports: "It'll ship 'when it's ready'", "There are no such dates because this isn't a commericially-backed effort", and "Spring 2010". Swish—nothin' but net. Ted 16; Wrongness 9.
  • THEN: The debate around "Scrum Certification" will rise to a fever pitch as short-sighted money-tight companies start looking for reasons to cut costs and either buy into agile at a superficial level and watch it fail, or start looking to cut the agilists from their company in order to replace them with cheaper labor. NOW: Agile has become another adjective meaning "best practices", and as such, has essentially lost its meaning. Just ask Scott Bellware. Ted 18; Wrongness 9.
  • THEN: Adobe will continue to make Flex and AIR look more like C# and the CLR even as Microsoft tries to make Silverlight look more like Flash and AIR. Web designers will now get to experience the same fun that back-end web developers have enjoyed for near-on a decade, as shops begin to artificially partition themselves up as either "Flash" shops or "Silverlight" shops. NOW: Not sure how to score this one—I haven't seen the explicit partitioning happen yet, but the two environments definitely still seem to be looking to start tromping on each others' turf, particularly when we look at the rapid releases coming from the Silverlight team. Ted 16; Wrongness 11.
  • THEN: Gartner will still come knocking, looking to hire me for outrageous sums of money to do nothing but blog and wax prophetic. NOW: Still no job offers. Damn. Ah, well. Ted 16; Wrongness 13.

A close game. Could've gone either way. *shrug* Ah, well. It was silly to try and score it in basketball metaphor, anyway—that's the last time I watch ESPN before writing this.

For 2010, I predict....

  • ... I will offer 3- and 4-day training classes on F# and Scala, among other things. OK, that's not fair—yes, I have the materials, I just need to work out locations and times. Contact me if you're interested in a private class, by the way.
  • ... I will publish two books, one on F# and one on Scala. OK, OK, another plug. Or, rather, more of a resolution. One will be the "Professional F#" I'm doing for Wiley/Wrox, the other isn't yet finalized. But it'll either be published through a publisher, or self-published, by JavaOne 2010.
  • ... DSLs will either "succeed" this year, or begin the short slide into the dustbin of obscure programming ideas. Domain-specific language advocates have to put up some kind of strawman for developers to learn from and poke at, or the whole concept will just fade away. Martin's book will help, if it ships this year, but even that might not be enough to generate interest if it doesn't have some kind of large-scale applicability in it. Patterns and refactoring and enterprise containers all had a huge advantage in that developers could see pretty easily what the problem was they solved; DSLs haven't made that clear yet.
  • ... functional languages will start to see a backlash. I hate to say it, but "getting" the functional mindset is hard, and there's precious few resources that are making it easy for mainstream (read: O-O) developers make that adjustment, far fewer than there was during the procedural-to-object shift. If the functional community doesn't want to become mainstream, then mainstream developers will find ways to take functional's most compelling gateway use-case (parallel/concurrent programming) and find a way to "git 'er done" in the traditional O-O approach, probably through software transactional memory, and functional languages like Haskell and Erlang will be relegated to the "What Might Have Been" of computer science history. Not sure what I mean? Try this: walk into a functional language forum, and ask what a monad is. Nobody yet has been able to produce an answer that doesn't involve math theory, or that does involve a practical domain-object-based example. In fact, nobody has really said why (or if) monads are even still useful. Or catamorphisms. Or any of the other dime-store words that the functional community likes to toss around.
  • ... Visual Studio 2010 will ship on time, and be one of the buggiest and/or slowest releases in its history. I hate to make this prediction, because I really don't want to be right, but there's just so much happening in the Visual Studio refactoring effort that it makes me incredibly nervous. Widespread adoption of VS2010 will wait until SP1 at the earliest. In fact....
  • ... Visual Studio 2010 SP 1 will ship within three months of the final product. Microsoft knows that people wait until SP 1 to think about upgrading, so they'll just plan for an eager SP 1 release, and hope that managers will be too hung over from the New Year (still) to notice that the necessary shakeout time hasn't happened.
  • ... Apple will ship a tablet with multi-touch on it, and it will flop horribly. Not sure why I think this, but I just don't think the multi-touch paradigm that Apple has cooked up for the iPhone will carry over to a tablet/laptop device. That won't stop them from shipping it, and it won't stop Apple fan-boiz from buying it, but that's about where the interest will end.
  • ... JDK 7 closures will be debated for a few weeks, then become a fait accompli as the Java community shrugs its collective shoulders. Frankly, I think the Java community has exhausted its interest in debating new language features for Java. Recent college grads and open-source groups with an axe to grind will continue to try and make an issue out of this, but I think the overall Java community just... doesn't... care. They just want to see JDK 7 ship someday.
  • ... Scala either "pops" in 2010, or begins to fall apart. By "pops", I mean reaches a critical mass of developers interested in using it, enough to convince somebody to create a company around it, a la G2One.
  • ... Oracle is going to make a serious "cloud" play, probably by offering an Oracle-hosted version of Azure or AppEngine. Oracle loves the enterprise space too much, and derives too much money from it, to not at least appear to have some kind of offering here. Now that they own Java, they'll marry it up against OpenSolaris, the Oracle database, and throw the whole thing into a series of server centers all over the continent, and call it "Oracle 12c" (c for Cloud, of course) or something.
  • ... Spring development will slow to a crawl and start to take a left turn toward cloud ideas. VMWare bought SpringSource for a reason, and I believe it's entirely centered around VMWare's movement into the cloud space—they want to be more than "just" a virtualization tool. Spring + Groovy makes a compelling development stack, particularly if VMWare does some interesting hooks-n-hacks to make Spring a virtualization environment in its own right somehow. But from a practical perspective, any community-driven development against Spring is all but basically dead. The source may be downloadable later, like the VMWare Player code is, but making contributions back? Fuhgeddabowdit.
  • ... the explosion of e-book readers brings the Kindle 2009 edition way down to size. The era of the e-book reader is here, and honestly, while I'm glad I have a Kindle, I'm expecting that I'll be dusting it off a shelf in a few years. Kinda like I do with my iPods from a few years ago.
  • ... "social networking" becomes the "Web 2.0" of 2010. In other words, using the term will basically identify you as a tech wannabe and clearly out of touch with the bleeding edge.
  • ... Facebook becomes a developer platform requirement. I don't pretend to know anything about Facebook—I'm not even on it, which amazes my family to no end—but clearly Facebook is one of those mechanisms by which people reach each other, and before long, it'll start showing up as a developer requirement for companies looking to hire. If you're looking to build out your resume to make yourself attractive to companies in 2010, mad Facebook skillz might not be a bad investment.
  • ... Nintendo releases an open SDK for building games for its next-gen DS-based device. With the spectacular success of games on the iPhone, Nintendo clearly must see that they're missing a huge opportunity every day developers can't write games for the Nintendo DS that are easily downloadable to the device for playing. Nintendo is not stupid—if they don't open up the SDK and promote "casual" games like those on the iPhone and those that can now be downloaded to the Zune or the XBox, they risk being marginalized out of existence.

And for the next decade, I predict....

  • ... colleges and unversities will begin issuing e-book reader devices to students. It's a helluvalot cheaper than issuing laptops or netbooks, and besides....
  • ... netbooks and e-book readers will merge before the decade is out. Let's be honest—if the e-book reader could do email and browse the web, you have almost the perfect paperback-sized mobile device. As for the credit-card sized mobile device....
  • ... mobile phones will all but disappear as they turn into what PDAs tried to be. "The iPhone makes calls? Really? You mean Voice-over-IP, right? No, wait, over cell signal? It can do that? Wow, there's really an app for everything, isn't there?"
  • ... wireless formats will skyrocket in importance all around the office and home. Combine the iPhone's Bluetooth (or something similar yet lower-power-consuming) with an equally-capable (Bluetooth or otherwise) projector, and suddenly many executives can leave their netbook or laptop at home for a business presentation. Throw in the Whispersync-aware e-book reader/netbook-thing, and now most executives have absolutely zero reason to carry anything but their e-book/netbook and their phone/PDA. The day somebody figures out an easy way to combine Bluetooth with PayPal on the iPhone or Android phone, we will have more or less made pocket change irrelevant. And believe me, that day will happen before the end of the decade.
  • ... either Android or Windows Mobile will gain some serious market share against the iPhone the day they figure out how to support an open and unrestricted AppStore-like app acquisition model. Let's be honest, the attraction of iTunes and AppStore is that I can see an "Oh, cool!" app on a buddy's iPhone, and have it on mine less than 30 seconds later. If Android or WinMo can figure out how to offer that same kind of experience without the draconian AppStore policies to go with it, they'll start making up lost ground on iPhone in a hurry.
  • ... Apple becomes the DOJ target of the decade. Microsoft was it in the 2000's, and Apple's stunning rising success is going to put it squarely in the sights of monopolist accusations before long. Coupled with the unfortunate health distractions that Steve Jobs has to deal with, Apple's going to get hammered pretty hard by the end of the decade, but it will have mastered enough market share and mindshare to weather it as Microsoft has.
  • ... Google becomes the next Microsoft. It won't be anything the founders do, but Google will do "something evil", and it will be loudly and screechingly pointed out by all of Google's corporate opponents, and the star will have fallen.
  • ... Microsoft finds its way again. Microsoft, as a company, has lost its way. This is a company that's not used to losing, and like Bill Belichick's Patriots, they will find ways to adapt and adjust to the changed circumstances of their position to find a way to win again. What that'll be, I have no idea, but historically, the last decade notwithstanding, betting against Microsoft has historically been a bad idea. My gut tells me they'll figure something new to get that mojo back.
  • ... a politician will make himself or herself famous by standing up to the TSA. The scene will play out like this: during a Congressional hearing on airline security, after some nut/terrorist tries to blow up another plane through nitroglycerine-soaked underwear, the TSA director will suggest all passengers should fly naked in order to preserve safety, the congressman/woman will stare open-mouthed at this suggestion, proclaim, "Have you no sense of decency, sir?" and immediately get a standing ovation and never have to worry about re-election again. Folks, if we want to prevent any chance of loss of life from a terrorist act on an airplane, we have to prevent passengers from getting on them. Otherwise, just accept that it might happen, do a reasonable job of preventing it from happening, and let private insurance start offering flight insurance against the possibility to reassure the paranoid.

See you all next year.


.NET | C# | C++ | Conferences | Development Processes | F# | Flash | Industry | Java/J2EE | Languages | LLVM | Mac OS | Parrot | Python | Reading | Review | Ruby | Scala | Security | Social | Solaris | Visual Basic | VMWare | WCF | Windows | XML Services

Tuesday, January 05, 2010 1:45:59 AM (Pacific Standard Time, UTC-08:00)
Comments [5]  | 
 Sunday, November 29, 2009
Thoughts from the (Java)Edge 2009

These are the things I think as I sit here in my resort hotel on the edge of the Dead Sea in Israel after the JavaEdge 2009 conference on Thursday:

  • The JavaEdge hosts (Alpha CSP) are, without a doubt, the most gracious hosts I think I've ever had at a conference. And considering the wonderful treatment I've had at the hands of the 4Developers and JDD hosts in Krakow (Proidea) and the SDN hosts in Amsterdam, this is saying a lot. But the Alpha CSP folks have simply floored me, top to bottom, with their generosity and warmth.
  • The JavaEdge crowd is a great one. I wasn't quite sure what to expect, because in the US we don't hear much about the tech going on in Israel, so I was a bit concerned that (a) my English was going to be difficult to grasp or that (b) my humor was going to sail over their heads due to the language barrier, or worse, (c), the developers at the conference wouldn't be ready to hear the keynote message ("Why the Next Five Years Will Be About Languages"). I shouldn't have been concerned on any of those points—this crowd understood me perfectly, laughed at most of my jokes (hey, not even my family gets all of them), and more importantly, not only accepted the thrust of the message but also came up to me afterwards and either sought clarification, challenged one or more points, or simply said they enjoyed the keynote. It was as engaged and enthusiastic a crowd as just about any I've had.
  • Fan(tom) is something worth looking into. Some of the speakers at the conference were talking with me about Fan (recently renamed to Fantom, to make it easier to Google/Bing), and I've realized that Fan's too interesting a language for the amount of press that it gets. I think this is something I'm going to pursue in the coming calendar year, maybe put together some presentations and/or workshops on it.
  • Israel is ready for Groovy, Scala, and closures in Java. These folks were chomping at the bit at the thought of using one or all of these, at least based on the comments and questions I got after the keynote.
  • Swimming in the Dead Sea is a truly bizarre experience. To be honest, one doesn't really "swim" in the Dead Sea—one just rests on top of the water, because the salt content in the water is so high that it is (quite literally) impossible to go under the water. It's like lounging on an inflatable raft in the water, except without the raft. It borders on the creepy. Still, my skin is much softer now than it was before. ;-)
  • Jerusalem is a fascinating city. Alpha CSP set me up with a tour guide (Ido Notman), and we toured Jerusalem yesterday: all four quarters of the Old City (the Christian quarter, the Jewish quarter, the Moslem quarter and the Armenian quarter), the "Tomb" of King David, the Holy Sepulchre (where Christ was supposedly crucified and buried), the Western Wall, and then back to Tel Aviv for the night. Throughout the entire day, Ido kept up a running commentary about the history of the city and the three religions that are centered there (Christianity, Judaism and Islam) and the stories/legends that each holds about the city's place in their religious beliefs. I came away just flat overwhelmed, and, once we got back, flat on my back—we walked for most of the day, and Jerusalem is not a flat city like you might expect—it's nestled in some serious mountains, which makes it a bit rough on the calves. But it was well worth it, because there's nothing like standing and looking at pillars right in front of you—excavated from beneath a high-rise apartment building, just there for anybody to stroll up to and see and touch and take photos with—that were built back when Rome meant the center of civilization. Wow.
  • The Palestinian-Israeli and Arab-Israeli conflict(s) are a lot more "real" when you're in the middle of it (geographically). Seeing armed Israeli guards, driving through security checkpoints, even just driving past the wall that Israel is building to keep a physical barrier between them and Hamas/Hezbollah is all a vivid reminder that the nine-o'clock news is more than just something that's happening "over there" when you're "over there" too. The highway we took (the road from Jerusalem to Jericho, the same one mentioned in the parable of the Good Samaritan—and, yes, we passed the Inn of the Good Samaritan on the way here, which was just a little creepy and exciting and weird all at the same time) drove right alongside that wall for a stretch of about five or so kilometers, and I couldn't help but wonder if somebody in one of those apartment buildings over there, who had a clear line of sight to our car zipping by on the freeway, was looking at us through the scope of a sniper rifle. It's a creepy feeling, and even worse knowing that there may well have been an Israeli sniper looking back across the wall as well, into somebody's apartment. I won't weigh in on one side or the other here, because that's not my point; my point is that we in the US take our physical security way too much for granted, compared to some other parts of the world where it's not such a given.
  • And no, in case you were wondering, I was never concerned for my safety. Yes, it's something I thought about. But you have a better chance of dying on a New York street corner from a runaway ice cream truck than you do from a rocket attack or a terrorist suicide bomb (or something like that). I'd come back in a heartbeat.
  • Israelis really know how to party. First the after-conference party on Thursday night, then a quieter speaker dinner last night, but each time, the company was excellent, the food was amazing, and the wine/beer/liquor-of-choice was flowing fast. I don't know if it's just the Alpha CSP folks or Israelis in general, but these people really have a work-hard-play-hard mentality that I just love.

Thanks again to Miya, Ety, Shlomi, Roi, Alex and Ido for a wonderful combination work/vacation trip.


Conferences | Java/J2EE | Languages | Review | Social

Sunday, November 29, 2009 9:08:46 AM (Pacific Standard Time, UTC-08:00)
Comments [5]  | 
 Sunday, November 22, 2009
Book Review: Debug It! (Paul Butcher, Pragmatic Bookshelf)

Paul asked me to review this, his first book, and my comment to him was that he had a pretty high bar to match; being of the same "series" as Release It!, Mike Nygard's take on building software ready for production (and, in my repeatedly stated opinion, the most important-to-read book of the decade), Debug It! had some pretty impressive shoes to fill. Paul's comment was pretty predictable: "Thanks for keeping the pressure to a minimum."

My copy arrived in the mail while I was at the NFJS show in Denver this past weekend, and with a certain amount of dread and excitement, I opened the envelope and sat down to read for a few minutes. I managed to get halfway through it before deciding I had to post a review before I get too caught up in my next trip and forget.

Short version

Debug It! is a great resource for anyone looking to learn the science of good debugging. It is entirely language- and platform-agnostic, preferring to focus entirely on the process and mindset of debugging, rather than on edge cases or command-line switches in a tool or language. Overall, the writing is clear and straightforward without being preachy or judgmental, and is liberally annotated with real-life case stories from both the authors' and the Pragmatic Programmers' own history, which keeps the tone lighter and yet still proving the point of the text. Highly recommended for the junior developers on the team; senior developers will likely find some good tidbits in here as well.

Long version

Debug It! is an excellently-written and to-the-point description of the process of not only identifying and fixing defects in software, but also of the attitudes required to keep software from failing. Rather than simply tossing off old maxims or warming them over with new terminology ("You should always verify the parameters to your procedure calls" replaced with "You should always verify the parameters entering a method and ensure the fields follow the invariants established in the specification"), Paul ensures that when making a point, his prose is clear, the rationale carefully explained, and the consequences of not following this advice are clearly spelled out. His advice is pragmatic, and takes into account that developers can't always follow the absolute rules we'd like to—he talks about some of his experiences with "bug priorities" and how users pretty quickly figured out to always set the bug's priority at the highest level in order to get developer attention, for example, and some ways to try and address that all-too-human failing of bug-tracking systems.

It needs to be said, right from the beginning, that Debug It! will not teach you how to use the debugging features of your favorite IDE, however. This is because Paul (deliberately, it seems) takes a platform- and language-agnostic approach to the book—there are no examples of how to set breakpoints in gdb, or how to attach the Visual Studio IDE to a running Windows service, for example. This will likely weed out those readers who are looking for "Google-able" answers to their common debugging problems, and that's a shame, because those are probably the very readers that need to read this book. Having said that, however, I like this agnostic approach, because these ideas and thought processes, the ones that are entirely independent of the language or platform, are exactly the kinds of things that senior developers carry over with them from one platform to the next. Still, the junior developer who picks this book up is going to still need a reference manual or the user manual for their IDE or toolchain, and will need to practice some with both books in hand if they want to maximize the effectiveness of what's in here.

One of the things I like most about this book is that it is liberally adorned with real-life discussions of various scenarios the author team has experienced; the reason I say "author team" here is because although the stories (for the most part) remain unattributed, there are obvious references to "Dave" and "Andy", which I assume pretty obviously refer to Dave Thomas and Andy Hunt, the Pragmatic Programmers and the owners of Pragmatic Bookshelf. Some of the stories are humorous, and some of them probably would be humorous if they didn't strike so close to my own bitterly-remembered experiences. All of them do a good job of reinforcing the point, however, thus rendering the prose more effective in communicating the idea without getting to be too preachy or bombastic.

The book obviously intends to target a junior developer audience, because most senior developers have already intuitively (or experientially) figured out many of the processes described in here. But, quite frankly, I think it would be a shame for senior developers to pass on this one; though the temptation will be to simply toss it aside and say, "I already do all this stuff", senior developers should resist that urge and read it through cover to cover. If nothing else, it'll help reinforce certain ideas, bring some of the intuitive process more to light and allow us to analyze what we do right and what we do wrong, and perhaps most importantly, give us a common backdrop against which we can mentor junior developers in the science of debugging.

One of the chapters I like in particular, "Chapter 7: Pragmatic Zero Tolerance", is particularly good reading for those shops that currently suffer from a deficit of management support for writing good software. In it, Paul talks specifically about some of the triage process about bugs ("When to fix bugs"), the mental approach developers should have to fixing bugs ("The debugging mind-set") and how to get started on creating good software out of bad ("How to dig yourself out of a quality hole"). These are techniques that a senior developer can bring to the team and implement at a grass-roots level, in many cases without management even being aware of what's going on. (It's a sad state of affairs that we sometimes have to work behind management's back to write good-quality code, but I know that some developers out there are in exactly that situation, and simply saying, "Quit and find a new job", although pithy and good for a laugh on a panel, doesn't really offer much in the way of help. Paul doesn't take that route here, and that alone makes this book worth reading.)

Another of the chapters that resonates well with me is the first one in Part III ("Debug Fu"), Chapter 8, entitled "Special Cases", in which he tackles a number of "advanced" debugging topics, such as "Patching Existing Releases" and "Hesenbugs" (Concurrency-related bugs). I won't spoil the punchline for you, but suffice it to say that I wish I'd had that chapter on hand to give out to teammates on a few projects I've worked on in the past.

Overall, this book is going to be a huge win, and I think it's a worthy successor to the Release It! reputation. Development managers and team leads should get a copy for the junior developers on their team as a Christmas gift, but only after the senior developers have read through it as well. (Senior devs, don't despair—at 190 pages, you can rip through this in a single night, and I can almost guarantee that you'll learn a few ideas you can put into practice the next morning to boot.)


.NET | C# | C++ | Development Processes | F# | Industry | Java/J2EE | Languages | LLVM | Mac OS | Parrot | Python | Reading | Review | Ruby | Scala | Solaris | Visual Basic | Windows | XML Services

Sunday, November 22, 2009 11:24:41 PM (Pacific Standard Time, UTC-08:00)
Comments [0]  | 
 Tuesday, July 28, 2009
More on journalistic integrity: Sys-Con, Ulitzer, theft and libel

Recently, an email crossed my Inbox from a friend who was concerned about some questionable practices involving my content (as well as a few others'); apparently, I have been listed as an "author" for SysCon, I have a "domain" with them, and that I've been writing for them since 10 January, 2003, including two articles, "Effective Enterprise Java" and "Java/.NET Interoperability".

Given that both of those "articles" are summaries from presentations I've done at conferences past, I'm a touch skeptical. In fact, it feels like those summaries were scraped from conferences I've done in the past, and I certainly don't remember ever giving Sys-Con (or any other conference) the right to reprint my presentation as an article.

Then it turns out that apparently I'm not the only one suffering this problem. Go. Read that article, then come back. I promise, I'll wait.

(Seriously, go read it.)

Wow. Just... wow. If even half of what Aral's story is true (and I'm inclined to believe at least part of it, given that he's done some pretty meticulous documentation of at least his side of the story), then this is beyond outrageous, and squarely into "completely unethical".

Now, I'll be the first to admit, I've not heard back from Sys-Con about any of this, so if I get any sort of response I'll be sure to update this blog post. But...

Calling anyone a "homosexual son of a bitch", "terrorist" or "fag" is so unbelievably offensive it staggers the mind. Normally, I'd be a bit hesitant to just give either party the benefit of the doubt on that one, given just how ludicrous the accusation sounds, but Aral includes screen shots of the articles, which in of itself lends an air of credibility to the accusation—either Aral is the world's worst Turkish translator, or Sys-Con's translation into Turkish is a bit on the "edgy" side, or Sys-Con really did call him that. Which implies that whichever way this goes, doesn't look good for one of the two parties. But even if we leave that to one side....

Sys-Con is playing with fire by collecting my content and claiming me as an author. Sys-Con never contacted me about becoming a part of their "Ulitzer" website. They never asked me for permission to reprint my articles, though, I'll admit, I can't find where the articles actually exist, nor links to the articles, so maybe they didn't, actually, reprint the article, but just link to them... except I can't find the links to the articles or the presentations, either. They never asked me for an updated bio or photo, and in fact, they pretty clearly grabbed both bio, photo and "summaries" from an old location, because that bio lists me as a DevelopMentor instructor (which I haven't been for two years or so), and as living in Sacramento, CA (which I haven't been for about three years or so). Let me be very clear about this: I do not write for Sys-Con Media. I never have. They have never asked permission to reuse any of the content I have produced. I am appalled at being included in such a fashion.

Note that I'm not opposed to being linked to, mind you—if I put material on my blog, I generally expect (and hope) that people will link to it, and I don't demand permission or even notification when it happens. But to claim that I've written material for an entity does mean I expect to at least be asked if it's OK to use my likeness, name, or material. No such request was ever made of me, so far as I can remember or find (through my own email archives, which stretch back to 2001).

And I can say that I've thought about this issue before, from the other side of the story—back when I was editor at TheServerSide.NET, we began a "blogger's program" that would take interesting blog posts from around the Internet and "collect" them in some fashion for TSS.NET readers. Originally, the thought was to simply reproduce the content directly on our site, and I hated that idea, for the same reasons as I dislike it when somebody does it to me. Regardless of the licensing model the blog entries are published under, to me, a publication or media firm owes the author at least the right of refusal, and a chance to be notified when their material is reused. (In the end, we chose to ask authors if we could reproduce their material in the program, and we never (to my knowledge) had an author refuse.) It doesn't take a real rocket scientist's brain to figure out that asking permission is never a bad thing to do if you want to maintain good will with your sources of material.

This is an open and public request to Sys-Con media: either contact me about using my name, likeness and material on your website, or remove it. (I have emailed their editorial and asked them to acknowledge receipt of my request.)

In the meantime, I will be making every effort to make sure that other content-producers I know are aware of Sys-Con's practices, so they can act as they see fit.

If you are a reader, and find this distasteful as well, then I suggest you follow some of the suggestions mentioned in Aral's blog post:

    • Tell everyone you know about what Sys-Con is doing (but don't link to them so as not to give them Google Juice). If tweeting, leave out the http:// bit so that your URL is not automatically made into a link.
    • Sys-Con feeds upon the work of authors and speakers to live. If all authors had their content removed from Sys-Con and Ulitzer, they would not have pages to put ads on. So go through their list of authors and notify the ones you know. If they are unaware that they're listed there, they will most likely want themselves removed. Update: I've created a single list of all Sys-Con's Ulitzer authors. More information and the full list are in this post. The original list of authors is at http://www.ulitzer.com/?q=authors. You can ask for your Ulitzer/Sys-Con author page to be removed by emailing editorial@sys-con.com.
    • Contact their advertisers and tell them what you think of their association with Sys-Con.
    • If you know any speakers speaking at Sys-Con events, make sure they know the kind of company they are associating themselves with. Do the same with anyone you know who is thinking of attending one of their events. Raise awareness about their events at your place of work.
    • Make sure Google knows that Sys-Con/Ulitzer is spamming Google with tons of duplicate content. Report them on Google's spam page for posting duplicate content. According to their terms and conditions, Google should stop indexing Sys-Con/Ulitzer. See this comment for a template you can use when reporting them.
    • Make sure Google News knows that they are syndicating libelous articles from Sys-Con. Use the Google News Report an Issue form to report the following articles: http://internetvideo.sys-con.com/node/1017038, http://internetvideo.sys-con.com/node/1028923, http://www.sys-con.com/node/1035252, http://air.ulitzer.com/node/1038383, http://openwebdeveloper.sys-con.com/node/1039556, and http://cloudcomputing.sys-con.com/node/1047589

Meanwhile, I'm going to be talking about this to everybody I know at Microsoft, desperately seeking to find out which department engaged the advertising with Sys-Con, and looking to convince them that they don't need this kind of press or association. Ditto for the contacts (far fewer in number) I have with IBM, and any other Sys-Con advertiser I find.


.NET | C# | C++ | Conferences | F# | Industry | Java/J2EE | Reading | Review | Ruby | Security | Social | VMWare | WCF | Windows | XML Services

Tuesday, July 28, 2009 6:58:00 PM (Pacific Daylight Time, UTC-07:00)
Comments [3]  | 
 Wednesday, July 15, 2009
What is "news", and what is "unethical"?

This post from TechCrunch crossed my attention inbox today, and I find myself quite flummoxed on the subject of how I think I should react.

Assume you have managed, through no overt work on your part (meaning, you didn't explicitly solicit, ask, or otherwise endeavor to obtain), to get ownership of "hundreds of confidential corporate and personal documents" for a company. Assume further that these documents are genuine—there is little to no chance that they could have been forged or fabricated. The documents span a range of sensitivity, from documents that are "somewhat embarrassing to various individuals, but not otherwise interesting", to documents that "show floorplans and security passcodes to get into the Twitter offices", to documents "showing financial projections, product plans and notes from executive strategy meetings". In other words, documents that yes, could create a certain amount of havoc to the corporate entity in question, could embarrass individuals within (and not within) that company, and documents that could lead to a competitive advantage for the entity's competitors.

Now also assume, for the purpose of the discussion, that you are an entity whose business model or raison d'etre is to publish—you are a blogger, a "social networking maven", a media outlet, whatever.

Is it unethical to publish these documents? Is it simply trolling for hits? Is there a "journalistic responsibility" to publish this material?

The people from TechCrunch feel like they have a right/responsibility to publish at least some of the documents, and are unswayed by the arguments in the blog's comments about the morality of such a move, including such comments as "This is an a**hole move" and "there's still an appearance of lapse of ethics here" (and that's just within the first half-dozen comments or so". What is particularly interesting is the response from (someone I assume to be) one of the blog's owners:

lol. if we only posted things that companies gave us permission to post this would be a press release site and none of you would be here. News is stuff someone doesn’t want you to write. The rest is advertising.

This comment disturbs me on several levels—it's only news if it's "stuff someone doesn't want you to write"? That's a pretty shallow and narrowly-defined sense of the term, if you ask me, and it puts periodicals like National Enquirer and Star magazines on the same level as the New York Times and CNN. (Although, and I'll freely admit this, having just come through the Michael Jackson media blitz, sometimes it feels difficult to tell the difference between all four of those.)

At the same time, though, it's clear from our own history that journalism has served the public good by shining a bright light into shady corners that some powers-that-be would prefer left unexposed. The abuses described by Upton Sinclair in the turn-of-the-century factories, the rampant sexual harassment in the military exposed by the Tailhook scandal, and certainly the outright blatantly violent suppression of Civil Rights movement of the 60's in the South were all shining examples of journalism at its finest, showing off dark and ugly parts of the world and—either implicitly or explicitly—demanding society to acknowledge it and either openly accept it or strive to change it (with all three of my examples seeing society choosing the latter).

What is "journalistic responsibility" here?

In our chosen field—that of computer science and software—there is clearly a responsibility for those "in the know" to reveal scenarios where information is being purloined or made available that violates individuals' rights to privacy. It's one thing if I trade my personal sales habits to a grocery store chain in exchange for a percentage off the final sale. That's a choice I'm making, consciously and knowingly. (By this point, if you haven't figured that out, you're just deliberately hiding from the fact.) But for somebody else to disclose my purchasing history without my consent to another party, that's brushing a very ugly moral dark area. And if a company is choosing to take its customers' personal data and make it available for anyone else to use as they see fit—for whatever purpose that third party can imagine—then cheers and kudos to the whistle-blower who brings media attention on that behavior.

But Twitter doesn't have much of my personal data, and they certainly didn't give it away to anybody—it was stolen from them, according to what I've read so far. What's more, I don't really have that much personal data stored with them—certainly no credit cards, birthdates, financial or medical information, or even family notes. What's there is actually pretty tame, as a Twitter customer.

(Twitter employees are a totally different matter. Admittedly. But let's just stick with the Twitter customer data for now.)

So where is the "journalistic responsibility" in publishing this material?

And are bloggers journalists? Should they be held to the same standards as journalists? And if not, then with all these formerly print-only media moving to the Internet and putting more and more of their material online, where do we draw that line? What's the difference between Fareed Zakaria writing a column on Middle East affairs for Newsweek.com on a monthly basis and Joe Sixpack posting a monthly rant on the illegal and illicit activities of his hometown rival's sports team? Is it just the domain name? And if Joe Sixpack decides to say, point blank, "TechCrunch paid for that material, they hired the guy who broke into the Twitter offices and stole it" on his blog, what avenues does TechCrunch have to decry and/or reverse that trend?

For the record, I oppose what TechCrunch is doing except if there is some blatantly legal violation of consumers' privacy. Frankly, if the hacker had approached me with those documents, I'd be working with the FBI to see the guy tossed in jail, because folks, if he did it to them, he could just as easily do it to you.

But this still leaves the deeper question about where bloggers sit in the journalistic continuum, and I admit, I have a lot of mixed feelings on the subject.


Industry | Reading | Review | Social

Wednesday, July 15, 2009 1:35:50 AM (Pacific Daylight Time, UTC-07:00)
Comments [4]  | 
 Saturday, July 11, 2009
Thoughts on the Chrome OS announcement

Google made the announcement on Tuesday: Chrome OS, a "open source, lightweight operating system that will initially be targeted at netbooks."

Huh?

I'm sorry, but from a number of perspectives, this move makes no sense to me.

Don't get me wrong—on a number of levels, the operating system needs a little shaking up. Windows7 looks good, granted, Mac OS is a strong contender, and both are clearly popular with the consuming public, but innovation in the operating system seems pretty limited right now, to eye candy graphical window-opening/window-closing effects, different window decorations (title bars and minimize/maximize buttons), and areas along the edges of the screen to store icons. At no point has any of the last three or four OS releases from any of the major vendors—Microsoft, Apple, or the various Linux distros—really introduced anything novel, just infinite variations on a theme. Filesystems are still hierarchical, users still install and manage applications, and so on. In fact, arguably the most interesting development in operating systems has been the iPhone, and most of its innovations center around two things: the two-finger interface, and the complete mental reboot of what user interface looks and acts like.

Seriously, that's the best we can do?

I see a lot of room for improvements in the operating system experience; for starters, let's do away with the "browser" and just call Firefox, IE and Chrome what they're (far too slowly) evolving into: a generic application host. Get that story right—the acquisition of applications onto the device, the updating of those applications when new versions are available, the offline application experience, and so on—and the operating system and the browser will mesh into a seamless whole. But we're not there yet, not by a long ways, and the first competitor to create such an environment will have a huge advantage over its rivals. Arguably Apple got there first with the iPhone and AppStore, and yet the iPhone still needs iTunes running on a computer to make the experience seamless, and iTunes is definitely not what I call a seamless user experience.

(Besides, the iPhone is hamstrung on a number of levels—I would absolutely despise trying to write this blog post on it, for example.)

Despite the clear window of opportunity for an innovative operating system to step in and make some serious waves in the industry, Google producing an OS really doesn't make sense to me, for a number of reasons.

  • Challenging your opponent on your opponent's turf is never a good idea. A maxim of battle says that one should only battle on favorable terrain, yet Google's deliberately choosing to "cross the line", as it were, into territory that is clearly foreign to them. They have no expertise in marketing it, selling it, researching it, or developing it, while their competitors in this—Microsoft, Apple being the principal two—have been doing it for decades. Literally. I realize that Google has a number of smart people working for them, but it seems pretty presumptuous and arrogant to think they can get this story better, particularly in any kind of short term.
  • This is a difficult problem to tackle. Microsoft's known it for decades, Apple is discovering it all over again, and Linuxers have either wallowed in it as a sign of prowess or just accepted the problem as intractable—it's really hard to get an operating system to recognize the billions of different devices out there. Apple solved it by jealously and zealously chasing anyone who ever tried to run Mac OS on non-Apple hardware. Linux consumers found themselves recompiling kernels or in some cases, having to build device drivers themselves. Microsoft just suffered through it. For a new OS, the only path possible in the beginning is to support the 20% of the devices that 80% of the people use, and hope that nobody else tries a device that isn't on that list and blogs to tell about it. Unfortunately, the chosen target market (consumer netbooks) works against them here in a big way. With developers, it's pretty easy to say, "Sorry, guys, you know how it is, give us a few years, or contribute the patch yourself!"; with consumers, if their BuyMart-bargain-bin web cam doesn't work, it's Google's fault and they'll be up in the acne-spackled BuyMart counter boy's face about it. This will not persuade BuyMart to stock the Chrome-installed netbook for much longer.
  • Is this really the company that swore to "do no evil"? Google's announcement is vague on so many levels, it's almost a FUD play, or else they're trying to blatantly cash in on their "geek cred" to convince investors and analysts that they've finally found a new source of revenue to supplement AdWords. (Well, modulo the fact that this new OS will be open-source, which means it's not really a revenue play, but I'm sure they've got that figured out somehow, too.) Seriously, this doesn't make sense: if you're doing an open-source OS, then where is the source? Where is the transparency? Where is my ability to contribute despite my status as a non-Google developer? What part of this project is open-source in any sense of the term?
  • Netbooks? I realize that netbooks are the new hotness to a lot of people, a compromise between a phone/PDA and a laptop, and that the price point of the netbook means that for the first time, consumers can get into computing for under $250 (rivalling the price of game consoles) that addresses their fundamental needs—email, web surfing and maybe an application or two—but the timing here is just too late. Google's announcement says that "netbooks running Google Chrome OS will be available for consumers in the second half of 2010". Which means that the major competitors (mostly Windows) will have twelve months to convince netbook consumers that Windows (and Windows7, in particular) is the right choice to run the netbook, and Google will be starting from some distance behind the 8-ball. Chrome needs to be available now if they're going to avoid a long and entrenched battle starting from a position of weakness.
  • It's a distraction from their strength. Abraham Lincoln is famous for saying. "You cannot strengthen the weak by weakening the strong", but this represents Google's third or fourth effort into a space that really isn't leveraging their core strength (their ability to scale). Even if the money and resources spent on Chrome (and Android, for that matter) have zero effect on the budgeting and resourcing for Google App Engine and other server plays, the message and story that Google presents to the world is now as disjoint and multifaceted (and therefore harder to grasp) as Microsoft's.
  • Haven't we seen this before? Wasn't it almost a decade ago when another company announced a plan to unify the browser and the desktop? In that case, the world either yawned, rejected it outright ("I don't want to browse my desktop, damnit" was how one friend of mine put it), or sued them over it. Even if Google doesn't run afoul of the DOJ directly, Microsoft is going to love pointing to Chrome OS as clear indication of non-monopoly status the next time DOJ comes calling. If Google does manage somehow to annoy the DOJ antitrust personalities, well... let IBM and Microsoft tell you all about how much fun it is to try to innovate and bring products to market with lawyers looking over your shoulders.
  • Haven't we seen this before? Not too long ago, another vendor tried to go after the "you don't need an operating system" story... except they called it "The Network Is the Computer". All you Java developers, raise your hand. Anybody who doesn't have their hand raised, ask what happened to that vendor from any of the people with their hand in the air. Or ask an Oracle DBA.
  • Haven't we seen this before? Even more recently, another vendor made a play for the netbook+cloud story. All those who've heard of Cloud OS, raise your hand. Anybody who doesn't have your hand raised.... well, I wish I could tell you to go talk to the people with their hand raised, except I don't think anybody does.

This whole idea just feels badly-planned and not well thought-out. Let's see how it executes, so let's meet back here in a year and compare notes, but in the meantime, I'm not hanging up my Java or .NET tools any time soon.


.NET | Industry | Java/J2EE | Languages | Review | VMWare | WCF | Windows | XML Services

Saturday, July 11, 2009 1:37:01 AM (Pacific Daylight Time, UTC-07:00)
Comments [8]  | 
 Wednesday, July 01, 2009
Review: "Iron Python in Action" by Michael Foord and Christian Muirhead

OK, OK, I admit it. Maybe significant whitespace isn't all bad. (But don't let me ever catch you quoting me say that.)

The reason for my (maybe) shift in thinking? Manning Publications sent me a copy of Iron Python in Action, and I have to say, I like the book and its approach. Getting me to like Python as a primary language for development will probably take more than just one book can give, but... *shrug* Who knows?

Bear in mind, I have plenty of reasons to like IronPython (Microsoft's Python implementation for the .NET environment):

  • A good friend of mine, Harry Pierson (aka @DevHawk), is the PM on the IPy project, and I'm generally prejudiced in favor of those things that people I know and respect.
  • I'm generally a fan of dynamic languages, particularly those that let you do strange and twisted things to the type system and its instances at runtime. (Yes, I'm looking at you, ECMAScript...)
  • I spent some quality time with IronPython Studio last year while researching a Visual Studio Extensibility "Deep Dive" paper.
  • I've known Jim Hugunin (the creator of IronPython, and Jython before that) for some years, ever since his days working on AspectJ, and he's one of those scary-smart guys that, despite knowing they're scary-smart, still render me stunned when I listen to them.
  • I'm a huge fan of the DLR. It's like having Parrot, but without having to wait a decade (give or take).

But, just to counterbalance the scales, I have plenty of good reasons to dislike IronPython, too:

  • Significant whitespace.
  • The "There's only one way to do it" oath that Pythonistas seem to hold as religion. (Somebody told me that building C-Python—the original implementation—only works for you if you swear a holy oath to The One True Way on the One True Way Bible. Needless to say, I believe them, and have never tried to build C-Python from sources as a result.)
  • Significant whitespace.
  • Uh.... did I mention significant whitespace yet?

I admit, it was with some hesitation that I cracked open the book. Actually, to be honest, I was really ready to just take out all my dislike of significant whitespace and pour it into a heated, vitriolic diatribe on everything that was just wrong with Python.

And...?

Well, OK, I admit it. Maybe significant whitespace isn't all bad.

But this is a review of the book, not the technology. So, on we go.

What I liked about the book

  • The focus is on both .NET and Python, and doesn't try to short-change either the "Python"-ness or the ".NET'-ness by trying to be a "Python book (that happens to run on .NET)" or a ".NET book (that happens to use Python for code samples)". The authors, I think, did a very good job of balancing the two, making this the book to get if you're in that area on the Venn diagram where "Python" overlaps with ".NET".
  • Part 2, "Core development techniques", starts down the "feed you the Python Kool-Ade" pretty quickly, heading straight into Chapter 4 ("Writing an application and design patterns with IronPython") without much of a pause for breath. The authors get into duck typing, protocols, and Model-View-Controller within the first four pages, and begin working on a running example to highlight some of the ideas. (Interestingly enough, they also take a few moments to point out that IronPython on Mono works, and include a couple of screen shots to that effect as we go, though I personally wonder just how many people are really going down this path.) I like the no-holds-barred, show-you-the-code style, but only because they also take time throughout the prose to talk about some of the concepts at work underneath and laced throughout the code. "Show me then tell me" is a time-honored tradition, but too many authors forget the "tell me" part and stop with code. These guys do a good job of following through.
  • The chapters in Part 3, "IronPython and advanced .NET", form an interesting collection of how IronPython can fit into the rest of the .NET stack, demonstrating how to use IronPython with WPF, ASP.NET, and IronPython's crowning glory, Silverlight. If you're into front-end stuff, this is the section where I think you're going to have the most fun.
  • The chapters in Part 4, "Reaching out with IronPython", is I think the most important part of the book, showing how to extend IronPython (chapter 14) with C#/VB extensions (similar to how a C-Python developer would extend Python by writing C code, but much much simpler) and the opposite—how to embed IronPython inside of existing C#/VB applications (chapter 15), which is really an exercise in using the DLR Hosting APIs. While the discussion in chapter 15 is good, I wish it'd had a bit more thorough discussion of how the DLR could be hosted regardless of the scripting language, though I admit that's pretty beyond the scope of this book (which is focused, after all, entirely on IronPython, and as a result should stay focused on how to host IPy).

What I found "Meh" about the book

  • Part 1 ("A new language for .NET", "Introduction to Python", and ".NET objects and IronPythong") does a good job of bringing the rank beginner up to speed, getting some basic Python ideas across in the same breath that they bring .NET home. The only problem is, it only works well if you're neither a Python programmer nor a .NET programmer. Chapter 1, for example, does a sort of Cannonball-into-the-pool kind of dive into Python, but dives equally into the "Iron" parts as it does the "Python" parts. If you're either a Pythonista or a .NETter, I suspect you're going to be tempted to flip pages pretty quickly, and (I suspect) miss a few things. Chapter 2 is all about Python (meaning .NETters will probably spend some time here), but it certainly doesn't feel like an exhaustive reference, nor does Chapter 3 stand as an exhaustive discussion about all things .NET, either. I almost wish all three chapters had been collapsed into one—suffice it to say, I don't feel like I know the Python language, and don't feel like this book could be my Python reference next to me as I learn it, and I know that it's not a great .NET reference, either. Fortunately, the goal of these three chapters feels pretty clearly to be "Teach you just enough to make you dangerous (and able to understand the rest of the book)", and once we hit Part 2, rubber meets road pretty quickly.
  • By the time you hit Chapter 7, less than halfway through the book, the authors have created a fairly nice, if simplistic, application for later dissection, but it's not until you hit Chapter 7 that they begin to start unit-testing, even though they insist (on page 17) that "Dynamic language programmers are often proponents of strong testing rather than strong typing" (a quote they attribute to Bruce Eckel, though I'm relatively certain I heard Dave Thomas and Neal Ford say it with respect to Ruby, long before Eckel started "Thinking in Python... or Flex... or whatever"). If unit-testing is that important, why wait three chapters into the application's development before writing a single unit-test? This doesn't jibe with me, somehow.
  • If you're into back-end stuff, chapter 12 on "Databases and web services" is pretty bland. The fact that the two are combined into a single chapter is indicative, all by itself, of how deep or intensive the coverage goes, and there's zero mention of anything beyond basic ADO.NET. The coverage on web services covers REST relatively well, but there's zero coverage of WCF, and the whole of SOAP-based services is all of four or five pages. And Workflow? Doesn't exist, isn't even mentioned (except for an appearance in a table, "The major new APIs of .NET 3.0"). Yikes.

What I actively disliked about the book

Actually, not much. Manning did their usual superb job of arrowed callouts to point out particular concepts in the code listings, the copyediting is professional (meaning there's no obvious typos or misspellings that just break up the flow of prose, something that not all publishers seem to take seriously), and the graphics flow nicely alongside the prose, not dominating the page but accentuating it.

In fact, about the only thing I'd care to criticize is the huge number of footnotes, particularly in the first chapter. (By page 20 in the book, there have already been 30 footnotes.) When you have three footnotes per page, on average (and sometimes more), it does tend to distract, at least to me it does. It feels like there were ways, for most of them, to inject the idea or concept into the main prose, or leave it out entirely, but that could just be a difference of writing style, too.

Summation

If you're a .NET developer interested in learning/using IronPython on your next project, this is a definite winner. If you're a Python developer looking to see how to break into .NET, I'm not so sure this is your book, but I say that mostly because I'm not a Pythonista and can't really speak to how that mindset will find this as an introduction to the .NET space. My intuition tells me that this would be a good springboard into another book on .NET for the Python programmer, but I'll have to leave that to Pythonistas who've read this book to comment one way or another.


.NET | C# | Languages | Python | Reading | Review | Visual Basic | WCF | Windows | XML Services

Wednesday, July 01, 2009 2:00:14 AM (Pacific Daylight Time, UTC-07:00)
Comments [3]  | 
 Saturday, June 27, 2009
Review: "Programming Clojure", by Stu Halloway

(Disclaimer: In the spirit of full disclosure, Stu is a friend, fellow NFJS speaker, and former co-worker of mine from DevelopMentor.)

I present this review to you in two parts.

Short version: If you want to learn Clojure, and you're familiar with at least one programming language, you'll find this a great resource. If you don't already know a programming language, or if you already know Clojure, or if you're looking for "best practices" to cut-and-paste, you're going to be disappointed.

Long version: Recently, fellow NFJS speaker Stu Halloway decided to take up a new language, and came to Clojure. He found the language interesting enough to write a book on it, something he hasn't done since his Java days, and the result is a nice walk through the language and its environment for experienced Java developers who want to understand Clojure's language, concurrency concepts, and programming model.

Now, let's be 100% honest about this: if you're coming at this book expecting it to be a language reference, you will probably be disappointed (as this guy obviously is). Stu's not like that—he's not going to re-create material that's available elsewhere, or that can be found with an easy Google search. Stu will not waste your time that way—he wants to tell you a story, one that takes you from "I'm a Java guy, but clueless about Lisp, dynamic languages, functional programming, concurrency, or macros" to "Wow. I know kung-fu." in the shortest path possible, but without trying to lobotomize you. He wants—no, expects—the readers of his book to be propping the text open with a cell phone on one side and the dinner plate on the other, craning your neck over to scan the pages and type in the examples into the REPL shell to try them out, see them work, then spend a few minutes experimenting with them before moving on to the next paragraph or page.

(Oh, I suppose you could just cut and paste them from the PDF version of the book, but where's the fun in that?)

The fact is, the concepts behind Clojure make up what's important to learn here, and readers of this book will come away like the panda from the movie, realizing that "There is no Secret Ingredient", that the power of Clojure comes not from its super-secret language sauce or special libraries, but in the way Clojure programmers approach problems and think about programming. And for that reason, if you're a programmer—even if you don't program on the JVM—you really want to take a look at what Stu's talking about (and Rich Hickey is creating).

Just remember, cellphone and dinner plate. Otherwise you'll be missing out on so much.


.NET | C# | C++ | F# | Java/J2EE | Languages | Reading | Review | Ruby | Scala | Visual Basic

Saturday, June 27, 2009 10:34:56 PM (Pacific Daylight Time, UTC-07:00)
Comments [0]  | 
 Sunday, June 14, 2009
The "controversy" continues

Apparently the Rails community isn't the only one pursuing that ephemeral goal of "edginess"—another blatantly sexist presentation came off without a hitch, this time at a Flash conference, and if anything, it was worse than the Rails/CouchDB presentation. I excerpt a few choice tidbits from an eyewitness here, but be warned—if you're not comfortable with language, skip the next block paragraph.

Yesterday's afternoon keynote is this guy named Hoss Gifford — I believe his major claim to fame is that viral "spank the monkey" thing that went around a few years back.  Highlights of his talk:

  • He opens his keynote with one of those "Ignite"-esque presentations — where you have 5-minutes and 20 slides to tell a story — and the first and last are a close-up of a woman's lower half, her legs spread (wearing stilettos, of course) and her shaved vagina visible through some see-thru panties that say "drink me," with Hoss's Photoshopped, upward-looking face placed below it.
  • He later demos a drawing tool he has created (admittedly with someone else's code) and invites a woman to come up to try it.  After she sits back down, he points out that in her doodles she's drawn a "cock."
  • Then he decides he wants to give a try at using the tool to draw a "cock" (he loves this word) — and draws a face, then a giant dick (he redraws it three times) that ultimately cums all over the face.
  • A multitude of references to penises and lots of swearing — and also "If you are easily offended, fuck you!"
  • And then, to top it off, a self-made flash movie of an animated woman's face, positioned as if she's having sex with you, who gradually orgasms based on the speed of your mouse movement on the page.

Wow. Just... wow. To call this unprofessional smacks of calling Hitler a "socially awkward individual"... or using a euphemism like "mild medical condition" to refer to death. This is so far "over the line" that it's unbelievable. Even Mr. Aimonetti's "CouchDB" presentation, as bad as it was, at least tried to tie the analogy together in a meaningful, if offensive, way. This is just male posturing at its worst. (I'm shocked Hoss didn't whip off his pants and demand the women in the room bow down in worship to his obviously superior manhood.)

Fortunately, according to the source, the conference organizer seems to be pretty responsive, so kudos to the one adult in the room, but....

What's worse, apparently the presenter and more than a few of his pals are (in the best traditions of assholery) blatantly unrepentant about the whole thing, claiming the moral high ground in much the same way that the Rails idiots did—it's all in good fun, if you don't find it funny you're a prude, and so on:

I checked Twitter (hashtag #flashbelt) to see what the responses were.  Here are some notable remarks:

  • Fonx is reading the #flashbelt rants on Hoss offending the ladies w/ a few swear words & a penis drawing - r u really that prudish & sexist?
  • nthitz lol @hoss69 "If you are easily offended, fuck you" #flashbelt
  • livenootrac Ladies of #flashbelt , I am sorry for the Hoss preso, but in the flash community he gets a pass, kinda like Don Rickles - that's just Hoss.
  • CujoJpn @livenootrac And there were many ladies at #flashbelt who were offended by Hoss' Preso some were thick skinned and took it as is.

So, if you didn't like it then
a) you are a prude - and sexist (?)
b) fuck you
c) suck it because Hoss gets a pass here in the boy's club known as "the flash community" and
d) you are a wimpy girl who isn't strong enough / man enough / "thick-skinned" enough  to deal with it.

Even more... wow. Talk about justification and marginalization. Amazing.

Before I figuratively smack this Hoss guy around the blog for a while, let's take a brief moment for reflection—what's going on here? Why all the misogynistic presentations recently? Is this reflective of a general trend in the programming industry? Of society in general? Is the world coming to an end?

A few possibilities present themselves:

  • The lack of women in the IT industry means there's nobody around to act as a "gender filter" to keep things on an even keel. In other words, the genders constantly filter themselves based on the company they keep, and because the boys who put these presentations together don't have female input, they simply don't know where to draw the line for mixed company. This theory also presumes that an industry that's made up primarily of women will also lack such a filter and "girls will be girls" as a result. Unfortunately I have no good counterexamples at hand to examine—anybody know of an industry populated primarily by women, and can weigh in with experience there? The closest I get is my brief experience working in a restaurant with an almost-all-woman serving staff, and from that brief experience, yep, the theory holds. Solution? Easy: get more women in IT, and things will re-balance themselves naturally.
  • Programmers are principally males who have no redeeming social skills. In other words, the industry gathers up exactly the kind of men who find objectifying women and reveling in late-acquired testosterone overdoses to be gratifying, and this kind of behavior is the result. If true, it leads to the conclusion that programmers are no more evolved than the Navy sailors involved in the Tailhook scandal of a few years ago. So go ahead, smack your wives and girlfriends around a little if they get a little "uppity", it's OK, 'cuz u r a l33t d00d. Personally? I find the idea ludicrous—there is definitely a strong antisocial streak that runs through the IT ecosystem (how many of you met your friends via World of Warcraft again?), but like all stereotypes, there's some elements of truth to it, and a lot of exaggeration. And frankly, anybody who believes in this theory is welcome to come with me to dinner at a No Fluff Just Stuff show and meet the other speakers, and listen in on our "boys club" conversations, including questions like, "Which movie best represents the book it was made after?" and "If given a mandate to create a programming language, what language would your language most resemble?". Oh, and the odd fart joke. We are boys, after all.
  • We're hypersensitive to the subject right now. In other words, these kind of presentations have always been going on, and it's just that we notice them now, in the same way that you notice a particular brand of car on the road a lot more when you're thinking about buying that brand and model of car. Frankly, I don't buy this argument—I've been to a lot of presentations over the past decade, and I've never seen any that were anything like this.
  • This is the YouTube generation, with access to everything the Internet has to offer, and this is "just how they do things". After all, how much maturity, sexual discretion and adult behavior can we expect of the generation that gave us "Girls Gone Wild" and its ilk? It's just a "generation gap" thing, and we old fogies who didn't grow up with Internet porn just a browser-click away just don't "get it". Hmm.... somehow, I just don't buy it. Sure, there may be some elements of this involved here (I'm really curious to see what all these "Girls Gone Wild" girls are going to say to their own daughters in a decade or so...), but I think that's too easy an answer, and an eminently unhelpful one.
  • We have copycatters out there trying to follow the path of people they respect. If you're looking up at this Hoss character and thinking, "I want to be just like him!", you really should see a therapist and develop a sense of self, before you find yourself without friends. Hoss gets a pass because of your misguided fan-boi hero-worship. So does Paris Hilton. You want to be the Paris Hilton of your social circle? Go for it. After all, she's highly respected and loved, right? Take a clue from the next car wreck you drive past—everybody's slowing to look not because they wish they were in the body bag, folks, but because we have a ghoulish fascination with it. In the case of Ms. Hilton, that ghoulish fascination is with those who self-destruct in spectacular fashion. (Me, I'd love to be the fly on the wall at the Hoss residence when he tries to explain this whole thing to his daughter or his date/girlfriend/wife, if he ever finds one.)
  • The presenters taking this tack are looking for an easy path to fame. In the grand traditions of Andrew Dice Clay ("Oh!"), the easiest way for a presenter to "stand out" from the rest of the crowd of presenters is to do something outrageous and call it "edgy", and stake out a claim on the edge of the civilization, rather than try to integrate with the rest of the crowd and build something up slowly. Don Box has already claimed "HTTP is dead", I made the analogy between a technology and a military conflict, and Matt Aimonetti claimed a data storage framework "performs like a pr0n star", so what's left but to stake out ground even further out on the fringe and just be misogynistic? Fortunately, history suggests that people with content-free/shock-heavy presentations (or even content-heavy/shock-heavy ones) don't go the distance, so to speak, and that once there's nowhere more shocking left to go, the audience comes back to the content-heavy/shock-light discussions and stays there for a while. Unfortunately, this means we're going to have to suffer through somebody's "Live YouPorn filming" talk first, which I'm not looking forward to.

And now for the smacking around... but you know, I suddenly realize that the volume of comments on the original post leave with nothing to do or say that's not already being said, so to just "pile on" would only serve to let me vent, and I have other outlets for that. But it would be inappropriate to just "walk away", so to speak, so with that in mind....

Hoss, you're an idiot. Like any sprinter, you're going to head up the pack for a bit, but soon enough, your "shtick" is going to flame out and you'll be left behind with all the other "shock jocks" of the 80's who found their material unwelcome after a while. So enjoy the spotlight (such as it is) while you can. In the meantime, I'm off to revise a few presentations, and stick with solid ideas and analogies, and maybe dropping the odd F-bomb when I want to make a point, just for emphasis, because I know something you apparently don't:

Shock makes a point because of the contrast to the rest of the talk, not because of its inherent "edginess".

Meanwhile, by all means, continue to be an idiot. You just make me look better by comparison, for which I thank you.


.NET | C# | C++ | Conferences | F# | Flash | Industry | Java/J2EE | Languages | Mac OS | Review | Ruby | Scala | Social | Windows

Sunday, June 14, 2009 3:17:44 PM (Pacific Daylight Time, UTC-07:00)
Comments [22]  | 
 Friday, May 15, 2009
TechEd 2009 Thoughts

These are the things I think as I wing my way out of LA fresh from this year's TechEd 2009 conference:

  • I think I owe the attendees at DTL309 ("Busy .NET Developer's Guide to F#") an explanation. It's always embarrassing when your brain freezes during a presentation, and that's precisely what happened during the F# talk—I completely spaced on the syntax for implementing an interface on a class in F#. (To the attendees who commented "consider preparing a bit better so you dont forget the sintax :)" and "Not remembering the language syntax sorta comes across bad doesn't it?", you're absolutely right, which prompts this next sentence.) I apologize profusely to those who were there—I just blew it. For the record, the missing syntax looks like this:
    #light

    type IStudy =
    abstract Study: string -> unit

    type Person(firstName : string, lastName : string, age : int) =
    member p.FirstName = firstName
    member p.LastName = lastName
    member p.Age = age
    override p.ToString() =
    System.String.Format("[Person: firstName={0}, lastName={1}, age={2}]",
    p.FirstName, p.LastName, p.Age);

    type Student(firstName : string, lastName : string, age : int, subject : string) =
    inherit Person(firstName, lastName, age)
    interface IStudy with
    member s.Study(sub : string) =
    System.Console.WriteLine("Hey, Ma, I'm studying {0}!", sub)
    member s.Subject = subject
    override s.ToString() =
    System.String.Format("[Student: " + base.ToString() + " subject={0}]", s.Subject);

    Truth is, though, right now not a lot of people (myself included) are writing types that formally implement a given interface—the current common practice appears to be an object expression instead, something along these lines:
    let monkey =
    { new IStudy with
    member p.Study(subject : string) =
    System.Console.WriteLine("Oook eeek aah aah {0}!", subject) }
    monkey.Study("Visual Basic")

    In this way, the object handed back still implements the interface type that the client wants to call through, but the defined type remains anonymous (and thus provides an extra layer of encapsulation against implementation details leaking out). The most frustrating part about that particular snafu? I had a Notepad window open with some prepared code snippets waiting for me (a fully-defined Person type, a fully-defined Student type inheriting from Person, and so on) if I needed to grab that code because typing it out was taking too long. Why didn't I use it? I just forgot. Oy.....
  • Clearly Microsoft is thinking big things about Azure. There were a lot of sessions around Azure and cloud computing, far more than I'd honestly expected, given how new (and unreleased) the Azure bits are. This is a subject I would have expected to see covered this deeply at PDC, not TechEd.
  • TechEd Speaker Idol is a definite win, to me. I watched the final round of Speaker Idol on Thursday night (before catching the redeye out to Atlanta for the NFJS show there this weekend), and quite honestly, I was blown away by the quality of the presentations—they were all of them better than some of the TechEd speakers I'd seen, and it was great to hear that not only will the winner, who did a great presentation on legacy application support in Windows7 (and whose name I didn't catch, sorry) be guaranteed a slot at TechEd, but I overheard that the runner-up, a Polish security expert who demoed how to break Process Explorer (in front of Mark Russinovich, no less!), will also be speaking at TechEd Berlin this year.
  • As always, the parties at TechEd were where the real value lay. This may seem like an odd statement to those whose heads are a bit full right now from five days' worth of material (six, if you attended a pre-con), but remember that I'm a speaker, so the sessions aren't always as useful as they are to people who've not seen this content before (or have the kind of easy access to the people building it and/or presenting it that I'm fortunate and privileged to have). Any future attendees should take serious note, though: networking is a serious part of this business, and if you're not going out to the parties (or creating a few of your own while you're there) and handing out business cards left and right, you're missing a valuable opportunity.
  • I'm looking forward to TechEd 2010. Particularly because, thanks to a few technical snafus, I had the chance to sit down with the folks who organize and run TechEd and vent for a little bit about everything I found annoying (as a speaker). Not only were my comments not blown off, but it started a really productive discussion about how to make the behind-the-scenes experience for the TechEd speakers a more pleasant and streamlined one. What's more, we're planning to revisit some of these discussions in the months to come as they start their preparations for TechEd 2010 (in New Orleans). I'm looking forward to those conversations and (hopefully) helping them eliminate some of the awkwardness that I've seethed over in the past.

New Orleans in the summer will not be an entirely wonderful experience (I'm told it gets monstrously humid there in the summers, but it can't be any worse than Orlando is/was), but I'm honestly very curious to get back there to see what post-Katrina New Orleans looks and feels like, and to maybe do my (very little) part to help the area claw its way back by maybe staying an extra day or two and taking in some of the sights. (I'm hoping that Sara Ford will be willing to act as tour guide.....)

In the meantime, thanks to all of you who came, and remember—if you attended a talk and you want to say "thanks" to the speaker who gave it, the best way is to take the five minutes to fill out the evals for that talk. (Speaking personally, I don't even care so much about the scores you give me, but the comments are absolutely invaluable.)

See y'all next year!


.NET | C# | Conferences | F# | Languages | Review | Visual Basic | WCF | Windows | XML Services

Friday, May 15, 2009 8:18:19 PM (Pacific Daylight Time, UTC-07:00)
Comments [5]  | 
 Saturday, May 02, 2009
Windows 7 RC install experience

Since a number of people have been connecting to my blog via my last post on installing Windows 7 into a VMWare image, I thought since the Windows7 RC is now available, I'd update my experiences with installing it.

I downloaded the Windows7 RC ISO image (a freakishly hideous name containing every character on my US keyboard, plus a few in Klingon, I think.... if you can stand it, the full name of the ISO is 7100.0.090421-1700_x86fre_client_en-us_Retail_Ultimate-GRC1CULFRER_EN_DVD) from the Microsoft CONNECT website, not bothering with any of the other images (x64, ia64, and a "server" image I've not explored yet), using Microsoft's File Transfer Manager. (I know, I know, somebody's going to complain again about the ISOs not being available via a straight HTTP download or Torrent, but this is just an RC release, folks, and this is ostensibly to Microsoft-friendly customers who already have the FTM utility installed.) Took about 3+ hours to download on my home connection... or so it claimed. I went to bed after starting it last night. It was done when I woke up. What more do you want from me?

I created a new VMWare image, as a "Windows Vista" VM with 1GB RAM, a 60GB IDE hard disk (by default Fusion wants to create a 40 GB SCSI disk, but IDE seems to play nicer with the early betas of Microsoft OS'es, and I made it all one file rather than Fusion's default "Split into 2GB files" option), with the experimental 3D graphics turned on, battery status turned off, and (this is HUGE) the "Allow your Mac to open applications in the virtual machine" option turned OFF. Can't repeat this enough, for ANY VMWare VM containing Windows inside of it, turn off that option—leaving it on sucks up HUGE amounts of CPU time. (It's barely documented, and only determined Googling found that this was what was rendering my VMWare Fusion 2 images all but unusable.)

I attached the ISO to the VMWare CD and turned the thing loose. It takes a while, but so long as the ISO file and the VMWare VMDK disk file are on separate drives, the perf isn't too bad—roughly twenty minutes later (or, as I measure things, one randomly-generated map game of Pax Galaxia later), the image had installed all the core files on the VM disk, restarted itself, finished the installation, and restarted itself again. (I have no idea why Win7 wants to reboot itself twice during the install—if I remember the Vista installs correctly, it only restarts once). As I write this, I'm starting at the "Setup is preparing your computer for first use" screen with the funky Cylon-like flashing bar underneath the text (I'm serious, it really looks like the graphic artists at Microsoft are paying homage to BSG during that Setup screen). Whoops, I take it back—got through that screen rather quickly, and now we're into the username/password/product key stage. Plug that in, set the Update policy, the date and time, the network defaults (Public Location for all my VMs, just because), and.... "Welcome".

There's no Step Four. Although, according to Windows Update, there's already an update for Windows7 that should be downloaded and installed. *grin* Actually, it seems like the driver it installed was for the VMWare virtual sound device, which normally doesn't kick in until I install the VMWare Tools. It tells me that this is an "Unsupported Creative Sound Device", however, so maybe it's an older driver. *shrug* Not sure, don't care, because my next step is....

Install the VMWare Tools. I install VMWare Tools in the image, after the Update is complete. (No restart was required, so why not?) Actually, let me rephrase that—I tried to install the VMWare tools, but when I selected it from the Fusion menu bar... nothing happened. Hmm. OK, let's do the restart and see what happens. VM shuts down quickly enough (no having to wait for updates to finish, which was somewhat annoying with Vista), and when I restart, it seems to restart quickly enough (again, no obvious updates to be installed), so I get to a working desktop (640x480, how did we ever think this was reasonable?!?), and try the Install VMWare Tools option from the Fusion menubar again. It thinks for a bit, and the cursor flashes to the "pointer-with-CD" icon for a second before flashing back, but after a few seconds, the "What do you want to do?" (Autoplay) menu pops up as if I'd slipped the CD into the drive, so all looks good. Go through the UAC "Continue/Cancel" dialog (see below), choose "Complete" for the VMWare Tools install options, and let 'er rip. Disks spin, lights flicker, and a "VMWare Shared Tools" network folder shows up on the Win7 desktop, indicating that it's suddenly discovered the Shared Folder (to my MacOS user account) is there. But now we're back to the Windows-display-exercise program, which leads me to believe that it's the VMWare driver that's doing the exercise, not Windows itself. (VMWare? Anybody listening and care to comment?)

And now I'm into Win7 desktop customization steps, things like display sizing and desktop icon selection, background image, and all that other jazz that you probably don't care about. (If you do, then I'm a bit worried about you—be an individual! Choose your own settings!) All in all, pretty flawless and smooth.

Thoughts on the process:

  • It feels like we're getting away from the "minimal install" process that Vista tried to create. For a while, there was a meme that said that installing Windows was too hard for the average person, and Microsoft promised to reduce the number of steps it had to go through to install the OS. Take the date/time screen, for example: it picked up the defaults from the underlying (virtual) hardware, why not just assume those and skip that step? Users can always change it later.
  • I still have to set a Administrator password. I know that Microsoft is trying to find that sweet-spot balance between "too secure" and "unsecure" for desktop operating systems, but I have to hand it to the Ubuntu folks here—the "passwordless root" idea that they use is pretty slick. MacOS uses it (for the most part) in places, as well. I like the balance that approach achieves: it forces the user to enter "superuser" mode to do something sensitive, but it isn't challenging for a password (unless the superuser installs one) every time.
  • It's not going through display-screen calesthenics on each startup with this build. My previous Win7 image, every time I restart the VM, goes through every possible video/monitor size combination before settling in on the resolution I established in the last session. That was a bit disconcerting, until I realized that it's Windows trying to get some exercise in to be less overweight. *grin*
  • What, no PowerShell installed by default? Either it's not there, or it's buried pretty deeply. Command Prompt (cmd.exe) is right where it's always been, under Accessories, but no PowerShell.... Whoops, no, I take it back, it's in a folder underneath Accessories, forcing one more click to get to it. Hey Microsoft: do me a favor and pin that guy to the Start Menu. Make it easy for me to use, if you really want me to believe that this is supposed to replace Command Prompt someday.
  • On that note, though, the PowerShell "ISE" (Interactive Scripting Environment) is an interesting and new toy to play with.
  • "Pin to Taskbar" is an interesting option that I'm going to have to play around with. Not being a huge MacOS Dock fan (which is pretty clearly the inspiration for the new Taskbar), I'm not sure how well I'll like the new "it's the QuickLaunch and the Taskbar combined" idea.

Overall, I'm looking forward to putting a few things into this image (VS 2008, VS 2010, Office, and so on) and seeing how it reacts. As always, your mileage may vary, no implied warranties with this blog post, blah blah blah, but if you do anything with the Windows OS, you really should get hold of the RC (build 7100) and put it into a Virtual PC, VMWare, VirtualBox, Xen or some other virtualized box to play with. Like it or not, it's entirely reasonable to believe that Windows7 is going to win a few folks back from the Vista "less-than-I-expected" crowd.

As always, caveat emptor, and feel free to comment....


.NET | C# | F# | Industry | Review | Windows

Saturday, May 02, 2009 12:18:20 AM (Pacific Daylight Time, UTC-07:00)
Comments [2]  | 
 Tuesday, January 13, 2009
Windows7 VM, pre-built

I'm getting *hammered* by the Google "Windows7 VMware" hits, which I can only assume is from people looking for hints and advice on installing Windows7 into a VMWare image, and I feel compelled to point out that there's already a pre-built VMWare VM available from the "Virtual Appliance" pages at VMware.com; currently, it resides here. Note that you will need to BitTorrent it down, I haven't found a straight HTTP download link from that (off-vmware.com) site.

I wish I knew the full legalities of making said VM available; I only hope that the guys who are doing it have checked, but if you're at all concerned about such things, trust me, it's pretty painless to install Win7 into your own VM of your own making.


Review | VMWare | Windows

Tuesday, January 13, 2009 2:55:08 AM (Pacific Standard Time, UTC-08:00)
Comments [2]  | 
 Sunday, January 04, 2009
"Pragmatic Architecture", in book form

For a couple of years now, I've been going around the world and giving a talk entitled "Pragmatic Architecture", talking both about what architecture is (and what architects really do), and ending the talk with my own "catalog" of architectural elements and ideas, in an attempt to take some of the mystery and "cloud" nature of architecture out of the discussion. If you've read Effective Enterprise Java, then you've read the first version of that discussion, where Pragmatic Architecture was a second-generation thought process.

Recently, the patterns & practices group at Microsoft went back and refined their Application Architecture Guide, and while there's a lot about it that I wish they'd done differently (less of a Microsoft-centric focus, for one), I think it's a great book for Microsoft-centric architects to pick up and have nearby. In a lot of ways, this is something similar to what I had in mind when I thought about the architectural catalog, though I'll admit that I'd prefer to go one level "deeper" and find more of the "atoms" that make up an architecture.

Nevertheless, I think this is a good PDF to pull down and put somewhere on your reference list.

Notes and caveats: Firstly, this is a book for solution architects; if you're the VP or CTO, don't bother with it, just hand it to somebody further on down the food chain. Secondly, if you're not an architect, this is not the book to pick up to learn how to be one. It's more in the way of a reference guide for existing architects. In fact, my vision is that an architect faced with a new project (that is, a new architecture to create) will think about the problem, sketch out a rough solution in his head, then look at the book to find both potential alternatives (to see if they fit better or worse than the one s/he has in her/his head), and potential consequences (to the one s/he has in her/his head). Thirdly, even if you're a Java or Ruby architect, most of the book is pretty technology-neutral. Just take a black Sharpie to the parts that have the Microsoft trademark around them, and you'll find it a pretty decent reference, too. Fourthly, in the spirit of full disclosure, the p&p guys brought me in for a day of discussion on the Guide, so I can't say that I'm completely unbiased, but I can honestly say that I didn't write any of it, just offered critique (in case that matters to any potential readers).


.NET | C# | C++ | F# | Flash | Java/J2EE | Languages | Reading | Review | Ruby | Visual Basic | Windows | XML Services

Sunday, January 04, 2009 6:30:53 PM (Pacific Standard Time, UTC-08:00)
Comments [2]  | 
 Tuesday, November 25, 2008
Dustin Campbell on the Future of VB in VS2010

Dustin Campbell, a self-professed "IDE guy", is speaking at the .NET Developer's Association of Redmond this evening, on the future of Visual Basic in Visual Studio 2010, and I feel compelled, based on my earlier "dissing" of VB in my thoughts of PDC post, to give VB a little love here.

First of all, he notes publicly that the VB and C# teams have been brought together under one roof, organizationally, so that the two languages can evolve in parallel to one another. I have my concerns about this. Frankly, I think the Managed Languages team at Microsoft is making a mistake by making these two languages mirror images of one another, no matter what their customers are telling them; it's creating an artificial competition between them, because if you can't differentiate between the two on a technical level, then the only thing left to differentiate them on is an aesthetic level (do you prefer curly braces and semicolons, or keywords?). Unfortunately, the market has already done so, to the tune of "C# developers make more than VB developers do (on average)", leaving little doubt in the minds of VB developers where they'd rather be... and even less doubt in the minds of C# developers where they'd rather the VB developers remain, lest the supply and demand curves shift and move the equilibrium point of C# developer salaries further south.

Besides, think about this for a moment: how much time and energy has Microsoft (and other .NET authors) had to invest in making sure that every SDK and every article ever written has both C# and VB sample code? All because Microsoft refuses to simply draw a line in the sand and say, once and for all, "C# is the best statically-typed object-oriented language for the CLR on the planet, and Visual Basic is the best dynamically-typed object-oriented language for the CLR on the planet", and run with it. Then at least there would be solid technical reasons for using one or the other, and at least we could take this out of the realm of aesthetics.

Or, contrarily, do the logical thing and create one language with two parsers, switching between them based on the file extension. That guarantees that the two evolve in parallel, and releases resources from the languages team to work on other things.

Next, he shows some simple spin-off-a-thread code, with the Thread constructor taking a parameter to a function name, traditional delegate kinds of stuff, then notes the disjoint nature of referencing a method defined elsewhere in the class but only to be used once. Yes, he's setting up for the punchline: VB gets anonymous methods, and "VB's support for lambda (expressions) reaches parity with C#'s" in this next release. I don't know if this was a feature that VB really needed to get, since I don't know that the target audience for VB is really one that cares about such things (and, before the VB community tries to lynch me, let me be honest and say that I'm not sure the target audience for C# does, either), but at least it's nice that such a powerful feature is now present in the VB language. Subject to the concerns of last paragraph, of course.

Look, at the end of the day, I want C# and VB to be full-featured languages each with their own raison d'etre, as the French say, their own "reason to be". Having these two "evolve in parallel" or "evolve in concert" with one another is only bound to keep the C#-vs-VB language wars going for far too long.

Along the way, he's showing off some IDE features, which presumably will be in place for both C# and VB (since the teams are now unified under a single banner), what he's calling "highlights": they'll do the moral equivalent of brace matching/highlighting, for both method names (usage as well as declaration/definition) and blocks of code. There's also "pretty listing", where the IDE will format code appropriately, particularly for the anonymous methods syntax. Nice, but not something I'm personally going to get incredibly excited about--to me, IDE features like this aren't as important as language features, but I realize I'm in something of the minority there, and that's OK. :-)

He demonstrates VB calling PLINQ (Parallel LINQ), pointing out some of the inherent benefits (and drawbacks) to parallelism. This isn't really a VB "feature" per se. <<MORE>>

Now he gets into some more interesting stuff: he begins by saying, "Now let's talk about the Dynamic Language Runtime (DLR)." He shows some VB code hosting the IronPython runtime, simple boilerplate to get the IronPython bits up and running inside this CLR process. (See the DLR Hosting Spec for details, it's pretty straightforward stuff: call IronPython.Hosting.Python.CreateRuntime, then call GetEngine("python") and SetSearchPaths() to tell IPy where to find the Python libs and code.) Where he's going with this is to demonstrate using VB's late-binding capabilities to get hold of a Python file ("random.py", using the DLR UseFile() call), and he dynamically calls the "shuffle" function from that Python file against the array of Ints he set up earlier.

(We get into a discussion as to why the IDE can't give Intellisense on the methods he's calling in the Python code. I won't go into the details, but essentially, no, VS isn't going to be able to do that, at least not for this scenario, any time soon. Maybe if the Python code was used directly from within VS, but not in this hosted sense--that would be a bit much for the IDE to analyze and understand.)

Next he points out some of the "ceremony" remaining in Visual Basic, essentially showing how VB's type inferencing is getting better, such as with array literals, including a background compilation warning where the VB compiler finds that it can't find a common type in the array literal declaration and assumes it to be an array of Object (which is a nice "catch" when the wrong type shows up in the array by accident or typo). He shows off multidimensional array literal and jagged array literal syntax (which requires the internal array literals in the jagged array to be wrapped up in parentheses, a la "{({1,2,3}), ({1, 2, 3, 4, 5})}", which I find a touch awkward and counterintuitive, quite frankly), while he's at it.

(We get into a discussion of finer-granularity color syntax highlighting options, such as colorizing different keywords differently, as well as colorizing different identifiers based on their type. Now that's an interesting idea.)

By the way, one thing that I've always found interesting about VB is its "With" keyword, a la "New Student With {.Id=101, .Name="bart", .Score=53, .Gender="male"}".

He then shows how VB 10 has auto-implemented properties: "Property Gender As String" does exactly what .NET programmers have had to do by hand for so long: create a field, generate simple Get and Set blocks and so on. Another nice feature of this: the autogenerated properties can have defaults, as in, "Public Property Age As Integer = 1". That's kinda nice, and something that VB should have had years ago. :-)

And wahoo! THE UNDERSCORE IS (almost) HISTORY! "Implicit line completion" is a feature of VB 10. This has always plagued me like... well... the plague... when writing VB code. It's not gone completely, there's a few cases where ambiguity would reign without it, but it appears to be gone for 95% of the cases. Because this is such a radical change, they've even gone out and created a website to help the underscores that no longer find themselves necessary: www.unemployedunderscores.com .

He goes into a bit about co- and contravariance in generic types, which VB now supports more readily. (His example is about trying to pass a List(Of Student) into a method taking a List(Of Person), which neither he nor I can remember if it's co- or contra-. Sorry.) The solution is to change the method to take an IEnumerable(Of Person), instead. Not a great solution, but not a bad one, either.


.NET | C# | Conferences | Languages | Review | Visual Basic | Windows

Tuesday, November 25, 2008 12:23:48 AM (Pacific Standard Time, UTC-08:00)
Comments [3]  | 
 Monday, November 03, 2008
More PDC 2008 bits exploration: VisualStudio_2010

Having created a Window7 VMWare image (which I then later cloned and installed the Windows7 SDK into, successfully, wahoo!), I turned to the Visual Studio 2010 bits they provided on the hard drive. Not surprisingly, though a bit frustratingly, they didn't give us an install image that I could put into a VMWare image of my own creation, but instead gave us a VPC with everything pre-installed in it.

I know that Microsoft prefers to promote its own products, and that it's probably a bit much to ask them to provide both a VMWare image and a VirtualPC image for these kind of pre-alpha things, but it's a bit of a pain considering that Virtual PC doesn't run anymore on the Mac, that I'm aware of. Please, Microsoft, a lot of .NET devs are carrying around MacBookPro machines these days, and if you're really focused on trying to get bits in the hands of developers, it would be quite the bold move to provide a VMWare image right next to the VPC image. Particularly since over half the drive was unused.

So... I don't want to have to carry around a PC (though I do at the moment) just to run VirtualPC just to be able to explore VS 2010, but fortunately VMWare provides a Converter application that can take a VPC image and flip it over to a VMWare image. Sounds like a plan. I fire up the Converter, point it at the VPC, and after the world's... slowest... wizard... takes... my... settings... and... begins... I discover that it will take upwards of 3 hours to convert. Dear God.

I decided to go to bed at that point. :-)

When I woke up, the image had been converted successfully, but I wasn't quite finished yet. First of all, fire it up to make sure it runs, which it does without a problem, but at 640x480 in black-and-white mode (no, seriously, it's not much more than that). Install the VMWare Tools, reboot, and...

... the mouse cursor disappears. WTF?!?

Turns out this has been a nagging problem with several versions of VMWare over the years, and I vaguely remember running into the problem the last time I tried to create a Windows Server 2003/2008 image, too. Ugh. Hunting around the Web doesn't reveal an easy solution, but a couple of things do show up a few times: disconnect the CD-ROM, change the mouse pointer acceleration, delete the VMWare Mouse driver and let Windows rediscover the standard PS/2 mouse driver, or change the display hardware acceleration.

Not being really interested in debugging the problem (I know, my chance at making everybody's life better is now forever lost), I decided to take a bit of a shotgun approach to the problem. I explicitly deleted the VMWare Mouse driver, fiddled with the display settings (including resizing it to a more respectable 1400x1050), turned display hardware acceleration down, couldn't find mouse hardware acceleration settings, allowed it to reboot, and...

... yay. I have a mouse pointer again.

Now I have a VS2010 image on my Drive-o'-Virtual-Machines, and with it I plan on exploring the VS2010/C# 4.0/C++ 10/VB 10 bits some more. I fire up Visual Studio 2010, intending to poke around C# 4.0's new "dynamic" keyword and see if and how it builds on top of the DLR (as a few people have suggested in comments in prior posts). VS comes up pretty quickly (not bad for a pre-alpha), the new interface seems snappy, and I create the ubiquitous "ConsoleApplicationX" C# app.

Wait a minute...

Something niggled at the back of my head, and I went back to File | New Project, and ... something's missing.

There's no "Visual F#" tab. There's an item in the "Project types:" box on the left for Visual Basic, Visual C#, Visual C++, WiX, Modeling Projects, Database Projects, Other Project Types, and Test Projects, but no Visual F#. (And no, it doesn't show up under "Other Project Types" either, I checked.) Considering that my understanding was that F# was going to ship with VS 2010, I'm a little puzzled as to its absence. Hopefully this is just a temporary oversight.

In the meantime, I'm off to play with "dynamic" a bit more and see what comes out of it. But guys, please, let's see some F# love out of the box? Surely, if you can ship WiX with it, shipping F# can't be hard?


.NET | C++ | Conferences | F# | Languages | Review | Visual Basic | VMWare | Windows | XML Services

Monday, November 03, 2008 5:19:06 PM (Pacific Standard Time, UTC-08:00)
Comments [4]  | 
 Saturday, November 01, 2008
Windows 7 + VMWare 6/VMWare Fusion 2

So the first thing I do when I get back from PDC? After taking my youngest trick-or-treating at the Redmond Town Center, and settling down into the weekend, I pull out the PDC hard drive and have a look around.

Obviously, I'm going to eventually spend a lot of time in the "Developer" subdirectory--lots of yummy PDC goodness in there, like the "Oslo_Dublin_WF_WCF_4" subdirectory in which we'll find a Virtual PC image of the latest CSD bits pre-installed, or the Visual_Studio_2010 subdirectory (another VirtualPC image), but before I start trying to covert those over to VMWare images (so I can run them on my Mac), I figured I'd take a wild shot at playing with Windows 7.

That, of course, means installing it into a VMWare image. So here goes.

First step, create the VMWare virtual machine. Because this is clearly not going to be a stock install, I choose the custom option, and set the operating system to be "Windows Server 2008 (experimental)". Not because I think there's anything really different about that option (except the default options that follow), but because it feels like the right homage to the pre-alpha nature of Windows 7. I set RAM to 512MB, chose to give it a 24GB IDE disk (not SCSI, as the default suggested--Windows sometimes has a tentative relationship with SCSI drives, and this way it's just one less thing to worry about), chose a single network adapter set to NAT, pointed the CD to the smaller of the two ISO images on the drive (which I believe to be the non-checked build version), and fired 'er up, not expecting much.

Kudos to the Windows 7 team.

The CD ISO boots, and I get the install screen, and bloody damn fast, at that. I choose the usual options, choose to do a Custom install (since I'm not really doing an Upgrade), and off it starts to churn. As I write this, it's 74% through the "Expanding files" step of the install, but for the record, Vista never got this far installing into VMWare with its first build. As a matter of fact, if I remember correctly, Vista (then Longhorn) didn't even boot to the first installation screen, and then when it finally did it took about a half-hour or so.

I'll post this now, and update it as I find more information as I go, but if you were curious about installing Windows 7 into VMWare, so far the prognosis looks good. Assuming this all goes well, the next step will be to install the Windows 7 SDK and see what I can build with it. After that, probably either VS 2008 or VS 2010, depending on what ISOs they've given me. (I think VS 2010 is just a VHD, so it'll probably have to be 2008.) But before I do any of that, I'll make a backup, just so that I can avoid having to start over from scratch in the event that there's some kind dependency between the two that I haven't discovered so far.

Update: Well, it got through "Expanding files", and going into "Starting Windows...", and now "Setup is starting services".... So far this really looks good.

Update: Uh, oh, possible snag: "Setup is checking video performance".... Nope! Apparently it's OK with whatever crappy video perf numbers VMWare is going to put up. (No, I didn't enable the experimental DirectX support for VMWare--I've had zero luck with that so far, in any VMWare image.)

Update: Woo-hoo! I'm sitting at the "Windows 7 Ultimate" screen, choosing a username and computername for the VM. This was so frickin flawless, I'm waiting for the shoe to drop. Choosing password, time zone, networking setting (Public), and now we're at the final lap....

Update: Un-FRICKIN-believable. Flawless. Absolutely flawless. I'm in the "System and Security" Control Panel applet, and of course the first thing I select is "User Account Control settings", because I want to see what they did here, and it's brilliant--they set up a 4-point slider to control how much you want UAC to bug you when you or another program changes Windows settings. I select the level that says, "Only notify me when programs try to make changes to my computer", which has as a note to it, "Don't notify me when I make changes to Windows settings. Note: You will still be notified if a program tries to make changes to your computer, including Windows settings", which seems like the right level to work from.

But that's beyond the point right now--the point is, folks, Windows 7 installs into a VMWare image flawlessly, which means it's trivial to start playing with this now. Granted, it still kinda looks like Vista at the moment, which may turn some folks off who didn't like its look and feel, but remember that Longhorn went through a few iterations at the UI level before it shipped as Vista, too, and that this is a pre-alpha release of Win7, so....

I tip my hat to the Windows 7 team, at least so far. This is a great start.

Update: Even better--VMWare Tools (the additions to the guest OS that enable better video, sound, etc) installs and works flawlessly, too. I am impressed. Really, really impressed.


.NET | C++ | Conferences | F# | Java/J2EE | Review | Visual Basic | Windows

Saturday, November 01, 2008 6:09:48 PM (Pacific Daylight Time, UTC-08:00)
Comments [1]  | 
 Monday, September 15, 2008
Apparently I'm #25 on the Top 100 Blogs for Development Managers

The full list is here. It's a pretty prestigious group--and I'm totally floored that I'm there next to some pretty big names.

In homage to Ms. Sally Fields, of so many years ago... "You like me, you really like me". Having somebody come up to me at a conference and tell me how much they like my blog is second on my list of "fun things to happen to me at a conference", right behind having somebody come up to me at a conference and tell me how much they like my blog, except for that one entry, where I said something totally ridiculous (and here's why) ....

What I find most fascinating about the list was the means by which it was constructed--the various calculations behind page rank, technorati rating, and so on. Very cool stuff.

Perhaps it's trite to say it, but it's still true: readers are what make writing blogs worthwhile. Thanks to all of you.


.NET | C++ | Conferences | Development Processes | F# | Flash | Java/J2EE | Languages | LLVM | Mac OS | Parrot | Reading | Review | Ruby | Security | Solaris | Visual Basic | VMWare | Windows | XML Services

Monday, September 15, 2008 4:29:19 AM (Pacific Daylight Time, UTC-07:00)
Comments [4]  | 
 Friday, August 01, 2008
From the "Team-Building Exercise" Department

This crossed my Inbox, and I have to say, I'm stunned at this incredible display of teamwork. Frankly... well, see for yourself.


.NET | Development Processes | Review | Windows

Friday, August 01, 2008 12:23:11 AM (Pacific Daylight Time, UTC-07:00)
Comments [1]  | 
 Sunday, June 01, 2008
Best Java Resources: A Call

I've been asked to put together a list of the "best" Java resources that every up-and-coming Java developer should have, and I'd like this list to be as comprehensive as possible and, more importantly, reflect more than just my own opinion. So, either through comments or through email, let me know what you think the best Java resources are in the following categories:

  • Websites and developer Web portals
  • Weblogs/RSS feeds. (Not all have to be hand-authored blogs--if you find an RSS feed for news on java.net projects, for example, that would count as well.)
  • Java packages and/or libaries. (Either those within Java Standard Edition--a la Reflection or the Scripting API--or from Enterprise Edition--a la JMS--or even third-party packages, a la Spring.)
  • Conferences, even including those that I don't speak at. ;-)
  • Books.
  • Tools. (IDEs, build tools, static analysis tools, either commercial or open source.)
  • Future trends you think bear watching.

There is, of course, no prize to be won here, and I'd please ask the vendors (commercial or open source) who watch my blog to avoid outright advertisements in comments (though you are free to rattle off the various advantages of your product in an email to me), in order to avoid turning this weblog into a gigantic row of billboards along the freeway. I am interested in peoples' opinions, however, and more importantly, why you think X should be on that list, or even why Y shouldn't. Keep it civil, though, please--I'll delete any comments that get too vindictive or offensive. (That doesn't mean that you have to agree with me--just avoid calling anybody names. Basic 'Netiquette.)

Oh, and if you want to be mentioned in the article (which will be published on an international developer site), please indicate how you'd like to be accredited. Or not. Whatever you prefer.


Java/J2EE | Languages | Mac OS | Reading | Review | XML Services

Sunday, June 01, 2008 9:18:03 PM (Pacific Daylight Time, UTC-07:00)
Comments [9]  | 
 Friday, May 16, 2008
Blogs I'm currently reading

Recently, a former student asked me,

I was in a .NET web services training class that you gave probably 4 or so years ago on-site at a [company name] office in [city], north of Atlanta.  At that time I asked you for a list of the technical blogs that you read, and I am curious which blogs you are reading now.  I am now with a small company where I have to be a jack of all trades, in the last year I have worked in C++ and Perl backend type projects and web frontend projects with Java, C#, and RoR, so I find your perspective interesting since you also work with various technologies and aren't a zealot for a specific one.

Any way, please either respond by email or in your blog, because I think that others may be interested in the list also.

As one might expect, my blog list is a bit eclectic, but I suppose that's part of the charm of somebody looking to study Java, .NET, C++, Smalltalk, Ruby, Parrot, LLVM, and other languages and environments. So, without further ado, I've pasted in the contents of my OPML file for cut&paste and easy import.

Having said that, though, I would strongly suggest not just blindly importing the whole set of feeds into your nearest RSS reader, but take a moment and go visit each one before you add it. It takes longer, granted, but the time spent is a worthy investment--you don't want to have to declare "blog bankruptcy".

Editor's note: We pause here as readers look at each other and go... "WTF?!?"

"Blog bankruptcy" is a condition similar to "email bankruptcy", when otherwise perfectly high-functioning people give up on trying to catch up to the flood of messages in their email client's Inbox and delete the whole mess (usually with some kind of public apology explaining why and asking those who've emailed them in the past to resend something if it was really important), effectively trying to "start over" with their email in much the same way that Chapter Seven or Chapter Eleven allows companies to "start over" with their creditors, or declaring bankruptcy allows private citizens to do the same with theirs. "Blog bankruptcy" is a similar kind of condition: your RSS reader becomes so full of stuff that you can't keep up, and you can't even remember which blogs were the interesting ones, so you nuke the whole thing and get away from the blog-reading thing for a while.

This happened to me, in fact: a few years ago, when I became the editor-in-chief of TheServerSide.NET, I asked a few folks for their OPML lists, so that I could quickly and easily build a list of blogs that would "tune me in" to the software industry around me, and many of them quite agreeably complied. I took my RSS reader (Newsgator, at the time) and dutifully imported all of them, and ended up with a collection of blogs that was easily into the hundreds of feeds long. And, over time, I found myself reading fewer and fewer blogs, mostly because the whole set was so... intimidating. I mean, I would pick at the list of blogs and their entries in the same way that I picked at vegetables on my plate as a child--half-heartedly, with no real enthusiasm, as if this was something my parents were forcing me to do. That just ruined the experience of blog-reading for me, and eventually (after I left TSS.NET for other pastures), I nuked the whole thing--even going so far as to uninstall my copy of Newsgator--and gave up.

Naturally, I missed it, and slowly over time began to rebuild the list, this time, taking each feed one at a time, carefully weighing what value the feed was to me and selecting only those that I thought had a high signal-to-noise ratio. (This is partly why I don't include much "personal" info in this blog--I found myself routinely stripping away those blogs that had more personal content and less technical content, and I figured if I didn't want to read it, others probably felt the same way.) Over the last year or two, I've rebuilt the list to the point where I probably need to prune a bit and close a few of them back down, but for now, I'm happy with the list I've got.

And speaking of which....

   1: <?xml version="1.0"?>
   2: <opml version="1.0">
   3:  <head>
   4:   <title>OPML exported from Outlook</title>
   5:   <dateCreated>Thu, 15 May 2008 20:55:19 -0700</dateCreated>
   6:   <dateModified>Thu, 15 May 2008 20:55:19 -0700</dateModified>
   7:  </head>
   8:  <body>
   9:   <outline text="If broken it is, fix it you should" type="rss"
  10:   xmlUrl="http://blogs.msdn.com/tess/rss.xml"/>
  11:   <outline text="Artima Developer Buzz" type="rss"
  12:   xmlUrl="http://www.artima.com/news/feeds/news.rss"/>
  13:   <outline text="Artima Weblogs" type="rss"
  14:   xmlUrl="http://www.artima.com/weblogs/feeds/weblogs.rss"/>
  15:   <outline text="Artima Chapters Library" type="rss"
  16:   xmlUrl="http://www.artima.com/chapters/feeds/chapters.rss"/>
  17:   <outline text="Neal Gafter's blog" type="rss"
  18:   xmlUrl="http://gafter.blogspot.com/feeds/posts/default"/>
  19:   <outline text="Room 101" type="rss"
  20:   xmlUrl="http://gbracha.blogspot.com/feeds/posts/default"/>
  21:   <outline text="Kelly O'Hair's Blog" type="rss"
  22:   xmlUrl="http://weblogs.java.net/blog/kellyohair/index.rdf"/>
  23:   <outline text="John Rose @ Sun" type="rss"
  24:   xmlUrl="http://blogs.sun.com/jrose/feed/entries/atom"/>
  25:   <outline text="The Daily WTF" type="rss"
  26:   xmlUrl="http://syndication.thedailywtf.com/TheDailyWtf"/>
  27:   <outline text="Brad Wilson" type="rss"
  28:   xmlUrl="http://feeds.feedburner.com/BradWilson"/>
  29:   <outline text="Mike Stall's .NET Debugging Blog" type="rss"
  30:   xmlUrl="http://blogs.msdn.com/jmstall/rss.xml"/>
  31:   <outline text="Stevey's Blog Rants" type="rss"
  32:   xmlUrl="http://steve-yegge.blogspot.com/atom.xml"/>
  33:   <outline text="Brendan's Roadmap Updates" type="rss"
  34:   xmlUrl="http://weblogs.mozillazine.org/roadmap/index.rdf"/>
  35:   <outline text="pl patterns" type="rss"
  36:   xmlUrl="http://plpatterns.blogspot.com/feeds/posts/default"/>
  37:   <outline text="Joel Pobar's weblog" type="rss"
  38:   xmlUrl="http://feeds.feedburner.com/callvirt"/>
  39:   <outline text="Let&amp;#39;s Kill Dave!" type="rss"
  40:   xmlUrl="http://letskilldave.com/rss.aspx"/>
  41:   <outline text="Why does everything suck?" type="rss"
  42:   xmlUrl="http://whydoeseverythingsuck.com/feeds/posts/default"/>
  43:   <outline text="cdiggins.com" type="rss" xmlUrl="http://cdiggins.com/feed"/>
  44:   <outline text="LukeH's WebLog" type="rss"
  45:   xmlUrl="http://blogs.msdn.com/lukeh/rss.xml"/>
  46:   <outline text="Jomo Fisher -- Sharp Things" type="rss"
  47:   xmlUrl="http://blogs.msdn.com/jomo_fisher/rss.xml"/>
  48:   <outline text="Chance Coble" type="rss"
  49:   xmlUrl="http://leibnizdream.wordpress.com/feed/"/>
  50:   <outline text="Don Syme's WebLog on F# and Other Research Projects" type="rss"
  51:   xmlUrl="http://blogs.msdn.com/dsyme/rss.xml"/>
  52:   <outline text="David Broman's CLR Profiling API Blog" type="rss"
  53:   xmlUrl="http://blogs.msdn.com/davbr/rss.xml"/>
  54:   <outline text="JScript Blog" type="rss"
  55:   xmlUrl="http://blogs.msdn.com/jscript/rss.xml"/>
  56:   <outline text="Yet Another Language Geek" type="rss"
  57:   xmlUrl="http://blogs.msdn.com/wesdyer/rss.xml"/>
  58:   <outline text=".NET Languages Weblog" type="rss"
  59:   xmlUrl="http://www.dotnetlanguages.net/DNL/Rss.aspx"/>
  60:   <outline text="DevHawk" type="rss"
  61:   xmlUrl="http://feeds.feedburner.com/Devhawk"/>
  62:   <outline text="The Cobra Programming Language" type="rss"
  63:   xmlUrl="http://cobralang.blogspot.com/feeds/posts/default"/>
  64:   <outline text="Code Miscellany" type="rss"
  65:   xmlUrl="http://codemiscellany.blogspot.com/feeds/posts/default"/>
  66:   <outline text="Fred, Let it go!" type="rss"
  67:   xmlUrl="http://freddy33.blogspot.com/feeds/posts/default"/>
  68:   <outline text="Codedependent" type="rss"
  69:   xmlUrl="http://graphics-geek.blogspot.com/feeds/posts/default"/>
  70:   <outline text="Presentation Zen" type="rss"
  71:   xmlUrl="http://www.presentationzen.com/presentationzen/index.rdf"/>
  72:   <outline text="The Extreme Presentation(tm) Method" type="rss"
  73:   xmlUrl="http://extremepresentation.typepad.com/blog/index.rdf"/>
  74:   <outline text="ZapThink" type="rss"
  75:   xmlUrl="http://feeds.feedburner.com/zapthink"/>
  76:   <outline text="Chris Smith's completely unique view" type="rss"
  77:   xmlUrl="http://feeds.feedburner.com/ChrisSmithsCompletelyUniqueView"/>
  78:   <outline text="Code Commit" type="rss"
  79:   xmlUrl="http://feeds.codecommit.com/codecommit"/>
  80:   <outline
  81:   text="Comments on Ola Bini: Programming Language Synchronicity: A New Hope: Polyglotism"
  82:   type="rss"
  83:   xmlUrl="http://ola-bini.blogspot.com/feeds/5778383724683099288/comments/default"/>
  84:  </body>
  85: </opml>

Happy reading.....


.NET | C++ | Conferences | F# | Java/J2EE | Languages | LLVM | Mac OS | Parrot | Reading | Review | Ruby | Security | Solaris | Visual Basic | Windows | XML Services

Friday, May 16, 2008 12:08:07 AM (Pacific Daylight Time, UTC-07:00)
Comments [0]  | 
 Friday, March 28, 2008
Rules for Review

Apparently, I'm drawing enough of an audience through this blog that various folks have started to send me press releases and notifications and requests for... well, I dunno exactly, but I'm assuming some blogging love of some kind. I'm always a little leery about that particular subject, because it always has this dangerous potential to turn the blog into a less-credible marketing device, but people at conferences have suggested that they really are interested in what I think about various products and tools, so perhaps it's time to amend my stance on this.

With that in mind, if you are a vendor and have a product that you'd like me to take a look at and (possibly) offer up a review here, here's the basic rules:

  1. No guarantees. Sending me something will in no way guarantee that I will review your product, for several reasons, two of which being (a) I get really busy sometimes, and (b) I may have no interest whatsoever in your product and I refuse to pretend to do so. (Readers can usually tell when the reviewer isn't all that excited about the subject, I've found.)
  2. If you're not going to send me a "real" version (meaning not the time-locked or feature-crippled demo), don't bother. I have no idea when I will get around to a review, and I have no desire to review something that isn't "the real deal". I will in turn promise that the licensed version you send me (if necessary) will not be used for any purpose other than my own research and exploration (signing contract if necessary to give you that "fresh-from-the-lawyer's-office" warm and fuzzy feeling).
  3. I say what I think, pro and con. I will not edit my review to suit your marketing purpose, and if you ask me to do so I will simply note in the review that you have asked me to do so. I retain full editorial control over what I say about your product.
  4. Having established #1, I will try to be as fair as I can about your product, and point out things that I liked and things that I didn't. (Of course, if I hated it from top to bottom, I may end up with the only positive thing being "It didn't set the atmosphere on fire when I started the app", but hey, that's something positive, right?)
  5. Also in the spirit of #1, if you send me mail answering questions or complaints in my review, I will of course amend the review with your comments. You are always welcome to post comments to the blog entry itself, too. Unless you insult my grandmother, then I will have to get all DELETE-key on you.

The reason I'm posting this here is twofold: one, so my faithful audience of four blog readers will know the rules under which I'm looking at these products and (hopefully) realize that I'm not financially vested in any of these products, and two, so the various vendor folks can read this and know what the rules are up front before even asking.

I know it sounds a little cheeky to lay this out. The image I get in my head is that of the kid at Christmas declaring to his grandparents as they walk through the door, presents in hand, "Make sure it's not a scratchy sweater, I hate scratchy sweaters. And G.I. Joe was only popular when my Dad was a kid. And if you give me another lunchbox I will scream until you buy me something cool, like a new GameBoy." Ugh. But I value the trust that people seem to have in me, and so I risk the perception of cheekiness for this tiny window in time in order to (hopefully) establish full disclosure over the reviews that come to pass (which, by the way, will always have the category "review" applied to them, so you know which is an official review and which is just me exploring, like the LLVM and Parrot posts of recent time).

We now return you to the regularly-scheduled blog.


.NET | C++ | Flash | Java/J2EE | Languages | LLVM | Mac OS | Parrot | Reading | Review | Ruby | Security | Solaris | VMWare | Windows | XML Services

Friday, March 28, 2008 4:18:12 AM (Pacific Daylight Time, UTC-07:00)
Comments [0]  |