JOB REFERRALS
    ON THIS PAGE
    ARCHIVES
    CATEGORIES
    BLOGROLL
    LINKS
    SEARCH
    MY BOOKS
    DISCLAIMER
 
 Sunday, May 18, 2008
Guide you, the Force should

Steve Yegge posted the transcript from a talk on dynamic languages that he gave at Stanford.

Cedric Beust posted a response to Steve's talk, espousing statically-typed languages.

Numerous comments and flamewars erupted, not to mention a Star Wars analogy (which always makes things more fun).

This is my feeble attempt to play galactic peacemaker. Or at least galactic color commentary and play-by-play. I have no doubts about its efficacy, and that it will only fan the flames, for that's how these things work. Still, I feel a certain perverse pleasure in pretending, so....

Enjoy the carnage that results.


First of all, let me be very honest: I like Steve's talk. I think he does a pretty good job of representing the negatives and positives of dynamic languages, though there are obviously places where I'm going to disagree:

  • "Because we all know that C++ has some very serious problems, that organizations, you know, put hundreds of staff years into fixing. Portability across compiler upgrades, across platforms, I mean the list goes on and on and on. C++ is like an evolutionary sort of dead-end. But, you know, it's fast, right?" Funny, I doubt Bjarne Stroustrup or Herb Sutter would agree with the "evolutionary dead-end" statement, but they're biased, so let's put that aside for a moment. Have organizations put hundreds of staff years into fixing the problems of C++? Possibly--it would be good to know what Steve considers the "very serious problems" of C++, because that list he does give (compiler/platform/language upgrades and portability across platforms) seems problematic regardless of the langauge or platform you choose--Lord knows we saw that with Java, and Lord knows we see it with ECMAScript in the browser, too. The larger question should be, can, and does, the language evolve? Clearly, based on the work in the Boost libraries and the C++0X standards work, the answer is yes, every bit as much as Java or C#/.NET is, and arguably much more so than what we're seeing in some of the dynamic languages. C++ is getting a standardized memory model, which will make a portable threading package possible, as well as lambda expressions, which is a far cry from the language that I grew up with. That seems evolutionary to me. What's more, Bjarne has said, point-blank, that he prefers taking a slow approach to adopting new features or ideas, so that it can be "done right", and I think that's every bit a fair position to take, regardless of whether I agree with it or not. (I'd probably wish for a faster adoption curve, but that's me.) Oh, and if you're thinking that C++'s problems stem from its memory management approach, you've never written C++ with a garbage collector library.
  • "And so you ask them, why not use, like, D? Or Objective-C. And they say, "well, what if there's a garbage collection pause?" " Ah, yes, the "we fear garbage collection" argument. I would hope that Java and C#/.NET have put that particular debate to rest by now, but in the event that said dragon's not yet slain, let's do so now: GC does soak up some cycles, but for the most part, for most applications, the cost is lost in the noise of everything else. As with all things performance related, however, profile.
  • "And so, you know, their whole argument is based on these fallacious, you know, sort of almost pseudo-religious... and often it's the case that they're actually based on things that used to be true, but they're not really true anymore, and we're gonna get to some of the interesting ones here." Steve, almost all of these discussions are pseudo-religious in nature. For some reason, programmers like to identify themselves in terms of the language they use, and that just sets up the religious nature of the debate from the get-go.
  • "You know how there's Moore's Law, and there are all these conjectures in our industry that involve, you know, how things work. And one of them is that languages get replaced every ten years. ... Because that's what was happening up until like 1995. But the barriers to adoption are really high." I can't tell from the transcript of Steve's talk if this is his opinion, or that this is a conjecture/belief of the industry; in either case, I thoroughly disagree with this sentiment--the barriers to entry to create your own language have never been lower than today, and various elements of research work and available projects just keep making it easier and easier to do, particularly if you target one of the available execution engines. Now, granted, if you want your language to look different from the other languages out there, or if you want to do some seriously cool stuff, yes, there's a fair amount of work you still have to do... but that's always going to be the case. As we find ways to make it easier to build what's "cool" today, the definition of what's "cool" rises in result. (Nowhere is this more clear than in the game industry, for example.) Moore's Law begets Ballmer's Corollary: User expectations double every eighteen months, requiring us to use up all that power trying to meet those expectations with fancier ways of doing things.
  • It's a section that's too long to quote directly here, but Steve goes on to talk about how programmers aren't using these alternative languages, and that if you even suggest trying to use D or Scala or [fill in the blank], you're going to get "lynched for trying to use a language that the other engineers don't know. ... And [my intern] is, like, "well I understand the argument" and I'm like "No, no, no! You've never been in a company where there's an engineer with a Computer Science degree and ten years of experience, an architect, who's in your face screaming at you, with spittle flying on you, because you suggested using, you know... D. Or Haskell. Or Lisp, or Erlang, or take your pick." " Steve, with all due respect to your experience, I know plenty of engineers and companies who are using some of these "alternative" languages, and they're having some good success. But frankly, if you work in a company where an architect is "in your face screaming at you, with spittle flying on you", frankly, it's time to move on, because that company is never going to try anything new. Period. I don't care if we're talking about languages, Spring, agile approaches, or trying a new place for lunch today. Companies get into a rut just as much as individuals do, and if the company doesn't challenge that rut every so often, they're going to get bypassed. Period, end of story. That doesn't mean trying every new thing under the sun on your next "mission-critical" project, but for God's sake, Mr. CTO, do you really want to wait until your competition has bypassed you before adopting something new? There's a lot of project work that goes on that has room for some experimentation and experience-gathering before utilizing something on the next big project.
  • "I made the famously, horribly, career-shatteringly bad mistake of trying to use Ruby at Google, for this project. ... And I became, very quickly, I mean almost overnight, the Most Hated Person At Google. And, uh, and I'd have arguments with people about it, and they'd be like Nooooooo, WHAT IF... And ultimately, you know, ultimately they actually convinced me that they were right, in the sense that there actually were a few things. There were some taxes that I was imposing on the systems people, where they were gonna have to have some maintenance issues that they wouldn't have [otherwise had]. Those reasons I thought were good ones." Recognizing the cost of deploying a new platform into the IT sphere is a huge deal that programmers frequently try to ignore in their zeal to adopt something new, and as a result, IT departments frequently swing the other way, resisting all change until it becomes inevitable. This is where running on top of one of the existing execution environments (the JVM or the CLR in particular) becomes so powerful--the actual deployment platform doesn't change, and the IT guys remain more or less disconnected from the whole scenario. This is the principal advantage JRuby and IronPython and Jython and IronRuby will have over their native-interpreted counterparts. As for maintenance issues, aside from the "somebody's gonna have to learn this language" tax (which is a real tax but far less costly, I believe, than most people think it to be), I'm not sure what issues would crop up--the IT guys don't usually change your Java or C# or Visual Basic code in production, do they?
  • Steve then gets into the discussion about tools around dynamic languages, and I heartily agree with him: the tool vendors have a much deeper toolchest than we (non-tool vendor programmers) give them credit for, and they're proving it left and right as IDEs get better and better for dynamic languages like Groovy and Ruby. In some areas, though, I think we as developers lean too strongly against our tools, expecting them to be able to do the thinking for us, and getting all grumpy when they can't or don't. Granted, I don't want to give up my IntelliJ any time soon, but let's think about this for a second: if I can't program Java today without IntelliJ, then is that my fault, the language's fault, the industry's fault, or some combination thereof? Or is it maybe just a fact of progress? (Would anybody consider building assembly language in Notepad today? Does that make assembly language wrong? Or just the wrong tool for the job?)
  • Steve's point about how Java IDE's miss the Reflective case is a good one, and one that every Java programmer should consider. How much of your Java (or C# or C++) code actually isn't capturable directly in the IDE?
  • Steve then goes into the ubiquitous Java-generics rant, and I'll have to admit, he's got some good points here--why didn't we (Java, though this applies just as equally to C#) just let the runtime throw the exception when the cast fails, and otherwise just let things go? My guess is that there's probably some good rationale that presumes you already accept the necessity of more verbose syntax in exchange for knowing where the cast might potentially fail, even though there's plenty of other places in the language where exceptions can be thrown without that verbose syntax warning you of that fact, array indexers being a big one. One thing I will point out, however, in what I believe is a refutation of what Steve's suggesting in this discussion: from my research in the area and my memory about the subject from way back when, the javac compiler really doesn't do much in the way of optimizations, and hasn't tried since about JDK 1.1, for the precise reason he points out: the JITter's going to optimize all this stuff anyway, so it's easier to just relax and let the JITter do the heavy lifting.
  • The discussion about optimizations is interesting, and while I think he glosses over some issues and hyper-focuses on others, two points stand out, in my mind: performance hits often come from places you don't expect, and that micro-benchmarks generally don't prove much of anything. Sometimes that hit will come from the language, and sometimes that hit will come from something entirely differently. Profile first. Don't let your intuition get in the way, because your intuition sucks. Mine does, too, by the way--there's just too many moving parts to be able to keep it all straight in your head.

Steve then launches into a series of Q&A with the audience, but we'll let the light dim on that stage, and turn our attention over to Cedric's response.

  • "... the overall idea is that dynamically typed languages are on the rise and statically typed languages are on their way out." Actually, the transcript I read seemed to imply that Steve thought that dynamically typed languages are cool but that nobody will use them for a variety of reasons, some of which he agreed with. I thoroughly disagree with Steve's conclusion there, by the way, but so be it ...
  • "I'm happy to be the Luke Skywalker to his Darth Vader. ... Evil shall not prevail." Yes, let's not let this debate fall into the pseudo-religious category, shall we? Fully religious debates have such a better track record of success, so let's just make it "good vs evil", in order to ensure emotions get all neatly wrapped throughout. Just remember, Cedric, even Satan can quote the Bible... and it was Jesus telling us that, so if you disagree with anything I say below you must be some kind of Al-Qaeda terrorist. Or something.
    • [Editor's note: Oh, shit, he did NOT just call Cedric a terrorist and a Satanist and invoke the name of Christ in all this. Time to roll out the disclaimer... "Ladies and gentlemen, the views and opinions expressed in this blog entry...."]
    • [Author's note: For the humor-challenged in the crowd, no I do not think Cedric is a terrorist. I like Cedric, and hopefully he still likes me, too. Of course, I have also been accused of being the Antichrist, so what that says about Cedric I'm not sure.]
  • Cedric on Scala:
    • "Have you taken a look at implicits? Seriously? Just when I thought we were not just done realizing that global variables are bad, but we have actually come up with better ways to leverage the concept with DI frameworks such as Guice, Scala knocks the wind out of us with implicits and all our hardly earned knowledge about side effects is going down the drain again." Umm.... Cedric? One reaction comes to mind here, and it's best expressed as.... WTF?!? Implicits are not global variables or DI, they're more a way of doing conversions, a la autoboxing but more flexible. I agree that casual use of implicits can get you in trouble, but I'd have thought Scala's "there are no operators just methods with funny names" would be the more disconcerting of the two.
    • "As for pattern matching, it makes me feel as if all the careful data abstraction that I have built inside my objects in order to isolate them from the unforgiving world are, again, thrown out of the window because I am now forced to write deconstructors to expose all this state just so my classes can be put in a statement that doesn't even have the courtesy to dress up as something that doesn't smell like a switch/case..." I suppose if you looked at pattern-matching and saw nothing more than a switch/case, then I'd agree with you, but it turns out that pattern-matching is a lot more powerful than just being a switch/case. I think what Cedric's opposing is the fact that pattern-matching can actually bind to variables expressed in the individual match clauses, which might look like deconstructors exposing state... but that's not the way they get used, from what I've seen thus far. But, hey, just because the language offers it, people will use it wrongly, right? So God forbid a language's library should allow me to, say, execute private methods or access private fields....
  • Cedric on the difficulty to impose a non-mainstream language in the industry: "Let me turn the table on you and imagine that one of your coworkers comes to you and tells you that he really wants to implement his part of the project in this awesome language called Draco. How would you react? Well, you're a pragmatic kind of guy and even though the idea seems wacky, I'm sure you would start by doing some homework (which would show you that Draco was an awesome language used back in the days on the Amiga). Reading up on Draco, you realize that it's indeed a very cool language that has some features that are a good match for the problem at hand. But even as you realize this, you already know what you need to tell that guy, right? Probably something like "You're out of your mind, go back to Eclipse and get cranking". And suddenly, you've become *that* guy. Just because you showed some common sense." If, I suppose, we equate "common sense" with "thinking the way Cedric does", sure, that makes sense. But you know, if it turned out that I was writing something that targeted the Amiga, and Draco did, in fact, give us a huge boost on the competition, and the drawbacks of using Draco seemed to pale next to the advantages of using it, then... Well, gawrsh, boss, it jus' might make sense to use 'dis har Draco thang, even tho it ain't Java. This is called risk mitigation, and frankly, it's something too few companies go through because they've "standardized" on a language and API set across the company that's hardly applicable to the problem at hand. Don't get me wrong--you don't want the opposite extreme, which is total anarchy in the operations center as people use any and all languages/platforms available to them on a willy-nilly basis, but the funny thing is, this is a continuum, not a binary switch. This is where languages-on-execution-engines (like the JVM or CLR) gets such a great win-win condition: IT can just think in terms of supporting the JVM or CLR, and developers can then think in whatever language they want, so long it compiles/runs on those platforms.
  • Cedric on building tools for dynamic languages: "I still strongly disagree with that. It is different *and* harder (and in some cases, impossible). Your point regarding the fact that static refactoring doesn't cover 100% of the cases is well taken, but it's 1) darn close to 100% and 2) getting closer to it much faster than any dynamic tool ever could. By the way, Java refactorings correcting comments, XML and property files are getting pretty common these days, but good luck trying to perform a reliable method renaming in 100 Ruby files." I'm not going to weigh in here, since I don't write tools for either dynamic or static languages, but watching what the IntelliJ IDEA guys are doing with Groovy, and what the NetBeans guys are doing with Ruby, I'm more inclined to believe in what Steve thinks than what Cedric does. As for the "reliable method renaming in 100 Ruby files", I don't know this for a fact, but I'll be willing to be that we're a lot closer to that than Cedric thinks we are. (I'd love to hear comments from somebody neck-deep in the Ruby crowd who's done this and their experience doing so.)
  • Cedric on generics: "I no longer bother trying to understand why complex Generic examples are so... well, darn complex. Yes, it's pretty darn hard to follow sometimes, but here are a few points for you to ponder:
    • 90% of the Java programmers (including myself) only ever use Generics for Collections.
    • These same programmers never go as far as nesting two Generic declarations.
    • For API developers and users alike, Generics are a huge progress.
    • Scala still requires you to understand covariance and contravariance (but with different rules. People seem to say that Scala's rules are simpler, I'm not so sure, but not interested in finding out for the aforementioned reasons)."
    Honestly, Cedric, the fact that 90% of the Java programmers are only using generics for collections doesn't sway me in the slightest. 90% of the world's population doesn't use Calculus, either, but that doesn't mean that it's not useful, or that we shouldn't be trying to improve our understanding of it and how to do useful things with it. After looking at what the C++ community has done with templates (the Boost libraries) and what .NET is doing with its generic system (LINQ and F# to cite two examples), I think Java missed a huge opportunity with generics. Type erasure may have made sense in a world where Java was the only practical language on top of the JVM, but in a world that's coming to accept Groovy and JRuby and Scala as potential equals on the JVM, it makes no sense whatsoever. Meanwhile, when thinking about Scala, let's take careful note that a Scala programmer can go a long way with the langauge before having to think about covariance, contravariance, upper and lower type bounds, simpler or not. (For what it's worth, I agree with you, I'm not sure if they're simpler, either.)
  • Cedric on dynamic language performance: "What will keep preventing dynamically typed languages from displacing statically typed ones in large scale software is not performance, it's the simple fact that it's impossible to make sense of a giant ball of typeless source files, which causes automatic refactorings to be unreliable, hence hardly applicable, which in turn makes developers scared of refactoring. And it's all downhill from there. Hello bit rot." There's a certain circular logic here--if we presume that IDEs can't make sense of "typeless source files" (I wasn't aware that any source file was statically typed, honestly--this must be something Google teaches), then it follows that refactoring will be impossible or at least unreliable, and thus a giant ball of them will be unmanageable. I disagree with Cedric's premise--that IDEs can't make sense of dynamic language code--so therefore I disagree with the entire logical chain as a result. What I don't disagree with is the implicit presumption that the larger the dynamic language source base, the harder it is to keep straight in your head. In fact, I'll even amend that statement further: the larger the source base (dynamic or otherwise), the harder it is to keep straight in your head. Abstractions are key to the long-term success of any project, so the language I work with had best be able to help me create those abstractions, or I'm in trouble once I cross a certain threshold. That's true regardless of the language: C++, Java, C#, Ruby, or whatever. That's one of the reasons I'm spending time trying to get my head around Lisp and Scheme, because those languages were all about building abstractions upon abstractions upon abstractions, but in libraries, rather than in the language itself, so they could be swapped out and replaced with something else when the abstractions failed or needed evolution.
  • Cedric on program unmaintainability: "I hate giving anecdotal evidence to support my points, but that won't stop me from telling a short story that happened to me just two weeks ago: I found myself in this very predicament when trying to improve a Ruby program that 1) I just wrote a few days before and 2) is 200 lines long. I was staring at an object, trying to remember what it does, failing, searching manually in emacs where it was declared, found it as a "Hash", and then realized I still had no idea what the darn thing is. You see my point..." Ain't nothing wrong with anecdotal evidence, Cedric. We all have it, and if we all examine it en masse, some interesting patterns can emerge. Funny thing is, I've had exactly the same experience with C++ code, Java code, and C# code. What does that tell you? It tells me that I probably should have cooked up some better abstractions for those particular snippets, and that's what I ended up doing. As a matter of fact, I just helped a buddy of mine untangle some Ruby code to turn it into C#, and despite the fact that he's never written (or read) a Ruby program in his life, we managed to flip it over to C# in a couple of hours, including the execution of Ruby code blocks (I love anonymous methods) stored in a string-keyed hash within an array. And this was Ruby code that neither of us had ever seen before, much less written it a few days prior.
  • Cedric (and Steve) on error messages: "[Steve said] And the weird thing is, I realized early in my career that I would actually rather have a runtime error than a compile error. [Cedric responded] You probably already know this, but you drew the wrong conclusion. You didn't want a runtime error, you wanted a clear error. One that doesn't lie to you, like CFront (and a lot of C++ compilers even today, I hear) used to spit in our faces. And once I have a clear error message, I much prefer to have it as early as possible, thank you very much." Honestly, I agree with Cedric here: I would much prefer errors before execution, as early as possible, so that there's less chance of my users finding the errors I haven't found yet. And I agree that some of the error messages we sometimes get are pretty incomprehensible, particularly from the C++ compiler during template expansion. But how is that different from the ubiquitous Java "ClassCastException: Cannot cast Person to Person" that arises from time to time? Once you know what the message is telling you, it's easy to know how to fix it, but getting to the point of knowing what the error message is telling you requires a good working understanding of Java ClassLoaders. Do we really expect that any tool--static or dynamic, compiler or runtime, is going to be able to produce error messages that somehow precludes the need to have the necessary background to understand it? All errors are relative to the context from which they are born. If you lack that context, the error message, no matter how well-written or phrased, is useless.
  • Cedric on "The dynamic nuclear winter": "[Steve said] And everybody else went and chased static. And they've been doing it like crazy. And they've, in my opinion, reached the theoretical bounds of what they can deliver, and it has FAILED. [Cedric responded] Quite honestly, words fail me here." Wow. Just... wow. I can't agree with Steve at all, that static(ically typed languages) has FAILED, or that they've reached the theoretical bounds of what they can deliver, but neither can I say with complete confidence that statically-typed languages are The Way Forward, either. I think, for the time, chasing statically-typed languages was the right thing to do, because for a long time we were in a position where programmer time was cheaper than computer time; now, I believe that this particular metric has flipped, and that it's time we started thinking about what the costs of programmer time really are. (Frankly, I'd love to see a double-blind study on this, but I've no idea how one would carry that out in a scientific manner.)

So.... what's left?

Oh, right: if Steve/Vader is Cedric/Luke's father, then who is Cedric/Luke's sister, and why is she wearing a copper-wire bikini while strangling the Haskell/ML crowd/Jabba the Hutt?

Maybe this whole Star Wars analogy thing was a bad idea.


Look, at the end of the day, the whole static-vs-dynamic thing is a red herring. It doesn't matter. The crucial question is whether or not the language being used does two things, and how well it does them:

  1. Provide the ability to express the concept in your head, and
  2. Provide the ability to evolve as the concepts in your head evolve

There are certain things that are just impossible to do in C++, for example. I cannot represent the C++ AST inside the program itself. (Before you jump all over me, C++ers of the world, take careful note: I'm not saying that C++ cannot represent an AST, but an AST of itself, at the time it is executing.) This is something dynamic languages--most notably Lisp, but also other languages, including Ruby--do pretty well, because they're building the AST at runtime anyway, in order to execute the code in the first place. Could C++ do this? Perhaps, but the larger question is, would any self-respecting C++ programmer want to? Look at your average Ruby program--80% to 90% (the number may vary, but most of the Rubyists I talk to agree its somewhere in this range) of the program isn't really using the meta-object capabilities of the language, and is just a "simpler/easier/scarier/unchecked" object language. Most of the weird-*ss Rubyisms don't show up in your average Ruby program, but are buried away in some library someplace, and away from the view of the average Ruby programmer.

Keep the simple things simple, and make the hard things possible. That' should be the overriding goal of any language, library, or platform.

Erik Meijer coined this idea first, and I like it a lot: Why can't we operate on a basic principle of "static when we can (or should), dynamic otherwise"? (Reverse that if it makes you feel better: "dynamic when we can, static otherwise", because the difference is really only one of gradation. It's also an interesting point for discussion, just how much of each is necessary/desirable.) Doing this means we get the best of both worlds, and we can stop this Galactic Civil War before anybody's planet gets blown up.

'Cuz that would suck.


.NET | C++ | F# | Java/J2EE | Languages | LLVM | Mac OS | Parrot | Ruby | Visual Basic | Windows | XML Services

Sunday, May 18, 2008 8:34:54 PM (Pacific Standard Time, UTC-08:00)
Comments [2]  |  Related posts:
Tech Predictions, 2014
On Endings
Seattle (and other) GiveCamps
On speakers, expenses, and stipends
On startups
Farewell, Mr. Ballmer