Commentary Responses: 1/15/2008 Edition

A couple of people have left comments that definitely deserve response, so here we go:

Glenn Vanderberg comments in response to the Larraysaywhut? post, and writes:

Interesting post, Ted ... and for the most part I agree with your comments.  But I have to ask about this one:

Actually, there are languages that do it even worse than COBOL. I remember one Pascal variant that required your keywords to be capitalized so that they would stand out. No, no, no, no, no! You don't want your functors to stand out. It's shouting the wrong words: IF! foo THEN! bar ELSE! baz END! END! END! END!

[Oh, now, that's just silly.]

Seriously?  You don't think Larry has a point there?  That's one of the primary things I always hated about Wirth's languages, for exactly the reason cited here.  Most real-world Pascal implementations relaxed that rule to recognize upper- and lowercase keywords, but he didn't learn, making the same horrible mistake in Modula-2 and Oberon.

Capitalized words draw your attention, and make it hard to see the real code in between.

Rather than disagree with him, I agree with Larry: uppercased keywords, in a language, are just SOOOO last-century. But so is line-numbering, declaration-before-use, and hailing recursion as a feature. It just seems silly to put this out there as a point of language design, when I can't imagine anyone, with the possible exception of the old COBOL curmudgeon in the corner ("In MY day, we wrote code without a shift key, and we LIKED it! Uphill, both ways, I tell you!"), thinks that uppercased keywords is a good idea.

As for Mr. Wirth, well, dude had some good ideas, but even Einstein had his wacky moments. Repeat after me, everybody: "Just because some guy is brilliant and turns out to be generally right doesn't mean we take everything he says as gospel". It's true for Einstein, it's true for Wirth, and it's true even for Dave Thomas (whom I am privileged to call friend, love deeply, and occasionally think is off his rocker... but I digress).

Actually, Glenn, I think case-sensitivity as a whole is silly. Let's face it, all ye who think that the C-family of languages have this one right, when's the last time you thought it was perfectly acceptable to write code like "int Int = 30;" ? Frankly, if anybody chose to overload based on case, I'd force them to maintain that same code for the next five years as punishment.

(I thought about ripping their still-beating hearts out of their chests instead, but honestly, having to live with the mess they create seems worse, and more fitting to boot.)

What's ironic, though, is that to be perfectly frank, I do exactly this with my SQL code, and it DOESN'T! SEEM! TO! SHOUT! to me AT! ALL! For some reason, this

SELECT name, age, favorite_language FROM programmers WHERE age > 25 AND favorite_language != 'COBOL';

just seems to flow pretty easily off the tongue. Err... eyeball. Whatever.

Meanwhile, 'Of Fibers and Continuations' drew some ire from Mark Murphy:

Frankly, this desire to accommodate the nifty feature of the moment smacks a great deal of Visual Basic, and while VB certainly has its strengths, coherent language design and consistent linguistic facilities is not one of them. It's played havoc with people who tried to maintain code in VB, and it's played hell with the people who try to maintain the VB language. One might try to argue that the Ruby maintainers are just Way Smarter than the Visual Basic maintainers, but I think that sells the VB team pretty short, having met some of them.

Conversely, I think you're selling the Ruby guys a bit short. And this is coming from a guy who's old enough to have written code in Visual Basic for DOS several years into his programming experience.

Wow. Next thing you know, Bruce Tate will be in here, talking about the "chuck the baby out the window" game he wrote for QuickBASIC. (True story.) And, FWIW, I too know the love of BASIC, although in this case I did QuickBasic (DOS) for a while, before it became known as QBasic, and Applesoft BASIC even before that. (Anybody else remember lo-res vs. hi-res graphics debates?) Ah, the sweet, sweet memories of PEEK and POKE and.... *shudder* Never mind.

[insert obligatory "get off my lawn!" reference here]

Get off my lawn, ya hooligan!

The death-knell for VB is widely considered to be the move from VB6 to VB.NET. In doing that, they changed significant quantities of the VB syntax. That's why there was so much hue and cry to keep maintaining VB6, because folk didn't want to take the time to port their zillions of lines of VB6 code.

Actually, much of that hue and cry was from a corner of the VB world that really just didn't want to learn something new. It turned out that most of the VB hue'ers and cry'ers were those who'd been hue'ing and cry'ing with every successive release of VB, and in the words of one very popular VB speaker and programmer, "If they don't want to come along, well, frankly, I think we're better off without 'em anyway."

Truthfully? VB seems to have move along just fine since. And, interestingly enough, since its transition to the CLR, VB has had a much stronger "core vision" to the language than it did for many years. I don't know if this is because the CLR helps them keep that vision clear, or if trying to keep up with C# is good intra-corporate competition, or what, but I haven't heard anywhere near the kinds of grousing about new linguistic changes in the two successive revisions of VB since VB.NET's release (VS 2005 and VS 2008) than I did prior to its move to the CLR.

The changes Ruby made in 1.9 had very little syntax impact (colons in case statements, and not much else, IIRC). Fibers, in particular, are just objects, supplied as part of the stock Ruby class library. I'm not aware of new syntax required to use fibers.

Grousing about a language adding to its standard class library seems a little weak. When Microsoft added new APIs to .NET when they released 3.0, I suspect you didn't bat an eye.

Oh, heavens, no. Quite the contrary--when .NET 3.0 shipped with WCF, Workflow and WPF in it, I was actually a little concerned, because the CLR's basic footprint is just ballooning like mad. How long before the CLR installation rivals that of the OS itself? Besides, this monolithic approach has its limitations, as the Java folks have discovered to their regret, and it's not too long before people start noticing the five or six different versions of the CLR all living on their machine simultaneously....

Let's be honest here--an API release is different from changing the execution model of the virtual machine, and that's partly what fibers do.

But of even more interest to this particular discussion, I wasn't really grousing about the syntax, or the addition of fibers, as I was pointing out that this is something that other platforms (notably Win32) has had before, and that it ended up being a "ho-hum, another subject I can safely ignore" topic for the world's programmers. That, and the interesting--and heretofore unrecognized, to me--link between fibers and coroutines and continuations.

In particular, grousing about how Language X adds something to its class library that duplicates a portion of something "baked into" Language Y seems really weak. Does this mean that once something is invented in a language, no other language is supposed to implement it in any way, shape, or form?

Heavens, no! Just like if you want to use objects, you're more than welcome to do so in C, or Pascal, or even assembly!

What if fibers weren't part of the Ruby 1.9 distribution, but rather were done by a third party and released as a popular gem? (I'm not sure if this would have been possible, as there may have been changes needed to the MRI to support fibers, but let's pretend for a moment.) Does this mean that nobody writing class libraries for any programming language are allowed to implement features that are "baked into" some other programming language?

Um... no: witness LINQ, stealing... *ahem* leveraging... a great deal of the concepts that are behind functional languages. Or the Win16 API (or the classic Mac OS API, or the Xt API, or ...), using object concepts from within the C language.

If so, C# should have never been created.

Huh?

Look, I have nothing against Ruby swiping ideas from another language. But let's not pretend that Ruby was built, from the ground up, as a functional language. The concepts that Ruby is putting forth in its 1.9 release are "bolted on", and will show the same leaks in the abstraction model as any other linguistic approach "bolted on" after the fact. This is a large part of the beef with generics in Java, with objects in C, with O/R-Ms, and so on. Languages choose, very precisely, which abstractions they want to make as first-class citizens, and usually when they try to add more of those concepts in after the fact, backwards compatibility and the choices they made earlier trip them up and create a suboptimal scenario. (Witness the various attempts to twist Java into a metaprogramming language: generics, AOP, and so on.)

Besides, if you're going to explore those features, why not go straight to the source? Since when has it become fashionable to discourage people from learning a new concept in the very environment where it is highlighted? Ruby is a phenomenal dynamic language (as is Lisp and Smalltalk, among others), and anybody who wants to grok dynamic languages should learn Ruby (and/or Lisp, and/or Smalltalk). Ditto for functional languages (Haskell and ML/OCaml being the two primary candidates in that camp).

Don't get me wrong -- I agree that there are way better languages for FP than Ruby, even with fibers. That's part of the reason why so many people are tracking JRuby and IronRuby, as having Ruby implementations on common VMs/LRs gives developers greater flexibility for mixing-and-matching languages to fit specific needs (JRuby/Scala/Groovy/Java on JVM, IronEverything/LotsOf# on CLR/DLR).

Which is the same thing I just said. Cool. :-)

I just think you could have spun this more positively and made the same points. The Rails team is having their hats handed to them over the past week or two; casting fibers as a "whither Ruby?" piece just feels like piling on.

Well, frankly, I don't track what's going on in the Rails space at all [and, to be honest, if one more programmer out there invents one more web framework that rhymes with "ails" in any way, so help me God I will SCREAM], so I can honestly say that I wasn't trying to "pile on". What I do find frustrating, however, is the general belief that Ruby is somehow God's Original Scripting Language, and that the Ruby community is constantly innovating while the rest of the programming world is staring on in drooling slack-jawed envy. Most of what Ruby does today is Old Hat to Smalltalkers, and I fully expect that PowerShellers will come along and find most of what the Ruby guys are doing to be interesting experiments in just how powerful the PSH environment really is.

Of deeper concern is the blending of "shell language" and "programming language" that Ruby seems to encourage; the only other language that I think really crosses that line is Perl, and honestly, that's not necessarily good company to be in on this score. When a language tries to hold fealty to too many masters, it loses coherence. Time will tell how well Ruby can chart that narrow course; to my mind, this is what ultimately doomed (and continues to dog) Perl 6.