JOB REFERRALS
    ON THIS PAGE
    ARCHIVES
    CATEGORIES
    BLOGROLL
    LINKS
    SEARCH
    MY BOOKS
    DISCLAIMER
 
 Thursday, February 21, 2013
Java was not the first

Charlie Kindel blogs that he thinks James Gosling (and the rest of Sun) screwed us all with Java and it's "Write Once, Run Anywhere" mantra. It's catchy, but it's wrong.

Like a lot of Charlie's blogs, he nails parts of this one squarely on the head:

WORA was, is, and always will be, a fallacy. ... It is the “Write once…“ part that’s the most dangerous. We all wish the world was rainbows and unicorns, and “Write once…” implies that there is a world where you can actually write an app once and it will run on all devices. But this is precisely the fantasy that the platform vendors will never allow to become reality. ...
And, given his current focus on building a mobile startup, he of course takes this lesson directly into the "native mobile app vs HTML 5 app" discussion that I've been a part of on way too many speaker panels and conference BOFs and keynotes and such:
HTML5 is awesome in many ways. If applied judiciously, it can be a great technology and tool. As a tool, it can absolutely be used to reduce the amount of platform specific code you have to write. But it is not a starting place. Starting with HTML5 is the most customer unfriendly thing a developer can do. ... Like many ‘solutions’ in our industry the “Hey, write it once in in HTML5 and it will run anywhere” story didn’t actually start with the end-user customer. It started with idealistic thoughts about technology. It was then turned into snake oil for developers. Not only is the “build a mobile app that hosts a web view that contains HTML5″ approach bass-ackwards, it is a recipe for execution disaster. Yes, there are examples of teams that have built great apps using this technique, but if you actually look at what they did, they focused on their experience first and then made the technology work. What happens when the shop starts with “we gotta use HTML5 running in a UIWebView” is initial euphoria over productivity, followed by incredible pain doing the final 20%.
And he's flat-out right about this: HTML 5, as an application development technology, takes you about 60 - 80% of the way home, depending on what you want your application to do.

In fact, about the only part of Charlie's blog post that I disagree with is the part where he blames Gosling and Java:

I blame James Gosling. He foisted Java on us and as a result Sun coined the term Write Once Run Anywhere. ... Developers really want to believe it is possible to “Write once…”. They also really want to believe that more threads will help. But we all know they just make the problems worse. Just as we’ve all grown to accept that starting with “make it multi-threaded” is evil, we need to accept “Write once…” is evil.
It didn't start with Java--it started well before that, with a set of cross-platform C++ toolkits that promised the same kind of promise: write your application in platform-standard C++ to our API, and we'll have the libraries on all the major platforms (back in those days, it was Windows, Mac OS, Solaris OpenView, OSF/Motif, and a few others) and it will just work. Even Microsoft got into this game briefly (I worked at Intuit, and helped a consultant who was struggling to port QuickBooks, I think it was, over to the Mac using Microsoft's short-lived "MFC For Mac OS" release), And, even before that, we had the discussions of "Standard C" and the #ifdef tricks we used to play to struggle to get one source file to compile on all the different platforms that C runs on.

And that, folks, is the heart of the matter: long before Gosling took his fledgling failed set-top box Oak-named project and looked around for a space to which to apply it next, developers... no, let's get that right, "developers and their managers who hate the idea of violating DRY by having the code in umpteen different codebases" have been looking for ways to have a single source base that runs across all the platforms. We've tried it with portable languages (see C, C++, Java, for starters), portable libraries (in the C++ space see Zinc, zApp, XVT, Tools.h++), portable containers (see EJB, the web browser), and now portable platforms (see PhoneGap/Cordova, Titanium, etc), portable cross-compilers (see MonoTouch/MonoDroid, for recent examples), and I'm sure there will be other efforts along these lines for years and decades to come. It's a noble goal, but the major players in the space to which we are targeting--whether that be operating systems, browsers, mobile platforms, console game devices, or whatever comes next two decades from now--will not allow their systems to be commoditized that easily. Because at the heart of it, that's exactly what these "cross-platform" tools and languages and libraries are trying to do: reduce the underlying "thing" to a commodity that lacks interest or impact.

Interestingly enough, as a side-note, one thing I'm starting to notice is that the more pervasive mobile devices become and the more mobile applications we see reaching those devices, the less and less "device-standard" those interfaces are trying to look even as they try to achieve cross-platform similarities. Consider, for a moment, the Fly Delta app on iPhone: it doesn't really use any of the standard iOS UI metaphors (except for some of the basic ones), largely because they've defined their own look-and-feel across all the platforms they support (iOS and Android, at least so far). Ditto for the CNN and USA Today apps, as well as the ESPN app, and of course just about every game ever written for any of those platforms. So even as Charlie argues:

The problem is each major platform has its own UI model, its own model for how a web view is hosted, its own HTML rendering engine, and its own JavaScript engine. These inter-platform differences mean that not only is the platform-specific code unique, but the interactions between that code and the code running within the web view becomes device specific. And to make matters worse intra-platform fragmentation, particularly on the platform with the largest number of users, Android, is so bad that this “Write Once..” approach provides no help.
We are starting to see mobile app developers actually striving to define their own UI model entirely, with only passing nod to the standards of the device on which they're running. Which then makes me wonder if we're going to start to see new portable toolkits that define their own unique UI model on each of these platforms, or will somehow allow developers to define their own UI model on each of these platforms--a UI model toolkit, so to speak. Which would be an interesting development, but one that will eventually run into many of the same problems as the others did.


.NET | Android | Azure | C# | C++ | Development Processes | F# | Flash | Industry | iPhone | Java/J2EE | Languages | LLVM | Mac OS | Objective-C | Parrot | Review | Ruby | Windows

Thursday, February 21, 2013 4:08:04 PM (Pacific Standard Time, UTC-08:00)
Comments [6]  |  Related posts:
Tech Predictions, 2014
On (Free) Speaking
On Endings
Seattle (and other) GiveCamps
On speakers, expenses, and stipends
On startups
Friday, February 22, 2013 6:56:38 AM (Pacific Standard Time, UTC-08:00)
What about the UCSD p-System of the late 1970s? Maybe that is closer to being the mythical first WORA system?
K Montgomery
Sunday, February 24, 2013 1:25:04 AM (Pacific Standard Time, UTC-08:00)
Monodroid/Monotouch don't quite fit your narrative as you need to code up the UI specifically for each platform. I've got personal experience with sharing non-UI code between a Silverlight and Monodroid app. It deliverers nicely.
Monday, February 25, 2013 12:13:52 PM (Pacific Standard Time, UTC-08:00)
I agree with K. Montgomery about UCSD Pascal. You could run such programs on a mainframe, CP/M, MS-DOS, and UNIX back when it was available and the program was portable in binary form to all those platforms. It was text consoles back then so it wasn't that difficult to make it run everywhere (except for handling cursor addressing and such, but that is another story).

I created such a language myself back in the late 1980s and used the UCSD Pascal as my concept for it. I had to have it run on two UNIX platforms as well as two proprietary platforms at a company that had been created by the merger of three different companies over a period of a few years.

We needed to certify the programs ran on the platforms correctly and previously it would require doing that separately on all four platforms. Once the language was created though and we proved that simply by moving the binary from one to the next it ran the same we only had to certify on one at a time. That reduced the programming team size needed (we actually could now write more programs with the same size team), it also reduced the people and systems needed for certification. Certification could take 4-6 weeks for each system, so time was important.

I designed it so the binary compiled to a small CISC instruction set, because one platform needed to have the entire program fit in less than 8K of memory. It was simple data entry screens with some database storage and access. The interpreter handled all of that so the programmers only had to worry about data validation and storage of the data contents based on the screen fields. It even allowed for debugging and the interpreter could be turned into a decompiler to recover the source code of a program.

Write Once, Run Anywhere is achievable - provided you define the environment it will be running in correctly. I only had to deal with simple text terminals. When you try to deal with GUI environments that differ in so many features of the GUI interface, plus the vastly different screen types you may have to deal with, it can be a lot tougher to define something that is robust and flexible enough to run correctly everywhere. Going from a small cell phone screen to a 4Kx2K screen (or even larger) often requires a complete rethink of the entire application interface.

Mike Riley
Mike Riley
Monday, February 25, 2013 12:37:06 PM (Pacific Standard Time, UTC-08:00)
You missed Ada. Ada well predates Java and is always WORA unless you use "With UncheckedConversion". Of course, you have to compile for each target platform (no JIT!), but you only have to write your source once! All versions I've worked with had limited display capabilities and it has been quite a few years. BTW, I'm not suggesting that Ada is a good choice for writing rich UI applications.
Phil Lenoir
Tuesday, February 26, 2013 10:03:31 AM (Pacific Standard Time, UTC-08:00)
Sorry. It goes all the way back to COBOL and the original Codasyl effort to come up with a way to have a language with portability among different platforms. That's late 1950s and early 60s work.

Bringing FORTRAN to ANSI was part of a similar effort.

At the time, these languages had to be loose enough to allow for implementation differences (size of integers, character sets, different floating-point representations, etc.). That also beset the ALGOL 60 specification.

The goals were portability, but that did not go so far as to proclaim write-once, except when COBOL was being hyped over-much. Also, portability was never meant to be compile once and run everywhere. Of course, a common intermediate language was the holy grail of the UNCOL project, an unsuccessful late 50s, early 60s effort. It is a testament to Moore's law and the convergence of architectures that software VMs are now so effective in that respect.
Wednesday, February 27, 2013 6:25:40 PM (Pacific Standard Time, UTC-08:00)
Developers are lazy and so they want to work less. But almost no one will happy when they see the same thing again and again. So my guess is Java can save some time for us (by reducing the differences) but someday even Java will be replaced by some language and the old bytecodes will not help much at that time.
Thai Ha
Comments are closed.