It’s that time of the year again, when I make predictions for the upcoming year.
As has become my tradition now for nigh-on a decade, I will first go back over last years’
predictions, to see how well I called it (and keep me honest), then wax prophetic on what I
think the new year has to offer us.
More than a decade ago, I published
Effective Enterprise Java,
and in the opening chapter I talked about the Ten Fallacies of Enterprise Computing, essentially an
extension/add-on to Peter Deutsch’s Fallacies of Distributed Computing. But in the ten-plus years
since, I’ve had time to think about it, and now I’m convinced that Enterprise Fallacies are a different
list. Now, with the rise of cloud computing stepping in to complement, supplment or replace entirely
the on-premise enterprise data center, it seemed reasonable to get back to it.
At first, it was called “DLL Hell”. Then “JAR Hell”. “Assembly Hell”. Now, it’s fallen
under the label of “NPM-Gate”, but it always comes back to the same basic thing:
software developers need to think about their software build and runtime dependencies
as a form of Supply Chain Management. Failure to do so—on both the part of the
supplier and the consumer—leads to the breakdown of civilization and everything
we hold dear.
tl;dr Peter Verhas asks a seemingly innocent question during a technical interview, and gets an answer that is not wrong,
but doesn’t really fit. He then claims that “Sometimes I also meet candidates who not only simply do not know the answer
but give the wrong answer. To know something wrong is worse than not knowing. Out of these very few even insists and tries
to explain how I should have interpreted their answer. That is already a personality problem and definitely a no-go in an
interview.” I claim that Peter is not only wrong, but that in addition to doing his company a complete disservice with this
kind of interview, I personally would never want to work for a company that takes this attitude.
tl;dr I’ve been asked a number of times over the years how, exactly, I approach learning new stuff, whether that
be a new programming language, a new platform, whatever. This is obviously a highly personal (meaning specific to the
individual offering the answer) subject, so my approach may or may not work for you; regardless, I’d suggest to anyone
that they give it a shot and if it works, coolness.
tl;dr A recent post on medium.com addresses the topic of technical debt; I had an intuitive
disagreement with the thrust of the post, and wrote this as a way of clarifying my own thoughts
on the matter. It raises some interesting questions about what technical debt actually
is—and if we can’t define it, how can we possibly understand how to avoid it or remove it,
as opposed to our current practice of using it as a “get-out-of-this-codebase-by-blowing-it-all-up”
The confirmations are starting to flow in, and I’m getting quite the nice lineup of shows to speak at for the new calendar year; the complete list is a bit long to list here (and it’ll change as the year progresses, to be sure), but so far I’ve got a nice mix of different kinds of shows: Voxxed Days: These are smaller, newer events in cities that are new to the Devoxx conference circuit.
It’s really starting to appear like the “technical monoculture” that so pervaded the 90’s and 00’s
is finally starting to die the long-deserved ugly death it was supposed to. And I couldn’t be
As has become my tradition now for nigh-on a decade, I will first go back over last years’
predictions, to see how well I called it (and keep me honest), then wax prophetic on what I
think the new year has to offer us.
This post is inspired by this post, describing why Perl "didn't win". I think that's being generous: Perl screwed up in a big way, and they did so in classic open-source (and closed-source) fashion, by focusing too much on the tech, and not enough on the value. Before we begin, though, let me make my biases clear: I am not a Perl fan. The irony of being the #2 hit for "Perl lover" on Google (today, #4, I just checked) is so loud as to be deafening.
Too often, geeks are called upon to leverage their technical expertise (which, to most non-technical peoples' perspective, is an all-encompassing uni-field, meaning if you are a DBA, you can fix a printer, and if you are an IT admin, you know how to create a cool HTML game) on behalf of their friends and family, often without much in the way of gratitude. But sometimes, you just gotta get your inner charitable self on, and what's a geek to do then?
Apparently I have become something of a resource on programming interviews: I've had three people tell me they read the last two blog posts, one because his company is hiring and he wants his people to be doing interviews right, and two more expressing shock that I still get interviewed--which I don't really think is all that fair, more on that in a moment--and relief that it's not just them getting grilled on areas that they don't believe to be relevant to the job--and more on that in a moment, too.
It's official: I hate them. Don't get me wrong, I understand their use and the reasons why potential employers give them out. There's enough programmers in the world who aren't really skilled enough for the job (whatever that job may be) that it becomes necessary to offer some kind of litmus test that a potential job-seeker must pass. I get that. And it's not like all the programming tests in the world are created equal: some are pretty useful ways to demonstrate basic programming facilities, a la the FizzBuzz problem.
With my most recent blog post, some of you were a little less than impressed with the idea of using types, One reader, in particular, suggested that: Your encapsulating type aliases don't... encapsulate :| Actually, it kinda does. But not in the way you described. using X = qualified.type; merely introduces an alias, and will consequently (a) not prevent assignment of a FirstName to a LastName (b) not even be detectible as such from CLI metadata (i.e.
Recently, having been teaching C# for a bit at Bellevue College, I’ve been thinking more and more about the way in which we approach building object-oriented programs, and particularly the debates around types and type systems. I think, not surprisingly, that the way in which the vast majority of the O-O developers in the world approach types and when/how they use them is flat wrong—both in terms of the times when they create classes when they shouldn’t (or shouldn’t have to, anyway, though obviously this is partly a measure of their language), and the times when they should create classes and don’t.
As is pretty typical for that site, Lambda the Ultimate has a great discussion on some insights that the creators of Mozart and Oz have come to, regarding the design of programming languages; I repeat the post here for convenience: Now that we are close to releasing Mozart 2 (a complete redesign of the Mozart system), I have been thinking about how best to summarize the lessons we learned about programming paradigms in CTM.
Charlie Kindel blogs that he thinks James Gosling (and the rest of Sun) screwed us all with Java and it's "Write Once, Run Anywhere" mantra. It's catchy, but it's wrong. Like a lot of Charlie's blogs, he nails parts of this one squarely on the head: WORA was, is, and always will be, a fallacy. ... It is the “Write once…“ part that’s the most dangerous. We all wish the world was rainbows and unicorns, and “Write once…” implies that there is a world where you can actually write an app once and it will run on all devices.
TL;DR Live craftsmanship, don't preach it. The creation of a label serves no purpose other than to disambiguate and distinguish. If we want to hold people accountable to some sort of "professionalism", then we have to define what that means. I found Uncle Bob's treatment of my blog heavy-handed and arrogant. I don't particularly want to debate this anymore; this is my last take on the subject. I will freely admit, I didn't want to do this.
TL;DR: To all those who dissented, you're right, but you're wrong. Craftsmanship is a noble meme, when it's something that somebody holds as a personal goal, but it's often coming across as a way to beat up and denigrate on others who don't choose to invest significant time and energy into programming. The Zen Masters didn't walk around the countryside, proclaiming "I am a Zen Master!" Wow. Apparently I touched a nerve.
I don't know Heather Arthur from Eve. Never met her, never read an article by her, seen a video she's in or shot, or seen her code. Matter of fact, I don't even know that she is a "she"--I'm just guessing from the name. But apparently she got quite an ugly reaction from a few folks when she open-sourced some code: So I went to see what people were saying about this project.
Once again, it's time for my annual prognostication and review of last year's efforts. For those of you who've been long-time readers, you know what this means, but for those two or three of you who haven't seen this before, let's set the rules: if I got a prediction right from last year, you take a drink, and if I didn't, you take a drink. (Best. Drinking game. EVAR!) Let's begin....
There's an interesting legal interpretation coming out of the Electronic Freedom Foundation (EFF) around the Megaupload case, and the EFF has said this: "The government maintains that Mr. Goodwin lost his property rights in his data by storing it on a cloud computing service. Specifically, the government argues that both the contract between Megaupload and Mr. Goodwin (a standard cloud computing contract) and the contract between Megaupload and the server host, Carpathia (also a standard agreement), "likely limit any property interest he may have" in his data.
Two things conspire to bring you this blog post. Of Contracts and Contracts First, a few months ago, I was asked to participate in an architectural review for a project being done for one of the states here in the US. It was a project dealing with some sensitive information (Child Welfare Services), and I was required to sign a document basically promising not to do anything bad with the data.
This CNET report tells us what we’ve probably known for a few years now: in the hacker/securist cyberwar, the hackers are winning. Or at the very least, making it pretty apparent that the cybersecurity companies aren’t making much headway. Notable quotes from the article: Art Coviello, executive chairman of RSA, at least had the presence of mind to be humble, acknowledging in his keynote that current "security models" are inadequate. Yet he couldn't help but lapse into rah-rah boosterism by the end of his speech.
Eric Evans, a number of years ago, wrote a book on “Domain Driven Design”. Around the same time, Martin Fowler coined the “Rich Domain Model” pattern. Ever since then, people have been going bat-shit nutso over building these large domain object models, then twisting and contorting them in all these various ways to make them work across different contexts—across tiers, for example, and into databases, and so on. It created a cottage industry of infrastructure tools, toolkits, libraries and frameworks, all designed somehow to make your objects less twisted and more usable and less tightly-coupled to infrastructure (I’ll pause for a moment to let you think about the absurdity of that—infrastructure designed to reduce coupling to other infrastructure—before we go on), and so on.
As discriminatory as this is going to sound, this one is for the old-timers. If you started programming after the turn of the milennium, I don’t know if you’re going to be able to follow the trend of this post—not out of any serious deficiency on your part, hardly that. But I think this is something only the old-timers are going to identify with. (And thus, do I alienate probably 80% of my readership, but so be it.) Is it me, or is programming just less interesting today than it was two decades ago?
Well, friends, another year has come and gone, and it's time for me to put my crystal ball into place and see what the upcoming year has for us. But, of course, in the long-standing tradition of these predictions, I also need to put my spectacles on (I did turn 40 last year, after all) and have a look at how well I did in this same activity twelve months ago.
Recently I got an email from Bohdan Zograf, who offered: Hi! I'm willing to translate publication located at http://blogs.tedneward.com/2006/06/26/The+Vietnam+Of+Computer+Science.aspx to the Belorussian language (my mother tongue). What I'm asking for is your written permission, so you don't mind after I'll post the translation to my blog. I agreed, and next thing I know, I get the next email that it’s done. If your mother tongue is Belorussian, then I invite you to read the article in its translated form at http://www.moneyaisle.com/worldwide/the-vietnam-of-computer-science-be.
Long-time readers of this blog know what’s coming next: it’s time for Ted to prognosticate on what the coming year of tech will bring us. But I believe strongly in accountability, even in my offered-up-for-free predictions, so one of the traditions of this space is to go back and revisit my predictions from this time last year. So, without further ado, let’s look back at Ted’s 2010 predictions, and see how things played out; 2010 predictions are prefixed with “THEN”, and my thoughts on my predictions are prefixed with “NOW”: For 2010, I predicted....
Hey, anybody who’s got significant VMWare mojo, help out a bro? I’ve got a Win7 VM (one of many) that appears to be exhibiting weird disk behavior—the vmdk, a growable single-file VMDK, is almost precisely twice the used space. It’s a 120GB growable disk, and the Win7 guest reports about 35GB used, but the VMDK takes about 70GB on host disk. CHKDSK inside Windows says everything’s good, and the VMWare “Disk Cleanup” doesn’t change anything, either.
By now, the Twitter messages have spread, and the word is out: at Uberconf this year, I did a session ("Pragmatic Architecture"), which I've done at other venues before, but this time we made it into a 180-minute workshop instead of a 90-minute session, and the workshop included breaking the room up into small (10-ish, which was still a teensy bit too big) groups and giving each one an "architectural kata" to work on.
Code Katas are small, relatively simple exercises designed to give you a problem to try and solve. I like to use them as a way to get my feet wet and help write something more interesting than "Hello World" but less complicated than "The Internet’s Next Killer App". This one is from the UVa online programming contest judge system, which I discovered after picking up the book Programming Challenges, which is highly recommended as a source of code katas, by the way.
Code Katas are small, relatively simple exercises designed to give you a problem to try and solve. I like to use them as a way to get my feet wet and help write something more interesting than "Hello World" but less complicated than "The Internet's Next Killer App". Rick Minerich mentioned this one on his blog already, but here is the original "problem"/challenge as it was presented to me and which I in turn shot to him over a Twitter DM: I have a list, say something like [4, 4, 4, 4, 2, 2, 2, 3, 3, 2, 2, 2, 2, 1, 1, 1, 5, 5], which consists of varying repetitions of integers.
A couple of days ago, a buddy of mine, Scott Hanselman, wrote a nice little intro to the "dynamic" type in C# 4.0. In particular, I like (though don't necessarily 100% agree with) his one-sentence summation of dynamic as "There's no way for you or I to know the type of this now, compiler, so let's hope that the runtime figures it out." It's an interesting characterization, but my disagreement with his characterization is not the point here, at least not of this particular blog entry.
Cruising the Web late last night, I ran across "10 things you can do to advance your career as a developer", summarized below: Build a PC Participate in an online forum and help others Man the help desk Perform field service Perform DBA functions Perform all phases of the project lifecycle Recognize and learn the latest technologies Be an independent contractor Lead a project, supervise, or manage Seek additional education I agreed with some of them, I disagreed with others, and in general felt like they were a little too high-level to be of real use.
Here we go again—another year, another set of predictions revisited and offered up for the next 12 months. And maybe, if I'm feeling really ambitious, I'll take that shot I thought about last year and try predicting for the decade. Without further ado, I'll go back and revisit, unedited, my predictions for 2009 ("THEN"), and pontificate on those subjects for 2010 before adding any new material/topics. Just for convenience, here's a link back to last years' predictions.
Paul asked me to review this, his first book, and my comment to him was that he had a pretty high bar to match; being of the same "series" as Release It!, Mike Nygard's take on building software ready for production (and, in my repeatedly stated opinion, the most important-to-read book of the decade), Debug It! had some pretty impressive shoes to fill. Paul's comment was pretty predictable: "Thanks for keeping the pressure to a minimum." My copy arrived in the mail while I was at the NFJS show in Denver this past weekend, and with a certain amount of dread and excitement, I opened the envelope and sat down to read for a few minutes.
Phil Haack wrote a thoughtful, insightful and absolutely correct response to my earlier blog post. But he's still missing the point. The short version: Phil's right when he says, "Agile is less about managing the complexity of an application itself and more about managing the complexity of building an application." Agile is by far the best approach to take when building complex software. But that's not where I'm going with this.
The above quote was tossed off by Billy Hollis at the patterns&practices Summit this week in Redmond. I passed the quote out to the Twitter masses, along with my +1, and predictably, the comments started coming in shortly thereafter. Rather than limit the thoughts to the 120 or so characters that Twitter limits us to, I thought this subject deserved some greater expansion. But before I do, let me try (badly) to paraphrase the lightning talk that Billy gave here, which sets context for the discussion: Keeping track of all the stuff Microsoft is releasing is hard work: LINQ, EF, Silverlight, ASP.NET MVC, Enterprise Library, Azure, Prism, Sparkle, MEF, WCF, WF, WPF, InfoCard, CardSpace, the list goes on and on, and frankly, nobody (and I mean nobody) can track it all.
Recently I've had the pleasure to make the acquaintance of Walter Bright, one of the heavyweights of compiler construction, and the creator of the D language (among other things), and he's been great in giving me some hand-holding on some compiler-related topics and ideas. Thus, it seems appropriate to point out that Walter's willing to give lots of other people the same kind of attention and focus, in exchange for your presence in gorgeous Astoria, OR.
Well, OK, the title is trolling ever so slightly, but there is an interesting trend at work, and I'm genuinely concerned about its ultimate expression if the trend continues to its logical conclusion. Have a look and tell me if you agree or disagree.
The Simple-Talk newsletter is a monthly e-zine that the folks over at Red Gate Software (makers of some pretty cool toys, including their ANTS Profiler, and recent inheritors of the Reflector utility legacy) produce, usually to good effect. But this month carried with it an interesting editorial piece, which I reproduce in its entirety here: When the market is slack, nothing succeeds better at tightening it up than promoting serial group-panic within the community.
This crossed my Inbox: I read your article entitled: The Polyglot Programmer. How about the thought that rather than becoming a polyglot-software engineer; pick a polyglot-language. For example, C# is borrowing techniques from functional and dynamic languages. Let the compiler designer worry about mixing features and software engineers worry about keep up with the mixture. Is this a good approach? [From Phil, at http://greensoftwareengineer.spaces.live.com/] Phil, it’s an interesting thought you’ve raised—which is the better/easier approach to take, that of incorporating the language features we want into a single language, rather than needing to learn all those different languages (and their own unique syntaxes) in order to take advantage of those features we want?
Last year I had the opportunity to return to the land of my roots, Poland, and speak at Java Developer Days (JDD). Just today, the organizers from JDD sent me a link with a nice little photo montage from the conference. (I did notice a few photos from the after-party were selectively left out of the montage, however, which is probably a good thing because that was the first time I'd ever met a Polish Mad Dog, and boy did they all go down easy...) If you're anywhere in the area around Krakow in March, you definitely should swing by for their follow-up conference, 4Developers--it sounds like it's going to be another fun event, and this time it's going to reach out to more than just the Java folks, but also the .NET crowd (and a few others), as well.
From Scott Hanselman's blog: Are you in King County/Seattle/Redmond/Bellevue Washington and surrounding areas? Are you a huge nerd? Perhaps a geek? No? Maybe a dork, dweeb or wonk. Maybe you're in town for an SDR (Software Design Review) visiting BillG. Quite possibly you're just a normal person. Regardless, why not join us for some Mall Food at the Crossroads Bellevue Mall Food Court on Monday, January 19th around 6:30pm? ... NOTE: RSVP by leaving a comment here and show up on January 19th at 6:30pm!
Chris Sells, an acquaintance (and perhaps friend, when he's not picking on me for my Java leanings) of mine from my DevelopMentor days, has a habit of putting on a "DevCon" whenever a technology seems to have reached a certain maturity level. He did it with XML a few years ago, and ATL before that, both of which were pretty amazing events, filled with the sharpest guys in the subject, gathered into a single room to share ideas and shoot each others' pet theories full of holes.
It's once again that time of year, and in keeping with my tradition, I'll revisit the 2008 predictions to see how close I came before I start waxing prophetic on the coming year. (I'm thinking that maybe the next year--2010's edition--I should actually take a shot at predicting the next decade, but I'm not sure if I'd remember to go back and revisit it in 2020 to see how I did.
It amazes me how insular and inward-facing the software industry is. And how the "agile" movement is reaping the benefits of a very simple characteristic. For example, consider Jeff Palermo's essay on "The Myth of Self-Organizing Teams". Now, nothing against Jeff, or his post, per se, but it amazes me how our industry believes that they are somehow inventing new concepts, such as, in this case the "self-organizing team". Team dynamics have been a subject of study for decades, and anyone with a background in psychology, business, or sales has probably already been through much of the material on it.
The full list is here. It's a pretty prestigious group--and I'm totally floored that I'm there next to some pretty big names. In homage to Ms. Sally Fields, of so many years ago... "You like me, you really like me". Having somebody come up to me at a conference and tell me how much they like my blog is second on my list of "fun things to happen to me at a conference", right behind having somebody come up to me at a conference and tell me how much they like my blog, except for that one entry, where I said something totally ridiculous (and here's why) ....
As Joel points out, we've made a draft of the SSCLI 2.0 Internals book available for download (via his blog). Rather than tell you all about the book, which Joel summarizes quite well, instead I thought I'd tell you about the process by which the book came to be. Editor's note: if you have no interest in the process by which a book can get done, skip the rest of this blog entry.
For those of you who were at the Cinncinnati NFJS show, please continue on to the next blog entry in your reader--you've already heard this. For those of you who weren't, then allow me to make the announcement: Hi. My name's Ted Neward, and I am now a ThoughtWorker. After four months of discussions, interviews, more discussions and more interviews, I can finally say that ThoughtWorks and I have come to a meeting of the minds, and starting 3 September I will be a Principal Consultant at ThoughtWorks.
This comment deserves response: First of all, if you're quoting my post, blocking out my name, and attacking me behind my back by calling me "our intrepid troll", you could have shown the decency of linking back to my original post. Here it is, for those interested in the real discussion: http://www.agilesoftwaredevelopment.com/blog/jurgenappelo/professionalism-knowledge-first Well, frankly, I didn't get your post from your blog, I got it from an email 'zine (as indicated by the comment "This crossed my Inbox..."), and I didn't really think that anybody would have any difficulty tracking down where it came from, at least in terms of the email blast that put it into my Inbox.
Recently this little gem crossed my Inbox.... Professionalism = Knowledge First, Experience Last By J----- A----- Do you trust a doctor with diagnosing your mental problems if the doctor tells you he's got 20 years of experience? Do you still trust that doctor when he picks up his tools, and asks you to prepare for a lobotomy? Would you still be impressed if the doctor had 20 years of experience in carrying out lobotomies?
If you've peeked at my blog site in the last twenty minutes or so, you've probably noticed some churn in the template in the upper-left corner; by now, it's been finalized, and it reads "JOB REFERRALS". WTHeck? Has Ted finally sold out? Sort of, not really. At least, I don't think so. Here's the deal: the company behind those ads, Entice Labs, contacted me to see if I was interested in hosting some job ads on my blog, given that I seem to generate a moderate amount of traffic.
The Pragmatic Programmer says, "Learn a new language every year". This is great advice, not just because it puts new tools into your mental toolbox that you can pull out on various occasions to get a job done, but also because it opens your mind to new ideas and new concepts that will filter their way into your code even without explicit language support. For example, suppose you've looked at (J/Iron)Ruby or Groovy, and come to like the "internal iterator" approach as a way of simplifying moving across a collection of objects in a uniform way; for political and cultural reasons, though, you can't write code in anything but Java.
As Amanda notes, I’m riding with 46 other folks (and lots of beer) on a bus from Michigan to devLink in Tennessee, as part of sponsoring the show. (I think she got my language preferences just a teensy bit mixed up, though.) Which brings up a related point, actually: Amanda (of “the great F# T-shirt” fame from TechEd this year) and I are teaming up to do F# In A Nutshell for O’Reilly.
Steve Yegge posted the transcript from a talk on dynamic languages that he gave at Stanford. Cedric Beust posted a response to Steve's talk, espousing statically-typed languages. Numerous comments and flamewars erupted, not to mention a Star Wars analogy (which always makes things more fun). This is my feeble attempt to play galactic peacemaker. Or at least galactic color commentary and play-by-play. I have no doubts about its efficacy, and that it will only fan the flames, for that's how these things work.
Recently, a former student asked me, I was in a .NET web services training class that you gave probably 4 or so years ago on-site at a [company name] office in [city], north of Atlanta. At that time I asked you for a list of the technical blogs that you read, and I am curious which blogs you are reading now. I am now with a small company where I have to be a jack of all trades, in the last year I have worked in C++ and Perl backend type projects and web frontend projects with Java, C#, and RoR, so I find your perspective interesting since you also work with various technologies and aren't a zealot for a specific one.
Not too long ago, Don wrote: The three most “personal” choices a developer makes are language, tool, and OS. No. That may be true for somebody who works for a large commercial or open source vendor, whose team is building something that fits into one of those three categories and wants to see that language/tool/OS succeed. That is not where most of us live. If you do, certainly, you are welcome to your opinion, but please accept with good grace that your agenda is not the same as my own.
A couple of folks have taken me to task over some of the things I said... or didn't say... in my last blog piece. So, in no particular order, let's discuss. A few commented on how I left out commentary on language X, Y or Z. That wasn't an accidental slip or surge of forgetfulness, but I didn't want to rattle off a laundry list of every language I've run across or am exploring, since that list would be much, much longer and arguably of little to no additional benefit.
Recently, it has become the fad to weigh in on the Groovy vs JRuby debate, usually along the lines of "Which is X?", where X is one of "better", "faster", "more powerful", "more acceptable", "easier", and so on. (Everybody seems to have their own adjective/adverb to slide in there, so I won't even begin to try to list them all.) Rick Hightower, in a blog post from January, weighs in on this and comes down harshly on both Scala and JRuby.
From Wikipedia (itself a source of conceptual folk etymology, but that's another rant): A commonly held misunderstanding of the origin of a particular word, a false etymology "The popular perversion of the form of words in order to render it apparently significant"; "the process by which a word or phrase, usually one of seemingly opaque formation, is arbitrarily reshaped so as to yield a form which is considered to be more transparent" What do I mean by "technical folk etymology"?
(Editor's note: This post is likely to open a huge can of whoop-*ss on this blog, so unless you want to get caught up in the huge bar fight that's about to break out, you're advised to take your whiskey or beer and head outside for a smoke until the cops come.) As a fellow Scala writer, I've been following Daniel Spiewak's blog with no small amount of interest, as he discovers little tidbits inside the Scala language (like the Option type).
Over on Channel 9, the video interview recorded with me during Lang.NET has gone live. Have a look, tell me what you think.
Kohsuke Kawagachi has posted a blog entry describing how to watch the assembly code get generated by the JVM during execution, using a non-product (debug or fastdebug) build of Hotspot and the -XX:+PrintOptoAssembly flag, a trick he says he learned while at TheServerSide Java Symposium a few weeks ago in Vegas. He goes on to do some analysis of the generated assembly instructions, offering up some interesting insights into the JVM's inner workings.
Recently I received a press announcement from Waggener-Edstrom, Microsoft's PR company, about their latest move in the interoperability space; I reproduce it here in its entirety for your perusal: Hi Ted, Microsoft is announcing another action to promote greater interoperability, opportunity and choice across the IT industry of developers, partners, customers and competitors. Today Microsoft is posting additional documentation of the XAML (eXtensible Application Markup Language) formats for advanced user experiences, enabling third parties to access and implement the XAML formats in their own client, server and tool products. This documentation is publicly available, for no charge, at http://go.microsoft.com/fwlink/?LinkId=113699 . It will assist developers building non-Microsoft clients and servers to read and write XAML to process advanced user experiences – with lots of animation, rich 2D and 3D graphic and video.
Apparently, I'm drawing enough of an audience through this blog that various folks have started to send me press releases and notifications and requests for... well, I dunno exactly, but I'm assuming some blogging love of some kind. I'm always a little leery about that particular subject, because it always has this dangerous potential to turn the blog into a less-credible marketing device, but people at conferences have suggested that they really are interested in what I think about various products and tools, so perhaps it's time to amend my stance on this.
A couple of people have asked me over the last few weeks, so it's probably worth saying out loud: No, I don't work for a large company, so yes, I'm available for consulting and research projects. If you've got one of those burning questions like, "How would our company/project/department/whatever make use of JRuby-and-Rails, and what would the impact to the rest of the system be", or "Could using F# help us write applications faster", or "How would we best integrate Groovy into our application", or "How does the new Adobe Flex/AIR move help us build richer client apps", or "How do we improve the performance of our Java/.NET app", or other questions along those lines, drop me a line and let's talk.
While perusing the E Tutorial, I noticed something that was simple and powerful all at the same time: URLs as first-class concepts in the language. Or, if you will, URLs as a factory for creating objects. Check out this snippet of E: ? pragma.syntax("0.8") ? def poem := <http://www.erights.org/elang/intro/jabberwocky.txt> # value: <http://www.erights.org/elang/intro/jabberwocky.txt> ? <file:c:/jabbertest>.mkdirs(null); ? <file:c:/jabbertest/jabberwocky.txt>.setText(poem.getText()) Notice how the initialization of the "poem" variable is set to what looks like an HTTP URL?
Since we're examining various aspects of the canonical O-O language (the three principals being C++, Java and C#/VB.NET), let's take in a review of another recent post, this time on the use of "new" in said languages. All of us have probably written code like this: Foo f = new Foo(); And what could be simpler? As long as the logic in the constructor is simple (or better yet, the constructor is empty), it would seem that the simplest code is the best, so just use the constructor. Certainly the MSDN documentation is rife with code that uses public constructors. You can probably find plenty of public constructors used right here on my blog. Why invest the effort in writing (and using) a factory class that will probably never do anything useful, other than call a public constructor?
Gilad makes the case that static, that staple of C++, C#/VB.NET, and Java, does not belong: Most imperative languages have some notion of static variable. This is unfortunate, since static variables have many disadvantages. I have argued against static state for quite a few years (at least since the dawn of the millennium), and in Newspeak, I’m finally able to eradicate it entirely. I think Gilad conflates a few things, but he's also got some good points.
During the Lang.NET Symposium, a couple of things "clicked" all simultaneously, giving me one of those "Oh, I get it now" moments that just doesn't want to leave you alone. During the Intentional Software presentation, as the demo wound onwards I (and the rest of the small group gathered there) found myself looking at the same source code, but presented in a variety of new ways, some of which appealed to me as the programmer, others of which appealed to the mathematicians in the room, others of which appealed to the non-programmers in the room.
OK, after a week of getting the Internet equivalent of Bad Mojo being sent my way by every Perl developer on the planet, I have to admit something that may strike readers as inconsistent and incongruous. I want Parrot to work. I don't really care about Perl 6, per se. As I've said before, the language has a lot of linguistic inconsistencies and too many violations of the the Principle of Least Surprise to carry a lot of favor with me.