It seems that starting up a conversation about architecture is the topic of the month. Another email, this time from anattendee at a recent Denver Java Users Group meeting I spoke at a few weeks ago:
I just finished another interview to act as a consultant on a enterprise Java project at a large (Fortune 500) Denver company. It went pretty well from a technical perspective, but I couldn't convince the interviewer of a few points. I wanted your perspective so that I can share it at the Denver Open Source Users Group this Thursday with our members...
- Interviewer said that neither him nor his team attend any events in person (DJUG, Boulder Java Users Group, DOSUG, NFJS, ApacheCon).
- Interviewer said the budget wasn't there for these events.
- I pointed out that ones like DJUG were free. He said "OH" like he didn't even know. (warning bells ringing in my head)
- I noted the value that (as you also stated) an attendee gets from not just the presentation, but the interaction with the speaker AND audience. He was not impressed.
- I asked if he or anyone on his team had ever contributed to anything open source. He said that none of his current employees or himself do. But a guy that left last year did. (warning bells ringing again)
- He said that they are primarily a consumer of open source, not a producer and that if they really wanted to change code, they would grow the tool in-house anyway.
- They are still on Junit 3.8 because they haven't had time to upgrade (this was a greenfield project as of 6 months ago)
- They are sticking with Ant because Maven looks daunting.
I made some attempts to probe why he thought this way, but of course, didn't make much headway. I think I'll stay with my current contracts for a while longer. This didn't appear to be "the place" I would want to work.
Probably a smart idea; as a general rule, if you strongly disagree with the software development practices in place at a particular company (or any part of the company's culture, for that matter), it's often not worth it to jump in--you just have to jump back out within six months or so. Granted, there are those who can stomach that kind of mismatch between internal values and external culture, but generally only by "checking out" of the environment around them and pursuing other things on the side. (As a side note, it seems to me that this "checking out" at work is probably the number one reason and source for developers working on open-source projects: if you're mentally and emotionally engaged at work, your desire to write code when you get home seems significantly diminished. It's almost like a drug addict's wet dream: getting high at work, so you don't have to bother with it at home.)
Anyway, the answers I'd love to hear your quick opinion on are:
1) Is this the norm for Fortune 500 sized companies (I've only got experience of a dataset of 10, and I'm sure you have more)
There's a conflation of topics here, so let's take them in order.
- Do many/most F500 companies act as open-source consumer? Absolutely.
- Do many/most F500 companies act as open-source producer? Not even close, but this makes sense. Most F500 companies are not in the business of contributing software; in fact, a significant majority of them don't make money on IT in the slightest, so it's not to their competitive advantage to offer up contributions to open source. (That's going to piss off the open-source zealots to no end, I'm sure, but in the end, which would you rather a drug manufacturer focus on: finding a cure to cancer, or patching security bugs in Tomcat? The benefit of an economy of specialization, is specialization. From each according to their abilities, and all....)
- Do many/most developers at F500 companies not go to events like local JUGs or conferences like NFJS? Unfortunately, far too many developers don't treat their careers as a work-in-progress, but believe that once they've landed the job, they're on a track to moderate financial success for life. In fact, lots of developers fit the demographic of "male, 18-30, and single", which is the absolute worst demographic for future planning. Lots of these guys think that "Hey, I'm smart, the money will always flow in, right? Java/.NET/C++/COBOL will always be the tool of choice, right? What, me worry?" Unfortunately, it's my experience that it doesn't get better with age. Developers in the age range of 31 and up have seen one (or two) generations of languages/platforms go by, and have had to re-tool themselves, and are still pissed about it. The ones who realize that no matter how much they learn, there's still a lot more to take on, those are the ones going to JUGs and NFJS and TechEd and JavaOne and whatever else comes their way. Those are the same ones who see training classes as opportunities for advancement, not opportunities to play 8 hours of uninterrupted Solitaire. And, unfortunately, those developers are the exception, not the rule, it seems. Which means, if you take the time to invest in yourself, you will never be in the bottom of the candidate pool for your next job. Period.
- Do many/most F500 projects stay with outdated things like JUnit 3.8? Honestly, this is the least of their sins. Over half of the code being written out there is being done without benefit of unit tests--frankly, the fact that they're using JUnit at all is a point in their favor, not against them. But I see your concern, the "we're frozen in time" syndrome when it comes to technology choices can often atrophy over long periods of time, and yes, that is a concern. But not in this case, no. If they're still using JDK 1.4 because they're still on WebSphere and have no plans to move forward until IBM makes them do it (because of end-of-support concerns--are you listening out there, VB6 world?), then yes, it's a concern. But only if they refuse to do the risk analysis of upgrading; if they do that analysis (and the architect or project manager should be able to tell you, if not show you, that analysis), and still decide to stay with what they've got for reasons they can explicate, then that's just being smart.
- Do many/most F500 projects stick with Ant because Maven looks daunting? Dear Lord, man, I don't use Maven, for the same reason! :-) Be careful not to project your own value judgements ("Maven is better than Ant!" "Maven is just a pile of crap on top of the goodness that is Ant!" and so on) on the company's decisions around build process. I've seen a few teams that used Maven and loved it, and I've seen a few teams that used Maven and hated it. I've seen some teams use Ant (or MSBuild) in ways that were simple, clean, and elegant, and I've seen Ant used in ways that would keep you up at night, shivering. (I was an accomplice to one of those latter Ant-trocides. Therapist says with time, I may be able to sleep through the night.) So long as a company can offer an articulate reason for their technology decisions, you can disagree with them if you wish, but you have to honor them. If you have serious concerns still, subvert them from within. Once you've lived their build process for a while, you'll know where the pain points are and can raise suggestions, perhaps suggesting Maven as a solution, perhaps not, to help address those pain points. But never forget, as the newly-installed outsider, your credibility will be at its lowest on the very first day you start there.
2) Would you also hear warning bells going off in your head after hearing this guy reply as he did above?
I hear warning bells all the time, but that's because I don't expect any place I go into to be perfect on all three axes of the triangle (People, Process, Technology). I'm looking for where the red flags are, because nobody's perfect. Having said that, it's critical for your sanity as a developer to know which of those three axes you need to be on the "high" end of the scale in order to preserve your own sanity, and which ones you can live without (or work to change).
3) Is it worth being an evangelist to folks like this convincing them that events AND contributing to open source are personal-improvement exercises that lead to better code and architecture at your main job?
Depends on a couple of factors. One, how much do you want to be an influencer? Some people derive great joy from changing the way a company does business, others see that as an impediment to their larger goal of "building cool stuff". And two, how much do you want to emotionally invest in this group? If you're just there for the short-term, certainly tell them and quote the statistics, but don't lose sleep over it. If you're really invested in this company (because of its location, the work its doing, you own the firm, whatever), then obviously put your heart out there on the line.
4) How far back on that continuum that we talked about at DJUG do the big companies have to be to start missing out on the early-adopter advantage? (1, 3 or 5 years back on mature technologies?)
That's a hard one to answer. "That continuum" is the technology-adoption continuum I mentioned in the JUG talk: every company is pegged somewhere on the technology adoption continuum, despite our tendencies to classify companies as either "(b)leading-edge/early-adopters" or "legacy players". Frankly, I think the hard part of that question is the phrase "mature technologies"--what's "mature" these days? What does the definition of "mature" mean for a technology? It's a hard thing to nail down, and until we do, I can't really answer the question. In the spirit of the question, though, I'd say that any firm who isn't at the very least watching forums like TheServerSide or InfoQ, and/or building a corporate bookshelf on a yearly/quarterly/monthly basis (and Safari, the joint Pearson/OReilly online book project is a HUGE win here for firms seeking to do this) is missing on golden opportunities to keep their ear to the ground. I'm not saying they need to adopt them right away, but every firm that writes software should have prototyped a Ruby/Rails app at this point, in order to get an idea of what Ruby and Rails bring to the playground, and whether or not they can play nicely with the rest of the firm's Java/.NET/C++/whatever apps.
 Ironically, the reverse is true of consultants--their credibility is at its highest on day one, and generally drops off from there, unless the consultant can prove his worth and earn more credibility. 'Tis horribly unfair, and exactly the reverse of what it should be, but I don't make these rules, I just observe them.
By the way, just a reminder, if you've got questions on architecture or other software-development-related topics that you'd love to see answered here, feel free to drop me a line: "ted@" this domain name. Or, blog them, and shoot me an email with the blog link.