...instead of spending your time playing, why not learn to be multi-lingual in a multi-cultural world?
Hello and welcome to Overload #81.
In a change to our scheduled programming, I've taken over editing this issue while Alan's been away. And thanks to the sterling efforts and helpful advice from everyone involved it has all been remarkably stress-free for me (well apart from the moment when I realised I'd have to write an editorial - I thought I'd stopped doing essays years ago!)
But it's all been worth it and we've another great issue for you with highlights including the next installment in the thrilling soap opera that is 'Paramaterise from Above', and Allan Kelly showing us some card tricks.
Raising standards
In Overload #80, the editorial talked about some odd things happening in the Standardization organisations. Since then there have been more reports of irregularities in the OOXML process [ Groklaw ], with several national groups raising objections, both to the standard and the way the votes have been conducted.
I won't comment on what's going on - that's a long conversation over a few beers I suspect - but one thing struck me as very interesting: the Internet Society Netherlands suggested [ ISN ]:
ISOC.nl recommends that the ISO procedures - and more specific the Fast Track procedure - be adapted significantly to better deal with controversial standards like DIS 29500/Office Open XML in order for ISO to maintain relevant. This includes demanding two interoperable and independent full implementations prior to accepting a submission for a Fast Track procedure.
This reminded me of the heroic failure that is 'export' in C++: an interesting idea that was a theory without practice when it was standardized. In the subsequent years, I only know of a single implementation [ EDG ] that has actually appeared after a huge effort, and the reported benefits were much less than originally hoped for. Other vendors have reportedly decided never to even try to support it, reasoning that the cost/benefits just don't stack up. If an attempt to implement it had been tried before it was standardized, I think it would have been clear quite quickly that it wasn't going to be worth the effort and it would have been abandoned.
Fortunately this lesson appears to have been heeded, as the more interesting parts of the next C++ standard (usually dubbed C++0x) are being tested out in real compilers and libraries to prove their worth and smooth out the resulting implementations before they get into the standard. After all, a lot of standardization is meant to be about formalizing existing practice so that alternative vendors can produce a new implementations and existing code will be able to work with any implementation. In this sense, it's the interface between the compiler vendor and the user that gives guarantees about what should happen (and makes clear what is undefined or implementation defined, so you know when you're straying into such grey areas). Looked at like this it makes sense that something like export didn't work - we all know how bad an interface can be if it has been written in isolation without it actually being used.
One of the new features that does look very well thought through makes an appearance in the final part of Richard Harris's auto_value series. I feel confident that future articles will start to cover more of the new language features as time goes on.
Multi-lingual
How many languages do you know? I don't mean the usual embarrassing haul of English with a smattering of Restaurant French half remembered from school, but how many programming languages? And how many types of languages? After all, if you know C#, VB.net isn't a big leap, but a dynamic language such as Python does things quite differently, and a functional language such as Lisp requires a completely new way of thinking.
So what if you follow the advice 'Learn at least a new language every year' [ Pragmatic ]? If you know multiple types of languages, you'll have a much richer set of tools and ways of thinking about a problem, so if something is hard in one language but is very natural in another, you're more likely to be able to choose the right tool for the job or gain new insights by bringing the ideas of one language into the domain of another.
A good example of such cross-fertilization is Lambda functions - originally found in functional languages, partially implemented in boost.lambda [ Boost ], and now proposed for C++0x [ WG21 ]. Another is the whole idea of functional programming itself, whose parallels with some C++ features Stuart Golodetz investigates.
Nuts and bolts
One of the main pieces of IT news in recent weeks (no, NOT the iPhone) has to be the EC's anti-competitive ruling on Microsoft with a record fine of €497M. While that's small change compared to their income and can easily be absorbed, the wider effects will only become clear over time (assuming an appeal isn't launched). While the rulings on the bundling of certain applications such as Media Player is fascinating stuff (especially as I'm someone who recently bought a new PC and was frustrated to find I couldn't play a DVD so gave up and downloaded VLC instead), I was more interested in the rulings on publishing server interoperability protocols. In and of themselves, they're not very interesting, but the principle is - how much should a company publish product APIs that allow a competitor's products to talk to it?
I don't think there's an easy business answer to that one - it depends on many things such as the technology, market size and the company dominance. For example, a small company that's invented a new technology may wish to open up its protocols and let other people make compatible products in the hope that it will drive adoption faster than they could on their own, at the cost of having a smaller share of a much bigger market. Different situations will result in a different answer.
But why not publish all protocols and APIs? What would happen? Think about it as a thought experiment for a moment and you'll spot a lot of the arguments on both sides. Revealing too much may lead to a company's hard work and clever ideas being copied, and they get no benefit so won't bother again - the hope of reward drives a lot of innovation. But conversely the ability to 'mix and match' parts provides a market for someone who can make a better and cheaper widget, as long as it can work with the rest of the system, so having open protocols encourages a richer eco-system of suppliers, also driving innovation. Obviously there's a balance somewhere - just where is what this ruling was trying to establish in this particular case, and Standards bodies have a similar intent to define the points of interoperability such that you do have a choice of what goes on either side, just the like the size standards of nuts and bolts [ ISO ].
"Learn as if you were to live forever"
I've been interviewing a lot recently due to expanding our team, and have got to wondering what exactly to look for in a software developer. We tend to follow Joel Spolsky's advice in The Guerrilla Guide To Interviewing [ Spolsky ] where he recommends trying to find 'Smart people who get things done'. Now that's fine as far as it goes, but I'd go one further: 'Smart people who get things done and want to improve '.
I think this is vital because this industry doesn't stand still - there are always new technologies coming out, new competitors with a rival product, and more and more things become possible as power and storage improve. So you want people who'll be able to adapt and do new things - personally I don't care too much if a candidate doesn't know that much about the ins and outs of a particular area unless they're being hired as an expert for that niche, but I think it's vital that they'll learn what they need very fast and not stay stuck in a rut trying to solve every problem with the same tools and ideas.
It reminds me of an anecdote I heard years ago about someone trying to find what made someone into a Guru. They interviewed all the people regarded as the Ones Who Know to find out what was their super-hero ability - the things that made them so respected and productive - with the idea that if they trained others this core skill set they'd have lots of experts really quickly. When the results came in, they found that there was only one common skill - they used the help system to find the answers they didn't know and they used it a lot. The only thing that made them better than everyone else was the ability and desire to know more.
So what sort of things would show that someone wants to learn and are one of the better programmers? Well, if they did things like going on training courses or being mentored, going to conferences, reading programming mailing lists and magazines, meeting with like-minded people to exchange ideas.... in fact a lot like what the ACCU provides.
References
[ Boost] http://www.boost.org/libs/lambda/
[ EDG] Edison Design Group's C++ front end http://www.edg.com/ used by compiler vendors such as Comeau http://www.comeaucomputing.com/
[ Groklaw] http://www.groklaw.net/article.php?story=20070902123701843 http://www.groklaw.net/article.php?story=2007081708383138
[ ISN] http://isoc.nl/michiel/nodecisiononOOXML.htm
[ ISO] http://www.engineersedge.com/iso_hardware_menu.shtml
[ Pragmatic] http://www.pragmaticprogrammer.com/loty/
[ Spolsky] http://www.joelonsoftware.com/articles/GuerrillaInterviewing3.html
[ WG21] http://www.open-std.org/JTC1/SC22/WG21/docs/papers/2006/n1958.pdf