Editorial: Keeping Up Standards

Editorial: Keeping Up Standards

By Alan Griffiths

Overload, 14(71):4-6, February 2006

“Nobody made a greater mistake than he who did nothing because he could do only a little.” - Edmund Burke.

There are many conventions that are required to make modern life possible. They take many forms – from social rules, through voluntary agreements and contractual arrangements to legislation.There are also those who seek to circumvent, subvert or exploit these conventions. This implies that there are benefits, real or imagined, from ignoring the conventions – so what is the value of them?

Let’s take one example that we are all familiar with: connecting to mains power. This involves a large number of arbitrary choices: voltage, frequency, the shape and materials of the plugs and sockets. The benefits of being able to take any appliance and plug into any socket are obvious. Less obvious are both the mechanisms and activities that support this – the standards process – and the market it creates in compatible components. The former is the cost that suppliers pay to participate in the market, and competition in the market drives improvements in quality and price. Clearly suppliers would like to avoid both the costs of validating conformance to standards and the need for competition, but society (in the form of legislation) has ensured that they cannot access the market without this.

As software developers we all come across standards as part of our work – almost all of you will be working with ASCII or one of the Unicode supersets of it. (Yes, there is also EBCDIC.) Maybe some of you can remember the time before character sets were standardised, and computer manufacturers just did their own thing. The others will have to imagine the additional effort this once required when a program needed to accept data from a new source. Standards didn’t solve the whole problem – for example, ASCII had an alternative interpretation of code 0x23 as “national currency symbol” which has largely fallen into disuse since the advent of “extended ASCII” which contains enough “national currency symbols” to handle European countries as well as the USA.

Another implication of differing character sets is that there are occasionally characters represented in one that were not representable in another (for example, the “{” and “}”curly brace characters beloved of the C family of languages are not present in EBCDIC). This means that workarounds are required – for text intended for human consumption this may be as simple as mapping these characters to some arbitrary token (such as a “?”), but for some uses (like the source code of computer programs) more automation is required – hence the C and C++ support for trigraphs and digraphs.

Despite these problems, having a common standard (or two) for interpreting and transmitting text solves a lot of problems and helped make communication between computers a commonplace. Nowadays computers can be plugged into the “world wide web” with almost the same ease that appliances can be plugged into the mains. There is less regulation and certification than with electrical power (and probably more standards), but by and large it does work.

IT standards come in a wide variety of forms. They vary in source: there are bodies that exist to ratify standards like IEEE, ISO and ECMA; there are industry consortia set up to standardise a specific area like Posix, W3C and Oasis; and there are standards published and maintained by a specific vendor like IBM, Microsoft or Sun. They vary in terms: some are financed by charging for access to the standard and some don’t charge; some require (and charge for) validation and some don’t – there are even some with legislative backing!

My last editorial related to the C++ language standard and the unfortunate treatment it has received at the hands of a major vendor of C++. What I wasn’t expecting at the time was that it would sound so familiar to a number of of correspondents. In fact, as I gathered the stories that I’m about to relate together I began to feel that Bonaparte’s maxim “never ascribe to malice that which can adequately be explained by incompetence” was being stretched beyond credibility. I now think that Microsoft has either undervalued consensus in the standardisation process, or overestimated the extent to which the its needs are a guide to the needs of the wider community. But please don’t take my word for it: decide that for yourself.

Exhibit 1

Many of you will have heard of “C++/CLI” – this is an ECMA standard that originates with Microsoft and has recently been submitted to ISO for a “Fast Track Ballot”. The BSI working group responding to this submission, and the following quotes come from this response:

At the time this project was launched in 2003, participants described it as an attempt to develop a "binding" of C++ to CLI, and a minimal (if still substantial) set of extensions to support that environment. C++/CLI is intended to be upwardly compatible with Standard C++, and Ecma TG5 have gone to praiseworthy efforts to guarantee that standard-conforming C++ code will compile and run correctly in this environment.

Nevertheless, we believe C++/CLI has effectively evolved into a language which is almost, but not quite, entirely unlike C++ as we know it. Significant differences can be found in syntax, semantics, idioms, and underlying object model . It is as if an architect said, “we're going to bind a new loft conversion on to your house, but first please replace your foundations, and make all your doors open the other way.” Continuing to identify both languages by the same name (even though one includes an all-too-often­dropped qualifier) will cause widespread confusion and damage to the industry and the standard language .

Standard C++ is maintained by WG21, the largest and most active working group in SC22. WG21 meetings, twice a year lasting a week at a time, draw regular attendance by delegates from a number of national bodies and nearly all the important vendors of C++ compilers and libraries, plus a number of people who use the language in their work. By contrast, this ECMA draft was developed by a small handful of people – awesomely competent ones, undoubtedly, but who do not represent the interests of the broad market of vendors and users. With ISO/IEC 14882 maintained by JTC 1 SC 22 WG 21 and C++/CLI maintained by ECMA, the differences between Standard C++ and the C++/CLI variant will inevitably grow wider over time. The document proposes no mechanism for resolving future differences as these two versions of C++ evolve.

For JTC1 to sanction two standards called C++ for what are really two different languages would cause permanent confusion among employers and working programmers. There is clear evidence that this confusion already exists now...

Documentation for Microsoft's Visual C++ product contains many code examples identified as “C++” – NOT “C++/CLI” or even “C++.Net” – which will fail to compile in a Standard C++ environment.

C++ already has a reputation as a complicated language which is difficult to learn and use correctly. C++/CLI incorporates lip-service support for Standard C++ but joins to it in shotgun marriage a complete second language, using new keywords, new syntax, variable semantics for current syntax, and a substantially different object model, plus a complicated set of rules for determining which language is in effect for any single line of source code.

If this incompatible language becomes an ISO/IEC standard under the name submitted, it will be publicly perceived that C++ has suddenly become about 50% more complex. The hugely increased intellectual effort would almost certainly result in many programmers abandoning the use of C++ completely.

A parallel to this situation can be found in the history of C++ itself. As related by Bjarne Stroustrup in The Design and Evolution of C++, the language in its early days was known as “C with Classes”, but he was asked to call it something else: “The reason for the naming was that people had taken to calling C with Classes ‘new C' and then C. This abbreviation led to C being called ‘plain C’, ‘straight C’ and ‘old C’. The last name, in particular, was considered insulting, so common courtesy and a desire to avoid confusion led me to look for a new name.”

(Full response at: http://www.octopull.demon.co.uk /editorial/N8037_Objection.pdf.)

In short, ISO – the body that standardised C++ is being asked to ratify a “fork” of C++, and the BSI panel is raising some concerns about this. Forking isn’t of itself evil: there are often good reasons to fork pieces of work – and on occasions good things come of it. Some of you will remember the edcs fork of gcc – this was a major reworking of the C++ compiler that was infeasible within the normal gcc schedule. This work has since become the basis of current gcc versions. Another, long-lived and useful fork exists between emacs and xemacs – these projects co-exist happily and frequently exchange ideas.

However, in these felicitous cases of forking those concerned were careful to make it clear what they were doing and why. In the case of C++/CLI it is clear that having a language with the power of C++ on the .NET platform is a good thing for those interested in the platform, and it is also clear that the CLR provides many features that cannot be accessed without language support (yes, efforts were made to find a library-based “mapping”).

In short, while there may be no explicit rules or laws being broken by Microsoft’s promotion of C++/CLI as “Pure C++” ( http://msdn.microsoft.com/msdnmag/issues/05/12/PureC/default.aspx ) it is undoubtedly antisocial.

Exhibit 2

In an earlier editorial I alluded briefly to the notice given by the Commonwealth of Massachusetts’ administration that, from 2007, they intended to require suppliers of office applications to support the OpenDocument standard. For those that don’t know, OpenDocument is an XML based standard for office applications and has been developed by Oasis (a consortium including all the major players) which has submitted it to ISO for ratification.

Although Microsoft joined Oasis it is apparent that they didn’t participate in developing the standard that was produced. I don’t know the reasons for this but, inter alia, it is clear that they disagreed with the decision to take an existing, open, XML based formats as a starting point: including that used by both StarOffice and OpenOffice. (Although the resulting OpenDocument format differs substantially from the original OpenOffice format statements from Microsoft continue to refer to it as “OpenOffice format”.)

Anyway, after discovering that their own XML based formats didn’t meet the eventual criteria that Massachusetts developed during an extended consultation period (and that these criteria were unlikely to change) Microsoft decided to standardise its own XML based formats through ECMA as “Office Open format”. (The terms of reference for the ECMA working group are interesting – by my reading it seems that the group doing the standardisation don’t have the authority to make changes to address problems discovered in the format!)

The intent is for ECMA to submit “Office Open format” to ISO for “Fast Track submission”.

I don’t have the expertise to make a technical assessment of alternative standard formats for office documents – especially when one has yet to be published. But when a major customer (Massachusetts) and a range of suppliers, including both IBM and Sun, agree on a standard I think it should be given serious consideration. It is easy to see that OpenDocument is already being adopted and supported by applications suppliers: http://en.wikipedia.org/wiki/List_of_applications_supporting_OpenDocument ).

By playing games with names, Microsoft trivialise the discussion to the point where I doubt that there is any merit to their claim that OpenDocument is seriously flawed or that Office Open format (when it arrives) will be better.

Exhibit 3

In my last editorial, talking about about Microsoft’s non­standard “Safe Standard C++ Library” I wrote: The Microsoft representatives have indicated that the parts of this work applicable to the C standard have already been adopted by the ISO C working group as the basis for a ‘Technical Report’ . Since then I’ve had my attention drawn to the following comments by Chris Hills (who was convener of BSI’s C panel around that time):

Microsoft are pushing for a new “secure C library” (See http://std.dkuug.dk/jtc1/sc22/wg14/www/docs/n1007.pdf and http://std.dkuug.dk/jtc1/sc22/wg14/www/docs/n1031.pdf ) for all the library functions, apparently all 2000 of them. I did not think there were 2000 functions in the ISO-C library but MS have included all the MS C/C++ libraries as well in this proposal, which is of no use to the vast majority in the embedded world.

The problem for me is that the resultant libraries would be full of MS specific extensions. The trust of the proposal is that there are many holes and leaks in the original libraries that permit buffer over runs, error reporting and parameter validation. Security is the important thing here they stress. One of my BSI panel said that voting against security is like “voting against Motherhood and Apple Pie”. However, there is quite some unease on the UK panel re this proposal.

The other complaint MS have in their proposal is that the library was designed when computers were “Much simpler and more constrained”. This is a common comment from PC programmers who will tell you 16 bit systems died out a while ago and there has not been an 8-bit system since Sinclair or the BBC home computers. http://www.phaedsys.org/papersese0403.html

This doesn’t sound quite like the support for the TR implied by Microsoft’s account – and I don’t know what has actually happened at the WG14 meeting when this was discussed (maybe I will by next time). However, these ISO groups are manned by individuals and organisations that volunteer their time: if someone volunteers to write a ‘Technical Report’then even the most negative response is likely to be something like “it doesn't interest me” – so Microsoft may have talked only to those interested in their idea, not those that thought it misguided. This could have led to an incorrect impression regarding the level of support for their proposal. (Or Chris may have got it wrong – I gather he was unable to attend the ISO meeting where this proposal was discussed.)

We should see later this year if WG13 accepts this ‘Technical Report’ but, even if that happens, there are some members of the working group that do not anticipate this report becoming a significant input to a future C standard.

Exhibit 4

As I reported on this last time Microsoft has unilaterally decided various usages that a sanctioned by the standard should be reported as ‘deprecated’ by their C++ implementation and replaced with other non-standard (and non-portable) usages of their own. Beyond saying that the discussions between Microsoft and other WG21 members are ongoing I won’t go into more detail at present.

Exhibit n

The above exhibits are not an exhaustive list, I’ve heard disgruntled remarks about Microsoft’s implementation of Kerberos authentication, their implementation of HTTP, their approach to standardising .NET and their work with the SQLAccess Group. Such remarks may or may not be justified – I don’t know enough about any of these subjects to make informed comment.


Conforming to standards and regulations can be irritating and inconvenient and some, like speed limits, are widely violated – to the extent that ‘being caught’ is frequently considered the nuisance, not the miscreant. (Other standards, such as prohibitions against kidnapping, are held in higher esteem.)

Part of what governs our attitude to standards is the difference between what we get out of it and the inconvenience it imposes on us. Getting to bed half an hour earlier after a long journey often seems attractive compared to the marginal increase in risk that speeding introduces. (On the other hand, the temptations of kidnapping are, to me at least, more elusive.)

In many regards Microsoft’s determination to ensure that standard C++ code works in C++/CLI is a demonstration of respect for the C++ standard. On the other hand, by their actions they appear to hold the process of standardisation, or the needs of other participants, in low regard. On the gripping hand, these standardisation processes are not meeting the needs of an important supplier – is it the process or the supplier that is out of step?

Alan Griffiths

Your Privacy

By clicking "Accept Non-Essential Cookies" you agree ACCU can store non-essential cookies on your device and disclose information in accordance with our Privacy Policy and Cookie Policy.

Current Setting: Non-Essential Cookies REJECTED

By clicking "Include Third Party Content" you agree ACCU can forward your IP address to third-party sites (such as YouTube) to enhance the information presented on this site, and that third-party sites may store cookies on your device.

Current Setting: Third Party Content EXCLUDED

Settings can be changed at any time from the Cookie Policy page.