Software is all about describing a solution to a computer. Ric Parkin imagines what his ideal dialect would sound like.
Hello there, and welcome to Overload 115. Firstly, to avoid confusion, Frances has been off to enjoy a well earned holiday and kindly asked me to step in and guest edit this issue. Thankfully the well-oiled ACCU publication machine is working as well as ever, and by taking advantage of the captive speakers at the conference in Bristol I’ve managed to find plenty of articles for this issue – and promises for the future.
It was the first conference that I’ve attended for a few years – my excuse was that having to write editorials meant I’d failed to come up with any decent talk ideas, but this year my company kindly paid for me to attend. And it was as good as ever with a packed programme where, as usual, there would be at least two or three talks at a time I’d like to attend. It was also a great opportunity to meet old friends and put faces to people I only knew as an email address.
C++: the past, the present, and the future
While there is always a wide range of subjects across the talks, each conference tends to have an overall theme – sometimes intentionally, sometimes it emerges naturally depending on what’s new and interesting to people. This year’s was definitely about the evolution of C++. Many talks looked at how the new features in C++11 work in practice, and how different features interact – always the bit where the unexpected appears for both good and bad.
A good example of these serendipitous synergies is from the previous standard, when the interaction of template specialisation, non-type template parameters, and some limited compile time evaluation resulted in the discovery of Template Meta-Programming (which some consider a mixed blessing!) For all its warts (and the error messages – sorry, essays – that can result are most definitely ‘interesting’ [ Curse ]) TMP has been hugely influential which has attracted attempts to improve it – for library writers, and for the users trying to use the resulting libraries (especially when working out why their attempt has failed.)
It is notable that a lot of the recent standardisation effort has gone into improving many TMP techniques, helping better compiler support, standardising utilities such as
, and aiming to provide some sort of Concepts support. Sadly the original Concepts got dropped from C++11 when it was realised that it was becoming too big and unwieldy and risked jeopardising the next standard [
]. More happily, for the next standard – dubbed C++14 [
] – as well as mainly being some tidying up, a few new bits and pieces here and there, and relaxing some of the restrictions in C++11, there is also ongoing work towards one major new feature – Concepts-Lite – that should deliver many of the hoped for benefits of the original Concepts without becoming overly-complex. There’s even a compiler that has already implemented it to get some idea of how it works in practice [
Beyond that is even more work on C++17. You may have noticed that after the marathon to get C++11 out, the committee has decide that a shorter 3-year rolling release cycle was needed, which enables small tweaks to make it into the ‘Official’ standard rather than an interim Technical Report, while allowing ongoing work for large features to aim for a release further ahead. One of these ideas are to extend Concept-Lite even further to make the more complex cases even easier to write and use. Some of the possible ideas were shown at the conference, and valuable feedback was gleaned which resulted in some rapid updates to the syntax ideas.
A fantasy programming language
This is all tremendously exciting, although I do fear for my bookshelves that will have to deal with the resulting tomes. But it has made me wonder what sort of language features would be in my ‘Ideal Language’ – one which didn’t have to deal with backwards compatibility with previous versions (although it can be persuasively argued that backward compatibility with C was a major reason for the take up of C++, despite it leading to some design decisions that you might have not have chosen from afresh). So this is my wish-list – it’s not particularly comprehensive, or even thought through very much so forgive the inevitable clashes and contradictions between ideas. After all, that’s where the interesting things happen.
The first thing to keep in mind is the difference between the Language, the Standard Library, other available libraries, the compiler and other tools. Many people cite their opinion that language X is better than Y, but often they mean it has more libraries out of the box, or it has a nice IDE, or lots of useful tools. These are indeed important but each are different.
The core language should be as lean as possible, but allow a rich set of tools to be built upon it, including the standard libraries and others. This implies several simple but orthogonal features, with tools for abstraction and encapsulation to build new types and libraries.
The Standard library should provide a decent set of tools and types that are likely to be common to most programmes on most platforms. A key idea would be to provide an extendable framework to allow different extensions to be combined in unforeseen ways. The design of the original STL is instructive here – by having iterators as the ‘glue’ between algorithms and containers, you could write new components that provided iterators e.g. number generators, then existing and future algorithms would work with them.
We would also like lots of libraries to be available from third parties – these would cover bits that we have inevitably missed, or more niche uses. These will depend a lot on how easy such libraries are to write and combine, and so depend on the whole language package.
The compiler and tools are an odd one, but very important. We’d want our language to be easy to write the tools for, and easy to extend. C++ has traditionally been very hard to parse and slow to compile, and so has become neglected somewhat compared to other languages. More recently tools such as Clang [ Clang ] have attempted to rectify this, allowing better helpers to be written. Making our new langauge simple to parse would help to get a rich set of tools for it to fill any gaps, and a short compile time avoiding all the textual header includes is also very desirable.
Many languages have an underlying philosophy to guide the choices to be made. Here C++ has some good ones, e.g. trust the programmer; don’t pay for what you don’t use; support multiple programming paradigms. I’d also add some more: keep individual language features simple, but able to be combined with others; and avoid surprises.
So what did C++ get wrong? Well, for understandable historical reasons, values are mutable by default, similarly methods are non-const. We’ve learnt by experience and from functional languages that mutability is problematic – it can lead to programmes that are harder to reason about, and causes problems in multi-threaded code. So things should be immutable by default, and you have to ask for a modifiable value. This could even extend to free functions – if they promised to be ‘const’ by default, i.e. not modify global data or call non-const functions, then you don’t have side effects, resulting in less for the programmer to worry about and the compiler has better optimisation opportunities.
So what did C++ get right? I’m a big fan of value-based programming, especially if the language gives you tools for making user defined types that look and feel ‘natural’ to use, using converting constructors, operator overloading, conversions, and overloaded functions. So these sorts of things should be in, and perhaps extended further. At the same time I’m very much not a fan of using built-in types for anything other than implementing user-types with more specific semantics. So in addition I’d like it to be very easy to define ‘strong typedefs’ – new types that act just like an existing one, but is a different type not an alias. e.g. instead of forename and surname both being strings, you can just declare
and you’d have two different types that can’t be mixed up.
You may have noticed that this depends on a Strict type system, which is another good thing as it helps catch errors early. But many other languages are not strictly typed and have found the flexibility of dynamic type systems rather useful in certain areas. So I’d like this ability to be directly supported, but it should be optional to avoid unnecessary overhead – perhaps a keyword to tell the compiler to add the machinery for dynamic method dispatch and dynamically adding new methods. This last item is already hinting at the need for something along the lines of closures or lambdas, and this in turn hints at some sort of garbage collection, although again it should be optional on a per-type basis. Reflection can be very powerful too, but given some of the stories about its overhead in the program footprint this one definitely has to be optional!
Garbage Collection and C++ can’t be mentioned without touching on what I think is C++’s great strength – deterministic lifetime and destructors. While GC is great for reclaiming abandoned memory, it fails badly at cleaning up resources that are more limited or have to be released in a timely manner – file handles, mutexes, sockets are just some examples. The fact GC doesn’t deal with these very well can be seen by some of the solutions to these issues, such as the clunky Execute Around idioms,
, or old fashioned
methods. But value-based objects on the stack and destructors shine for these situations, and it surprises me that such solutions are not used as much in other languages (even GC ones – why doesn’t declaring an object on the stack automatically generate a
call and a
on block exit?)
C++ also has the very nice container/iterator/algorithm framework from the STL, and something similar would be excellent. This would require some sort of generics support, and I think C++’s template mechanism is an excellent start (and is better than the alternative interface based solutions), although improved concept and requirements checking would be needed. Not a trivial task! Strings are not so good though. A better solution would involved immutable strings, slices, mutable string builders, and ropes (bundles of strings that act like a single string – useful for concatenation without reallocation).
Parallelism is very important now and for the foreseeable future, so good support is vital. A good memory model is a vital bedrock upon which low level threading primitives can be based. But even these are too low level to work with directly, so some higher level parallel structures are needed, perhaps Erlang’s message passing, coroutines, Actors or some such. This latter area is less obvious, so I suspect a library based solution may be preferable to a core language issue, although many other languages take a different approach.
Other ideas are from functional languages, such as list comprehensions or data pattern matching. Getting this to mesh with procedural style can be a challenge, but languages such as D show that it can be done. Adding map/reduce, and tuples with operations such as tie and zip can be very flexible too.
And finally what about errors? There are only a few choices, and none are totally appealing. I tend to favour exceptions but would like better support for writing exception safe code – deterministic destructors help here, but also
qualifiers, and perhaps ones for Basic and Strong guarantees may help, or compiler to generate code to do the the right thing, e.g. generated swap functions.
These are just a few of my ideas off the top of my head, but is probably already too complex though. Which ones would you keep, and which have I missed?
[C++14] New draft at http://isocpp.org/files/papers/N3690.pdf