I have spent much of the last three years working on the Government Office for Science Foresight project on The future of cities. The focus was on a time horizon of fifty years into the future. It is clearly impossible to use urban models to forecast such long-term futures but it is possible in principle to explore systematically a variety of future scenarios. A key element of such scenarios is transport and we have to assume that what is on offer – in terms of modes of travel – will be very different to today – not least to meet sustainability criteria. The present dominance of car travel in many cities is likely to disappear. How, then, can we characterise possible future transport modes?
This takes me back to ideas that emerged in papers published 50 years ago (or in one case, almost that). In 1966 Dick Quandt and William Baumol, distinguished Princeton economists, published a paper in the Journal of Regional Science on ‘abstract transport modes’. Their argument was precisely that in the future, technological change would produce new modes: how could they be modelled? Their answer was to say that models should be calibrated not with modal parameters, but with parameters that related to the characteristics of modes. The calibrated results could then be used to model the take up of new modes that had new characteristics. By coincidence, Kelvin Lancaster, Columbia University economist, published a paper, also in 1966, in The Journal of Political Economy on ‘A new approach to consumer theory’ in which utility functions were defined in terms of the characteristics of goods rather than the goods themselves. He elaborated this in 1971 in his book ‘Consumer demand: a new approach’. In 1967, my ‘entropy’ paper was published in the journal Transportation Research and a concept used in this was that of ‘generalised cost’. This assumed that the cost of travelling by a mode was not just a money cost, but the weighted sum of different elements of (dis)utility: different kinds of time, comfort and so as well as money costs. The weights could be estimated as part of model calibration. David Boyce and Huw Williams in their magisterial history of transport modelling, ‘Forecasting urban travel’, wrote, quoting my 1967 paper, “impedance … may be measured as actual distance, as travel time, as cost, or more effectively as some weighted combination of such factors sometimes referred to as generalised cost……… In later publications, ‘impedance’ fell out of use in favour of ‘generalised cost’”. (They kindly attributed the introduction of ‘generalised cost’ to me.)
This all starts to come together. The Quandt and Baumol ‘abstract mode’ idea has always been in my mind and I was attracted to the Kelvin Lancaster argument for the same reasons – though that doesn’t seem to have taken off in a big way in economics. (I still have his 1971 book, purchased from Austicks in Leeds for £4-25.) I never quite connected ‘generalised cost’ to ‘abstract modes’. However, I certainly do now. When we have to look ahead to long-term future scenarios, it is potentially valuable to envisage new transport modes in generalised cost terms. By comparing one new mode with another, we can make an attempt – approximately because we are transporting current calibrated weights fifty years forward – to estimate the take up of modes by comparing generalised costs. I have not yet seen any systematic attempt to explore scenarios in this way and I think there is some straightforward work to be done – do-able in an undergraduate or master’s thesis!
We can also look at the broader questions of scenario development. Suppose for example, we want to explore the consequences of high density development around public transport hubs. These kinds of policies can be represented in our comprehensive models by constraints – and I argue that the idea of representing policies – or more broadly ‘knowledge’ – in constraints within models is another powerful tool. This also has its origins in a fifty year old paper – Jack Lowry’s ‘Model of metropolis’. In broad terms, this represents the fixing through plans of a model’s exogenous variables – but the idea of ‘constraints’ implies that there are circumstances where we might want to fix what we usually take as endogenous variables.
So we have the machinery for testing and evaluating long-term scenarios – albeit building on fifty year old ideas. It needs a combination of imagination – thinking what the future might look like – and analytical capabilities – ‘big modelling’. It’s all still to play for, but there are some interesting papers waiting to be written!!