Slide 1


Mutual Distributed Ledgers (aka Blockchains) have generated a great deal of excitement over the last couple of years, with the growing realisation that they have applications beyond cryptocurrencies.  From secure systems to manage the ‘internet of things’, to trading platforms and government systems, Mutual Distributed Ledgers (MDLs) have potential to be a transformative technology.  However, in the rush to embrace the future, it is important to ensure that this technology does not erode the accountability of governments and businesses who are employing it.

The Cardano Foundation recently sponsored Long Finance research designed to examine the challenge that Mutual Distributed Ledgers (MDLs) face with respect to governance. As part of this process, an examination was made of whether different types of MDL require different approaches and the type of tools that were required to deliver effective governance outcomes.


The Headlines

  • The effective governance of MDLs relies on people rather than software;
  • Whilst MDLs are sometimes called “trustless systems” due to the way transactions are managed, trust is actually an essential component;
  • MDLs must be assuredly secure, reliable and predictable if they are to meet their potential; only effective governance can provide such assurance to users.


Fundamental Questions

Governance is the mechanism that enables organisations to be accountable to their stakeholders whilst delivering their long term objectives.

With respect to MDLs, the issue of governance raises a number of key questions:

  • How do you go about creating and enforcing the rules by which the MDL is run?
  • What happens when there are disputes between users?
  • Who is allowed to change the software the ledger runs on, and who should have access to the data it contains?
  • How do you go about managing risk and performance?


Types of MDL

There are two type of ledger:

  • un-permissioned ledgers, where users are anonymous and there is no need to register with a central authority and;
  • permissioned ledgers, which require the identity of users to be whitelisted or blacklisted through some type of Know Your Customer (KYC) procedure.  

 These two types of ledger lend themselves to four different use classes, each of which requires different governance structures. Table 1 illustrates the four different use classes and the types of governance structures they require:

Table 1 MDL Use Classes And Their Corresponding Governance Structures

The relationship with users is affected by the governance structures chosen for the MDL.  For appointed boards and oligarchies, consultation with the users of the MDL is particularly important, as these will be more distant from users (see Figure 1)

 Figure 1 User Proximity to Governance Structures

Key Challenges In The Governance Of MDLs


The report identifies a number of key challenges that MDLs must address, regardless of the governance structure chosen.  One of the most important issues is that of trust.

Whilst MDLs are sometimes referred to as ‘trustless systems’ due to the way that transactions take place, trust is an essential component:

  • trust is required in the code that runs the MDL;
  • trust is required is persistence of the data- that it will not be changed through forks or rollbacks;
  • trust is required in your fellow users to implement appropriate systems for security and privacy;
  • in the case of crypto currencies, trust is required that other users will continue to believe in the future persistence of community valuation of a ‘virtual element’.

 Theft, fraud, coding errors, regulatory compliance, the way disputes are resolved and reputational issues can all impact on users trust in an MDL. Effective governance can address these issues and enhance trust.


 Ethical principles and social norms are important issues to consider in the governance of MDLs:

  • if an MDL has a reputation as a haven of vice and criminality, law abiding organisations and individuals will be reluctant to use it, and regulators in multiple jurisdictions will be likely to sanction its use;
  • if attempts to defraud users or hack the network go unpunished, trust in a MDL will decline.

Managing the behaviour of users is relatively straightforward in permissioned MDLs as the users are known and identified. However, in unpermissioned MDLs, users are anonymous and this is more difficult.

Regulatory Compliance

Regulatory compliance is another issue that must be considered, and the issue of privacy is a good way to demonstrate this. The way privacy is handled varies considerably across jurisdictions. The “right to be forgotten” and the General Data Protection Regulations have significant implications, given the permanent and persistent nature of MDLs.

There are technical solutions available for managing regulatory compliance, however as MDLs operate across regulatory regimes, it is essential that they are adopted by all users. Ensuring that all users comply with the adoption and implementation of these will require effective governance.


Effective Governance Mechanisms

Public MDLs

The anonymity of users complicates both dispute resolution and the management of user behaviour. Questions of legitimacy arise when it comes to code changes, and without governance structures, strategic planning and risk management are difficult.

The report draws a parallel with the provision of free e-mail services, such as Gmail.  Anyone can sign up for a free Gmail account however, to do so, you must accept the terms of use and policies. This allows Google to suspend or revoke accounts if terms of use are breached, for example distributing copyrighted material, pornography or spam.

For a public MDL, terms of use, along with the formalisation of governance structures (including accountability, dispute resolution and the basis of software changes) can be enshrined in a constitution.

Based on a constitution, two options present themselves for governance structures:

  • an open process, such as that used by the Internet Society, which may be in line with the libertarian philosophy of some cryptocurrencies;
  • or a more structured approach, such as a foundation.

State Sponsored MDLs

With respect to state sponsored MDLs, ensuring integration of the MDL into existing governance structures is essential.  A key challenge is ensuring that those responsible for oversight have both the technical knowledge necessary for running the MDL and an understanding of its strategic implications.

Private and Consortium MDLs

The key challenges faced by these types of MDL include;

  • enhancing trust through transparent decision making;
  • effective security, risk and performance management;
  • legal compliance and;
  • dispute resolution.

Consortium MDLs also face the additional challenges of:

  • effectively managing the expectations and needs of the organisations who are part of the consortium;
  • ensuring that the governance structure is independent and not unduly influenced by individual organisations or factional groups within the consortium.

As private and consortium MDLs are permissioned and the users are known to the managing body, the development of service level agreements (SLAs) is the key to effective governance.

Effective SLAs must :

  • define the nature of the services that are being delivered;
  • bind users to expected behaviours and standards especially with respect to security;
  • and establish independent mechanisms for dispute resolution.

 Whilst the governing boards of private MDLs will be mapped on to the organisation which owns them, consortia have a number of options as to how the MDL can be governed.

One example presents itself in the form of SWIFT, the Society for Worldwide Interbank Financial Telecommunications, a messaging network that financial institutions use to securely transmit information and instructions. SWIFT was established as a member owned cooperative and has been highly successful since it was established in the 1970s.

However, in establishing a new structure to govern an MDL network care must be taken not to establish a body that evolves into the type of third party organisation that MDLs are designed to replace.


Tools For Governance

The tools for effective governance of MDLs are not that different from those used for the governance of any organisation:

  • strategic plans are needed to set priorities;
  • performance management frameworks are required to ensure delivery of objectives;
  • auditing and reporting arrangements are needed to ensure accountability and;
  • risk management plans are required to deal with adverse events.

Most of these will come from the standard governance handbook, however the auditing MDLs may present some challenges.  Whilst researching this report, no accountancy firms were found who had conducted an audit on an MDL. However, whilst the accountants which were consulted did not foresee significant issues, a number of them did focus on the need to confirm that the assets which existed on the blockchain actually existed in the real world.



Ultimately, effective governance in MDL systems relies on people rather than software, and rests on three pillars:


  1. Architecture: The role of the governance structure, its composition, remit, powers, responsibilities, and its relationship with users, is a critical component.


  1. Accountability: Effective governance of MDLs enhances trust. Trust is enhanced when a governance structure is accountable to its stakeholders, transparent in its decision-making, and subject to periodic audit and third party review.


  1. Action: The governance structure must develop strategic and risk management plans, which are delivered through effective performance management frameworks. Trust can be further enhanced through the use of the voluntary standards market to independently verify performance metrics and the systems established to compile them.

 A full copy of the research report 'Responsibility Without Power' can be downloaded at


Published: Wednesday, 06 September 2017 12:00

Why do we use a financial system based on money whose value cannot be defined by any one or more definable real things? Money is used to price real resources on which markets forces are supposed to efficiently allocate.

But how can we explain a belief in efficient markets to a super intelligent alien? I asked this question to a super intelligent high profile economist who has his hands on the financial system levers. I suggested two answers: Humans are either insane or it’s a religion. The answer I obtained was “a bit of both”!

This answer is not inconsistent with the views of another high profile economist, Lord King[1]. When Mervyn King was Governor of the Bank of England he stated to an audience in New York in 2010 that: “Of all the many ways of organising banking, the worst is the one we have today”. In his 2016 book Lord King stated: “Another crisis is certain”[2]. According to King the inspiration for his book was a comment he obtained in Beijing when he was the guest of a senior Chinese Central Banker in 2011. He was informed: “I don’t think you’ve quite got the hang of money and banking yet”[3].  A view apparently shared by the Queen of England when visiting the London School of Economics in October 2008. Her Majesty inquired as to why “no one saw the credit crunch coming”[4].

The greatest threat to any investment is uncertainty. A currency with volatile and unpredictable value inhibits investment. Uncertainty is exacerbated with international investment with floating currencies. Foreign exchange fluctuations arise from multiple complex unpredictable variables that include sentiments and perceptions to political and social events around the world.

In 1990 when the Euro was being proposed the cover story of The Economist was “Its time to tether currencies”[5]. The Economist in 1990 went on to say: “Economic historians will look back on the 1980’s as the decade in which the experiment with floating currencies failed”. The article explained how economic theories that The Economist had supported did not fit the empirical evidence in regards to how a floating currency should “act as a balancing mechanism”.

Three decades later,the global financial crisis of 2008 and subsequent uncertainties about the maintenance of the Euro,again provided evidence that the financial system with floating currencies did not “act as a balancing mechanism”.

To provide a guide as to the relative value of international currencies The Economist established its own standard reference unit of value[6]:

"The Big Mac index was invented in 1986 by The Economist as a light-hearted guide to whether currencies are at their “correct” level. It is based on the theory of purchasing-power parity (PPP), the notion that in the long run exchange rates should move towards the rate that would equalise the prices of an identical basket of goods and services (in this case, a burger) in any two countries. For example, the average price of a Big Mac in America in July 2017 was $5.30; in China it was only $2.92 at market exchange rates. So the "raw" Big Mac index says that the yuan was undervalued by 45% at that time. 

Burgernomics was never intended as a precise gauge of currency misalignment, merely a tool to make exchange-rate theory more digestible. Yet the Big Mac index has become a global standard, included in several economic textbooks and the subject of at least 20 academic studies. For those who take their fast food more seriously, we have also calculated a gourmet version of the index."

However, when The Economist analyzed price distortions created by fiat money in the Soviet economy in 1991 it used energy measured in kWh[7]. A suggestion made by this author in 1977 and developed in 1983[8]. A reference unit of value like that exists for weights and measures would likewise facilitate trade. As described in my other writings it would create a basis for establishing a much lower cost, efficient, stable, resilient and sustainable financial system[9].

Today minor official currencies may be tethered to the value of a major currency but no major currency like the Euro, US dollar of the English pound has its value tethered to any one or more real goods or services. As a result the value of all official currencies have become interdependent not anchored to realty. The interdependence means a crisis in one currency can spread to others. Economic value has become a social construct not fit for the purpose of minimizing uncertainty for investors or in providing a compelling logical basis that real resources are being allocated efficiently.

As noted by Lord Stern climate change represents: “The biggest market failure the World has ever seen”[10].  Implicit in this statement is that markets are required to distribute resources not just on an efficient basis but one that is sustainable. It can also be implied that trade in non-renewable natural resources should cease. In other words efficiency becomes a second order objective to sustainability. It also means that advanced financial decision-making tools like “Present Value” or “Discounted Cash Flow” analysis should be over ridden by criteria of sustainability. It makes no sense to obtain efficient decision-making if we, or our descendents, cannot survive to enjoy the benefits of efficiency.

The criteria for selecting a basis for defining a stable unit of value therefore requires an anchor that is sustainable over the long term without creating harms or risks and whose use is essential for sustaining modern societies. While hamburgers represent a basket of commodities they do not meet the test of being essential as there are many alternatives. Any basket of commodities creates governance problems as to whom, when, and how decisions are made to change the composition according to seasons, tastes, fashions, local needs and changes in technology in their production.

A more compelling alternative is to use a sustainable service of nature that can be used to generate electricity that is essential in all modern societies. Electricity can be used to create clean air, clean water, food, clothing, shelter, and has become essential for communications and transport. Kilowatt hours (kWhs) of electricity generated from benign sources of renewable energy provides a way to construct a non-volatile reasonably stable index that provides feedback from local environments on their capacity to support humanity on the planet. Money whose value is indexed to this criterion will be referred to as Sustainable Energy Dollars (SEDs=$Z).

The word “indexed” is crucial as it means $Z are not convertible into any fixed number of kWhs though of course they can be used to purchase a negotiated number. An index based on say five year weighted averages of various parameters is essential to avoid both daily changes in real production and consumption or medium term speculation in changes that could lead to instabilities.

Generation of kWhs from benign renewable energy sources is possible in every bio-region of the planet. However, the resources required to generate benign kWhs in each bio-region could vary considerably. This means the efficiency of sustaining humanity in each bio-region could also vary considerably. So while $Z could become a global unit of account their value could change in each bio-region according to how efficiently “nature can yield her resources more abundantly”[11] in each bio-region. The purchasing power of money needs to become greatest in those bio-regions that can generate $Z most efficiently. In this way market forces are created for the global population to occupy those bio-regions that can best sustain humanity indefinitely.

The fact that the consumption of energy may only represent a minor fraction of total expenditures in a local economy need not matter. The consumption of gold was irrelevant to it being accepted as reference unit of value. Gold has few uses in a modern economy and most uses are not essential. Electricity is essential. A comparative analysis of using gold and kWhs as reference unit of value is presented in my 1983 Monograph[12]. My 2016 journal article identifies twenty-five reasons why $Z can be designed to be better fit for purpose as a medium of exchange than official currencies[13].

Stability is inherent in resources required to convert renewable energy into electricity as their useful operating life exceeds twenty years. The means by which “nature can yield her resources more abundantly” with hydroelectric generators is much longer. The weighted average installed production capacity of kWhs over say five years would change very slowly. Even with major breakthroughs in technology the legacy of existing capacity would allow future changes to be well anticipated. Decline in the installed capacity of carbon burning generators could also be expected to change at an anticipated rate.

In an age of the Internet of Things, it is now practical to meter even highly decentralized retail capacities that can generate kWhs from benign renewable sources and to also meter its ultimate consumption. The word “ultimate” is used so consumption in storage facilities is not counted. Likewise the efficiency of consuming kWhs produced from carbon burning generators in relationship to their installed capacity can also be determined. All parameters being weighted five-year averages of kWhs to produce a unitless ratio. In this way an index can be created that increase the relative purchasing power of the local $Z as the ratio of renewable to non renewable increases and as the ratio of the consumption of renewable energy increases as a percentage of the installed capacity of its generators.

As the number of householders increase with their own sources of generating benign renewable Kwhrs, tens of millions of data inputs of productive capacity and consumption could become involved. The process of collection would be automatic through the Internet not subject to discretionary adjustments as occurs when interest rate are determined by the London Interbank Offering Rate (LIBOR) or Foreign Exchange rates. Both are determined by a small group of banks with serious conflicts of interest and profit incentives involved.

Perhaps the most appropriate body to determine a sustainable value money index in each bio-region of the world would be the non-profit International Accounting Standards Board (IASB). Their standards are required in over 125 jurisdictions, with many others permitting their use. The mission of the IASB is “to develop standards that bring transparency, accountability and efficiency to financial markets around the world. [14]” They also state that: “Our work serves the public interest by fostering trust, growth and long-term financial stability in the global economy”. Members of the IASB have informed the author that the reason they have not established a standard for economic value was because it was “too difficult”. But many of their standards become problematical when organisations are operating in different currency areas.

Hopefully, the approach suggested in this article can provide a basis for the IASB to undertake the task. It is urgently required to avoid another financial crisis as anticipated Lord King and others such the Secretary General of the Basle Committee on banking supervision. The latter stated “it will be impossible to avoid a repeat of the failures that caused a near collapse of the financial system in 2008”[15].

With an agreed standard unit of value accepted in each bio-region or nation their financial system would become independent of others suffering a crisis. In a region in which a crisis arose a basis would be established for anyone to enter into monetary contracts without needing money or a bank. Life would continue, perhaps on a sustainable decentralized basis without the need for carbon taxing or trading or even central banks as argued in my other writings[16].

Decentralized banking would replace central banking. Central banking is but a narrow form of central planning requiring one set of policies settings to fit everyone. We know central planning does not work. Lord King questioned the future of central banking before he became governor of the Bank of England. In 1999 he noted: “There is no reason, in principle, why final settlements could not be carried out by the private sector without the need for clearing through the central bank”[17]. This observation supported his question: “Will future historians look back on central banks as a phenomenon largely of the twentieth century?”[18]


[1] King, M. 2010, Banking: From Bagehot to Basel, and Back Again, The Second Bagehot Lecture, Buttonwood Gathering, New York City, October 25, p. 16, <>.

[2] King, M. 2016, The End of Alchemy: Money, Banking, and the Future of the Global Economy, Little Brown: London.

[3] King, M. 2016, ‘Lord Mervyn King: Why throwing money at a financial panic will lead to a new crisis’, The Telegraph, 27 February, <>.

[4] Pierce, A. 2008, ‘The Queen asks why no one saw the credit crunch coming’, The Telegraph, 5 November, <>.

[5] The Economist, 1990, ‘Time to tether currencies?’ January 6,pp. 9-10.

[6] The Economist, 2017, ‘The Big Mac Index’, July 13, <>.

[7] The Economist, 1991, ‘When the Price is Wrong’, February 2.

[8] Turnbull, S. 1977, 'Let the Market Correct Itself', The Australian, Op. Ed. p.8, May 25.

Turnbull, S. 1983, ‘Selecting a local currency’, Options, June, The Australian Adam Smith Club, Sydney.

[9] Turnbull, S. 2017, ‘Sustainable Value Money: Why it’s needed, how to get it?’ (Forthcoming) in: Boubaker, S. and Nguyen, D. (eds.), Ethics, ESG and Sustainable Prosperity, World Scientific Publishing: Singapore, <>. Turnbull, S. 2017,’Renewable Energy Stabilising Money and Society’, (Forthcoming) in: Droege, P. (ed.), Urban Energy Transition – Handbook for cities and regions, Elsevier Science Publishers: Oxford.

[10] Stern, N. 2006, The Economics of Climate Change: The Stern Review, Cabinet Office, London: HM Treasury, London, <>.

[11] A phrase used to define the word “capital” on page 11 of Moulton, H. G. 1935, The formation of capital, Brookings Institutions, Washington D.C.

[12] Op cit. n. 8, Turnbull 1983.

[13] Turnbull, S. 2016, ‘Terminating currency options for distressed economies’, Athens Journal of Social Science, 3(3): 195-214, July <>.

[14] <>.

[15] Drummond, M. 2011, ‘Banks told to prepare for more shocks’, Australian Financial Review,< OTVEyH5WrFSHbbfIJLo8XL>.

[16] Turnbull, S. 2010, ‘Money, Renewable Energy and Climate Change’, Financiële Studievereniging Rotterdam, (FSR Forum), 12:2, pp.1417, 19-22, 24, 25, 28-29, February, Erasmus University, Rotterdam, <>.

Turnbull, S. 2011, ‘Options for Reforming the Financial System’, The IUP Journal of Governance and Public Policy, 6(3): 7-34, September, <>.

Turnbull, S. 2017, ‘Sustainable Value Money: Why it’s needed, how to get it?’ (Forthcoming) in: Boubaker, S. and Nguyen, D. (eds.), Ethics, ESG and Sustainable Prosperity, World Scientific Publishing, Singapore, <>.

[17] King, M. (1999) ‘Challenges for Monetary Policy: New and Old’, presented to a Symposium on "New Challenges for Monetary Policy" sponsored by the Federal Reserve Bank of Kansas City, Jackson Hole, Wyoming, p. 48, August 27, <>.

[18] Ibid. p. 47.

Published: Thursday, 24 August 2017 12:49

“Productivity isn’t everything, but in the long run it is almost everything. A country’s ability to improve its standard of living over time depends almost entirely on its ability to raise its output per worker.”
Paul Krugman, The Age of Diminished Expectations, MIT Press (1994).

Calculating Productivity

Productivity, at its most basic, is merely the ratio of outputs to inputs.  Having established the units of output and input, it is a simple mathematical ratio.  In a corporate setting, if you believe the accounting numbers, productivity might be rather close to being the same as operating profit.  However, the inputs are tough to establish, and for simplicity many measures of productivity therefore focus on ‘labour productivity’, i.e. the ratio of outputs to labour inputs.  However, labour input measures can take at least two basic routes, hours or wages.

Even simple corporate comparisons are fraught with difficulty [see W Bruce Chew's “No-Nonsense Guide to Measuring Productivity”], for example moving beyond direct labour to wages, considering unpaid overtime, handling foreign exchange movements, the circularity of cost attributions often being allocated by labour hours, and confusing efficiency (“same with less”) with productivity (“more with same”) leading to volume reduction being equated with increased productivity.  And this is well before arguments on financing structures and rent extraction, or existential arguments on output (e.g. “number of patients treated” or “quality of care” or “quality-adjusted life years”) or inputs (e.g. government R&D subsidies).

National productivity calculations and comparisons are largely extrapolated from firm-level methods.   The Office for National Statistics (ONS) headline labour productivity measure is output per worker for the whole economy.  Other published productivity measures are output per filled job and output per hour worked which are available for the whole economy and selected industry sections and sub-sections.

“Both unit labour costs and unit wage costs are available for the whole economy and measure the labour or wage cost of producing one unit of output.  Although not a direct measure of productivity, an inverse relationship between these measures and productivity tends to be observed: the higher the productivity of a worker, the lower the cost of labour per unit of output, and vice versa.”

Problems With The Numbers

ONS International Comparisons of Productivity (ICP) "are based on two productivity measures for all G7 countries (Canada, France, Germany, Italy, Japan, the UK and the US). These two productivity measures are gross domestic product (GDP) per worker and GDP per hour worked. These are calculated using both current and constant purchasing power parities (PPPs).

Note that both GDP and PPP methodologies are highly criticised, often highly volatile, and often revised, for example Nigeria’s 89% GDP jump in 2014  or Ireland’s problems with aircraft leasing statistics leading to a 2015 revision of GDP growth from 7.8% to 26.3%.  The problems with GDP calculations are legion. As Diana Coyle said her 2014 article "A Measure For Error" - "So complicated is the task that the official handbook explaining the construction of GDP and related national accounts figures runs to 722 pages (up from 50 pages in 1953).  And the margin of error on the UK’s annual change in GDP has been two percentage points over the past 15 years, which is the same order of magnitude as the headline growth figure itself.”  

One of the great areas of discussion surrounds quality and technological advance.  For example, search engines have become enormously valuable tools for all businesses since the late 1990s.  And search engines have increased markedly in power and usefulness, but how is that captured?  Further, the effects of the search business model on other sectors are unknown.  Advertising pays for the bulk of the service provision, while the effects are felt on directories, travel agents, or consultants (there was a mini-industry for consultancies marketing their ‘rolodexes’ in the 1990s).  Is advertising less productive (more expensive/person in traditional advertising) or more productive (billions of ads served), while directory enquiries is more productive (charge for residual enquiries much higher) or less productive (fewer, more complex enquiries per person), due to search engines?

Another telling point is just that because something is measurable because it’s paid for, doesn’t mean the overall system is working.  This is related to Bastiat’s broken window fallacy, that we can increase GDP by encouraging breaking windows and then repairing them.  We can install heating systems in tropical countries, or air conditioning systems in temperate countries that don’t need them.  We can have a lot of bureaucratic ‘make work’ by government.  We can employ a lot of people on artificially low wages and improve our productivity.  Productivity and efficiency measures do not naturally focus on the right outcomes, typically more quality-of-life issues.

Government comprises a significant amount of the economy and causes difficulties as well in the calculations.  The Atkinson Review: “Measurement of Government Output and Productivity for the National Accounts” (2005) commissioned by the ONS focused on the methodological problems in government output measures.  The New Zealand government reviewed many similar issues and have a short critique worth reading.  Amongst other methodological problems, they note the tendency for government services to assume inputs are outputs and thus productivity is stable over time, cost of production is often used as a false surrogate for value, output quality can improve markedly while productivity measures are constant, and capital and depreciation numbers may not represent inputs.

The ONS Productivity Handbook: A Statistical Overview And Guide” (2007), ONS, 191 pages, covers almost all of the theoretical ground and the complexities.  Even here though, there is insufficient time and space to cover role of national debt, role of taxation, position of pensions, and other long-term issues.  Some traditional areas of market failure highlight some quick areas of critique:

  • externalities – is productivity flattered by running down the environment?
  • agency problems – though agency strength can be a competitive advantage, equally agency issues feature in calculations of pension reasonableness, health care quality, care for the aged, and other awkward topics that can lead to biasing the numbers;
  • information asymmetries – is productivity flattered by living off historic educational quality, or infrastructure?
  • inappropriate or missing competition – for example flattered by exclusive natural resources?

Internationally, the Organisation for Economic Cooperation and Development (OECD) works hard to create valid comparisons.  Their Global Forum on Productivity has a wealth of information.  But the fact remains that productivity measures are not very reliable.

Comparative UK Productivity

Still, productivity in the UK is puzzlingly low when compared with its peers.  Traditional theory indicates that as employment becomes expensive, economic agents are encouraged to find ways to increase productivity.  In the UK, despite unemployment being low, productivity remains comparatively low as well:

 Productivity Graph

The UK is interesting in that statistics might imply there is no universal policy for all sectors as some sectors soar in productivity while others plummet:

Andy Haldane and the “Productivity Commission”, chaired by Sir Charlie Mayfield, emphasise, “a long tail of companies across all sectors in the UK whose productivity performance is falling short.   The Commission are developing, among other things, a tool which would enable firms to benchmark themselves relative to others in their sector along several key business dimensions. This could then serve as a prompt for action, enabling firms to boost their productivity performance through targeted action.  I think this micro-level assessment of productivity is a useful way to formulate plans which support productivity, and narrow productivity differences, regionally and sectorally. For example, recent work by the OECD has looked at the changing distribution of productivity across firms over time.  It suggests a widening– or bifurcation - of this distribution, with a small set of frontier firms whose productivity growth continues apace but a long tail of laggard firms whose productivity has effectively stagnated.” So we have national, sectoral, and long tail productivity problems.

Anything Special About Professional & Business Services (PBS)?

PBS firms are characterised by low levels of physical assets and high levels of rent extraction by principals, the people who run the firm.  The traditional profitability calculation looks at profit/principal as a function of rate (the billing rate), utilisation (the percentage of available time working at a rate), margin (the overheads and tooth-to-tail ratio of fee-earners to support staff), and leverage (the number of fee-earners per principal).  People who have worked in PBS are well aware that the published billing rate is frequently not the billed billing rate, utilisation algorithms vary and rarely take account of unpaid overtime, and that fee-earners and support staff are not always clearly delineated.

As a result of three factors being strongly influenced by chargeable time, PBS is characterised by a focus on billing for time where possible.  Timesheets, time-based comparisons, daily rates, all proliferate.  If you can sell on a time-basis, then efficiency can become a contradictory, countervailing factor, a point that has not gone unnoticed by clients.  However, one might expect competition, local, national, and international to make a difference. 

Ian Stewart, Deloitte’s Chief Economist, feeds in, “As with all aspects of the productivity story there are plenty of explanations - a slower pace of deregulation and globalisation, professional firms hoarding staff as demand fell away in the recession and aggressive cost control on the part of clients… It could be that productivity is not being measured properly and that the true rate of productivity growth is higher. In professional services the official measure compares growth in revenues with growth in wages. This roughly accords with how we measure margins which have, of course, been under pressure for the last decade. In this respect the official productivity numbers and the margin data tell a similar story. The official measure is based on existing, market based data sources which ought to be of decent quality. In the shadowy world of measuring productivity this is pretty good going.”

We are back to observing that “if you believe the accounting numbers, productivity might be close to the same thing as operating profit”.  If you believe that there is a ‘natural level’ of profit sufficient to make people aspire to be principals, then productivity might closely approximate a fairly stable operating profit percentage.  This might imply that PBS is characterised by firms with little incentive to invest to increase efficiency and facing little change until punctuated by a period of technologically-induced turmoil, potentially with large amounts of culling.

Another area where PBS can sometimes differ is in having mandated services, e.g. audit.  In such an environment, i.e. an oligopoly and a coerced client, there is little motivation to reduce time or fees, thus little motivation to improve productivity.  Other areas of PBS have similar mandated services, e.g. architecture, legal services, though they might argue there is more competition. 

Any Problems With The Policy Responses?

The traditional policy responses to low productivity are:

  • increase skill levels and balance skills with needs – through vocational training and education, in order to get higher-value-added per employee;
  • improve infrastructure – to increase overall economic efficiency;
  • encourage innovation – in order to discover new ways of doing things, typically via increased competition and research.

These policy responses equally make good economic sense at any time.  A longer list comes from “The Future of Productivity”, OECD (2015), that summarises well the potential generic policy responses to improve productivity. These include: 

  • Improvements in public funding and the organisation of basic research, which provide the right incentives for researchers, are crucial for pushing out the global frontier and to compensate for inherent underinvestment in basic research.
  • Rising international connectedness and the key role of multi-national enterprises in driving frontier R&D imply a greater need for global mechanisms to co-ordinate investment in basic research and related policies, such as R&D tax incentives, corporate taxation and IPR regimes.
  • Productivity growth via the diffusion of innovations at the global frontier to national frontier firms is facilitated by trade openness, participation in global value chains (GVCs) and the international mobility of skilled workers. Rising GVC participation magnifies the benefits from lifting barriers to international trade and from easing services regulation.
  • Well-functioning product, labour and risk capital markets as well as policies that do not trap resources in inefficient firms – including efficient judicial systems and bankruptcy laws that do not excessively penalize failure – help firms at the national frontier to achieve a sufficient scale, enter global markets and benefit from innovations at the global frontier.
  • A competitive and open business environment that favours the adoption of superior managerial practices and does not give incentives for maintaining inefficient business structures (e.g. via inheritance tax exemptions that may prolong the existence of poorly managed family-owned firms) facilitates within-firm productivity improvements. Stronger competition also enables the diffusion of existing technologies to laggards, which underpins their catch-up to the national frontier.
  • Innovation policies, including R&D fiscal incentives, collaboration between firms and universities and IPR protection, should be designed to ensure that they do not excessively favour applied vs basic research and incumbents vs young firms.
  • Framework policies that reduce barriers to firm entry and exit and improve the efficiency of matching in labour markets can improve productivity performance by reducing skill mismatch.
  • Reforms to policies that restrict worker mobility and amplify skill mismatch – e.g. high transaction costs on buying property and stringent planning regulations – and funding for lifelong learning will become increasingly necessary, to combat slowing growth and rising inequality.

The universal application of the above at any time raises the question of whether productivity measurement at a national level makes any difference.

What Might We Do About Productivity Measures?

Accountability requires measurement, but evaluation also requires judgement.  Measurement can displace judgement.  Professor Marilyn Strathern’s statement of Goodhart’s Law is "When a measure becomes a target, it ceases to be a good measure." As a former central banker, Charles Goodhart was referring to the fact that measures such as money supply tend to fail to measure well when too much weight is placed upon them.  The original form of Goodhart’s Law was, "As soon as the government attempts to regulate any particular set of financial assets, these become unreliable as indicators of economic trends." As he noted, "financial institutions can... easily devise new types of financial assets." Goodhart’s original expression evolved to his preferred formulation: "Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes.  "There is an old Groucho Marx joke that Woody Allen recycled to explain the inevitability of amorous relationships:

~Doctor, my brother thinks he’s a chicken.  Can you help?

~Why don’t you stop him?

~We need the eggs!

In some ways, productivity measures are equally absurd, and inevitable.  We’re going to create them and use them regardless, so what might we do to improve them, or develop alternatives?  The French Government tackled this head on with an international "Commission on the Measurement of Economic Performance and Social Progress", appointed by President Sarkozy of France, which issued its report in 2010 Many distinguished economists concluded, basically, that measurement should move "from production to well-being".  In November 2010 in the UK, then Prime Minister David Cameron established the Measuring National Well-being programme run by the Office for National Statistics (ONS).  Interestingly, from 2012 to date this has shown that "Reported personal well-being has improved every year since financial year ending 2012 when data were first collected, suggesting that an increasing number of people in the UK are feeling positive about their lives".

On alternative measures, there is an argument that productivity at a national level should focus on outcomes, and that these outcomes are not necessarily quantifiable in market terms.  For example, perhaps what we should really be exploring is how many people produce what quality of life.  This could lead to looking at output (quality of life) to inputs (working population).  Some might argue this could encourage near-term unemployment to produce a better ratio, though over the long-term this might be a better approach.  Without doubt such measures are harder to make than just relying on economic statistics, but they might be much more meaningful and useful in determining the way ahead.

Another point worth making is that we tend to undervalue lower anxiety, lower volatility, and perhaps security and safety.  This is explored in some depth, “Risk, Equality And Opportunity - The Roles For Government Finance”, but the basic point is to recognise the role of government and society in being a "guarantor" for individuals and families.

Some thoughts for discussion:

  1. We must first recognise how imperfect productivity measures are. We should urge caution against over-zealous application, particularly in PBS where some of the arguments may be circular;
  2. We could encourage more examination of improving productivity measure discussions and methodologies. One suggestion might be that we consider a comparative advantage trade analysis using Monte Carlo simulation to explore the ‘gap’.
  3. We could encourage the exploration of alternative measures focused on outcomes. This might mean exploring happiness economics and human development indices in preference to GDP-based measures.


Published: Thursday, 20 July 2017 17:01

Global Trade is now growing more slowly than economic growth, reversing a long-term trend. The UK Trade Secretary, Liam Fox, recently argued that the UK post Brexit can champion Free Trade and fight the rising tide of protectionism. In a recent speech, he described a “post geography world”. Michael Mainelli in his post has looked at some of the paradoxes around the productivity puzzle.

I would argue that we are faced with a similar challenge in measuring Trade flows. What we measure may not be capturing all that is happening. At the same time, companies’ response to climate change, reducing their carbon footprints over time and developments in Information Technology already have and look likely to have a significant impact in shaping the future of trade.

I would like to explore here a few of the trends that need to be understood to test if the old relationship between trade flows and economic growth can or will continue into the foreseeable future.

First is the impact of digitisation. A few years ago, music and film was traded on a physical disc. When these crossed national boundaries, tax could be collected and trade measured relatively easily. Now much music, TV and Film is sold on a download or streaming basis across national boundaries. The new business models include subscriptions or advertising backed access. 10 years ago if I imported say 10 CDs and 2 DVDs a month, it would be easy to count the trade. Now, what is it that can be counted? Recorded Music, TV and Film was sold as a product, but is now increasingly sold as a service. Whereas 10 years ago I would have known where my “content” was coming from, today I really have no idea. Personally, I am probably consuming more music and video than I did 10 years ago, but is that captured in the statistics?

Although, not exclusively confined to the digital world, developments in the treatment of Intellectual property are also impacting on the world of commerce. The IPR involved in my disc transferred with the transfer of the physical good. Today, a multinational may have a subsidiary, registered in Luxembourg for instance, which holds the IPR for Europe and licenses that to different countries. The flow of income from IPR can more readily be separated from the physical good sold. For example, Starbucks is one organisation that has taken this approach.

Greetings cards are another example. The Jacqui Lawson ecard website provides a wide variety of customisable cards for a wide variety of occasions. The business model is a subscription service. Sending an ecard to my daughter in Shanghai is more reliable than the physical mail service and quicker. I probably send more cards than I did 10 years ago, but buy far fewer stamps. Similarly, I took over 3000 digital photos last year, but only printed a dozen or so.

The complex supply chains of many industries stretch across many countries. After the floods in Thailand in 2011, the computer and auto supply chains were disrupted globally for some months.

One organisation I was working with at the time was considering how to reduce its carbon footprint over time. One of their scenarios suggested that their supply chains needed to be more resilient to climate change and used the Thai floods to rethink their approach to offshoring, bringing more production both home and to neighbouring economies.

As I understand it, there would appear to be no change to economic activity, but this would be measured as less trade.

Developments in 3D printing also create the potential to change the flow of trade in the manufacturing sector. Let me describe one possible cameo.

In 2030, you go to a car showroom where you are able to customise a car and then experience your customised car in virtual reality to ensure that you are happy with what you are about to order. Car components are then sourced and instead of being made in a large factory are delivered to a local plant. Batteries and raw materials are delivered, but the chassis is 3D printed and the car assembled close to the point of purchase. Instead of trade in finished products, what could happen is an increase on logistics investment in components and distributed production. Again, the design and software to drive the robotics could be centralised while supporting local manufacture.

Friends closer to the sector than I believe that this is technically and economically feasible in the early years of the 2020s. This customisation and tailoring of products may well spread to many sectors. In health, 3D printing of knee caps, hip replacements, teeth and even skin are at varying stages of development. One example I saw as a proof of concept was bespoke tableware, matching colour and design to a buyer’s desires.

Trade in services may also be impacted by these trends. For example, I have spoken at a number of Conferences in different countries, from my home or office, via video link. On one occasion, I spoke in two different continents on the same day 6 time zones apart. Of course, this means that I didn’t use either a hotel or an airline on that occasion. I will confess that if the Conference is in a nice location, I still prefer to travel, but the flexibility is very useful at times.

At one Conference, a doctor spoke of how he had developed a range of internet enabled medical instruments and was using them for remote consultations. He described a lecture tour of Japan where he was still able to run his California Clinic.

With these and many other possibilities already impacting on commerce and likely to grow over the coming decades, it would be very surprising if these changes did not impact on trade measurement.

I am not denying that there is a growing conflict between free trade and protectionism globally. However, I would caution that with the changes in business models, accounting for IPR, digitisation and climate change amelioration now under way to assume that all the relative decline in global trade is down to protectionism is worth questioning.


Published: Tuesday, 01 August 2017 01:00

 Portrait of Sir George Paish

Everyone knows about the £40m of gold recovered in the 1980s from the wreck of HMS Edinburgh which was sunk in the Arctic Sea.  Not so well known is that there is at least a hundred times that much gold waiting to be recovered from British merchant ships that were ferrying gold about during two world wars.

The problem is that unlike Royal Navy ships information on sinking of merchant ships and their cargoes has always been scrappy.  But now the Daily Mail has published an exclusive article (March 18) telling how City of London marine lawyers Campbell Johnston Clark has painstakingly pieced together a data base showing the location of hundreds of war-time wrecks carrying gold as well as how much and even where on board it was stored.

For treasure seekers this information is comparable to that about Spanish galleons sunk now stored in the great library in Seville, information which makes it the first port of call for all treasure seekers. Now a £15m venture to recover gold from a cluster of three ships sunk west of Ireland will start this summer thanks to the new City data base.  What is most intriguing about the gold sunk by U-boats is how they were able to target which ships were carrying gold.  

An example of this is the “City of Benares” which in September 1940 left Liverpool carrying hundreds of children being evacuated to America in a convoy of 18 ships. They all perished.  But why had it been singled out by a U-boat   It later emerged that it was carrying gold to pay for US munitions.  A similar incident a few days later with another ship suggests the Germans had prior information.  This almost certainly was true during the Great War and the idea for the modern-day version of a treasure data base only began when a partner, while going through some marine wills at the Public Records Office, discovered a letter that had been misfiled.

This letter, dated 1915, was to the Chancellor Reginald McKenna and had been sent by Sir George Paish, who at the time was the government’s senior economic adviser at the Treasury.  It pointed out that the Bank of England was in the habit of advising the market of any sales of gold on the very day that the gold was shipped out of the country.  From this information it was relatively easy for the Germans to deduce from published liner sailing information which ships had sailed that day (it was already known that only certain liners were licensed to carry gold).

Little was ever published about the gold losses.  This was partly because while private gold was insured, Bank of England gold was self-insured and anyway the government did not want the public to know about the loss of so much public money.

What the Daily Mail article did not tell about this 1915 letter is how Sir George Paish had been tipped off about the Bank’s idiocy by his great friend Sir Edgar Speyer, a leading City banker and, like Paish, a personal friend of Lloyd George.  Sir Edgar in pre-Great War days was chairman of UERL (Underground Electric Railways London) from 1906 to 1916 and he put together what was to become London Transport (tube and buses).  He was also a music lover who financed the start of the Proms.  His “reward” was to be most disgracefully pilloried because of his German Jewish origins and he was eventually stripped of his Privy Council membership.

German bankers were so angry at this shameful treatment of one of their kind that they shunned the City of London in the 1920s and it was not until the 1970s, when we joined the Common Market, that Deutsche Bank opened a City branch.

We had the privilege of knowing Sir George Paish (1867-1957) in his later years when he was still a great enthusiast for Free Trade.  The son of a coachman he had started work aged 14 at “The Statist” magazine compiling statistics.  This was at a time when there was little understanding of the subject (Florence Nightingale had only recently invented the “pie” chart) nor was it widely understood that statistics could be useful in managing businesses.  As a result young George was rapidly able to become seen as an “expert” and his development of railway freight/miles and passenger/miles statistics resulted in the railway companies in both the UK and the USA much improving their efficiency.  Hence the friendship with Sir Edgar of UERL.

In the Edwardian era Paish became seen as one of the leading economists of the age and he was knighted in 1911.  As it happens that was the year the Kaiser had wanted to go to War with France, but put it off when 200 German businessmen implored him to give them a couple of years to sort out their international investments.  Overseas investment at that time was regarded with suspicion by the British public, seen by many as a form of tax-dodging.  But Paish at “The Statist” , along with his friend Francis Hirst at “The Economist”, convinced Lloyd George otherwise.

Paish’s reputation at “The Statist” was such that on one of his frequent trips to America he was called on to advise President Wilson.  He could also be credited with stopping the London Discount Market from declaring insolvency when there was a “run”.  With War looming Lloyd George persuaded him it was his duty to become a civil servant (which idea he hated) and join the Treasury. There, in preparation for his retirement in 1915, he took on as his assistant, and prospective successor, a young man from Cambridge, a very different background from his own as the coachman’s son who had self-educated himself at night school.  His assistant’s name was Keynes.

John Heffernan B.Com, Barrister, is the Hon Secretary of the Free Trade League. A former financial journalist (City Editor United Newspapers and Yorkshire Post) and proprietor of the City Press (1965-75), John is publisher of The Bulletin, available by annual subscription @£50 p.a (email This email address is being protected from spambots. You need JavaScript enabled to view it.)




Published: Monday, 12 June 2017 09:58