Do Australians care about science?

Well Canadians do (apparently); and they do more than just about anyone else. See the release today posted on http://cnw.ca/He1J3; reporting on a study commissioned by the Council of Canadian Academies.

The Council  – ” an independent, not-for-profit organization that began operation in 2005. [it] supports evidence-based, expert assessments to inform public policy development in Canada. Assessments are conducted by independent, multidisciplinary panels of experts from across Canada and abroad. [its] blue-ribbon panels serve free of charge and many are Fellows of the Council’s Member Academies: the Royal Society of Canada; the Canadian Academy of Engineering; and the Canadian Academy of Health Sciences. The Council’s vision is to be a trusted voice for science in the public interest.

The manner of summarising bears some examination, but that aside, the message it conveys is worthy of reflection, given that its Australian counterpart has this month embarked on a new exercise to measure the dollar contribution of the core sciences to the Australian economy.

The Australian Chief Scientist has asked the Australian Academy of Sciences to devise an Australian version of recently released report into contribution of the Mathematical Sciences to the UK economy (conducted for the Royal Society by the accounting firm Deloites). AAS has in turn commissioned the Centre of International Economics to undertake this assignment for the core Australian science disciplines – interpreted as mathematics, physics, chemistry and earth sciences.

It is the inverse question to that posed in Canada: not whether people rank science as important in their world view and whether they are equipped to understand the scientific ramifications in public policy, but rather how much of our present propserity do we owe to ‘new’ science, and so the value of science as an activity or infrastructure in the policy field.

In other words why it is perhaps dangerous for the public to be unconcerned about science done in its name, or to support activities impinging on social cohesion and the ordinary enjoyment of life; why a disinvestment in science as a nationally recognised activity – in secondary and tertiary education; in public and private research – may be more than a little dangerous. 

The most direct way to make this argument is to map current economic activity in relation to reliance on a science base. The absence of that base can then be seen to come at a measurable cost, that can be weighed against other calls on government.

It could well be that Australians unknowingly have been enjoying the benefits of past investment in science, but have been excluded from or left behind in scientific literacy to secure the investment, leaving us exposed and languishing – as the Canadian report by extension indicates.

And what about statistics?

Should not our national agencies be monitoring not only the financial and material investment in science – the knowledge economy – but as well the human investment? The value – moral as much as ethical or monetary of knowing about the world and our place in it. How can people and their representatives be aware of the role of science in their lives, and the role of scientific evidence in decisions without an objective official statistical frame?

The Office of the Chief Scientist may stand in for a “Council of Academies” that can supply briefs to the political process on implications of foundational research (and why this is critical to the future – industrial and social). Certainly the present initiative could be said to concord with the role that the CCA has mapped for itself in Canada. Neither however will be effective without expert guidance in the planning, collection, extraction, accumulation, interpretation, presentation and use of official statistics.

But official statistics is here to be interpreted more broadly than usual: clearly agency collections are hostage to a conservative interpretation: as set out in legislation and circumscribed by a heirarchy of users and a diminishing budget.

What is needed is official statistics as embraced as an area of expertise, of objective and constructive advice, working with public interest organisations in parallel with the portfolio responsibility of government, but not limited to priorities set from government; instead addressing the domain of public policy, in the sense used by the CCA.

Furthermore representatives of the discipline of official statistics can act as (and be seen as) a ‘disinterested party’ alongside the core professions; the institutions of organised science; the enthusiastic advocacy of citizens for policy enlightened by good research, whether local or global, and not coloured by wishful thinking or distorted lenses applied to partial data, typical in the constraints of public advocacy.

Your ideas?

Stephen Horn       

Advertisement

The omics of Official Statistics

Professor Terry Speed’s AMSI-SSAI Lecture today at the Knibbs theatre provokes the following reflection.

Nuisances crowd out the signal – this is as true in genomics (or any of the bioinformatical omics spawned therefrom – proteomics; metabolomics; transferomics) as it is in modern official statistics, hand maiden to policy and socio-econometric modelling.

Nuisance however deserves attention. In an ideal world all data provided in statistical returns is simultaneously correct and perfectly recorded and transmitted. Furthermore the design of this ideal collection is itself perfect: the data collected is sufficient to answer the questions posed by users in their collectivity, without altering the inclination of respondents to cooperate, nor altering their behaviour in so doing. That is, the measurement process is dimensionless.

No one pretends that these conditions hold, or even approximately hold.

Instead the data resulting from the collection effort is conditioned by a quality framework that allows it to recede to the background. Official releases thus come with two crutches: formal rules of population inference – what can be inferred; its accuracy – centring on a true value, and precision – the width of the interval around an estimate containng the true value with certain confidence; and adherence to the nuisance-containing practices embodied in the collection operation.

These practices comprise the design. And this explains why official statistics is stubbornly design-based, even as statistics proper has struck out into the protean world of model building and model-based inference.

Both model-based and design-based approaches have been compromised by nuisance effects despite the loud and redundant appeals to ‘scientific method’ or ‘quality assurance’ respectively. In the one case data richness (and sample size) and spurious replicability obscured the real limitation of data acquisition; the other the drag induced by quality assurance required a stability in underlying processes which has patently been compromised in an external context of open data borders.

Can the negative control method elegantly applied to bioinformatics save official statistics too? Or rather if we take nuisance more seriously may we be inspired to find a more solid platform for the presentation of statistics used in public discourse?

If we restate the issue slightly differently – how to extract a consistent, reliable and useful signal of bearing to social governance from a multiplicity of data frames, where the criterion for signal quality (analogous to the deeper scientific truths underpinning bioinformatics or statistical investigation of physical or chemical phenomena) is encoded in the legislative ethos of government itself.

This not only allows nuisance but assumes it: the act of reducing an uncontrolled flow to a signal under metastatistical protocols (such as pre-existant or circumstantially imposed indicator series; or standards) is the badge of official statistics, best expressed by appeal to design. Certainly it is possible to improve on theory; most transparently by reviewing how deviations from design (for instance dealing with overlapping discordant collections) build a core assurance mechanism.

It happens that the methods put forward by Professor Speed in bioinformatics; and the discordancy accepting extension results that can be built from the geometric basis to sampling theory of Paul Knottnerus’ text play similar roles in the respective contexts. In both cases a fresh appraisal of the context in which statistics is applied has lead to results with immediate application as well as great generality.

References
Knottnerus, P., Samnple Surveys – a Euclidean view, Springer 2003

Measuring the worth of Mathematics

There is a sense where every self-directed extended human activity has an economic value. If nothing else the opportunity cost when occupied doing one (more or less productive) thing, as opposed to other (less or more productive) things. If the time thus spent is directed to some external project, it figures in the ultimate balance of value that stems from the project however accounted (cost-plus; capital gain; assurance; demand shift; monetarised policy objective; speculative gain…).

Mathematical training equips for a class of problems/ projects that require abstract thinking (or thinking in the abstract), bridging the conceptual gaps in tackling a new domain, or revisiting a well trammelled domain where new parameters or boundaries apply.

Advancing the corpus of mathematical knowledge is (or should be) the standard against which all subsequent application is made. This is how the subject is taught: abstractions beget abstractions. This is also the hardest to claim monetary value. A life time in mathematics does not leave such visible monuments; indeed some of the best mathematicians have led short and ignominious lives, yet their work is as central to the concept of the discipline as any public achievement by a Pasteur in biology; a Fermi in physics; a Davey in chemistry all of whom can claim to have added and continue to add to economic achievements.

It is necessary to show the derivative mathematics that most of us acquire in our school years is qualitatively different to mathematics as practised in and of itself. It might equally be said that conversance and fluency in the theory of statistics has placed the products of statistical reasoning in the hands of other scientists, indeed of most people working with real and unruly data sets and tameable. Then why still invest in the discipline?

There is a large element of speculation in any investment in core disciplines, as distinct from support for the governance mechanisms at the core of enterprises, public or private. Existing knowledge base is for many purposes sufficient; its mastery is implied in standard disciplinary training. Managing uncertainty when expressed at an executive level reduces to a question of  personality: only rarely is it seen as scientific. That scientific authority is contested makes its dismissal easier, and makes the case for investing in the hard disciplines of science tenuous.

Yet it is the creative output of these disciplines – the part most speculative – that yields dividends, that renews the worth of the discipline for the public, and from whence comes its most direct source of authority – external as well as internal.

But is it really such a high stakes game? A state rests not on force of arms but on its cultural strengths – the well being of its people; its history reconciled to its present course; its interpretation of its history and the reconciliation of past and present; its resilience to the uncertainties of nature. And its respect for the process of questioning old and acquiring new knowledge; not as elided into net current productive value but in another economy – what we need to know collectively about the world in which we are immersed if we are to be truly human.

There is a misconception about science that sees it as universal, as trafficable, as imperial; draftable into one or another enterprise of the state or its proxies in the market. This appears to be a truism as only such bodies can afford to build the scientific edifice, can align forces towards some goal (the eradication of malaria, sending a man to the moon); as if science can be engineered.

Of course it can, and there are natural alliances obvious when science is providing the knowledge in knowledge-based industry.  Unfortunately the power of engineering – encapsulated in the idea of high technology, is too easily mistaken as the standard of worth for the disciplines that have fed it. The culture that allows those disciplines to thrive, hinges on a respect for knowledge in the large, as well as those elements of knowledge that contribute to economic progress.

Economies become vulnerable when resting on the marketable only, on what works. Things work, or make a profit, or generate jobs and wealth, only up to the limits of ability to meet the unforeseen. Unforeseen is what is totally external (or seemingly so) like a GFC or a meteor or a war or an eruption; equally what has not yet been fully observed (unforeseen effects of a treatment); or properly internalised (adverse effects of fertilizer treatment); or manipulated to give a profit at the expense of competing values (sand mining; drilling the reef; mining antarctica; cross contour plowing). In other words what has been operationalised on market knowledge, not a forensic analysis of performance or public answerability for the use of privatised  knowledge.

The impacts of economic activity should be as accountable as the productive capacity generated, and it is as much an engineering as a scientific question as to how to design a process that is tuned to its environment.

This leads back to the core disciplines founded as they are on human experience, and aspirations. By bringing together the transformational goal of the activity (‘adding value’) and the transactional implications it may be possible to humanise progress to the extent of reducing cost and distributing benefits . That we think about ourselves in this fashion is a constant; the way we do is as process-tied as progress in the disciplines concerned: advancing by long periods of quiescent mastery, and short bursts of creative change.

How then do we measure the health of a discipline like mathematics?  One way is in the strength of renewal; the quality of teaching; the export of success; the attraction of collaborators; peer recognition (important in a competitive market for talent). Another way is in the breadth and sophistication of application, the passage from discovery to problem application; and its reverse, of public awareness of the role of the discipline, of skill value in innovation teams, in quality assurance for industrial process, in the construction of algorithms, of software, in the spawning of satellite disciplines – analytics, computer programming, genomics, biometry, actuarial science, evaluation, operations research. In each case the core is not questioned but the application builds the apparatus for understanding the foundational knowledge in context of solving a problem or feeding a process.

These two pillars separately define the social and economical worth of the discipline – what the discipline stands for – and prevent it from spiralling into debased obscurity, or pseudo-knowledge. They are the foundation for intervention, and authority; they will draw the next generation of trained scientists and consumers of science (the public, in government, among the entrepreneurial class). Both celebratory and performative they are inextricably linked.

A crude model for economic value (deriving from the state investing in the core disciplines) involves accounting for influence: students – through direct teaching, textbooks, examination, inspiration, extension – colleagues – administrative support, collaboration, superstructure; industrial partners – consultancies; algorithms/ software; the public at large – cultural element, adding to the national coherence, and respect for its institutions, attracting collaborative agreements, diplomacy; government – advice, policy contributions.

Not all can be measured by output through to outcome without the use of models or speculation (or both). Yet all provide indicators of health; can be used to detect deficiencies, and costs (opportunity costs), inefficiencies and flow on effects. This overall health combined with standardised output measures will identify the value of the discipline and the sources and fluctuations of that value over time.

 

Prepared ahead of a two-day meeting of the academy of sciences in the context of a consultancy on the economic gain from national core science investment.

Useful further reading

Stephan, P E (1966), The economics of science, Journal of Economic Literature, 1966 – JSTOR http://www.jstor.org/discover/10.2307/2729500

Dasgupta, Partha and Davids, Paul A. (1994)Towards a new economics of science Research Policy 23(1994) 487-521

natureOUTLOOK, Assessing science, lessons from Australia and New Zealand, 24 July 2014/ Vol 511/ Issue No 7510