Sixto Rios, Doctor Honoris Causa, University of Oviedo in 2000


Sixto Rios was awarded an honorary doctorate by the University of Oviedo on 14 January 2000. We give below a version of the speech he made in reply to this honour:

ADDRESS BY THE DOCTOR:

SIXTO RÍOS GARCÍA

1. Introduction:

Magnificent and Excellent Rector, Dignified Authorities, dear colleagues and students, ladies and gentlemen.

Let my first words of deep gratitude be to the Excellent Rector and Faculty of this admirable University of Oviedo and the Department of Statistics, and Mathematical Didactics, whose director, my dear and illustrious student Pedro Gil, who presented my nomination for Doctor Honoris causa, an honourable and precious distinction.

This occasion will allow me to resume and reinforce my old and precious scientific collaboration, with this brilliant group of researchers from the University of Oviedo, developers of Decision Theory.

In his work Parallel Lives, Plutarch (50-120) says of the mathematical works of Archimedes (287-212 BC):
To these he had by no means devoted himself as work worthy of his serious effort, but most of them were mere accessories of a geometry practised for amusement, since in bygone days the tyrant Hiero II the king had eagerly desired and at last persuaded him to turn his art somewhat from abstract notions to material things, and by applying his philosophy somehow to the needs which make themselves felt, to render it more evident to the common people.
In the reply of Archimedes, one of the most important mathematicians of all time, he told his relative and friend Hiero
... that with a given force any given weight can be moved, and emboldened, as we are told, with the strength of his demonstration, he declared that if we would give him another world, and he could go to it, he could move the Earth.
It is not easy to know whether in his response the mathematician uses natural irony in answering the tyrant Hiero with a subtle and paradoxical suggestion, such as the political teaching that could free him from an unstable situation by another, also problematic, but at least more desirable.

Nearly 2000 years later, in 1992, the International Mathematical Union (IMU) declared the year 2000, "World Year of Mathematics", with the objectives of addressing and concretizing the mathematical challenges of the 21st century, proclaiming as one of the fundamental keys for social development the driving force of the presence of the mathematics necessary in our society of information.

The International Mathematical Union intends with this declaration to promote the knowledge and use of mathematics, the fundamental pillar of our culture, not only because it is the natural language of science but also to understand the complex world in which we live. Subsequently, in 1997, UNESCO agreed to sponsor the idea of considering the year 2000 as the World Year of Mathematics.

Spain has created a Spanish Committee of the World Year of Mathematics (CEAMM 2000), made up of personalities representing various societies and sectors of mathematics to meet the objectives set by IMU and UNESCO, focusing on two objectives of special importance:
A) to bring Mathematics closer to society and
B) to promote an adequate mathematical education for the whole population.
It is often said that this lengthy process of growth and penetration of mathematics in culture was reinforced in an ostensible way, coinciding with the appearance of computers in the 1950s and the mathematician, who until then was a rare plant of society, becomes the governing mind of the processes of modelling and simulation, that allows one to turn every opaque problem of reality into a clear mathematical problem so that solving it has a good solution and yields knowledge about the original problem.

With this in mind, we consider it appropriate in 2000, which UNESCO has chosen to emphasize the fact that mathematics influences current cultural, scientific and social development, to reiterate some statements about the influence of the basic processes of modelling and simulation as an obligatory phase, absolutely necessary in the current teaching of mathematics at all levels of the educational system.

It is now considered that the traditional way of teaching mathematics as a finished construction which is based on axioms and demonstrates lemmas, theorems and corollaries in an endlessly boring succession (for 95% of students), ending with exercises that oscillate between the trivial and the ingenious; but which are always as far from real applications as the theory is blatantly exposed, is totally inadequate and distorting for those who aspire to know the current realities of our activity, as the one who tries to apply mathematics, be they be called physicist, economist, biologist or applied mathematician. The problems of reality are never presented as the end-of-chapter exercises of a traditional mathematics book, which begin with phrases such as: "Prove that in every isosceles triangle ...", "Prove that the sum ... " They are the opposite of the type: "How a population of 100,000 trout will evolve, which are placed in a nursery with a certain amount of food ...", "How can we find the relationship between tobacco consumption and ... ."

If economists, engineers, biologists, mathematicians, ... are to be prepared for the modelling of these complex problems in which an irrelevant mass of information obscures the central objective and in which precisely the skill consists in highlighting that objective and selecting the necessary information, the training must clearly differ from the traditional one in mathematical classes directed to understand abstract concepts, to prove theorems, to solve equations, very convenient for the formation of pure mathematicians (who will always be a very low proportion of the student population); but insufficient and inadequate for the one who is going to apply mathematics or for the mathematical formation of the average man. It is painful to recall the case of the twelve-year-old boy, a superficial connoisseur of a few definitions of set theory, who, in presenting him with a real problem that his mother must solve daily, asks whether it is to add or to multiply. Precisely because they have not been taught how to model the real problem. It is generally accepted today that if, from elementary education, mathematical knowledge was aimed at solving problems in close connection with daily life, the human sciences, physics, ... and educating the student in the modelling of such problems, people would have a conception of the need, interest and power of mathematics far superior to what they have today and, for example, managers of companies and businesses, if they had received such training, would more easily understand what kind of aid can help you to solve your problems and so make important advance in the relations between university, industry and society which would be of great value for our progress.

We can say that the human being has an instinct to theorise or model. Our brains allow us to collect and order impressions with which we construct our inner world with space for philosophical theories, the scientific ones, their applications, the religious, ethical feelings, ... Bertrand Russell said that "nothing is more practical than one good theory" and hence that this transition from reality to model, and from this to the real world, must be converted into a conscious but adequately designed daily activity, from infancy if we want to contribute to a natural and progressive improvement of the environment, general culture and the mathematical knowledge of the 21st century man in the street.

Faithful to these ideas we shall develop in our investiture speech the theme: Modelling, simulation and decision.

2. Modelling and simulation:

We have reached the threshold of the twenty-first century in which scientific problems are almost always multidisciplinary and great flexibility is needed to understand scientists from other areas, to make inroads into their fields and to collaborate effectively. For my part I want to confess that, after working for fifteen years (from 1936 to 1950) in functions of complex variable and theory of integration of real functions, I jumped, for professional needs, to the calculation of probabilities. Then I came in contact with the statistics that was beginning to constitute a unified discipline with the Calculus of Probabilities, thanks especially to the works of Harald Cramér. Only a few years later I became acquainted with the important multidisciplinary applications made by teams of eminent scientists in the military problems of World War II and soon led, with the name of Operations Research, to the treatment of the decision processes of organising systems, etc., that is to say, in the large systems involving men, machines, energy, information, uncertainties, ...

Since then, and for many years, my work and those of my students and collaborators have proposed to introduce into university teaching and into the reality of the activities of entrepreneurs and decision makers in different fields, the methodology of Operations Research, and decision making, which especially involves system modelling applied to the improvement of production structures.

This has been, briefly reported, the baggage I accumulated along the way that I followed, to look today at the fundamental concepts of Modelling and Simulation and their fruitful and reciprocal application to Decision-Making.

In lower-level beings there are also exchanges of information with their external environment and this is the basis for controlling their behaviour; but here we are going to refer to highly specialised models using symbols (language, mathematics, logic, probability, ...) and which are continually used by man in the study of his higher structures, natural phenomena, and so on. We can say that modelling, aided by computer simulation, is today the basis of all serious scientific work, in all fields of science and technology, and even in the humanities.

As proof of the international recognition of the importance of modelling and its application to the study of important problems of today's society, we can mention the creation, in 1972, of the International Institute of Applied Systems (I.I.A.S.A.) sponsored and supported by the USA, the USSR, Japan and 12 other European countries (among which Spain is not found, although a Spanish mathematician received the Peccei Prize in 1991). In the I.I.A.S.A., models are being researched, implemented, and applied with increasing success in
(1) macroeconomic development,
(2) demand and supply of energy,
(3) regional production in agriculture,
(4) health planning,
(5) urban dynamics, employment, migration, etc.,
(6) technological innovations,
(7) environmental quality control, ...
Also note that in 1989 the Department of Defence and Energy (U.S.A.) specified that the technology of Simulation and Modelling is one of the 22 considered fundamental in the U.S.A. (Council of competitiveness (1989) Vol 1. No 6.)

And in 1995, the 2nd International Knowledge and Data Mining Conference (KDD) took place, which began publishing in 1997 its journal Data Mining and Knowledge Discovery.

Since my initiation (1950) in the Chair of Statistics and Operations Research, I understood how important it was for effective teaching to practice simultaneously, and with a similar weight, the three aspects of teaching, research and consultation.

The consultations enlivened our teaching with topics that were then topical and awoke the need for more serious research than the current exercises in traditional classes, thus contributing to the three aspects a deep foundation for students and postgraduates achieving these two groups to form a school of international level to work on Statistics and Operations Research, which was founded in 1950, and continues today to be operated under the names TEST and TOP (and at the same time they suppressed the Institute that started it), but which, survived for half a century, guaranteeing the success achieved by the Spanish in this scientific area.

3. Systems and Models:

A system or set of elements that act with a global objective can be studied by direct experimentation with prototypes, but due to the difficulty and high cost of this approach, the current tendency is the construction of logical mathematical models in the form of systems of relations or equations that manage to imitate some of the main aspects of the behaviour of the real system. If the model has been developed in this way, or even directly using a computer program, will be called a simulation model. Such a program will allow you to experiment with the system and draw valid conclusions for different problems.

The idea of simulation with the name of artificial experimentation has been known and practiced for many years by mathematical statisticians. It can be said that it appears already in the Calculus of Probabilities when the Chevalier de Meré presented to Pascal frequencies of events in the game of dice whose probability was badly calculated by de Meré, but was to be correctly modelled by Pascal, and identified with the frequency in a long series of trials, led him to the first conscious and successful simulation of a random game, in short, to the creation of the Calculus of Probabilities.

This is really the origin of the so-called Monte Carlo method introduced by Stanislaw Ulam and John von Neumann, under the stimulus of Enrico Fermi to solve problems arising from the study of atomic energy.

If it is a question of solving neutron transfer problems, ..., relative to regular regions, the methods of classical integral equations are useful; but for complicated regions such a solution proved difficult and von Neumann reinvented the simulation.

But it had to play many games and quickly solve the simulations which led to the creation of the computer, although an anecdote tells us that in the first test with a problem, proposed by Fermi, von Neumann with paper, pencil and grey matter, finished before the computer. Which brings us to another anecdote from Princeton in the 1950s in which in classifying the mathematical geniuses they put von Neumann ahead of Gödel and Einstein, placing him "halfway between men and gods."

Establishing that the Monte Carlo method gives probabilistic approximations that require a number of observations independent of the dimension of the sets, work of half a century has broadened the germinal ideas, leading to the new discipline of random computing with great prospects in applications. This has contributed to considering Monte Carlo simulation as more than a simple complement to modelling, because of the basic advantages of understanding, implementation, execution and memory requirements, etc.

4. Techniques and Methodologies:

The modelling task must begin once the researcher has sufficiently studied the real system and understands what the model should serve. We would almost say that he has the model in his head. This is when the data analysis stage of the system is started. It comprises knowledge of the minimum number of variables that allow one to describe adequately the real system, their relations and interactive behaviour, distinguishing those determined from those that require a treatment of uncertainty, chaos, etc., to move to the synthesis stage whose purpose Is to situate the results of the previous stage into an organized logical structure, which is what is called the formalised model of the system. An important issue related to this stage is the validation of the model, which refers to its comparison with the original system, to check if it is a sufficiently approximate representation of that system.

It is important to choose the computer and languages to arrive at a simulation program and perform the verification step that tends to detect any mistakes made when moving from the logical structure of the model to the computer program, algorithmic errors, etc.

Finally the results from the verified model are compared with those of the real system and if the degree of adjustment is unacceptable a new series of steps must be carried out that modify the first model in view of the new information acquired. After these stages, the model is implemented for explanatory, predictive, control, etc. purposes. The bases of these processes are found in the methods of refutation or falsification of the philosophers Popper, Kuhn, Feyerabend, ... that are related to the statistical methods of hypothesis testing (Fisher, Neyman, Pearson, ...)

But in this path of mathematisation of plausible knowledge which, as Pólya says, embraces the inductive evidence of the physician, the lawyer, the economist, the historian, ..., the historical point of departure is Laplace's rediscovery of Bayes' theorem, which has allowed the construction of inductive logic and decision theory, in competition with each other on more or less convergent paths, always hoping for a general consensus. But it should not be forgotten that sometimes important problems will resist existing methodologies and it will be necessary to contribute a great deal of creativity (which is not able to be modelled) to arrive at the final success of the model, as we will see next with the creation of the theory of patterns to solve the problem of sensory perception.

5. Pattern Theory:

A series of classifications in which characteristics of the system (experimental, observable, designable), basic sciences (hard, such as Physics, Chemistry, ... or soft sciences: Sociology, History, Geography, etc.) the logical type (causal or descriptive), the interpretive type (deterministic, uncertain, chaotic, ...), the purpose (explanatory, predictive, scenario, optimization) gives rise to a huge variety of possibilities for attacking these problems and a difficult methodological doctrine, such as living science itself, with whose construction and architecture the modelling process is identified.

There are four fundamental aspects that appear in the modelling of a complex system and on which the model depends so it can be properly validated and useful as a representation of the system: a) structure of the system, b) patterns that are integrated in the system, c ) uncertainty and d) evaluation of results.

These are the initial concepts related to modelling, also well-known, over the years, such as discovery of knowledge, data mining, artificial intelligence, exploratory data analysis, decision analysis, ..., which constitute, alongside the corresponding computer techniques of simulation, original contributions, which could also be considered as chapters of Statistics, as we understand it "the oldest of the place", and some classics such as Kepler, Gauss and others such as Grenander, who introduced of the theory of patterns in the 1970s (whose first glimpses are found in the classic book by Hermann Weyl, 'Symmetry' (1952)).

As a new field of experimental mathematics applied to the modelling of problems such as computer assisted vision, language recognition, neural networks, artificial intelligence ... could be defined in a comprehensive way, following Munford, as the analysis of patterns generated in the world in any form, with all its natural complexity and ambiguity and with the goal of reconstructing the processes, objects and events that occurred in the world and forecast of these patterns in subsequent situations.

Multiple examples could serve as a stimulus and illustration: medical expert systems, which refer to the patterns that occur in symptoms, clinical history and the performance of other tests, by which the doctor tries to find known processes and diseases to diagnose in a patient; the vision associated with the analysis of electromagnetic signal patterns of certain incident wavelengths at a point in space from different directions; etc. If one considers, for example, the graph over time of pressure while a subject speaks we want to characterize such graphs by their shape over time by distinguishing the wave types that occur and relating them to the different configurations of the speech tract of the speaker, identify them as a part of a phoneme, also trying to describe the qualities of the voice of the speaker, etc.

The importance of considering the processes of analysis and synthesis from the basic elements, which are what Grenander calls generators (g) considered as abstract images of the patterns, is soon noticed. But the regularity in his theory is very relaxed according to the needs of the biological applications in which it is necessary to also include the connection with the observer through the visual, sensorial perception ...

More specifically, the theory of patterns constructs sets (GG) of objects gg in GG between which certain relations are defined. Objects or generators are organized in a hierarchy of levels, while the operations are related to a universal algebra defined on the set G and also universal partial algebras in which some relations are partially undefined. The elements contain certain constraints so that to complete the formalism we also consider connection relationships between elements gg in GG using Bayesian, Markovian, ..., graphs.

In addition to Algebra and Analysis, probability plays a fundamental role in pattern theory, specifically the probability measure over the space of regular configurations serves to discover the relative frequency of some types and what is called biological variability (e.g. of hands, stomachs, mitochondria, human faces in two and three dimensions, ...).

6. The Problem of Perception:

In a generic way it can be said that the perceptual problem presented to an observer is characterized by a) the inferences that the observer makes of the form of the outside world from his observations and b) the information that such images give about those properties, which are determined by the a priori structure of the world.

The basic idea of Bayesian modelling of the problem of visual perception is to characterize, as a probability distribution, the information relative to an image of a scene susceptible of different states, given the data corresponding to the different images. The form of the posterior distribution is produced in part by the process of image formation, and partly by the statistical structure of the world. Bayes' theorem then provides the fundamental tool for calculating from these two factors the posterior distribution, finally being set up for the reconstruction of the state of the world from the value that maximizes the style of such a posterior distribution. And this is essentially the basic scheme of the approach to the important problem of perception, the solution of which so much interests humanity. There is another approach to these problems closely related to the previous one, due to Rissanen (1993), who uses information theory, which according to its author, automatically leads, without a priori knowledge of the world, to the same variables in which Bayes' theory is based. It is designated as "reconstruction of the world with minimal length of description."

The author points out that, in contract to the Bayesian approach, his does not require a priori knowledge of physics, chemistry, biology, sociology, ... of the world system that is used to establish the model.

And this justifies our understanding of Munford's opinion. "In short, I believe that pattern theory contains the germ of a universal theory of thought, one which stands in opposition to the accepted analysis of thought in terms of logic. The extraordinary similarity of the structure of all parts of the human cortex to each other and of human cortex with the cortex of the most primitive mammals suggests that a relatively simple universal principle governs its operation, even in complex processes like language."

This final example should be underlined to emphasize that the new approach to perception like Bayesian inference, the historical origin of the great decision theory, has had a very important influence on the cultivators of this difficult subject, who, by capturing the common essence of the different known approaches to the same, allows approaching the expected solution within a scientific consensus that makes it more plausible.

As a brief historical reference to the problem of visual perception we can say that, at the end of the eighties, the computational approach to visual perception proposed by Marr (Vision, 1982) was considered well established.

But despite advances in robotics and prosthetics in the 1980s and 1990s in the field of perception, there was a huge gap between the new artificial systems and biological systems. Inductive intelligence and knowledge found in biological systems were not achieved and allowed to operate in a wide range of possible situations.

But in the course of writing our speech, today 4-12-99, we read in a newspaper with natural excitement that the American composer, singer and pianist Stevie Wonder, 49, who lost his sight a few hours after birth, is going to try to recover it thanks to a revolutionary chip, developed by Wilmer's Ophthalmic Institute of Johns Hopkins University (USA).

Wonder had announced that he would submit to the operation at a Detroit church, perhaps as a message of recognition of the possible communion of scientific spirit and religious sentiment. At least this was confirmed by the applause with which the end of the successful operation was underscored by the 400 faithful who attended the performance, characterized by the most varied ways of thinking and working.

This recent news is worth recording so that we can see with optimism the end of this succession of valid inductive relations in our time:

Real events → Modelling → Science → Technology → Spiritual and material human well-being → Social well-being

Hopefully, all modern cultures believe in this scheme.

Thank you very much.

Last Updated November 2017