Wednesday, November 25, 2009

METEOROLOGY


Pictured: The top of the MetService building in Kelburn on a typically fine and calm day in Wellington!





We probably all take for granted the weather forecasts in our newspapers and on radio and TV, but there is an awful lot of mathematics behind them (or some might say a lot of awful mathematics!). Jim McGregor, the meteorologist at Victoria University, mentioned to me early in the year that he gets students coming along to him saying they’ve long been interested in the weather and interested in studying it. The first question he asks is “How much calculus have you done?” If the answer is “Not much”, he tells them “Sorry, but in that case, it won’t be possible”. And an understanding of some Physics is essential too. Along with derivatives and integrals, logarithmic and trigonometric relationships will also be commonly encountered in any book on meteorology.

The fact that differentiation is involved should come as no surprise to senior secondary students, as we are dealing with things (e.g. temperature) changing with time and position, and motion of the air, so velocity and acceleration will be involved. To compound the fact that the motions are in three dimensions (unlike 1- dimensional kinematics), they are being observed from a rotating reference frame (the earth), giving an effective force producing curved paths (the Coriolis effect).

The convention is to use x for the eastward direction, y for the northward and z for the vertical direction, and use u, v and w respectively for the corresponding velocities in those directions.
The equation of motion of air (essentially Acceleration = Force/ mass) in the Eastward direction is
The first thing to notice, apart from the complexity, is two things that look like derivatives, which is just what they are. The first term on the right is a pressure gradient force, but looking just how pressure changes with respect to the x direction, hence a ‘partial derivative’, with a curly rather than straight d symbol. The Coriolis effect gives rise to the two terms on the right involving the angular velocity of the Earth’s rotation, Ω (= 2πradians/1 day), and the latitude, ø. The two terms on the left involving a (the mean radius of the Earth) are due to the curvature.
Actually, the equation might have looked a bit more frightening, if the first term on the left side (an acceleration term) and last term on the right (friction or viscous force) had been written out with individual components:
And we can see there that second derivatives are involved in the viscous force, although as it happens this force is very small compared with the other forces.

Now, this complex differential equation for the eastward component of momentum is just one of six linked equations that forms a mathematical model of our large-scale weather system. Advances in computing power over recent years has meant that it is possible to numerically solve these equations, but it requires the assimilation of a large set of data on the atmosphere at different locations, from satellites, planes, balloons, ships and land weather stations.

Global forecasts are produced every 12 hours using super computers at three locations, two in England and one in Washington. With modern computer power, it is now possible to forecast up to 7 days. These forecasts are sent around the world, and as soon as they receive them, our local forecasters study them carefully, marking on them important features. But even with the advanced computer power, the global forecasts are limited in showing local weather. The numerical solution of the global model works with a 60 km grid size, which would not be able to take into account local topography. Modellers at MetService produce a finer resolution model, with a 12 km grid size, using values from the global model as input at the boundary of the region. And they may soon move to an 8 km grid.


To illustrate, two maps are shown, the Tasman map and below it the NZ region map, which is based on the 12km grid resolution in the modelling. Note the wind barbs giving greater detail over the country. Each stroke on the tail of a wind barb represents 10 knots. The finer scale model does not try to forecast out so far as the Tasman one, which gives a 7 day rain and wind forecast.


These maps can be seen on the MetService website
(http://www.metservice.co.nz/) and clicking on Maps & Rain Radar, then selecting rain forecast maps for 3 and 7 days.


(to be continued, but after having had to delete a previous draft after a glitch on uploading photos and redo this, I'll quit while I'm ahead!)

Saturday, October 31, 2009

Integration and Insulin


Chris Hann is excited to have used mathematics to play a part in saving lives. At Christchurch Hospital's Intensive Care Unit, a simple practical device developed at Canterbury University's Department of Engineering is being used to help decide how much insulin and food to give patients to stabilise them.


As a result of the high levels of stress, and the action of the hormones produced, critically ill patients often experience hyperglycemia (high blood sugar) and increased insulin resistance. This can lead to further complications, such as organ failure and severe infection. So it is important to control the glucose levels by insulin infusion and a suitable food intake regime.


Various models for glucose-insulin kinetics have been investigated over the years. A model sufficient for this situation can be given by three first order differential equations. G(t) is the concentration of glucose above equilibrium (GE) levels, I(t) is the concentration of insulin (above basal levels) in the blood plasma, and Q(t) is the concentration of interstitial insulin, in the fluid that surrounds the cells - it is this that enables the cells to take up glucose.



The insulin infusion rate u(t) is known.


Notice that the equation for the rate of change of glucose concentration has three terms, the first two giving the decrease in glucose due to insulin (basal plasma insulin and interstitial) and the third giving the increase due to feeding, P(t). The key parameters in the process are the insulin sensitivity, SI, and the glucose fractional clearance, pG, which are patient specific and vary with time. By estimating other less important parameters using typical population values, reformulating the model in terms of integrals and then using numerical integration from the measured data (regular blood glucose levels for patients are taken), a set of equations can be solved for the key parameters over different time periods.


Having worked out a method that is computationally efficient, the model can now be tested with virtual patients, enabling a large number of simulations to be carried out, and refinement of the model. Chris Hann explains that there are more complex models of the glucose-insulin system that could be used, but nothing would be gained from using them in this application, where we are looking at glucose levels on an hourly (or greater) time scale. A detailed and physiologically correct insulin model is only useful for much smaller scales. "We always look to create the simplest possible model for the given application."


Extensive testing showed close agreement between the model predictions and clinical data. The final outcome of the work was the development of the 'Insulin Wheel' and 'Feed Wheel' which give protocols for the administering of insulin and food on the basis of the measured blood glucose levels and the change from the previous hour. Use of this has lead to a significant decline in the mortality rate in the Intensive Care Unit.

Work is now underway on patient-specific modelling of the cardiovascular system in critical care.

My thanks to Dr Chris Hann from Canterbury University. Meeting mathematicians who are enthusiastic about the work they are doing is one of the great pleasures of my fellowship year.

Friday, October 30, 2009

Breast Cancer Detection

The Mechanical Engineering Department at Canterbury University has been developing a new method for detecting breast cancer, which has presented interesting and challenging maths problems. The method has been named DIET - Digital Image-based ElastoTomography.

The rationale for the project is this: Incidence of breast cancer among women in NZ is high, and early detection significantly increases the chances of successful treatment. But the standard current method of breast cancer detection using a mammography machine is not only uncomfortable for women (I have sisters who confirmed that for me!), but involves large and expensive equipment. Screening methods which are less invasive (don’t involve radiation) and equipment which is more portable might increase the up-take of screening.

The concept: Cancerous tissue is stiffer than normal tissue. By subjecting the breast to a rapid fine vibration, it should be possible to detect any region of stiffer tissue from the way the surface moves in response to the vibration.

Five cameras are positioned around the breast to take synchronised images while it is vibrating. (The frequency of vibrations is 20 - 50 Hz and amplitude 1 mm.) Now there is the challenging geometry problem of reconstructing a 3-D image of the breast from the digital images. Once the 3-D motion has been reconstructed comes the problem of using the properties of elastic materials to locate the position and size of any regions of greater stiffness.


Richard Brown, now in the Canterbury University Mathematics and Statistics Dept, was working on this projective geometry problem for his PhD in Engineering. Richard is pictured with Thomas Lotz, who is currently on the team working on the DIET project. Thomas (left) is holding a test silicon 'breast' while Richard (right) is holding a calibration cube, which is used to locate the exact position of the cameras before the test is carried out.



My thanks to Richard and Thomas for introducing me to this work when I was visiting Canterbury University in September.

Sunday, October 25, 2009

Optimisation in Radiotherapy


I first discovered this surprising application of linear programming while looking through annual conference papers of the NZ Operations Research Society (ORSNZ) and have since had the opportunity of talking about the problem with Matthias Ehrgott from Auckland University's Dept of Engineering Science. It is a good example of the way new problems are being presented to mathematicians to work on in this increasingly technological world.


In early external beam radiotherapy treatments, a whole area of the body would be irratiated equally by the beam. Obviously, the maximum radiation dose that could be given to a tumor site has been restricted by the tolerance and sensitivity of the surrounding nearby healthy tissues. An improvement was to create a filter in front of the beam for each patient so as to shape the beam to the tumour. The development of a device called a multileaf collimator (MLC), with moveable 5 mm thick blades (leaves) of lead/tungsten that can block off sections of the beam, together with improvements in imaging techniques, has lead to 3-Dimensional Conformal Radiotherapy, where shaped radiation beams are aimed from several angles of exposure to intersect at the tumour, providing a much larger absorbed dose there than in the surrounding, healthy tissue.

Intensity-Modulated Radiation Therapy (IMRT) is a refinement on conformal radiotherapy, where the intensity beam is controlled, or modulated within the given area, using the MLC. Use of IMRT is growing in more complicated body sites, such as the neck and prostate. Auckland has had this high tech equipment for two or three years.


(Pictured right: multi-leaf collimator)



IMRT in brief:
•Beams (photon/ electron) produced by the linear accelerator are focused on tumour from different directions (3 to 9)
•Intensity across each beam can be modulated by multi-leaf collimator (effectively, beam divided into large no. of beamlets/ bixels)
•Aim:
Focus radiation so that enough dose is delivered to tumour (unlike normal cells, cancerous cells with damaged DNA can’t reproduce) while limiting dose to critical organs and healthy tissue.


The 3 optimisation problems:


1. The geometry problem: What angles (beam directions) should be used?


2. Finding optimal beam intensities for each angle


3. Optimising the delivery schedule


The beam intensity problem can be formulated as a very large linear programme:

Multi-Objective function:
One for tumour, critical organ(s) and normal tissue (minimise underdosing of the tumour and to minimise overdosing healthy organs and other tissues)
Variables:
Let xi be the intensity at bixel i
(the MLC can have 40 leaves and 40 stops, so up to 40 × 40 = 1600 variables for each beam direction)
Constraints:
The region of the body is divided into 3-D volume elements (voxels), and so one constraint for each voxel, given by dose levels of oncologists prescription
(order of 100,000 constraints)
Note: The optimisation computes a set of possible treatment plans for which less overdosing of healthy organs implies more underdosing of the tumour and vice versa. It assists the planner to select one such plan that is best for the patient.

Once the intensities are determined, there is still the problem of how they can be efficiently delivered using the collimator settings, so that the number of shapes and total radiation time is minimised. Effectively, this boils down to the problem of how to decompose a (40 × 40 ) matrix into the sum of matrices whose non-zero elements are identical. No algorithm exists for minimising the number of matrices in the decomposition, so here we have a new problem for computer folks.


The highly complex IMRT treatment has thrown up challenging problems for mathematicians working in OR. Matthias Ehrgott has been working on the multi-objective linear programming problem mentioned above. His time and assistance in giving me an insight into this work has been greatly appreciated.









Sunday, August 30, 2009

Modelling Epidemics

Two viruses have been particularly in the news this year, the Influenza A(H1N1)v virus ("swine flu") and measles. The rapid increase in the number of cases was evident in news reports, and is typical of epidemics. For instance, 7 cases of measles were reported in NZ in June, jumping to 93 notified cases in July, as the disease spread from Dunedin and Christchurch up to Auckland. Although measles is a common childhood infection that many in NZ would probably not think of as serious, it is highly contagious and can be fatal. 7 people died in the 1991 epidemic here. Globally, mortalities from measles exceed that of the flu, although are less than maleria and HIV.


Mathematics and Measles - A Case Study in the Value of a Mathematical Model

Up until about a decade ago, New Zealand faced regular measles epidemics, but a mathematical model successfully predicted the outbreak in 1997, and was instrumental in the decision to carry out an extensive vaccination campaign that year. Subsequent work on the mathematical model led to the recommendation that changes be made to the immunisation schedule in order to prevent further epidemics. First a little background:

Measles first came to NZ in the 1800's, via ships from England. The first instance seems to be in the mid 1830's, when a ship brought the disease from Sydney, and South Island Maori were severely affected. Before a vaccine was available, almost all children in NZ caught measles.
Immunisation was introduced in NZ in 1969, but there was still a pattern of 2- to 3-yearly epidemics, so in 1978 the Department of Health instituted a 5-year measles epidemic elimination programme. The next epidemics after 1980 were in 1985 and 1991, so the programme had only partial success. In Nov 1990, the measles vaccine was replaced by the measles-mumps-rubella vaccine (MMR) generally administered to babies when they were about 15 months old. In 1992, a second dose of MMR, scheduled at age 11 years, was added in response to the 1991 epidemic.

Which brings us to 1996 when Mick Roberts (pictured above), then working for AgReasearch, produced a mathematical model of the dynamics of measles in NZ for the Ministry of Health.

SIR Model

The model that Mick Roberts used was an extension of a standard deterministic model introduced by Kermack and McKendrick in 1927, which uses a set of linked differential equations. A population, taken to be constant for the duration of the epidemic, is divided into 3 groups:
  1. Susceptibles (S) - those who are uninfected but at risk of getting the disease

  2. Infectious (I) - those who are infected and pass on the disease to susceptibles

  3. Removed/recovered (R) - those who have had the disease but are now immune (or dead or isolated or otherwise removed from the population)
As time passes, susceptible people become infected and infected recover and become immune.
The differential equations describe how the numbers of susceptible, infectious and removed change during the epidemic.
These equations state that the rate at which susceptibles become infected is proportional to the rate of interaction between the groups (proportional to the product of the numbers in each group - the Law of Mass Action), but the number of infected decreases at a rate proportional to their number (i.e. the infected recover at an exponential rate).
The progress of the epidemic depends on the two parameters (infection and recovery rates) and initial numbers of susceptibles, which combined form the basic reproduction ratio (or epidemic threshold) Ro, which gives the number of secondary infections caused by a single primary case in a fully susceptible population. For measles, Ro ranges from 12 to 18. If it was less than 1, an epidemic would not occur.
The assumptions of the above simple SIR model are:
•Fixed population size (no births, deaths)
•Incubation period instantaneous
•Duration of infectivity = length of disease
•Completely homogeneous population (no age, spatial or social structure)

Mick Roberts' SIR Model for Measles in NZ (1996)
Assumptions:

•Those aged less than 6 months or more than 25 years take no part in the epidemic.
•4 active age classes coinciding with potential ages at vaccination.
•Different contact rates within and between the age classes
•Disease transmission is seasonal - high between Feb 28 and 1 Dec, low in summer.
•Annual birth rate constant at 57435 births/year.
•Model to incorporate vaccination strategies and historical vaccination rates (around 80%) from 1969-96 and assume 1996 strategy continues until 2000.
•Primary vaccine failure rate taken as 10%
There were differential equations for the change in susceptibles and infective for each of the four age groups (6 months to 15 months, 15months to 4 years, 4 - 11 years and 11 - 25 yrs)
The model was solved numerically for the period 1962 to 2000, using the combination of parameters that gave best match to historical timing of epidemics for the period 1970-92. It predicted there would be an epidemic the following year. The model was consistent with a basic reproduction ratio of 12.8 (2.85 with vaccination).

The 1997 Epidemic

The epidemic began a few months earlier than predicted, and was contained by a mass-vaccination effort. Just under 2000 reported cases resulted.

The mathematical model predicted another epidemic in 2003 - 2004. Was there a better immunisation strategy that might prevent further epidemics?
An extended model to test strategies:

•Further subdivision of age classes into eight classes
•Model run with 4 different immunisation schedules (each with 2 vaccinations)
•4 different coverage rates (from 80% to 95%)
•‘catch-up’ opportunities included (at entry to preschool and primary school)
•Model solved numerically over 20 years for selected strategies
The result of this modelling was a set of recommendations to the Ministry of Health, including moving the second vaccination at 11 years forward to 3 years or 6 years of age. In addition, coverage at 15 months needs to be at least 90% if further outbreaks are to be prevented.
In 2003, the immunisation schedule was amended to have the second vaccination at 4 years. The coverage is still too low, perhaps not helped by a (now discredited) report some years back linking the MMR vacine to Autism. Hence, we seem to have another outbreak in 2009.
Acknowledgements:

Roberts, M. G. & Tobias, M. I. Predicting and preventing measles epidemics in New Zealand: Application of a mathematical model. (Epidemiology and Infection 124, 279-287.)

With thanks to Professor Mick Roberts, Massey University, Auckland

To see the paper mentioned above and other recent research of Mick Roberts:
http://tur-www1.massey.ac.nz/~mgrobert/Research.html

Saturday, May 23, 2009

Operations Research

I have been spending some of the first half of my fellowship year gaining some background into the techniques and applications of operations research.



What is Operations Research?

Operations research involves the modelling of complex real-world systems in order to aid decision making. The goal is often to find optimal, or at least improved strategies.

For example, these are some of the sorts of questions that involve OR techniques:
  • How many checkouts (/call-centre operators) should we have open at various times of the day?
  • Where is the best location for emergency services?
  • How should we price our services with consideration to the competition?
  • How much of each product should we produce/stock/invest in?
  • How should our airline/rail/ferry services be scheduled?

Techniques of OR include:
  • Linear programming
  • Integer (and mixed) programming
  • Non-linear programming (non-linear objective function, may use calculus)
  • Queueing theory
  • Game theory
  • Simulation
  • Network models
  • Forecasting
  • Dynamic programming (for making a sequence of interrelated decisions)
  • Stochastic programming (building in uncertainty to the problem)

To my knowledge, courses in OR are running at Auckland, Canterbury, Waikato and Victoria universities, although you might find aspects in Management Science courses at other universities.


A very good explanation of OR, together with its history, case studies and activities suitable for students can be found on an American website 'High School Operations Research'

http://www.hsor.org/what_is_or.cfm



OR in the NZ Secondary Curriculum

The key parts secondary mathematics that relate to OR are:

Year 12

Probability simulations (Achievement Standard 2.6)

Networks (Unit standard 5249)

Year 13

Linear programming (Statistics and Modelling A.S. 3.4)

Probability and Probability distributions (A.S. 3.3 & 3.6)

Time series (A.S. 3.1)



The place of networks in the curriculum is interesting. In spite of being listed in level 8 of the curriculum for many years, it has not been part of any Level 3 NCEA standards, nor of the previous University Bursary Calculus or Statistics prescriptions. There is a Level 2 NCEA unit standard on networks, although this would normally be given to the less academic mathematics students. Here it is again in the new curriculum statement:

Level 7 - "Choose appropriate networks to find optimal solutions."

Level 8 - "Develop network diagrams to find optimal solutions, including critical paths."

OR in New Zealand
There are a number of significant applications of operations research happening here, such as:
  • Hydro-electric power generation (e.g. operations of Clutha hydro river chain, Mighty River Power (use of linear programming))
  • Air NZ scheduling
  • NZ electricity market (game theory models)
  • Milk collection (vehicle routing/ networks problem)
  • emegrency services (software developed in Auckland to minimise response time is being used overseas)

A Wellington OR Company

When NZ Post was looking at restructuring, Orbit Systems in Wellington developed a mixed integer programming model for mail collection, sorting, transport and delivery. They also did a ferry operations simulation for the Cook Strait fast ferry. But Craig MacLeod, Managing Director of Orbit Systems, points out that OR seems to be valued more highly overseas than in NZ, and much of their work is for overseas companies. For example, they have developed software for optimising warehouse usage and truck-loading operations, which have saved USA companies millions of dollars.

Craig had some interesting things to say about the skills required by people working in this area. "You need a group of people to bring different skills to a modelling problem", so the ability to work in a team is critical. And good computing (programming) skills are essential.

ORSNZ

A useful place to find examples of local research in OR is through the conference papers of the Operations Research Society of New Zealand. Follow the Annual Conference link to the conference proceedings for each of the conferences. ORSNZ also publishes a newletter about three times a year.
www.orsnz.org.nz

Friday, February 27, 2009

BioEd 2009


So what is a mathematics teacher doing at a Biology education conference? Like Sting's 'Englishman in New York', I was somewhat of an odd man out.


BioEd 2009, held in Christchurch in the middle of February was one of a worldwide series of events to mark the 200th anniversary of the birth of Charles Darwin (and 150th anniversary of the publication of 'The Origin of Species'). The major sponsors of BioEd were the Allan Wilson Centre and IUBS (International Commission of Biological Sciences). I was interested to note listed among the sponsors (along with the Royal Society and the MacDiarmid Institute among others) the Society for Mathematical Biology, an international society which exists to promote and foster interactions between the mathematical and biological sciences communities. And it was the fact that there were to be sessions dealing with the role of mathematics in biology (and biology education) that gave me the incentive to attend the conference.

A very interesting talk on ‘The Evolution of Medieval Manuscripts’ was given by Christopher Howe from Cambridge University, UK. He described how techniques that have been developed to reconstruct the evolutionary relationships from DNA sequence data have been applied to medieval manuscripts to get relationships between different versions of a text and reconstruct their copying history. The added complication of a scribe using more than one copy in copying a text has parallels to lateral gene transfer. The conclusions reached from the computer programs seem to agree with those from conventional analysis.

There were three speakers from USA who are involved in the use of mathematics and computer science in biology education, Holly Gaff, John Jungck and Tony Weisstein.

Holly Gaff, in her talk ‘Teaching the Biology and Ecology of Infectious Diseases through Mathematics’ pointed out the value of mathematical modelling to assess the potential spread of disease (without running actual trials!) and trying to predict the best practices for prevention and control. She illustrated the modelling of vector-borne diseases (e.g. via the mosquito) and the problem of genetic modification of pathogens.
Holly is involved in a project developing a high school (secondary) maths-biology curriculum. http://dimacs.rutgers.edu/BMC/


John Jungck’s talk in the ‘Towards Biology 2020’ theme of BioEd concerned the importance for students to be involved in conducting investigations and the increasing need for quantitative analysis – he listed calculus, discrete mathematics and statistics as the key areas of maths. Important applications mentioned included those from medicine, agriculture and environmentalism.

In 1997, John published a paper entitled ‘Ten Equations that Changed Biology: Mathematics in Problem-Solving Biology Curricula’ in which he argues the importance of mathematics in undergraduate biology education and draws attention to a variety of mathematical models that have been intrinsic to many of the significant discoveries in biology in the 20th century. These encompass evolution, genetics, developmental biology, biochemistry, cellular and molecular biophysics, and population biology.
Ten Equations that Changed Biology: Mathematics in Problem-Solving Biology Curricula

Tony Weisstein introduced Excel-based simulations of population genetics which he (along with John Jungck and others) has been involved in developing to support a multi-disciplinary approach to learning (mathematics, biology and computer science).
Their downloadable ESTEEM modules can be found at
http://bioquest.org/esteem

Two of the presenters were from Canterbury University’s Biomathematics Research Centre:


Charles Semple presented some of the mathematical difficulties involved in trying to reconstruct evolutionary trees and the search for methods that can reduce computer time. He gave an introduction to some notable problems in the area: the unrooted supertree problem, the maximum parsimony problem and the reconciling gene trees problem.


Mike Steel outlined the importance of phylogenetics (e.g. understanding evolutionary processes, biodiversity conservation and epidemiology) and the need for mathematics.
He suggested the type of mathematical skills that are likely to be most useful for future students wishing to work in this area, namely:
· Discrete maths (graph theory, algorithms, etc.),
· Probability (discrete markov chains, branching processes, etc.)
· Algebra and calculus (linear algebra, discrete Fourier analysis, modelling with DE’s).
At the end of his talk, Mike made reference to an essay written by Joel E. Cohen, Professor of Population at Rockefeller and Columbia Universities, New York, discussing ways each field can benefit from the other. It makes for very interesting reading:

Mathematics Is Biology's Next Microscope, Only Better; Biology Is Mathematics' Next Physics, Only Better

Susan Worner, from the Bio-Protection Research Centre at Lincoln University, gave a presentation on how evolutionary computing is helping in the battle to protect New Zealand from alien invasive species, such as the painted apple moth. Over 3000 insect pest invaders potentially could establish in NZ. Evolutionary computation used in phylogenetic studies can help identify which species are most threatening, where in the world they might come from and where they could establish a viable population. Computers can then be used to run various models of spread and control that can help scientists and authorities be better prepared.


There were many other interesting presentations at the conference and the website is still up if you'd like to check out the list of speakers:



I have become aware that there are a number of mathematicians working on biological problems in New Zealand, notably at the Allan Wilson Centre for Molecular Ecology and Evolution in Palmerston North, at Canterbury University, Auckland University and Massey University in Auckland, and I hope to have the opportunity to see what some of these mathematicians are doing later in the year.