Friday, 10 September 2010

getstats zone at RSS 2010


The getstats team are looking forward to RSS 2010 in Brighton next week. All of the delegates are encouraged to visit the getstats zone which will have information about the campaign and the ways you can get involved and answer any questions. We are also going to be doing some filming of vox pops to get everyone’s views on the importance of statistics and the campaign. Please drop by the zone to find out more! We look forward to seeing you there.

Thursday, 9 September 2010

Robustness in Engineering

In engineering, reliability problems come about for essentially only two reasons 1) mistakes, and 2) lack of robustness. Genichi Taguchi did much to bring to our attention the idea of robustness (making designs insensitive to variation, or “noises”), although others had been there too, notably RSS Fellow and Greenfield medallist Jim Morrison as far back as 1957. Taguchi had some important things to say about strategies for improving robustness, one being that engineers should first look to desensitize their designs to variation through experimenting with design parameters related to geometry, material properties and the like, and not to choose the more obvious path of trying to reduce or eliminate the noises. I will explain some of Taguchi’s ideas, and hope to demonstrate that he didn’t deserve some of the attacks on him by the statistical profession at the time, in stark contrast to the way our profession seems to have embraced the Six Sigma movement with nothing like the same scrutiny afforded to Taguchi’s work.

Tim Davis

Wednesday, 8 September 2010

The role of likelihood in statistical science

There have been many interesting developments in theoretical statistics since you were in graduate school, but do these have any relevance for the practice of statistics? What is needed to make the translation from nice new mathematics to "on-the-ground"improvements? In my talk I survey some of the advances in likelihood-based inference, and try to identify the most promising links to better analysis of data.

Nancy Reid
University of Toronto

Tuesday, 7 September 2010

Predicting Credit Default Rates

Predicting spatial processes often involve using many, many parameters. That approach requires using Bayesian methods -- or something with the same effect -- to shrink the predictions back to something more reasonable. I'm going to use something simpler, regression. No, not a ridge estimator either. Rather, by constructing a particular explanatory variable, I can achieve much the same effect at the cost of just a few parameter estimates. My talk will cover this trick as well as show a variety of maps of the evolution of default rates in the US. Hope this is enough to lure you back inside from the beach next week!

Sunday, 5 September 2010

Statistical Engineering & Reliability

High profile cases like BP, Toyota, and Firestone bring into sharp relief the subject of engineering for reliability. As statisticians, we seem to have got everybody from ourselves, to scientists & engineers, to senior management and to regulatory authorities, comfortable with the idea of expressing reliability as a probability. Indeed, in media interviews, the BP CEO quoted a failure probability of “about 10-5” for the oil rig that exploded causing the spill. In his investigation into the 1986 Challenger disaster, when NASA management had quoted a similar probability for the reliability of the Space Shuttle, Richard Feynman said in his report into the accident “What is the cause of management's fantastic faith in the machinery?” Probability measures for reliability may be appropriate for some fields of engineering, but I will introduce an information based definition that is better suited to many engineering situations (including automotive) where the probability definition simply can’t be measured. I will argue that the focus should be on evaluating the efficacy of counter measures for identified potential failure modes, and the statistical methods required to evaluate this efficacy are much different to those required in attempting to measure reliability through a probability.


Tim Davis

Wednesday, 1 September 2010

Special offers for conference delegates


We have teamed up with Visit Brighton to get some special offers which are available for all delegates to take advantage of while in Brighton. Why not check out a local restaurant or attraction during your stay?


Monday, 30 August 2010

Some more on Statistical Engineering

My previous comments on problem solving lead me to think about how I might illustrate the use of statistical methods in directly solving engineering problems. I have been involved in many interesting and challenging problems in my 30 years in the automotive industry. The recent media coverage of both the Toyota problem with sticking accelerator pedals and the BP oil spill in the Gulf of Mexico caused me to think back to my involvement in a similarly high profile case - the Firestone tire crisis of 2000/1, which resulted in around 300 fatalities and a $3Bn recall of 20 million tyres. There are many similarities in all three of these cases (not least the role of the media, and government agencies), but in the case of Firestone, I will show how a range of statistical methods was used (from simple EDA methods like box plots, to more sophisticated methods such as competing risk proportional hazard regression) to get to the root cause of the problem, and to quickly get ahead of the game, and decide on what actions to take, before the regulatory authorities told us what to do.