I have enjoyed looking at Chris Chatfield's book "Problem solving: a statistician's guide." A few points from his summary section "How to be an effective statistician": "Look at the data... errors are inevitable when processing a large-scale data set, and steps must be taken to deal with them." [On the other hand, "large scale" often means "collected automatically", and the less human intervention, the less opportunity to screw it up.]
"Rather than asking 'What technique shall I use here?' it may be better to ask 'How can I summarize these data and understand them?'" "If an effect is 'clear', then the exact choice of analysis procedure may not be crucial." [Should we just be looking at testing for effects? Is there not an effect in the real world?] "A simple approach is often to be preferred to a complicated approach, as the former is easier for the 'client' to understand and is less likely to lead to serious blunders." [If something is not understandable, it's not verifiable. (But if it's not understandable, what good is it anyway?]
Articles by Chatfield referenced in the book:
Teaching a course in applied statistics
The initial examination of data
Model Uncertainty, Data Mining and Statistical Inference
Avoiding Statistical Pitfalls
Also referenced:
Teaching and Examining Applied Statistics by A.G. Hawkes
No comments:
Post a Comment