I have just returned from my first American Psychological Association meeting that I thought was worth attending. I noted in a previous post that APA meetings now feature a wide array of division activities, and that those are worth going to, regardless of the main convention. This time, however, there were a handful of talks (out of many) which were valuable enough to make me think the main convention was starting to turn things around. The highlight was a session organized by Joseph Simmons, and featuring talks by Leslie John and Uri Simonsohn.
You might remember Simmons and Simonsohn as two of the three authors of last year's killer article False-Positive Psychology. Simonsohn is also the "data detective" responsible a few recent resignations. Simonsohn's methods in the latter cases looking for anomalous published results (e.g. effect sizes well above other effects in the same liturature), then looking a layer deeper into the data than most fraudsters would worry about (e.g., it is easy to make a correlation matrix look right, but who bothers to fake a realistic covariance matrix if they are not reporting it). -- For more details, see Neuroskeptic's good discussion here. -- All three talks in the session were solid, and the broader project forming here is really important for our field. The session finale was Simonsohn's explanation of his new fraud detection tool.
Recall that in the "false-positive" paper, no one was actually accused of acting maliciously. Instead, it was shown that series of fairly standard moves by psychologists made it much easier than it should to find an effect when there was not one. These common techniques included the adding and subtracting of covariates, running tests with and without outliers, and testing "just one more" participant over and over again. All of these are procedures that are not to bad when used in moderation, and might well be essential during the exploratory phase of a research program. However, if used too much, or too carelessly, and without follow up replication, they provide ample opportunity for false positives (i.e. type 1 error).
I won't get too specific, as he is has it under submission, but I am pretty sure I can get the point across in a way that is not "scooping" anything. The idea is to look, either within a lab or across a wider literature, at the distribution of published probability values. If a person, or lab, or literature, is routinely using techniques that generate false positives, you would expect a distribution biased towards the barely-good-enough-to-publish side of things (i.e., lots of p values near .05). This is because there is no reason to keep massaging the data (innocently or not) after you have passed the publishability threshold. Simonsohn had an excel spread sheet where you could enter the t or F values from studies, along with their degrees of freedom, determine the distribution, and calculate a probability that people were (innocently or not) using techniques that bias towards false positives. There was also a check to see if the published values fit with honest reporting in a situation where the null hypothesis was true and a test of fit for honest reporting in a situation where the null hypothesis was false. Like Simonsohn's data detective work, this involves dropping to a level of statistical analysis below where most people usually look: The probability of probability distributions! Still resisting the urge to scoop his paper, I can only say that it was brilliant. Simple, straightforward, obvious once you hear it, and potentially devastating.
This has the potential to disrupt several prominent labs in psychology, and possibly several whole literatures. Simonsohn seemed less interested (though not uninterested) in the meta-analytic implications. Attending this session and being able to talk with the Simmons and Simonsohn afterwards was almost enough on its own to make attending the entire APA conference worth it.
What I have missed at APA in the past was the feeling that I was hearing novel brilliant ideas, and gaining the opportunity to meeting brilliant people I didn't already know. This APA, it happened a few times. I am really close to looking forward to the next conference.