Observing that women are overrepresented in gender studies is your best example of "misogyny"? Really?
Munsch's correction in ASR

okay you shamed me into reading the correction. it is worse than I had imagined.
first off, this is using the NLSY  an old, welldocumented, and widely used dataset. this makes me think that someone else found her error, unlike the narrative in the correction.
second, she miscoded the DEPENDENT VARIABLE (infidelity), not just some explanatory variable or control. . worse, the impact is lowering the DV mean from 10 percentage points to 6 percentage points, nearly a 50% reduction (!!!).
third, this happened for "246 cases" which I can only assume is a sneaky way of saying "246 persons" and not just 246 observations. there are 2,713 respondents, so this is nearly TEN PERCENT OF THE SAMPLE.
but wait, it's worse. she also messed up FIVE OTHER VARIABLES.
This is weird: " In redoing my analysis, however, a reviewer pointed out that coefficients in binary regression models are confounded with residual variation." Does this mean that a reviewer found her error postpublication and then supervised her correction? Sounds like the original/published models are wrong then. Figure 1 has no confidence bands and isn't very convincing. Not to mention there is not really any causal identification, just correlations.
In conclusion, she had 5 results; one of them is gone and the other are statistically weaker than before. And some of you say the paper shouldn't be retracted?

As someone said upthread, the issues was discovered by a fellow Stanford grad who often works with the same data on similar topics.
I'd choose a correction over retraction any day, but the correction probably went through a review process and the text outkast quoted came from one of those reviewers rather than the ND AP.

^If you would have read the correction (and you obviously haven't) you would have seen that she didn't test initially for the correlations of the residuals (which were correlated) thus making the model invalid. Someone else found the confounding effect.
And yes, maybe it takes another class in stats to get this right. No shame here... And not to blame Munsch, it is not her fault, it is the poor curricula in sociology programs and also the fact that rarely students there go the extra mile to take advanced stats classes from other departments.

i will say that Stata's use of . (positive infinity) for missing values is a stupid, lazy trick. they should treat them as actually missing so that they are excluded from any computations.
(which is why I believe this was an inadvertent as opposed to willful error)
still, researchers should look at their data and not just their regression output.

>i will say that Stata's use of . (positive infinity) for missing values is a stupid, lazy trick. they should treat them as actually missing so that they are excluded from any computations.
But Stata uses complete case analysis, right? For example, regression will drop an observation if it has at least one missing value for any variable that's in the model.

but she did not enter the raw variable into the model, but the DV which was incorrectly coded to be 1 or 0 even when the underlying data were missing. so observations didn't get dropped. had she entered the observations with the original,misisng data, then obs would have been dropped and she would have (likely) noticed.
btw, I need to retract my statmenet on Stata being lazy about missing variables. they make a good point that a statement like "year > 1900" not being able to be evaluated if there are missing data. but maybe they should throw an error like "sorry, can't do comparisons on missing data"

>But Stata uses complete case analysis, right? For example, regression will drop an observation if it has at least one missing value for any variable that's in the model.
The issue occurred in data management before the analysis. The "replace" command allows you to recode a range of values without specifying an upperbound, which would lead to missing cases receiving the recoded value (and thus being included in subsequent analyses). The issue doesn't arise when using the "recode" command because you need to specify an upper bound when recoding a range of values.
I sympathize with her (at least for this mistake). I wasn't aware of Stata treating missing values this way. I never made this mistake, but only because I always use the "recode" command. Although, if I had done so, I would like to think that I'd have spotted it when I checked the univariate distributions and descriptives.

Munsch probably was having her prior beliefs about hetero relationships confirmed and the results were statistically significant and marketable to ASR, so she wasn't inclined to question her results or even look at the Stata data editor.
I agree that good research practice involves examining your data carefully. Munsch apparently didn't do that here.