Cats are better than dogs — and vice versa — as statistics, studies, magazine rankings and polls clearly indicate.
It is a byproduct of the information age that virtually any topic is subject to this sort of scrutiny.
In many cases, these sorts of examinations are useful. Statistics and studies, particularly, can help use measure the scope of a problem, discover its roots and steer us in the direction of a solution.
But in many other cases, the information we obtain from such sources should be viewed with caution, if not outright skepticism.
In some cases, the information is deliberately skewed by those who produce it as a means of advocating a certain position, but there are other cases where studies, stats, etc., can be misleading quite unintentionally.
We don’t have to strain our memory to find examples of this, of course.
Just this week alone, we’ve see the evidence of this on subjects of local interest:
A University of Pennsylvania study on racial bias in school discipline in the South singled out the Mississippi School for Match and Science in Columbus as a most egregious offender, stating that while 30 percent of MSMS students are black, 100 percent of students suspended by the school were black.
The Mississippi Department of Education released its final results on the state’s third-grade reading assessment, called third-grade gate. Those students who failed to pass the test were not allowed to move onto the fourth grade. The final tally shows that 10 percent of the state’s third-graders failed the test.
Mississippi University for Women has been recognized by U.S. News & World Report as one of the top 20 public Southern regional universities According to rankings released Wednesday The W was listed 18th among public Southern regional institutions.
These three examples illustrate the dangers of taking studies, stats and rankings as the final word on a subject.
In the case of MSMS, the Penn study was particularly disturbing. Upon closer inspection, the study had some obvious flaws, beginning with a small sample size that renders statistics a wild distortion. Not only did the study only review suspensions from a single year, it also failed to consider that MSMS is a small school made up of a high-achieving student body. Suspensions are rare to begin with.
But, the biggest flaw in the study was some obvious misinformation. In the year the data was based on, MSMS had two suspensions. However, the races of the students was erroneously reported. One student was black, the other white. So, in reality, suspensions of blacks and whites at MSMS were equal, percentage-wise.
But even if both of the suspended students at MSMS had turned out to be black, what real meaning can be found in it? The sample size is so small as to be statistically irrelevant.
MSMS officials had every right to be offended. The original report on the study suggested an institutional racism at MSMS that is not supported by any meaningful evidence. That’s a serious charge that can have a terrible effect on a school’s reputation. To its credit, Penn is working with MSMS to correct the study and walk back its implication that MSMS treats its minority students unfairly.
In the second case, the third-grade reading assessment, there is no issue of accuracy to undermine the data. Even so, there is ample reason to argue that the numbers are not a clear assessment of our state’s third-graders. For example, among the 10 percent of those who failed the test, many, perhaps most, were students with learning disabilities or students for whom English is a second language. Under the terms of the state law, those students are exempt from being held back.
What is also important to note is that determining who passed or failed was an arbitrary decision. The MDE set the pass/fail score after the students took the test. That is tantamount to moving the goal posts in a football game. Because of that, the results of the test have little real meaning. Have our third-graders improved their reading ability? The answer is likely yes, not because the tests say so, but because much emphasis was put on reading during the school year.
Finally, while we don’t dispute that MUW is a fine school and worthy of recognition by the magazine, we find it difficult to rank schools. Is The W 18th best because simply the magazine says so? Who is to say it is not No. 1 or 10 or 20?
In its release, the magazine made one point that we all should consider anytime we see statistics, studies, rankings or polls.
It read: According to U.S. News & World Report, it cannot measure the various factors that make up the college experience. “But for families concerned with finding the best academic value for their money, the U.S. News Best Colleges ranking provide an excellent starting point for the search.”
That’s a point we should keep in mind. Stats and studies, rankings and polls, should not be viewed as the “final word” on a subject, but, rather, an invitation for closer scrutiny.
The Dispatch Editorial Board is made up of publisher Peter Imes, columnist Slim Smith, managing editor Zack Plair and senior newsroom staff.
You can help your community
Quality, in-depth journalism is essential to a healthy community. The Dispatch brings you the most complete reporting and insightful commentary in the Golden Triangle, but we need your help to continue our efforts. In the past week, our reporters have posted 41 articles to cdispatch.com. Please consider subscribing to our website for only $2.30 per week to help support local journalism and our community.