It’s a tumultuous period for big-time college athletics. The conferences that have long defined Division 1 sports are under siege, some being re-constituted by the month. The highbrow SEC, Pac-12, Big Ten and ACC have been busy annexing schools outside their traditional geographies, fueled by the bonanza of cable TV payouts. And the Big East, just a few years ago the most powerful basketball conference, is now all but gone, functioning as little more than a spare parts store for the big dogs.

Even with all the developments, it was noteworthy last month when the University of Maryland, a charter member of the prestigious ACC, bolted for the Big Ten. It’s one thing for schools from lesser conferences and the fading Big East to make a move up, but Maryland from the storied ACC? The news especially hit home with me, a resident of Big Ten headquarters Chicago, with a son at Big Ten charter UW Madison, and having grown up in Maryland, with friends and relatives from UM.

On the day the move was announced, I received a text from a Big Ten friend. “Maryland to the Big 10” was the first line. “They’re gonna get killed” was the second. I had heard the rumor so wasn’t surprised by the announcement. I was, though, taken back by the “get killed” assessment. “Killed in what?” was my response, to which my friend replied “Sports!”

We then traded barbs on Maryland and the Big Ten, my friend opining that Maryland was weak in football, me countering that Maryland certainly compares favorably with half of Big Ten teams – and just how good has the Big Ten been in football over the last five years anyway? I then noted Maryland’s successes in men’s and women’s basketball, men’s and women’s soccer, men’s and women’s lacrosse, etc. My friend scoffed, reiterating “They’re gonna get killed.”

A bit miffed, I resolved to bring hard performance data to the question of how Maryland and the ACC stack up athletically with the Big Ten. To that end, a friend told me about the Director’s Cup, an annual program/competition sponsored by the National Association of Collegiate Directors of Athletics, “that honors institutions maintaining a broad-based program, achieving success in many sports, both men's and women's”. A little investigation revealed that the Director’s Cup assesses school athletic performance annually – just what the statistical doctor ordered.

For D1 schools like those in the ACC and Big Ten, the annual DC rankings are determined from accumulated points measuring performance in 20 sports, 10 men’s and 10 women’s, conducted over the Fall, Winter and Spring seasons. All sports are equal-weighted, so the contribution of women’s field hockey is the same as men’s football. A school’s overall annual score is the sum of individual scores across each sport and season. At the end of the Spring season, schools are ranked by the aggregated scores.

The NACDA website publishes Cup performance data dating from 1993-94 to 2011-12 for D1 universities. For the analysis presented below, I scraped information from the twelve school years 2000-01 to 2011-12. Once the data were in text files, I munged them with a series of Python scripts, ultimately producing a single csv file with attributes year, school, score and rank. I then added conference affiliation for the schools of interest and loaded the data into both R and Tableau. A quick first Tableau exploration revealed that Stanford finished on top every year. Yikes!

There were a few problems with the data. First, for each year, entries were posted only for schools that actually earned Cup points; scoreless institutions were not included in the list. Fortunately, all ACC and Big Ten schools earned points each year. Second, for some reason, 2004-2005 and 2005-2006 presented data for only the top 100 scorers, omitting many schools with positive numbers. Again, I had the good fortune that all Big Ten and ACC were represented those years. Third, over time, many school names in the reports changed from say, Duke, to Duke (NC.), and had to be standardized. Typical data science stuff. Finally, I used 2012-2013 conference affiliations for the analysis, rather than acknowledging conference as a dimension that can change from year to year. All of Nebraska’s history, for example, is attributed to the Big Ten, even though the Cornhuskers have only been a conference member since 2011.

Tableau is great for depicting the performance of the ACC against the Big Ten, also showing how Maryland compares to its future Big Ten peers. Figure 1 is a small-multiples graph contrasting the Big Ten and the ACC on average DC rank by school over the twelve years (smaller is better). From this view and with this data, the Big Ten appears to be the better-performing conference – especially at the bottom.

If Figure 1 presents the forest, Figure 2 displays the trees, interleaving Big Ten and ACC by average performance rank. And, low and behold, Maryland ends up smack-dab in the middle of the Big Ten – six schools above and six below – with these data. Maryland also bests my Big Ten friend’s school. Perhaps he’s guilty of a little availability or proximity bias?

To those who chirp that Maryland’s not a worthy academic addition to the Big Ten, consider the most recent U.S. News ratings, which would again have Maryland squarely in the center of the Big Ten – below six schools and above six others.

Welcome to the Big Ten, University of Maryland. Happy Holidays to All!