Today's post is about the company Academic Analytics, which describes its mission on its LinkedIn profile as follows: "Academic Analytics partners with research institutions to provide data that offer comprehensive information on faculty research activity as well as strategic solutions drawn from analyses of national disciplines and custom comparisons at the institutional, department, program, and individual faculty levels. The data have tremendous potential, supporting strategic investment decisions, benchmarking research activity, reviewing academic units, and developing the university research enterprise."
This is apparently not the message the company used to have, according to an October 2016 article in Chronicle of Higher Education. "Just a few years ago, Academic Analytics, an upstart company providing data on faculty productivity, talked of helping cash-strapped universities save as much as $2 billion by identifying their lowest-performing professors." The article explains that Academic Analytics toned down its message after Georgetown University cancelled its subscription (in 2016). Georgetown's provost Robert Groves "explained a decision to drop the university’s subscription by questioning whether Academic Analytics’ data are comprehensive, accurate, or consistently valuable." Readers can learn the details about the study Georgetown performed in the summer of 2016 to investigate the accuracy of the Academic Analytics information here, although Provost Groves took pains to state, in a follow-up post dated July 2017 available there, that: "Academic Analytics does a good job of collecting and making available the data that it purports to collect, and [the] data largely are accurate. For that reason Academic Analytics can be a useful tool for universities, and it has the potential to become even more useful as it expands the nature and types of data it collects."
The American Association of University Professors posted a "Statement on Academic Analytics and Research Metrics" on its website back in March 2016. It mentioned: "[Academic Analytics] now claims 385 institutional customers in the U.S. and abroad, representing about 270,000 faculty members in 9,000 Ph.D. programs and 10,000 departments. Some of the firm's metrics are without any qualitative dimension: per capita production of books and articles, for example. Other figures, such as per capita citations and "awards" per faculty member, may be said to introduce some qualitative dimensions, but seem to produce puzzling results." It described concerns at Rutgers (a contract worth $492,500), including some about data accuracy stemming from faculty members' inability to check their data for correctness, since the portal is only accessible to administrators. (It seems that universities can now choose to have their faculty members see and perhaps edit their data.)
The statement also explains, "The faculty of the university's School of Arts and Sciences [at Rutgers] voted overwhelmingly to forbid use of the firm's data in tenure and promotion decisions or "in decisions affecting the composition of the faculty, graduate and undergraduate curricula, and grant‐writing." The resolution also called on the university to distribute personal data collected by the firm to each faculty member of the school."
It goes on to say: "Most faculty members have some sort of direct experience of metrics used to assess performance... There is, however, good reason to doubt the utility of such metrics in tenure and promotion decisions and/or in judgments affecting hiring, compensation or working conditions. A 2015 study by the Higher Education Funding Council for England,2 where use of research metrics is now required at public institutions... found that indicators can be misused or "gamed," [...and] concluded that "carefully selected indicators can complement decision‐making," but expert judgment and qualitative measures that respect research diversity remain essential elements of a representative process of peer review, which should remain "the primary basis for evaluating research outputs, proposals and individuals.""
I have found more information on the Northeastern University's data access website, which provides a User Guide to Faculty Insight, a key module among the offerings of Academic Analytics that "enables faculty and academic staff to find targeted funding opportunities and experts at their institution". The Faculty Insight portal is open to all Northeastern faculty. Of greater concern is the Benchmarking Suite, which the Northeastern website says is used to perform departmental program reviews, identify vulnerabilities, strengths and weaknesses for strategic planning and analyze faculty career profiles and trajectory by discipline.
(Screenshot taken of this website on 10/10/20)
While data has its place in higher ed, situations where evaluating key components of the academic enterprise, from departmental programs to faculty members, would potentially be farmed out to a non-academic, for-profit entity are particularly worrisome. This approach also completely fails to recognize each university's brand, departments' brands (especially in doctoral programs), signature strengths, and all the qualitative measures that explain why certain students decide to enroll in a university over others.
Georgetown and Rutgers are not the only universities where concerns have been raised about Academic Analytics. Another famous, more recent example is UT Austin. In 2018, UT Austin faculty joined a campaign against Academic Analytics, as described in another Chronicle article, which refers to Academic Analytics as a "faculty-productivity company". (Readers will notice that Academic Analytics's self-description on LinkedIn does not use the word "productivity".) The opening paragraph states: "The University of Texas at Austin this week became one of the most prestigious research institutions to join a faculty rebellion against Academic Analytics, a data company that promises to identify low-performing professors." UT Austin's Faculty Council (their equivalent to the Faculty Senate) passed a resolution about Academic Analytics in January 2018 in which it recommended that UT Austin "not adopt Academic Analytics, LLC as a faculty management tool" (the term "faculty management tool" already sounds quite ominous). Further, it urged the administration, should it adopt Academic Analytics anyway, not to use that data to allocate resources among faculty nor in decisions affecting the composition of the faculty, nor in tenure and promotion decisions, nor in raise decisions, nor in undergraduate and graduate curricula decisions. Further, it urged the administration to make all personal data available to faculty members for their review, correction and ratification. Follow-up to the administration and their response are here.
As a data scientist, I am always interested in ways to use data to make better decisions, especially in unusual applications, such as faculty research. More broadly, data can help foster conversations about the right metrics to evaluate one's field and opportunities to leverage one's strengths. There are, however, clear pitfalls associated with letting a for-profit company wield such outsized influence over so many in academia and in so many universities focusing on the same Academic-Analytics-infused thinking to improve their academic standing. Universities already compete aggressively for the most meritorious students, spending extensive amounts of aid to bring in always better students (as measured by ACT and SAT scores), and the result of that has been a recent national refocus toward need-based financial aid and an update by US News of the criteria they use in ranking schools to put less emphasis on "student excellence" (this WaPo article is a great analysis of some of the recent changes and what remains wrong with the US News rankings). It often seems that U.S. universities try to do everything they can to shine in whatever indicators that US News uses in its rankings and, although data is valuable, I don't think it would be a compliment to higher education if they now fell in line behind Academic Analytics to evaluate their own faculty and their research productivity.
Here is, however, one possible useful way to leverage Academic Analytics. Deans and Provosts know their peers and aspirational peers at the school/college or university level, but it is very hard for them to know how individual departments in disciplines other than their own measure up, except by looking at the department-level (graduate) US News rankings. A department head may have in mind five peers and five aspirational peers, but how can a Dean from other discipline, let alone a Provost from a totally different field, augment that thinking or come up, as part of a brainstorming session with the department, with other peer/aspirational peer departments that would be more aligned with the strategy the Dean or the Provost has for the school or university? Under those circumstances it could be valuable to have access to a database like Academic Analytics. But it is not clear that it would be then be worth its hefty price tag to many administrators who use it.