Until university leaders can supply politicians with better approaches to accountability, they’re just going to count. In Norway right now, we’re in the midst of counting season. And because of the sharp folks at Current Research Information System in Norway, we have detailed numbers. (Disclosure: I’m the head of the board at CRIStin.)
Along with counting, comes comparison. So, here’s a little teaser from the high north and, sure, I’d like to know how it stacks up against your university.
In 2013, the 18,709 academic staff members at Norwegian colleges and universities were (co)authors on a total of 17,493 research articles, for an average of just less than one (co)authorship per academic employee.
The biggest and oldest university in Norway (Oslo) has 3,344 academic positions and generated 5,102 (co)authorships, for an average of about 1.5 publications per academic employee. The smallest and youngest university (Nordland) has 308 academic staff and 169 (co)authorships, for an average of about 0.5 publications per academic employee.
Academics in Oslo, from this perspective, publish three times as much as academics in Nordland.
In the Norwegian budgetary system, points are assigned through an elaborate algorithm building in part on the (alleged) quality of journals. Universities, through this system, get paid for their publications.
According to the point system, the University of Oslo has about 30 times as many points as the University of Nordland. But Oslo also has 10 times more employees.
Oslo, in other words, comes out as about three times more productive than Nordland either way: if we count (co)authorships or if we count in the Norwegian point system.
The cost of co-authorship
These two ways of counting shouldn’t necessarily line up so nicely, and the fact that they do hides some important differences. Nearly 40% of the publications at the University of Oslo are at the highest level in the point system while only about 20% of those from the University of Nordland are there. From this perspective, we would expect the points per employee in Oslo to distinguish the institutions even more than the (co)authorships do.
On the other hand, about 40% of Oslo’s articles are written with international co-authors while only 30% of Nordland’s are. Our Ministry — which claims to want to encourage international cooperation — actually punishes universities for international co-authorship by reducing the number of points the Norwegian institution is credited with.
(As I noted here, eliminating the reduction that follows from international co-authorship is a no-brainer for improving the system.)
Open access policy goals
The Norwegian point system is not being creatively used to pursue policy goals — or else the government’s only policy goal is getting researchers to publish more, which at the very least lacks nuance.
One of the most exciting opportunities would be to use the point system to push researchers towards open access publication. This should be a policy issue of concern to the Norwegian government, not least of all because of the sanctions recently reported in Nature.
The London-based Wellcome Trust says that it has withheld grant payments on 63 occasions in the past year because papers resulting from the funding were not open access. And the NIH, in Bethesda, Maryland, says that it has delayed some continuing grant awards since July 2013 because of non-compliance with open-access policies.
European-level research funders are sure to join this trend; they, too, have OA regulations in place and will likely act to make sure they are followed. Norway could achieve a competitive advantage in the European context through higher compliance with open access publishing requirements.
And if we do that, we’ll be good at even more than counting.