Skip to:

scholarly publishing

What can machines discover from scholarly content?

Just as you thought that everything was known about the academic user journey, a workshop comes along (the WDSM Workshop on Scholarly Web Mining, SWM 2017, held in Cambridge, February 10 2017) that presents a whole new set of tools and investigations to consider.

It was a rather frantic event, squeezing no fewer than 11 presentations into a half-day session, even if the event took place in the sumptuous and rather grand surroundings of the Council Chamber in the Cambridge Guildhall. Trying to summarise all 11 presentations would be a challenge; were there any common areas of inquiry?

How TrendMD uses collaborative filtering to show relatedness

TrendMD is (as its website states) “a content recommendation engine for scholarly publishers, which powers personalized recommendations for thousands of sites”. An interesting blog post by Matt Cockerill of TrendMD (published February 2016) claims “TrendMD’s collaborative filtering engine improves clickthrough rates 272% compared to a standard ‘similar article’ algorithm in an A/B trial”. That sounds pretty impressive.

The Journal Impact Factor and the Publishing Business

The Journal Impact Factor has been discussed, and criticized, for years. A recent Scholarly Kitchen article looks at another proposal for improving the impact factor (Optical Illusions, 21 July 2016). This is by no means the first suggested improvement to the impact factor metric – a search on Scholarly Kitchen itself reveals there are several posts on this topic each year.

Perhaps the biggest problem with the Journal Impact Factor is this. Most journals, from Nature to the smallest journal, seem to have a similar graph when number of citations are measured by individual articles in that journal. A few articles are cited a lot, followed by a very long tail of articles that get few or even zero citations. We all know this, but we persist in believing a Journal Impact Factor is in some way representative for each article in that journal.

Did anyone read my article? Did it have any impact?

Elsevier Library Connect Research Impact Metrics Cards

Any author will ask questions such as the ones above, and academic authors are no exception. In one sense, we have better answers than were possible just 20 years ago. Although thousands of copies of print books are sold per year, in those days there was little evidence coming back to the publisher that those books were actually read. In fact, one joke among publishers was that encyclopedias and bibles had one thing in common: they were more bought than read. A typical publisher would receive just a handful of comments from readers each year. As publishers, we knew the books were sold; but we didn’t know if they had ever been read. So if an author had asked us if anyone read their book, we couldn’t say.

A day in the Life of a (Serious) Researcher

 

How do researchers really look for and find content for their research? That’s a pretty fundamental question! So I turned to the research project “A Day in the Life of a (Serious) Researcher” with great anticipation to identify that part of the researcher activity relating to seeking and finding information. I found the survey exciting but at the same time questionable in some of its conclusions.

One way of dealing with the rising price of textbooks

Textbooks (Flickr, CC BY 2.0)

 

Textbook prices are increasing steadily - so what should we do about it?

An interesting article in the Financial Times (Monday 16 May 2016) “Rising Price of Textbooks reaches a tipping point”, reveals the problem and one tutor’s response. US Census Bureau statistics show that textbook prices increased more than 800 per cent between 1978 and 2014, more than triple the cost of inflation. 

Pages