Tag Archives: academia

For (not against) a better publishing model

I was unhappy to see this piece on the American Sociological Association (ASA) blog by Karen Edwards, the director of publications and membership.

The post is about Sci-Hub, the international knowledge-stealing ring that allows anyone to download virtually any paywalled academic paper for free. (I wrote about it, with description of how it’s used, here.) Without naming me or linking to the post, Edwards takes issue with pieces like mine. She writes:

ASA, other scholarly societies, and our publishing partners have been dismayed by some of the published comments about Sci-Hub that present its theft as a kind of “Robin Hood” fairy tale by characterizing the “victims” as greedy publishers feasting on the profits of expensive individual article downloads by needy researchers.

My first objection is, “ASA … have been dismayed.” There have been many debates about who speaks for ASA, especially when the association took positions on legal issues (their amicus briefs are here). And I’m sure the ASA executives send out letters all the time saying, ASA thinks this or that. But when it’s about policy issues like this post (and when I don’t agree), then I think it’s wrong without some actual process involving the membership. The more extreme case, on this same issue, was when the executive officer, Sally Hillsman, sent this letter to the White House Office of Science and Technology Policy objecting to the federal government’s move toward open access — which most of us only found out about because Fabio Rojas posted it on OrgTheory.

My second objection is to the position taken. In Edwards’ view, the existence of Sci-Hub, “threatens the well-being of ASA and our sister associations as well as the peer assessment of scholarship in sociology and other academic disciplines.”

Because, in her opinion, without paywalls — and Sci-Hub presumably threatens to literally end paywalls — the system of peer reviewed scholarly output would literally die. As I pointed out in my original piece, if your entire enterprise can be brought down by the insertion of 11 characters into a URL, your system may in fact not be sustainable. Rather than attack Sci-Hub and its users, “ASA” might ask why its vendor is so unable to prevent the complete demolition of its business model by a few key strokes. But they don’t. Which leads me to the next point.

The Edwards post goes way beyond the untrue claim that there is no other way to support a peer review system, and argues that ASA needs all that paywall money to pay for all the other stuff it does. That is, not only do we need to sell papers to pay for our journal operations (and Sage profits), we also need paywalls because:

ASA is a nonprofit, so whatever revenue we receive from our journals, beyond what it costs us to do the editorial and publications work, goes directly into providing professional and educational services to our members and other scholars in our discipline (whether they are members or not). … The revenue allows ASA to provide sociologists in the field competitive research grants, pre-doctoral scholarships, specialized career development, and new digital teaching resources among many other services. It is what allows us to work effectively with other social science associations to sustain and, hopefully, grow the flow of federal research dollars to the social sciences through NSF, NIH, and many others and to defend against elimination and cuts to federal support (e.g., statistical systems and ongoing surveys) so scholars can conduct research and then publish outstanding scholarship.

In other words, as David Mamet’s character Mickey Bergman once put it, “Everybody needs money. That’s why they call it money.”

This means that finding the best model for getting sociological research to the most people with the least barriers is not as important as all the other stuff ASA does — even if the research is publicly funded. I don’t agree.

Better models

There are better ways. Contrary to popular misconceptions, we do not need to go to a system where individual researchers pay to publish their work, widening status inequalities among researchers. The basic design of the system to come is we cut out the for-profit publishers, and ask the universities and federal agencies that currently pay for research twice — once for the researchers, and once again for their published output — to agree to pay less in exchange for all of it to be open access. Instead, they pay into a central organization that administers publication funds to scholarly associations, which produce open-access research output. For a detailed proposal, read this white paper from K|N Consultants, “A Scalable and Sustainable Approach to Open Access Publishing and Archiving for Humanities and Social Sciences.” (Others are trying as well; check out the efforts of the American Anthropological Association.)

This should be easy — more access, accountability, and efficiency, for less — but it’s a difficult political problem, made all the more difficult by the dedicated efforts of those whose interests are threatened by the possibility of slicing out the profit (and other surplus) portions of the current paywall system. The math is there, but the will and the organizational efforts are lagging badly, especially in the leadership of ASA.

1 Comment

Filed under In the news

How to steal 50 million paywalled papers

5747629074_d484394fa5_b

From Flickr CC / https://flic.kr/p/9KU6T1

I’m not a criminal mastermind, so I could be wrong, but I can’t think of a way to steal more — defined by list price — per unit of effort than using sci-hub to access paywalled papers I don’t have legitimate subscription access to read.

I don’t know the scientific term for this, but there has to be some way to describe the brokenness of a system based on the ratio of effort expended to damage done. For example, there are systems where even large effort is unlikely to cause serious harm (the US nuclear weapons system), versus those where minor efforts succeed but cause acceptable levels of harm (retail shoplifting). And then there is intellectual property, where small investments can inflict billions of dollars worth of damage.

Syed Ali and I have a short piece with some links to the news on sci-hub at Contexts. Alexandra Elbakyan says it took her about three days to develop the system that now gives anyone in the world access to almost 50 million paywalled articles. Of course, a lot of people help a little, by providing her with access information from their subscribing universities, but it still seems like a very low ratio of criminal energy to face-value payoff.

Example

Now that sci-hub is in place, how hard is it for an untrained individual to steal a $40 article while risking almost nothing? As hard as it is to insert 11 characters into a paywall URL and wait a few seconds (plus your share of the one hour I expended on this post).

Here’s an example. In the journal Society, published by Springer, an article in the current issue is currently available for $39.95 to non-subscribers. But Society is a “hybrid open access” journal, which means authors or their institutions can pay to have their paper unlocked for the public (I don’t know how much it costs to unlock the article, but let’s just assume it’s a rollicking awesome deal for Springer).

So for this example I use one of the unlocked articles, so you can try this without stealing anything, if that feels more ethical to you, but it works exactly the same way for the locked ones.

The article is “Saving the World, or Saving One Life at a Time? Lessons my Career with Médecins Sans Frontières/Doctors Without Borders/MSF has Taught Me,” by Sophie Delaunay. This is the launch page for the article:

http://link.springer.com/article/10.1007/s12115-015-9965-4

From there, you can download the PDF (it would say “Buy Now” if the article weren’t unlocked) at this link:

http://link.springer.com/content/pdf/10.1007/s12115-015-9965-4.pdf

Or you can steal it for free by inserting

.sci-hub.io

into the URL, after the corporate domain thing, like this:

http://link.springer.com.sci-hub.io/content/pdf/10.1007/s12115-015-9965-4.pdf

Don’t ask me how it really works, but basically it checks if the article has been requested before — in which case it’s cached somewhere — and if it hasn’t been requested before it uses fake login information to go get it, and then it stores the copy somewhere for faster retrieval for the next person. That’s why your stolen PDF may have a little tag at the bottom that says something like “Downloaded from journal.publisher.com at Unfamous University on Recent Date.” If the article comes up instantly, you didn’t really steal it, you’re just looking at a stolen copy; if you have to watch the little thing spin first then it’s being stolen for you. With this incredibly smart design the system grows by itself, according to demand from the criminal reading public.

What’s the punishment?

I have no idea what risk Alexandra Elbakyan or her compatriots face for their work. I don’t imagine the penalty for any given user is greater than the penalty for shoplifting a $39.95 bottle of Awesome Wasteproduct. And for me sharing this, I would expect the worst thing that would happen would be a stern letter on legal letterhead. But maybe I’m naive.

Anyway, the point is, it says something about the soundness of the academic publishing edifice that doing this much damage to it is this easy.

What are the ethics?

I am aware that some reasonable people think sci-hub is very wrong, while others think the current system is very wrong. I know that many people’s current paychecks depend on this system continuing to malfunction as it does, while others never earn the higher incomes they otherwise could because they can’t get paywalled articles. I understand corporate journals add some value through their investments. And I know that the current system denies many people access to a lot of information, with social costs that are unquantifiable. And there is some inherent value to not breaking the law just in general, while there is also value to breaking bad laws symbolically. How you balance all those factors is up to you.

Some people think it’s even wrong to discuss this. What does that tell you?

5 Comments

Filed under In the news

Basic self promotion

your work

If you don’t care enough to promote your research, how can you expect others to?

These are some basic thoughts for academics promoting their research. You don’t have to be a full-time self-promoter to improve your reach and impact, but the options are daunting and I often hear people say they don’t have time to do things like run a Twitter account or write blogs. Even a relatively small effort, if well directed, can help a lot. Don’t let the perfect be the enemy of the good. It’s fine to do some things pretty well even if you can’t do everything to your ideal standard.

It’s all about making your research better — better quality, better impact. You want more people to read and appreciate your work, not just because you want fame and fortune, but because that’s what the work is for. I welcome your comments and suggestions below. 

Present yourself

Make a decent personal website and keep it up to date with information about your research, including links to accessible copies of your publications (see below). It doesn’t have to be fancy (I have a vested interest in keeping standards low in that department). I’m often surprised at how man  people are sitting behind years-old websites. 

Very often people who come across your research somewhere else will want to know more about you before they share, report on, or even cite it. Your website gives your work more credibility. Has this person published other work in this area? Taught related courses? Gotten grants? These are things people look for. It’s not vain or obnoxious to present this information, it’s your job. I recommend a good quality photo (others disagree).

Make your work available

Let people read the actual research. Publishing in open-access journals is ideal, because it’s the right thing to do and more people can read it. (My recent article in Sociological Science was downloaded several hundred times within 10 days, which is much more than I would expect from a paywalled journal.)

Whether or not you do that, share your working paper or preprint versions. This is best done in your university repository (ask your library) or public disciplinary archive. (For prominent examples, check out the University of California’s has eScholarship or Harvard’s DASH; I use the working paper site of the Maryland Population Research Center, which is run by UCLA.) If you put them on your own university website, that will allow them to show up in web searches (including Google Scholar), but they won’t be properly tagged and indexed for things like citation or grant analysis, or archived — so it’s better just to put links on your website. But don’t just link to the pay-walled version, that’s the click of death for someone just browsing around. 

Don’t be intimidated by copyright. You can almost always put up a preprint without violating any agreement (ideally you wouldn’t publish anywhere that makes you take it down afterwards), and even if you have to take it down eventually you get months or years to share it first. No one will sue you or fire you — the worst outcome is being asked to take it down, which is very rare. Don’t prioritize protecting the journal’s proprietary right to promotion over serving the public (and your career) by getting the research out there, as soon as it’s ready. To see the policies of different journals regarding self-archiving, check out the simple database at SHERPA/RoMEO.

I oppose private sites like Academia.edu and ResearchGate. These are just private companies doing what your university and its library are already doing for the public. Your paper will not be discovered more if it is on one of these sites. It will show up in a Google search if you put it on your website or, better, in a public repository.

I’m not an open access purist, at least for sociology. If you got public money to develop a cure for cancer, that’s different. For us, not everything has to be open access (books, for example), but the more it is the better, especially original research. Anyway, it would be great if sociology got more into open science (for example, with the Open Science Framework). People for whom code is big already use sites like GitHub for sharing, which is beyond me; in your neck of the woods that can be great for getting your work out, too.

Share your work

In the old days we used to order paper reprints of papers we published and literally mail them to the famous and important people we hoped would read and cite them. Nowadays you can email them a PDF. Sending a short note that says, “I thought you might be interested in this paper I wrote” is normal, reasonable, and may be considered flattering. (As long as you don’t follow up with repeated emails asking if they’ve read it yet.)

Social media

I recommend at least basic social media, Twitter and Facebook. This does not require a massive time commitment — you can always ignore them. Setting up a public profile on Twitter or a page on Facebook gives people who do use them all the time a way to link to you and share your profile. If someone wants to show their friends one of my papers on Twitter, this doesn’t require any effort on my part. They tweet, “Look at this awesome new paper @familyunequal wrote!” When people click on the link they go to my profile, which tells them who I am and links to my website. I do not have to spend time on Twitter for this to work. (I chose @familyunequal because familyinequality was too long and I didn’t want to use my name because I was determined not to use Twitter for personal stuff. I think something closest to your name is ideal, but don’t not do this because you can’t think of the perfect handle.)

Of course, an active social media presence does help draw people into your work. But even low-level attention will help: posting or tweeting links to new papers, conference presentations, other writing, etc. No need to get into snarky chitchat and following hundreds of people if you don’t want to.

To see how others are using Twitter, you can visit the list I maintain, which has more than 600 sociologists. This is useful for comparing profile and feed styles.

Other writing

People who write popular books go on book tours to promote them. People who write minor articles in sociology might send out some tweets, or share them with their friends on Facebook. In between are lots of other places you can write something to help people find and learn about your work. I recommend blogging, but that can be done different ways.

As with publications themselves, there are public and private options, and I’m not a purist. (Some of my blog posts at the Atlantic, for which I used to get paid a little, were literally sponsored by Exxon, which I didn’t notice at first because I only looked at the site with my ad-blocker on.) But again public usually works better in addition to feeling better.

There are some good organizations now that help people get their work out. In my area, for example, the Council on Contemporary Families is great (I’m on their board), producing research briefs related to new publications, and helping to bring them to the attention of journalists and editors. Others work with the Scholars Strategy Network, which helps people place Op-Eds, or others. The great non-profit site The Society Pages includes lots of avenues for writing about your research. In addition, there are blogs run by sections of the American Sociological Association (like Work in Progress, from the Organizations, Occupations, and Work section) or other professional associations, and various group blogs.

And there is Contexts (of which I’m co-editor), the general interest magazine of ASA, where we would love to hear proposals for how you can bring your research out into the open (for the magazine or our blog).

5 Comments

Filed under Me @ work

Journal self-citation practices revealed

I have written a few times about problems with peer review and publishing.* My own experience subsequently led me to the problem of coercive self-citation, defined in one study as “a request from an editor to add more citations from the editor’s journal for reasons that were not based on content.” I asked readers to send me documentation of their experiences so we could air them out. This is the result.

Introduction

First let me mention a new editorial in the journal Research Policy about the practices editors use to inflate the Journal Impact Factors, a measure of citations that many people use to compare journal quality or prestige. One of those practices is coercive self-citation. The author of that editorial, Ben Martin, cites approvingly a statement signed by a group of management and organizational studies editors:

I will refrain from encouraging authors to cite my journal, or those of my colleagues, unless the papers suggested are pertinent to specific issues raised within the context of the review. In other words, it should never be a requirement to cite papers from a particular journal unless the work is directly relevant and germane to the scientific conversation of the paper itself. I acknowledge that any blanket request to cite a particular journal, as well as the suggestion of citations without a clear explanation of how the additions address a specific gap in the paper, is coercive and unethical.

So that’s the gist of the issue. However, it’s not that easy to define coercive self-citation. In fact, we’re not doing a very good job of policing journal ethics in general, basically relying on weak enforcement of informal community standards. I’m not an expert on norms, but it seems to me that when you have strong material interests — big corporations using journals to print money at will, people desperate for academic promotions and job security, etc. — and little public scrutiny, it’s hard to regulate unethical behavior informally through norms.

The clearest cases involve asking for self-citations (a) before final acceptance, for citations (b) within the last two years and (c) without substantive reason. But there is a lot short of that to object to as well. Martin suggests that, to answer whether a practice is ethical, we need to ask: “Would I, as editor, feel embarrassed if my activities came to light and would I therefore object if I was publicly named?” (Or, as my friend Matt Huffman used to say when the used-textbook buyers came around offering us cash for books we hadn’t paid for: how would it look in grainy hidden-camera footage?) I think that journal practices, which are generally very opaque, should be exposed to public view so that unethical or questionable practices can be held up to community standards.

Reports and responses

I received reports from about a dozen journals, but a few could not be verified or were too vague. These 10 were included under very broad criteria — I know that not everyone will agree that these practices are unethical, and I’m unsure where to draw the line myself. In each case below I asked the current editor if they would care to respond to the complaint, doing my best to give the editor enough information without exposing the identity of the informant.

Here in no particular order are the excerpts of correspondence from editors, with responses from the editors to me, if any. Some details, including dates, may have been changed to protect informants. I am grateful to the informants who wrote, and I urge anyone who knows, or thinks they know, who the informants are not to punish them for speaking up.

Journal of Social and Personal Relationships (2014-2015 period)

Congratulations on your manuscript “X” having been accepted for publication in Journal of Social and Personal Relationships. … your manuscript is now “in press” … The purpose of this message is to inform you of the production process and to clarify your role in the process …

IMPORTANT NOTICE:

As you update your manuscript:

1. CITATIONS – Remember to look for relevant and recent JSPR articles to cite. As you are probably aware, the ‘quality’ of a journal is increasingly defined by the “impact factor” reported in the Journal Citation Reports (from the Web of Science). The impact factor represents a ratio of the number of times that JSPR articles are cited divided by the number of JSPR articles published. Therefore, the 20XX ratings will focus (in part) on the number of times that JSPR articles published in 20XX and 20XX are cited during the 20XX publication year. So citing recent JSPR articles from 20XX and 20XX will improve our ranking on this particular ‘measure’ of quality (and, consequently, influence how others view the journal. Of course only cite those articles relevant to the point. You can find tables of contents for the past two years at…

Response from editor Geoff MacDonald:

Thanks for your email, and for bringing that to my attention. I agree that encouraging self-citation is inappropriate and I have just taken steps to make sure it won’t happen at JSPR again.

Sex Roles (2011-2013 period)

In addition to my own report, already posted, I received an identical report from another informant. The editor, Irene Frieze, wrote: “If possible, either in this section or later in the Introduction, note how your work builds on other studies published in our journal.”

Response from incoming editor Janice D. Yoder:

As outgoing editor of Psychology of Women Quarterly and as incoming editor of Sex Roles, I have not, and would not, as policy require that authors cite papers published in the journal to which they are submitting.

I have recommended, and likely will continue to recommend, papers to authors that I think may be relevant to their work, but without any requirement to cite those papers. I try to be clear that it is in this spirit of building on existing scholarship that I make these recommendations and to make the decision of whether or not to cite them up to the author. As an editor who has decision-making power, I know that my recommendations can be interpreted as requirements (or a wise path to follow for authors eager to publish) but I can say that I have not further pressured an author whose revision fails to cite a paper I recommended.

I also have referred to authors’ reference lists as a further indication that a paper’s content is not appropriate for the journal I edit. Although never the sole indicator and never based only on citations to the specific journal I edit, if a paper is framed without any reference to the existing literature across journals in the field then it is a sign to me that the authors should seek a different venue.

I value the concerns that have been raised here, and I certainly would be open to ideas to better guide my own practices.

European Sociological Review (2013)

In a decision letter notifying the author of a minor revise-and-resubmit, the editor wrote that the author had left out of the references some recent, unspecified, publications in ESR and elsewhere (also unspecified) and suggested the author update the references.

Response from editor Melinda Mills:

I welcome the debate about academic publishing in general, scrutiny of impact factors and specifically of editorial practices.  Given the importance of publishing in our profession, I find it surprising how little is actually known about the ‘black box’ processes within academic journals and I applaud the push for more transparency and scrutiny in general about the review and publication process.  Norms and practices in academic journals appear to be rapidly changing at the moment, with journals at the forefront of innovation taking radically different positions on editorial practices. The European Sociological Review (ESR) engages in rigorous peer review and most authors agree that it strengthens their work. But there are also new emerging models such as Sociological Science that give greater discretion to editors and focus on rapid publication. I agree with Cohen that this debate is necessary and would be beneficial to the field as a whole.

It is not a secret that the review and revision process can be a long (and winding) road, both at ESR and most sociology journals. If we go through the average timeline, it generally takes around 90 days for the first decision, followed by authors often taking up to six months to resubmit the revision. This is then often followed by a second (and sometimes third) round of reviews and revision, which in the end leaves us at ten to twelve months from original submission to acceptance. My own experience as an academic publishing on other journals is that it can regularly exceed one year. During the year under peer review and revisions, relevant articles have often been published.  Surprisingly, few authors actually update their references or take into account new literature that was published after the initial submission. Perhaps this is understandable, since authors have no incentive to implement any changes that are not directly requested by reviewers.

When there has been a particularly protracted peer review process, I sometimes remind authors to update their literature review and take into account more recent publications, not only in ESR but also elsewhere.  I believe that this benefits both authors, by giving them greater flexibility in revising their manuscripts, and readers, by providing them with more up-to-date articles.  To be clear, it is certainly not the policy of the journal to coerce authors to self-cite ESR or any other outlets.  It is vital to note that we have never rejected an article where the authors have not taken the advice or opportunity to update their references and this is not a formal policy of ESR or its Editors.  If authors feel that nothing has happened in their field of research in the last year that is their own prerogative.  As authors will note, with a good justification they can – and often do – refuse to make certain substantive revisions, which is a core fundament of academic freedom.

Perhaps a more crucial part of this debate is the use and prominence of journal impact factors themselves both within our discipline and how we compare to other disciplines. In many countries there is a move to use these metrics to distribute financing to Universities, increasing the stakes of these metrics. It is important to have some sort of metric gauge of the quality and impact of our publications and discipline. But we also know that different bibliometric tools have the tendency to produce different answers and that sociology fairs relatively worse in comparison to other disciplines. Conversely, leaving evaluation of research largely weighted by peer review can produce even more skewed interpretations if the peer evaluators do not represent an international view of the discipline. Metrics and internationally recognized peer reviewers would seem the most sensible mix.

Work and Occupations (2010-2011 period)

“I would like to accept your paper for publication on the condition that you address successfully reviewer X’s comments and the following:

2. The bibliography needs to be updated somewhat … . Consider citing, however critically, the following Work and Occupations articles on the italicized themes:

[concept: four W&O papers, three from the previous two years]

[concept: two W&O papers from the previous two years]

The current editor, Dan Cornfield, thanked me and chose not to respond for publication.

Sociological Forum (2014-2015 period)

I am pleased to inform you that your article … is going to press. …

In recent years, we published an article that is relevant to this essay and I would like to cite it here. I have worked it in as follows: [excerpt]

Most authors find this a helpful step as it links their work into an ongoing discourse, and thus, raises the visibility of their article.

Response from editor Karen Cerulo:

I have been editing Sociological Forum since 2007. I have processed close to 2500 submissions and have published close to 400 articles. During that time, I have never insisted that an author cite articles from our journal. However, during the production process–when an article has been accepted and I am preparing the manuscript for the publisher–I do sometimes point out to authors Sociological Forum pieces directly relevant to their article. I send authors the full citation along with a suggestion as to where the citation be discussed or noted. I also suggest changes to key words and article abstracts, My editorial board is fully aware of this strategy. We have discussed it at many of our editorial board meetings and I have received full support for this approach. I can say, unequivocally, that I do not insist that citations be added. And since the manuscripts are already accepted, there is no coercion involved. I think it is important that you note that on any blog post related to Sociological Forum

I cannot tell you how often an author sends me a cover letter with their submission telling me that Sociological Forum is the perfect journal for their research because of related ongoing dialogues in our pages. Yet, in many of these cases, the authors fail to reference the relevant dialogues via citations. Perhaps editors are most familiar with the debates and streams of thought currently unfolding in a journal. Thus, I believe it is my job as editor and my duty to both authors and the journal to suggest that authors consider making appropriate connections.

Unnamed journal (2014)

An article was desk-rejected — that is, rejected without being sent out for peer review — with only this explanation: “In light of the appropriateness of your manuscript for our journal, your manuscript has been denied publication in X.” When the author asked for more information, a journal staff member responded with possible reasons, including that the paper did not include any references to the articles in that journal. In my view the article was clearly within the subject area of the journal. I didn’t name the journal here because this wasn’t an official editor’s decision letter and the correspondence only suggested that might be the reason for the rejction.

Sociological Quarterly (2014-2015 period)

In a revise and resubmit decision letter:

Finally, as a favor to us, please take a few moments to review back issues of TSQ to make sure that you have cited any relevant previously published work from our journal. Since our ISI Impact Factor is determined by citations, we would like to make sure papers under consideration by the journal are referring to scholarship we have previously supported.

The current editors, Lisa Waldner and Betty Dobratz, have not yet responded.

Canadian Review of Sociology (2014-2015 period)

In a letter communicating acceptance conditional on minor changes, the editor asked the author to consider citing “additional Canadian Review of Sociology articles” to “help with the journal’s visibility.”

Response from current editor Rima Wilkes:

In the case you cite, the author got a fair review and received editorial comments at the final stages of correction. The request to add a few citations to the journal was not “coercive” because in no instance was it a condition of the paper either being reviewed or published.

Many authors are aware of, and make some attempt to cite the journal to which they are submitting prior to submission and specifically target those journals and to contribute to academic debate in them.

Major publications in the discipline, such as ASR, or academia more generally, such as Science, almost never publish articles that have no reference to debates in them.

Bigger journals are in the fortunate position of having authors submit articles that engage with debates in their own journal. Interestingly, the auto-citation patterns in those journals are seen as “natural” rather than “coerced”. Smaller journals are more likely to get submissions with no citations to that journal and this is the case for a large share of the articles that we receive.

Journals exist within a larger institutional structure that has certain demands. Perhaps the author who complained to you might want to reflect on what it says about their article and its potential future if they and other authors like them do not engage with their own work.

Social Science Research (2015)

At the end of a revise-and-resubmit memo, under “Comment from the Editor,” the author was asked to include “relevant citations from Social Science Research,” with none specified.

The current editor, Stephanie Moller, has not yet responded.

City & Community (2013)

In an acceptance letter, the author was asked to approve several changes made to the manuscript. One of the changes, made to make the paper more conversant with the “relevant literature,” added a sentence with several references, one or more of which were to City & Community papers not previously included.

One of the current co-editors, Sudhir Venkatesh, declined to comment because the correspondence occurred before the current editorial teams’ tenure began.

Discussion

The Journal Impact Factor (JIF) is an especially dysfunctional part of our status-obsessed scholarly communication system. Self-citation is only one issue, but it’s a substantial one. I looked at 116 journals classified as sociology in 2014 by Web of Science (which produces the JIF), excluding some misplaced and non-English journals. WoS helpfully also offers a list excluding self-citations, but normal JIF rankings do not make this exclusion. (I put the list here.) On average removing self-citations reduces the JIF by 14%. But there is a lot of variation. One would expect specialty journals to have high self-citation counts because the work they publish is closely related. Thus Armed Forces and Society has a 31% self-citation rate, as does Work & Occupations (25%). But others, like Gender & Society (13%) and Journal of Marriage and Family (15%) are not high. On the other hand, you would expect high-visibility journals to have high self-citation rates, if they publish better, more important work; but on this list the correlation between JIF and self-citation rate is -.25. Here is that relationship for the top 50 journals by JIF, with the top four by self-citation labeled (the three top-JIF journals at bottom-right are American Journal of Sociology, Annual Review of Sociology, and American Sociological Review).

journal stats.xlsx

The top four self-citers are low-JIF journals. Two of them are mentioned above, but I have no idea what role self-citation encouragement plays in that. There are other weird distortions in JIFs that may or may not be intentional. Consider the June 2015 issue of Sociological Forum, which includes a special section, “Commemorating the Fiftieth Anniversary of the Civil Rights Laws.” That issue, just a few months old, as of yesterday includes the 9 most-cited articles that the journal published in the last two years. In fact, these 9 pieces have all been cited 9 times, all by each other — and each article currently has the designation of “Highly Cited Paper” from Web of Science (with a little trophy icon). The December 2014 issue of the same journal also gave itself an immediate 24 self-citations for a special “forum” feature. I am not suggesting the journal runs these forum discussion features to pump up its JIF, and I have nothing bad to say about their content — what’s wrong with a symposium-style feature in which the authors respond to each other’s work? But these cases illustrate what’s wrong with using citation counts to rank journals. As Martin’s piece explains, the JIF is highly susceptible to manipulation beyond self-citation promotion, for example by tinkering with the pre-publication queue of online articles, publishing editorial review essays, and of course outright fraud.

Anyway, my opinion is that journal editors should never add or request additional citations without clearly stated substantive reasons related to the content of the research and unrelated to the journal in which they are published. I realize that reasonable people disagree about this — and I encourage readers to respond in the comments below. I also hope that any editor would be willing to publicly stand by their practices, and I urge editors and journal management to let authors and readers see what they’re doing as much as possible.

However, I also think our whole journal system is pretty irreparably broken, so I put limited stock in the idea of improving its operation. My preference is to (1) fire the commercial publishers, (2) make research publication open-access with a very low bar for publication; and (3) create an organized system of post-publication review to evaluate research quality, with (4) republishing or labeling by professional associations to promote what’s most important.

* Some relevant posts cover long review delays for little benefit; the problem of very similar publications; the harm to science done by arbitrary print-page limits; gender segregation in journal hierarchies; and how easy it is to fake data.

11 Comments

Filed under Uncategorized

Sociology: “I love you.” Economics: “I know.”

Sour grapes, by Sy Clark. https://flic.kr/p/yFT3a

Sour grapes, by Sy Clark. https://flic.kr/p/yFT3a

A sociologist who knows how to use python or something could do this right, but here’s a pilot study (N=4) on the oft-repeated claim that economists don’t cite sociology while sociologists cite economics.

I previously wrote about the many sociologists citing economist Gary Becker (thousands), compared with, for example, the 0 economists citing the most prominent article on the gender division of housework by a sociologist (Julie Brines). Here’s a little more.

It’s hard to frame the general question in terms of numerators and denominators — which articles should cite which, and what is the universe? To simplify it I took four highly-cited papers that all address the gender gap in earnings: one economics and one sociology paper from the early 1990s, and one of each from the early 2000s. These are all among the most-cited papers with “gender” and “earnings OR wages” in the title from journals listed as sociology or economics by Web of Science.

From the early 1990s:

  • O’Neill, J., and S. Polachek. 1993. “Why the Gender-gap in Wages Narrowed in the 1980s.” Journal of Labor Economics 11 (1): 205–28. doi:10.1086/298323. Total cites: 168.
  • Petersen, T., and L.A. Morgan. 1995. “Separate and Unequal: Occupation Establishment Sex Segregation and the Gender Wage Gap.” American Journal of Sociology 101 (2): 329–65. doi:10.1086/230727. Total cites: 196.

From the early 2000s:

  • O’Neill, J. 2003. “The Gender Gap in Wages, circa 2000.” American Economic Review 93 (2): 309–14. doi:10.1257/000282803321947254. Total cites: 52.
  • Tomaskovic-Devey, D., and S. Skaggs. 2002. “Sex Segregation, Labor Process Organization, and Gender Earnings Inequality.” American Journal of Sociology 108 (1): 102–28. Total cites: 81.

A smart way to do it would be to look at the degrees or appointments of the citing authors, but that’s a lot more work than just looking at the journal titles. So I just counted journals as sociology or economics according to my own knowledge or the titles.* I excluded interdisciplinary journals unless I know they are strongly associated with sociology, and I excluded management and labor relations journals. In both of these types of cases you could look at the people writing the articles for more fidelity. In the meantime, you may choose to take my word for it that excluding these journals didn’t change the basic outcome much. For example, although there are some economists writing in the excluded management and labor relations journals (like Industrial Labor Relations), there are a lot of sociologists writing in the interdisciplinary journals (like Demography and Social Science Quarterly), and also in the ILR journals.

Results

Citations to the economics articles from sociology journals:

  • O’Neill and Polachek (1993): 37 / 168 = 22%
  • O’Neill (2003): 4 / 52 = 8%

Citations to the sociology articles from economics journals:

  • Petersen and Morgan (1995): 6 / 196: 3%
  • Tomaskovic-Devey and Skaggs (2002): 0 / 81: 0%

So, there are 41 sociology papers citing the economics papers, and 6 economics papers citing the sociology papers.

Worth noting also that the sociology journals citing these economics papers are the most prominent and visible in the discipline: American Sociological Review, American Journal of Sociology, Annual Review of Sociology, Social Forces, Sociology of Education, and others. On the other hand, there are no citations to the sociology articles in top economics journals, with the exception of an article in Journal of Economic Perspectives that cited Peterson and Morgan — but it was written by sociologists Barbara Reskin and Denise Bielby. Another, in Feminist Economics, was written by sociologist Harriet Presser. (I included these in the count of economics journals citing the sociology papers.)

These four articles are core work in the study of labor market gender inequality, they all use similar data, and they are all highly cited. Some of the sociology cites of these economics articles are critical, surely, but there’s (almost) no such thing as bad publicity in this business. Also, the pattern does not reflect a simple theoretical difference, with sociologists focused more on occupational segregation (although that is part of the story), as the economics articles use occupational segregation as one of the explanatory factors in the gender gap story (though they interpret it differently).

Anyways.

Previous sour-grapes stuff about economics and sociology:

Note:

* The Web of Science categories are much too imprecise with, for example, Work & Occupations — almost entirely a sociology journal –classified as both sociology and economics.

6 Comments

Filed under Research reports

Quick correction on that 90-percent-of-faculty-are-White thing

The other day I saw a number of anti-racist people tweeting that “nearly 90% of full-time professors are White.” As I have previously complained when 90% of the full professors at my then-school (UNC) were White, I was interested to follow up. Unfortunately, that popular tweet turns out to be a stretched description of a simple error.

The facts are in this Education Department report from May, which was reported at the time by The Ed Advocate, and suddenly started going around the other day for unknown reasons. The “nearly 90%” is the Ed Advocate’s description of 84%, which is the percentage White among full-time full professors, which the original report in one place accidentally describes as just full-time professors. Among all full-time instructional faculty, in fact, 79% are White. So the headline, “Study: Nearly 90 Percent of Full-time Professors Are White,” was a conflation of two errors. It presumably became popular because it put a number to a real problem lots of people are aware of and looking for ways to highlight.

Here is the original chart:

whitefac

The problem of White over-representation among college faculty is not that apparent in this national 79% statistic. Consider, for example, that among all full-time, full-year workers age 40 and older (my made-up benchmark), 71% are non-Hispanic White. Among those with a Masters degree or higher, 77% are White. So faculty, nationally and at all levels, don’t look that different from the pool from which they’re drawn.

The 84% full professor statistic reflects the greater White representation as you move up the academic hierarchy. And that’s not just a question of waiting for younger cohorts with more non-White faculty to age into the professoriate. Because the pipeline isn’t working that well, especially for Black faculty. Which brings me back to my old UNC complaint, which focused mostly on Back under-representation. In 2010 I noted that the North Carolina population was 22% Black, while the UNC faculty was 4.7% Black. But full professors at UNC were just 2.4% Black, while the assistant professors were 7.5% Black. Is that the pipeline working? Well, only 4.5% of the recent faculty hires were Black.

I went back to check on things. As of the 2014 report (they’re all here), the update is that UNC has stopped reporting the numbers by rank, so now all they say is that 5.2% of all faculty are Black, and they don’t report the makeup of recent hires. So take from that what you will.

And what about further up the pipeline? I previously shared numbers showing a drop in Black representation among entering freshmen at the University of Michigan, from 10% to 5% over the 2000s. The trend at UNC is in the same direction:

unc black studentsOf course we always need to be cautious about numbers that support what we already know or believe. Some people will respond to this by saying, “but the point remains.” Right, but if the number is irrelevant to the point, there’s no need to use the number. Plenty of people can say, “In all my undergraduate years, I never had a Black professor,” or some other highly relevant observation.*

On the other hand, others of us need to disabuse ourselves of the notion that progress on under-representation is just happening out there because everyone thinks it should and it’s just a matter of time. That common assumption allows defensive administrators to do write thinks like this caption (from UNC’s 2011-2012 report):

unc1112

This is misleading: There was a big increase in Hispanic students (North Carolina has a growing Hispanic population) and Asian students, and marked drops in Black and American Indian students. But “overall, steady increase” is an easy narrative to sell.

If they scaled that chart from 0 to 12 and dropped Whites, “overall, steady increase” would look like this:

uncscaled

* I think I had three great Black professors at Michigan: Walter Allen, Robin D. G. Kelley, and Cecilia Green, each of whom changed my life forever. Sorry if I’m forgetting someone.

Related posts:

3 Comments

Filed under In the news

Shine a light on journal self-citation inflation

Photo by pnc.

Photo by pnc.

Note: Welcome Inside Higher Ed readers. I’d be happy to hear accounts from disciplines other than sociology. Email me at pnc@umd.edu.

In my post on peer review the other day, I mentioned that a journal editor made this request — before she agreed to send the paper out for review:

“If possible, either in this section or later in the Introduction, note how your work builds on other studies published in our journal.”

A large survey on “coercive citation” practices, published in Science in 2012 (paywalled; bootlegged PDF) found that 20% of researchers had, in the previous five years, “received a request from an editor to add more citations
from the editor’s journal for reasons that were not based on content.” The survey, which was sent to email lists for academic associations, including the American Sociological Association, found sociologists and psychologists were less likely to report having experienced this practice than were economists and those in business-related disciplines.

The journal I named, Sex Roles, is high on the list of those most frequently mentioned — cited by four respondents, more than any journal outside of business, marketing, or economics. But there are a lot of other journals you know on the list.

Although I made the assumption that the Sex Roles editor was trying to increase the impact factor — the citation rate — for her journal, one could defend this practice as being motivated by other interests (I’ll leave that to you). It also seems likely that some requests are open to interpretation — for example, mixing in citations from different journals, or offering specific reasons for including particular citations.

Tell me about it

To look into this a little more, I’m asking you to send me requests for journal self-citation that you have received. I’ll keep them confidential, but if I get enough to make it interesting, I will post: (1) journal name, (2) the type of request, (3) the date (month and year), and (4) the stage in the publication process. Feel free to include extenuating details or other information you would like to share, and let me know if you want it disclosed. I assume most of you are sociologists, but I’ll include items from any discipline.

To be included on the list, I’ll need to see copies of the letter or email you received. I will not disclose your identity or information about you, or the specific article under review. I won’t use quotes that might identify the author or article under review.

I will also send the list to the current editors of journals named and give them an opportunity to respond.

My contact information is here.

Maybe there’s not enough here to go on, but if there is, I think shining a light on it would be a good thing, and might deter the practice in the future.

3 Comments

Filed under Me @ work