Publications

Top 10 Tips for Getting Published in Human Factors

By Nancy J. Cooke, Editor, Human Factors

As I come to the end of my fifth and final year as editor of Human Factors, I would like to offer some guidance about getting articles published in our journal. (Many of these principles can also be applied when submitting work to other scholarly publications.)

In what follows, I offer suggestions for circumventing the 10 most common roadblocks to publication in Human Factors.

1. Make explicit and compelling the contribution of your work to the theory and/or practice of human factors/ergonomics.

Space in quality journals is very limited, and foremost among the criteria for selection is incremental contribution. If your work doesn't generalize or advance the body of knowledge in the field by more than a trivial amount, it isn't likely to be accepted, no matter how well conceived or executed it is, or how well you've presented it. Lack of contribution to the field is the most common basis on which submissions to Human Factors (and most other journals) are rejected. Ask yourself and others whom you trust - colleagues, faculty, family - to assess how important your message is, and proceed only if you are convinced that it is substantial.

Lack of substance, however, is but one facet of this roadblock. Surprisingly often the substance is there, but it isn't clearly articulated in the presentation. Probably a third of all submissions to Human Factors suffer from failure to make explicit the nature of the incremental contribution.

Make doubly sure your contribution to the literature is clearly articulated in the abstract (a Human Factors requirement), the introduction, and the conclusion. In this case, redundancy is a good thing.

2. Connect your contribution to the HF/E literature and to the readers.

Once you've determined that your contribution is sufficient to merit publication, decide whether it fits well within the topical domain of the field. Two guides for making this decision are available in each issue of the journal: a  listing of core topic areas and the Information for Authors. If, after considering these instructions and the topic list, you are still unsure about the relevance of your work for this outlet, scan through past issues of the journal or consult with professionals in the field.

Deciding what does and doesn't fit isn't always an easy call because the field itself covers a far wider topic domain than do most disciplinary journals, and that domain is expanding. One popular misconception - that Human Factors publishes only laboratory-based empirical studies - is patently false. During my tenure, Human Factors has published field studies, research employing cognitive task analysis, integrative reviews, theoretical works, and methodological pieces. Further, in the service of HFES's strategic outreach objectives, we have attempted to push the bounds of traditional HF/E areas and open up the journal to topics that may seem "on the fringe." Naturally, there are limits; some topics are clearly outside the field's extensive domain, but again, there is no clear boundary.

3. Present your contribution in the best possible form at micro and macro levels.

This is probably the most obvious advice for any author, but substandard exposition is far more common that it should be. At the extreme, many manuscripts that might otherwise be given serious consideration are eliminated on this basis alone. It is the author's responsibility to communicate the message clearly and effectively, not the reviewer's, editor's, or reader's to figure it out.

There are many ways at both the macro and micro levels in which poor composition can undermine communication, so it would be impossible to list them all in this article. But some seem to occur frequently enough to merit particular attention here.

Communicate your contribution clearly and explicitly, organizing the presentation logically, as you would in telling a story. Put yourself in the reader's position, asking at each point what you need to know in order for the next paragraph or section to make sense. Unlike some stories, however, journal articles are not supposed to be suspenseful. The reader should be able to predict from the introduction what lies ahead in the methods section. Start by explaining the motivation for the work. As Associate Editor Deborah Boehm-Davis suggests (personal communication),

Research is typically done for one of three reasons: (1) nothing has been done in this area, (2) work has been done in this area and there is a hole that needs to be filled, or (3) research/theory is conflicting, and you are trying to resolve the conflict. If (1), you expect a general exploratory study leading to identification of important variables. If (2), you expect that the authors will manipulate variables that explore the gap or hole in previous work. If (3), you expect a design that pits the conflicting variables against one another. This also means that the discussion follows with (1) description of what was identified and where to go from here, (2) what knowledge about the "gap" adds to what was previously known, or (3) which theory/approach seems more promising. This "flow" doesn't happen often enough.

Avoid the redundancy and bloat of poorly organized manuscripts. When the initial review of disorganized papers requires that they be shortened, authors are obliged to pay closer attention to organization, and the result is substantial improvement in clarity and overall quality. Why not invest that attention at the outset so that your product enters the review process in the best possible shape? As the saying goes, "You have only one chance to make a good first impression."

Pay attention to grammar, spelling, term definition, and sentence clarity. Again, this contributes to that important first impression.

Be sure to provide sufficient methodological detail. Reviewers often cite this as a problem. It's tricky: You should provide enough detail to enable someone to replicate the study, but at the same time, you shouldn't overdo it in anticipation of every possible question or criticism someone might raise, and you should avoid redundancy. It's important to define your variables operationally and relate them to whatever motivated the work.

Keep your audience – the readers – in mind. HF/E is a wide domain, so Human Factorsreaders constitute a very diverse population. Don't assume that they have all read the same literature, taken the same classes, conducted the same kind of research, or earned their degrees in the same field as you have. For maximum impact, make your work accessible to as many readers as possible.

There is no better way to perfect your writing skills than to go through your own Write-Review-Revise (WRR) process using peers, colleagues, or students as reviewers. In my opinion, several rounds of WRR before submission can reduce the number of more lengthy WRR rounds after submission.

4. Adhere to length restrictions associated with your submission type.

Authors seem to find it difficult to adhere to length restrictions, but required reduction in length almost always improves the clarity and quality of the end product. Guidance in this regard is readily available for the Human Factors article categories: Research Reports, Regular Articles, and Review Articles, and in some cases, Special Section articles (see the Information for Authors). First determine which of those article categories best applies to your work, then follow the respective length guidelines.

5. Cite the relevant literature.

Similar to my earlier points about methodological detail, what - and how much - literature to cite is tricky. Criticisms of both excessive and deficient referencing are common in the review process. Although you can never be certain how much literature to cite, you should give the matter careful thought, making sure you include and accurately report all the most relevant work (particularly the most recent) and omit what you regard as tangential. The reviewers may disagree with your decision, but this is not likely to result in failure unless your omissions have seriously compromised the rationale for the work, your methodology, or your conclusions. Provided your work constitutes a clear incremental contribution, this feedback will be constructive and helpful in strengthening your ultimate product.

Authors tend to assume a defensive posture in citing the literature. If you decide an article or chapter is worth citing, be sure to make its relevance to your "story" explicit and represent its substance accurately. In an effort to be all-inclusive, some authors give short shrift to what the cited author actually says, which reviewers might perceive as padding the reference list. That raises doubts about the author's grasp of the literature and the credibility of other facets of his or her report.

One final point about citation relates to my earlier comment about expanding topic coverage. If your work is somewhat beyond the mainstream of topics published in Human Factors, the literature review format provides an excellent opportunity for you to build your case for relevance. Similarly, if you are inexperienced in the HF/E domain, a good place to start is by perusing the reference lists of review articles and regular articles published in areas related to your work.

6. Be sure that your methods address the question you are trying to answer.

Reviewers can always raise methodological issues of one sort or another, because no experimental design or methodology is perfect. But you can avoid some of the more serious criticisms simply by ensuring that your methodology addresses the question that motivated the work and that your description makes this connection clear. There will always be some limitations and caveats, but they should be clearly spelled out.

As noted earlier, when reporting your research, describe your variables and measures in enough detail to enable replication. A helpful technique for checking all this is simply to run your methods section by a few colleagues or professionals whose candor you trust before submitting the manuscript.

7. Report statistical results completely and factually.

Reviewers generally provide detailed comments on methods and results and often suggest alternative ways to analyze data. In contrast to some methodological issues, these can typically be addressed easily in a revision. Some common pitfalls in this area include failure to report the results completely (e.g., leaving out degrees of freedom) or "spinning" the results to better support a hypothesis.

One common example is reporting a result as marginally significant when it fails to meet the significance criterion established in advance. The p value is often determined by convention but can be set by the researcher (to p < .10, for instance). This is a binary decision rule, and your result is either significant or it isn't. In this context, there are no marginally significant results. This does not, of course, preclude reporting actual p values, and traditional hypothesis testing is not the only means of designing, analyzing, or interpreting research.

8. Apply HF/E principles to your graphical presentations.

A terrific article in Human Factors by Gillan, Wickens, Hollands, and Carswell (1998) provides guidelines for data presentation that directly address the ironic situation that authors often fail to consider the user in their graphical presentations. Too often we see graphs that are unreadable, axes that are unlabeled, and representations that are inappropriate for the data type (e.g., line graphs for categorical data). These types of mistakes are generally easily fixed but embarrassing for those who profess to be HF/E professionals.

9. Do not overstate conclusions.

It is one thing to be explicit about the contribution of your work and quite another to overreach your data and proclaim that you have the final solution for global warming or world hunger. Results need to be presented and discussed honestly. This means steering clear of discussions of marginal results or the "trends that could have been" in an effort to confirm predictions. Rather, view your findings objectively, asking whether they advance our understanding of the engaged topic and, if so, how. Significant confirmation of well-conceived hypotheses obviously makes a contribution, but so too can unexpected failure - again, provided the work was well conceived and executed. Often such unexpected findings suggest new avenues to explore, and these should be exploited rather than marginalized. Every study has its limitations, and these should also be acknowledged in the discussion, along with suggestions for overcoming them in future work.

10. Be responsive to reviews.

In my five years as editor, I have not seen a submission published without at least some revision. Most require more than one round of revision. Invariably, the end result is substantially stronger. The review process is not an adversarial one intended to criticize and minimize your contribution but, rather, is a constructive process aimed at helping those with an important message to present that message in its best possible form. Both you and the journal are well served when the inevitable requests for revision are viewed in that light.

This does not mean that you must blindly comply with each and every comment or suggestion of all the reviewers (some of which might conflict). Being responsive means recognizing and addressing all the reviewers' concerns, whether through revision or explanation for not revising. Address each reviewer comment in a cover letter accompanying your revision, identifying explicitly (with reference to page and line numbers) where the changes appear. But do not make the mistake of addressing comments in a cover letter, only to ignore them in the revised document. The reviewers are representative of your readers, and if they have questions, it is likely that readers will as well.

Steps to Success

In my opinion, if authors were to adhere to these 10 practices, they could greatly increase the number of submissions that make it into print. It would also free reviewers to focus on other, more subtle but perhaps deeper theoretical or methodological issues that might elevate the contribution and impact ultimately made by the submission, thereby advancing the theory and practice of human factors/ergonomics.

I thank the associate editors who have contributed to this Top 10 list (Deborah Boehm-Davis, Patricia DeLucia, and William Horrey), as well as William Howell, who provided very helpful input on my first draft. Last but not least, a special thank-you to all the reviewers, associate editors, and HFES staff with whom I have worked over the last five years. Along with the authors, each of you has significantly contributed to the high quality of published articles that we find today in Human Factors.

Reference

Gillan, D. J., Wickens, C. D., Hollands, J. G., & Carswell, C. M. (1998). Guidelines for presenting quantitative data in HFES publicationsHuman Factors, 40, 28–41.

This article was originally published in the October 2009 HFES Bulletin; Volume 52, Number 10.