How Good Are The Best Papers of JASIS?

 

 

 

 

 

 

 

 

 

Terrence A. Brooks

School of Library and Information Science

University of Washington

tabrooks@u.washington.edu


Abstract

A citation analysis examined the twenty eight best papers published in JASIS (Journal of the American Society for Information Science) from 1969 - 1996.  Best papers tend to be single authored works twice as long as the average paper published in JASIS.  They are cited and self cited much more often than the average paper.  The greatest source of references made to the best papers is from JASIS itself.  The top five best papers focus largely on information retrieval and online searching. 


            In the year celebrating fifty years of publication of JASIS (Journal of the American Society for Information Science), one may wonder how good are the papers selected each year as the “best” paper published in JASIS   The ASIS 1999 Handbook and Directory states that  "The Best JASIS Paper Award recognizes the outstanding paper published in the Journal of the American Society for Information Science (JASIS), the fully refereed official scholarly publication of ASIS” (p. 25).  This short report suggests that the best papers are good indeed.

The following study used citation analysis to assess the quality of the twenty eight best papers published in JASIS during the years 1969 to 1996.  The "average" paper published in JASIS was established by a paired contrast group of twenty eight other papers.  The paired contrast group was created by selecting papers that appear immediately following a best paper.  For twenty best papers this choice was possible, while eight of the contrast group appeared immediately preceding a best paper.  Such a paired contrast sample controls possible confounding factors such as years since publication, characteristics of a particular issue, and other external factors that may have mediated the receipt of citations such as editorial policy and methodological changes in the Social Science Citation Index. 

            Table 1 presents the number of authors of the best papers and the comparison sample.  Best papers tend to be single-author statements, with the exception of two best papers that have four aurthors.  Best papers also are significantly longer than the average JASIS paper.  The mean length of the best papers is 16.21 pages as compared to the comparison sample mean length of 8.93 pages (F(1,54) = 12.09, a < .001).  This suggests that JASIS best papers tend to be lengthy, individual statements.

           

Citation Analysis

            The best papers of JASIS are cited at a significantly higher rate than the average paper.  Best papers received a mean of 30.46 citations as compared to a mean of 8.25 citations for a comparison paper (F (1,54) = 11.61, a < .001).  If  we equate academic worth with numbers of citations, then this is clear evidence that the JASIS best papers are significantly better than the average paper published in JASIS.  The greater citedness of the best papers extends over time as well.  In chronological partitions of 0 to 3 years after publication, 4 – 6 years after publication, 7 to 9 years after publication and more than 10 years after publication, the best papers received significantly more citations in each partition.  JASIS itself provides the greatest number of references to the best papers, followed by IPM (Information Processing and Management).  Finally, best papers tended to have more self citations (2.86 to 0.75) than the comparison sample (F (1,54) = 8.09, a < .006).  The heavy self citedness tends to reinforce the notion that best papers tend to be single-authored statements where the author establishes a new paradigm or argues a certain point of view.

 

The Crème de la Crème

            Table 2 presents the five best papers ranked by absolute number of citations received.   The clear leader is the paper by Saracevic, Trivison, Chamis and Kantor which received about 25% more citations than the second ranked paper by Bates.  Papers by Marcus and Fidel tied for the fourth and fifth rank.

White and McCain (1998) have visualized the discipline of library and information science by doing a factor analysis of 120 authors.  Their “Experimental Retrieval” group included Saracevic, Kantor and W.S. Cooper.  Marcus, Bates and Fidel fell into their “Online Retrieval” group.   The top ranked best papers of JASIS appear to have been focused on design, evaluation and use of document retrieval system.  Clearly online searching has been a central concern of JASIS.

The preceding analysis of the absolute number of citations received is obviously correlated to the number of years since publication; that is, a paper published ten years ago has had much more time to collect references than a paper published last year.  A second analysis controlled years since publication by ranking the best papers by citations received per year.  Table 3 presents the five best papers ranked by mean number of citations received per year.   The paper by Saracevic, Trivison, Chamis and Kantor remains in first place.  This pre-eminence of this paper in Tables 2 and 3 implies that it is not only the most heavily cited, but also the paper accumulating citations at the highest rate.  It can be argued, therefore, that it is the single best paper of the last quarter century.  Harter’s paper jumps into the second rank as a very recent and heavily cited paper.  White and McCain place Harter in the “Experimental Retrieval” group, while Marchionini was not in their sample of 120.  The topics of the papers in Table 3 remain focused on online retrieval and evaluation.

 

Conclusion

            The best papers of JASIS appear to be long, single authored statements about online searching and retrieval.  They are heavily cited and self cited by the JASIS community.  The preceding citation analysis establishes that the best papers are clearly better than average.  The citation analysis indicates that the single best paper is “A Study of Information Seeking and Retrieving” by Tefko Saracevic, Paul Kantor, Alice Y. Chamis and Donna Trivison,  Journal of the American Society for Information Science, 39, 1988, pp. 161 – 216.

            JASIS, of course, publishes papers on many different subjects germane to the information science that have not been chosen as best papers.  This suggest a cultural factor at work that is beyond the scope of a citation analysis.  Such a cultural factor is suggested by White and McCain:    

The secondary loadings in the user theory speciality exemplify the factor-analytic technique’s sensitivity to nuance.  It will be seen that authors who write about literatures – the citationists, bibliometricians, and scientific communication people – never load above 0.30 on this factor, apparently because citers do not perceive their work as having the right psychological content.  On the other hand, quite a few retrievalists load above 0.30, and this suggests the nature of the cognition involved.  It has to do with problem-solving at the interface where literatures are winnowed down for users with: Question formulation, search strategies, information-seeking styles, relevance judgments, and the like….Saracevic and Belkin load in the 0.50s; Borgman, Fidel, and Bates in the 0.40s, Harter in the 0.30s… (White and McCain, p. 336-337).

 


Acknowledgement

I thank Cynde Moya for compiling the citation analysis.

 


 

References

            American Society for Information Science.  (1999).  Handbook and Directory.  Silver Spring, MD: ASIS.

            White, H. D. & McCain, K. W.  (1998).  Visualizing a discipline: An author co-citation analysis of information science, 1972-1995.  Journal of the American Society of Information Science, 49, 327-355.

 


 

 

Table 1: Number of Authors

 

                                                                        Authors

 

 

One

Two

Three

Four

Best Paper

23

2

1

2

Comparison

12

9

7

0

 


 

 

Table 2: Top Ranked Best Papers by Number of Citations

 

Rank

Title

1

Saracevic, T., Trivison, D., Chamis, A., & Kantor, P. “A study of information seeking and retrieving, Parts I-III.” (May 1988)

2

Bates, M. “Information search statistics” (July 1979)

3

Cooper, W.S. “On selecting a measure of retrieval effectiveness” (March-April 1973)

4

Marcus, R.S. “An experimental comparison of the effectiveness of computers and humans as search intermediaries.” (November 1983)

4

Fidel, R. “Online searching styles: A case-based model of searching behavior” (July 1984)

 

 


 

Table 3: Top Ranked Best Papers by Citations per Year

 

 

Rank

Title

1

Saracevic, T., Trivison, D., Chamis, A., & Kantor, P. “A study of information seeking and retrieving, Parts I-III.” (May 1988)

2

Harter, S. P. "Variations in Relevance Assessments and the Measurement of Retrieval Effectiveness." (January 1996)

3

Bates, M. “Information search statistics” (July 1979)

4

Fidel, R. "Searchers' selection of search keys. I - III." (August 1991)

5

Marchionini, G. "Information-seeking strategies of novices using a full-text encyclopedia." (January 1989)