Papers of interest to SIG-Metrics Readers
Eugene Garfield
eugene.garfield at THOMSONREUTERS.COM
Mon Feb 13 13:35:24 EST 2012
--------------------------------------------------------------------------
TITLE: Trends of e-learning research from 2000 to 2008: Use of
text mining and bibliometrics (Article, English)
AUTHOR: Hung, JL
SOURCE: BRITISH JOURNAL OF EDUCATIONAL TECHNOLOGY 43 (1). JAN
2012. p.5-16 WILEY-BLACKWELL, MALDEN
SEARCH TERM(S): BIBLIOMETR* item_title
ABSTRACT: This study investigated the longitudinal trends of e-
learning research using text mining techniques. Six hundred and eighty- nine (689) refereed journal articles and proceedings were retrieved from the Science Citation Index/Social Science Citation Index database in the period from 2000 to 2008. All e-learning publications were grouped into two domains with four groups/15 clusters based on abstract analysis.
Three additional variables: subject areas, prolific countries and prolific journals were applied to data analysis and data interpretation.
Conclusions include that e-learning research is at the early majority stage and foci have shifted from issues of the effectiveness of e- learning to teaching and learning practices. Educational studies and projects and e-learning application in medical education and training are growing fields with the highest potential for future research. Approaches to e-learning differ between leading countries and early adopter countries, and government policies play an important role in shaping the results.
AUTHOR ADDRESS: JL Hung, Boise State Univ, Dept Educ Technol, 1910 Univ Dr,
Boise, ID 83725 USA
--------------------------------------------------------------------------
--------------------------------------------------------------------------
TITLE: GREAT EXPECTATRICS: GREAT PAPERS, GREAT JOURNALS, GREAT
ECONOMETRICS (Article, English)
AUTHOR: Chang, CL; McAleer, M; Oxley, L
SOURCE: ECONOMETRIC REVIEWS 30 (6). 2011. p.583-619 TAYLOR &
FRANCIS INC, PHILADELPHIA
SEARCH TERM(S): HIRSCH JE P NATL ACAD SCI USA 102:16569 2005;
CITED ARTICLE item_title,keyword,keyword_plus;
JOURNALS item_title
KEYWORDS: Article influence; Cited article influence; C3PO;
Eigenfactor; IFI; Immediacy; Impact factors; h-Index; PI-
BETA; STAR; Research assessment measures; Zinfluence
KEYWORDS+: EIGENFACTOR(TM) METRICS; IMPACT FACTOR
ABSTRACT: The article discusses alternative Research Assessment
Measures (RAM), with an emphasis on the Thomson Reuters ISI Web of Science database (hereafter ISI). Some analysis and comparisons are also made with data from the SciVerse Scopus database. The various RAM that are calculated annually or updated daily are defined and analyzed, including the classic 2-year impact factor (2YIF), 2YIF without journal self-citations (2YIF*), 5-year impact factor (5YIF), Immediacy (or zero- year impact factor (0YIF)), Impact Factor Inflation (IFI), Self-citation Threshold Approval Rating (STAR), Eigenfactor score, Article Influence, C3PO (Citation Performance Per Paper Online), h-index, Zinfluence, and PI- BETA (Papers - Ignored By Even The Authors). The RAM are analyzed for 10 leading econometrics journals and 4 leading statistics journals. The application to econometrics can be used as a template for other areas in economics, for other scientific disciplines, and as a benchmark for newer journals in a range of disciplines. In addition to evaluating high quality research in leading econometrics journals, the paper also compares econometrics and statistics, alternative RAM, highlights the similarities and differences of the alternative RAM, finds that several RAM capture similar performance characteristics for the leading econometrics and statistics journals, while the new PI-BETA criterion is not highly correlated with any of the other RAM, and hence conveys additional information regarding RAM, highlights major research areas in leading journals in econometrics, and discusses some likely future uses of RAM, and shows that the harmonic mean of 13 RAM provides more robust journal rankings than relying solely on 2YIF.
AUTHOR ADDRESS: M McAleer, Erasmus Univ, Erasmus Sch Econ, Inst Econometr,
NL-3602 PA Rotterdam, Netherlands
-------------------------------------------------------------------------
TITLE: Data Mining Technique for Expertise Search in a Special
Interest Group Knowledge Portal (Article, English)
AUTHOR: Ahmad, WMZW; Sulaiman, S; Yusof, UK
SOURCE: 2011 3RD CONFERENCE ON DATA MINING AND OPTIMIZATION
(DMO). 2011. p.20-25 IEEE, NEW YORK
SEARCH TERM(S): HIRSCH JE P NATL ACAD SCI USA 102:16569 2005
KEYWORDS: Data mining; information retrieval; knowledge discovery;
Web mining; expertise search
KEYWORDS+: INDEX; WEB
ABSTRACT: The Internet contributes to the development of electronic
community (e-community) portals. Such portals become an indispensable platform for members especially for a Special Interest Groups (SIG) to share knowledge and expertise in their respective fields. Finding expertise over the e-community portal will help interested people and researchers to identify other experts, working in the same area. However, it is quite a cumbersome task to search such expertise in the portal. In order to find an expert, expertise data mining could be a solution to ease the search of experts. Performing effective data mining technique will help to analyze and measure expertise level accurately in a SIG portal. This paper proposes a method called Expertise Data Mining (EDM) that comprises a few techniques for expertise search in a SIG portal. It expects to improve the finding of experts among the members of a SIG e- community.
AUTHOR ADDRESS: WMZW Ahmad, Univ Sains Malaysia, Sch Comp Sci, George Town
11800, Malaysia
--------------------------------------------------------------------------
TITLE: An Efficient Algorithm for Ranking Research Papers Based
on Citation Network (Article, English)
AUTHOR: Singh, AP; Shubhankar, K; Pudi, V
SOURCE: 2011 3RD CONFERENCE ON DATA MINING AND OPTIMIZATION
(DMO). 2011. p.88-95 IEEE, NEW YORK
SEARCH TERM(S): GARFIELD E rauth; CITATION item_title;
CITATION* item_title
KEYWORDS: Ranking; Citation Network; Authoritative Score
ABSTRACT: In this paper we propose an efficient method to rank the
research papers from various fields of research published in various conferences over the years. This ranking method is based on citation network. The importance of a research paper is captured well by the peer vote, which in this case is the research paper being cited in other research papers. Using a modified version of the PageRank algorithm, we rank the research papers, assigning each of them an authoritative score.
Using the scores of the research papers calculated by above mentioned method, we formulate scores for conferences and authors and rank them as well. We have introduced a new metric in the algorithm which takes into account the time factor in ranking the research papers to reduce the bias against the recent papers which get less time for being studied and consequently cited by the researchers as compared to the older papers.
Often a researcher is more interested in finding the top conferences in a particular year rather than the overall conference ranking. Considering the year of publication of the papers, in addition to the paper scores we also calculated the year-wise score of each conference by slight improvisation of the above mentioned algorithm.
AUTHOR ADDRESS: AP Singh, IIIT Hyderabad, Ctr Data Engn, Hyderabad, Andhra
Pradesh, India
-------------------------------------------------------------------------
TITLE: Neotropical Ichthyology: trajectory and bibliometric
index (2003-2010) (Article, English)
AUTHOR: Stumpf, IRC; Vanz, SAD; Gastaud, N; Vargas, R;
Bentancourt, SMP
SOURCE: NEOTROPICAL ICHTHYOLOGY 9 (4). DEC 26 2011. p.921-926
SOC BRASILEIRA ICTIOLOGIA, SAO PAULO
SEARCH TERM(S): GARFIELD E rauth; BIBLIOMETR* item_title;
GARFIELD E JAMA-J AM MED ASSOC 295:90 2006
KEYWORDS: Bibliometrics; Citation analysis; Co-authorship; Impact
factor; Scientific journal
KEYWORDS+: SCIENTIFIC CO-AUTHORSHIP; COLLABORATION
ABSTRACT: The Neotropical Ichthyology journal was created in 2003
and soon became one of the main publications in its field as it is reflected in the number of articles submitted every year and the fact that it has been indexed by both SciELO and ISI. In order to understand the reasons for its trajectory, the journal history was recovered and bibliometric indices on author, citation and impact factor were mapped for the period between 2003 and 2010. A descriptive study on journal information source and a bibliometric study of the 388 articles published by the journal and the 642 articles that cite it have been carried out.
Bibliometric analyses showed that 75.8% of the articles had been written by Brazilian authors and 91.3% had been published in collaboration. The journal was cited by 171 different publications from 28 countries, including renowned journals in the field. Self-citation accounted for 26.8% of journal citation. Analyses have been able to show that strict evaluation control and editing of the articles have contributed towards its success and internationalization.
AUTHOR ADDRESS: IRC Stumpf, Univ Fed Rio Grande do Sul, Fac Bibliotecon &
Comunicacao, Rua Ramiro Barcelos 2705,Sala 214, BR-90035007
Porto Alegre, RS, Brazil
--------------------------------------------------------------------------
TITLE: Nine criteria for a measure of scientific output
(Article, English)
AUTHOR: Kreiman, G; Maunsell, JHR
SOURCE: FRONTIERS IN COMPUTATIONAL NEUROSCIENCE 5. NOV 10 2011.
p.NIL_16-NIL_21 FRONTIERS RES FOUND, LAUSANNE
SEARCH TERM(S): GARFIELD E rauth;
HIRSCH JE P NATL ACAD SCI USA 102:16569 2005;
GARFIELD E JAMA-J AM MED ASSOC 295:90 2006
KEYWORDS: impact factors; peer review; productivity; scientific
output; citation; bibliometric analysis; quality versus
quantity; impact
KEYWORDS+: JOURNALS IMPACT FACTOR; PAPER
ABSTRACT: Scientific research produces new knowledge, technologies,
and clinical treatments that can lead to enormous returns. Often, the path from basic research to new paradigms and direct impact on society takes time. Precise quantification of scientific output in the short-term is not an easy task but is critical for evaluating scientists, laboratories, departments, and institutions. While there have been attempts to quantifying scientific output, we argue that current methods are not ideal and suffer from solvable difficulties. Here we propose criteria that a metric should have to be considered a good index of scientific output. Specifically, we argue that such an index should be quantitative, based on robust data, rapidly updated and retrospective, presented with confidence intervals, normalized by number of contributors, career stage and discipline, impractical to manipulate, and focused on quality over quantity. Such an index should be validated through empirical testing. The purpose of quantitatively evaluating scientific output is not to replace careful, rigorous review by experts but rather to complement those efforts. Because it has the potential to greatly influence the efficiency of scientific research, we have a duty to reflect upon and implement novel and rigorous ways of evaluating scientific output. The criteria proposed here provide initial steps toward the systematic development and validation of a metric to evaluate scientific output.
AUTHOR ADDRESS: G Kreiman, Harvard Univ, Childrens Hosp, Sch Med, Dept
Ophthalmol, 1 Blackfan Circle,Karp 11217, Boston, MA 02115
USA
More information about the SIGMETRICS
mailing list