Monday, 22 August 2016

Digital Object Identifier System and DOI Names (DOIs) Guide - ANDS


Digital Object Identifier System and DOI Names (DOIs) Guide


Who should read this?

guide is intended for researchers and eResearch infrastructure support
providers. It explains the Digital Object Identifier system and the
advantages of using a DOI Name to cite and link to research data. This
guide should be read in conjunction with the ANDS Guides on Persistent identifiers and Data citation.

What is the DOI System?

Digital Object Identifier system is used for identifying intellectual
property in the digital environment. It is used principally by
publishers, and is an implementation of the Handle System for persistent
identifiers. The International DOI Federation (IDF) appoints
Registration Agencies who allocate DOI prefixes, register DOI Names, and
provide the necessary infrastructure to allow registrants to declare
and maintain metadata.
Major applications of the DOI system currently include:
  • persistent
    citations in scholarly materials (journal articles, books, etc.)
    through CrossRef, a consortium of around 3,000 publishers;
  • scientific
    data sets, through DataCite, a consortium of leading research
    libraries, technical information providers, and scientific data centres;
  • European Union official publications, through the EU publications office.
promote the citation and reuse of Australian research data, ANDS
provides a DOI Service for research datasets as a free service to
Australian institutions.

DOIs (Digital Object Identifiers)

DOI Name (DOI) is a specific type of Handle and can be assigned to any
object that is a form of intellectual property. DOI should be
interpreted as 'digital identifier of an object' rather than 'identifier
of a digital object'.
DOI consists of a unique, case-insensitive, alphanumeric character
sequence that is divided into two parts, a prefix and a suffix,
separated by a forward slash. The prefix is assigned by a DOI
Registration Agency and always starts with '10.' This distinguishes it
as a DOI as opposed to other types of Handle. The suffix is assigned by
the publication agent, the agency supplying the information about the
object, and must be unique within a prefix.

Example of a DOI within a data citation:
Ivan (2012): Monthly drought data for Australia 1890-2008 using the
Hutchinson Drought Index. The Australian National University Australian
Data Archive. DOI :10.4225/13/50BBFD7E6727A
is a complete DOI Name. The prefix 10.4225 consists of the directory
code '10' (always 10 for a DOI Name) and the registrant's code '4225'
which is allocated by the German National Library of Science and
Technology for scientific datasets in its role as a registration agency.
Citations for this DOI should be in the form
DOI :10.4225/13/50BBFD7E6727A
but the hypertext link should be

What is the difference between a DOI and other Persistent Identifiers?

DOI is a Persistent Identifier (PID), but also provides extra benefits.
A DOI can be used to uniquely identify either digital or non-digital
objects, whether or not they have any internet presence.
DOI persistently identifies an object itself through listing it in a
DOI Registry, while a PID persistently identifies only an object's
location. DOIs are supported by the International DOI Federation (IDF)
and Registration Agencies infrastructure, which provides ongoing DOI
services and allows for a high level of confidence in the quality and
accuracy of DOIs.
object may have multiple DOIs and multiple PIDs assigned to it as it
moves through the publishing process. If an object has an internet
location, it will have either a URL or other persistent identifier (such
as Handle, PURL or ARK) in addition to a DOI. Each DOI and PID will
confer a different benefit on the dataset.

What are the advantages of DOIs for datasets?

assignment of DOIs through the international DOI infrastructure has
associated costs. Accordingly, DOIs are unlikely to be issued on an ad
hoc or unmanaged basis, but will be assigned by authorised agencies or
institutions to datasets that are well described and managed archivally
for long-term access.
assignment of a DOI therefore indicates that a dataset will be well
managed and accessible for long-term use. It also brands published data
as a first-class research output in the publishing world, since datasets
will be assigned DOIs regularly as is done for existing scholarly
DOIs in this way will establish easier access to research data on the
Internet, increase the acceptance of research data as legitimately
citable contribution to the scientific record, and support data
archiving that will permit results to be verified and re-purposed for
future study.

What is ANDS doing?

is a member of the DataCite consortium, a group of leading research
libraries and technical information providers that aims to make it
easier for research datasets to be handled as independent, citable,
unique scientific objects. ANDS runs a DOI Local Handle Server, minting
and managing DOIs on behalf of DataCite.
has its own DOI prefix and research institutions, consortia and
agencies are able to obtain DOIs for scholarly outputs such as:
  • datasets and collections
  • associated workflows
  • software
  • models
  • grey literature
ANDS Cite My Data DOI minting service is available as a machine to
machine or manual service.  It is free to use for publicly funded
Australian research organisations and government agencies.

is also been working with Thomson Reuters and data providers to track
and record dataset use through DOIs, and making that information
available through the Data Citation Index.

Further Information

Frequently Asked Questions:

Digital Object Identifier System and DOI Names (DOIs) Guide - ANDS

How Wageningen University and Research Center managed to influence researchers publishing behaviour towards more quality, impact and visibility | EuroCRIS


Please use this identifier to cite or link to this item:

Title: How
Wageningen University and Research Center managed to influence
researchers publishing behaviour towards more quality, impact and
Authors: Fondermann, Philipp 
Van der Togt, Peter 
Keywords: research information management
current research information systems
research impact
research quality
Wageningen University and Research Center (WUR)
Issue Date: 9-Jun-2016
Publisher: euroCRIS
Source: "Communicating
and Measuring Research Responsibly: Profiling, Metrics, Impact,
Interoperability": Proceedings of the 13th International Conference on
Current Research Information Systems (2016) Procedia Computer Science
(2016, In Press)
Series/Report no.: CRIS2016: 13th International Conference on Current Research Information Systems (St Andrews, June 9-11, 2016)
Conference: CRIS2016 – St Andrews 
Abstract: Wageningen
University and Research Center (WUR) is one of the most prestigious
research institutions in the world in life sciences and improved
significantly in several rankings over the last years. One of the
`drivers` of this success story is a comprehensive quality management
exercise based on Research information from an integrated CRIS system,
that managed to influence researchers publishing behaviour towards more
quality, impact and visibility.
Description: Delivered
at the CRIS2016 Conference in St Andrews; published in Procedia
Computer Science xx (Jul 2016).-- Contains conference paper (8 pages)
and presentation (18 slides).
Note from the authors.-- "The results
presented in this paper build on the work of Wouter Gerritsma, a former
employee of Wageningen UR, who passed away on the 21st of June 2016.
Wouter made a great contribution to the field of research performance in
Wageningen and beyond".
Appears in Collections:Conference

Files in This Item:
File Description SizeFormat 
CRIS 2016 slides Version 1_5.pptxPPT presentation2.98 MBMicrosoft Powerpoint XML
Paper Fondermann+van der Togt2.pdfpost-print version212.41 kBAdobe PDF

Page view(s)


checked on Jun 10, 2016


checked on Jun 10, 2016

Google ScholarTM


How Wageningen University and Research Center managed to influence researchers publishing behaviour towards more quality, impact and visibility | EuroCRIS

Sunday, 21 August 2016

The Impact of Print Media and Wikipedia on Citation Rates of Academic Articles ~ libfocus - Irish library blog


The Impact of Print Media and Wikipedia on Citation Rates of Academic Articles

Guest post by Daniel Price. Daniel lives
in Israel, has an MA in Library and Information Science from Bar Ilan
University, and works as a librarian at Shalem College in Jerusalem.

There is a clear desirability to publish a paper that has a strong
scholarly impact, both for personal satisfaction knowing that one’s
research has been viewed and built upon, and for professional reasons,
since the number of citations a paper can correlate to promotion and
tenure - the ubiquitous “publish or perish” (Miller, Taylor and Bedeian,
2011), which has now become an international phenomena (De Meis,
Leopoldo, et al, 2003; De Rond and Miller, 2005; Min, Abdullah and
Mohamed, 2013; Osuna, Cruz-Castro and Sanz-Menéndez, 2010; Qiu, 2010;
Rotich and Muskali, 2013), increased salary and external funding
(Browman and Stergiou, 2008; Diamond, 1986; Gomez-Mejia and Balkin,
1992; Monastersky, 2005; Schoonbaert and Roelants, 1996) and even the
chance of winning professional prizes such as a Nobel Prize (Pendlebury,

Understandably then many studies have been carried out to discover the
characteristics of highly cited papers (Aksnes, 2003) and the factors
that influence citation counts. It is widely accepted that it is not
just the quality of the science that affect the citation rate, but
bibliometric parameters of papers such as its length (Abt, 1998; Ball,
2008; Falagas et. al. 2013; Hamrick, Fricker and Brown, 2010), number of
references (Corbyn, 2010; Kostoff, 2007; Vieira and Gomes, 2010;
Webster et. al., 2009), number of authors (Aksnes, 2003; Borsuk et. al.,
2009; Gazni and Didegah, 2011; Wuchty et al., 2007), length of titles
(Habibzadeh and Yadollahie, 2010; Jacques and Sebire, 2010), colons in
titles (Jamali and Nikzad, 2011; van Wesel, Wyatt & ten Haa, 2014;
Rostami, Mohammadpoorasl, and Hajizadeh, 2014).

A variety of external considerations is also known to influence the
citation rate of academic papers. Intuitively a paper that has been
publicised in the popular print media will be cited more as its
publicity makes researchers more aware of it; however it can be argued
that quality newspapers only cite valuable articles that would garner a
significant number of citations in any case. That the first assumption
was true was proven thirteen years ago in 1991 by comparing how many
more citations articles published in the New England Journal of Medicine
received if they were quoted in the New York Times during a 12 week
period in 1978 when copies of the paper were printed but not distributed
due to a strike compared to the following year of 1979. The results
showed that articles covered by the Times received 72.8% more citations
during the first year after their publication but only those discussed
when the paper was actually distributed. Articles covered by the Times
during the strike period received no more citations that articles not
referenced by the Times, thus proving that exposure in the Times is a
cause of citation (“the publicity hypothesis”) and not a forecast of
future trends (the “earmark hypothesis”) (Phillips, 1991).

Phillips’ findings articles that covered in the New York Times receive
more citations was confirmed in another study conducted 11 years later
which also found however that exposure in less “elite” daily newspapers
(but not in evening broadcasts of mainstream USA television networks)
during a twelve month period from mid-1997 to mid-1998 also correlated
with higher citation rates of a wider range of scientific papers, thus
showing that scientific communication is not just carried out through
elite channels. Importantly though, the author notes that his study does
not prove the “publicity hypothesis” as the articles that were
publicised could have been more intrinsically important and were only
cited for this reason, although it does cast doubt on the “earmark
hypothesis” since many articles that were not mentioned were cited
(Kiernan, 2003).

In the present day much scholarly communication takes place on Web 2.0
tools and in the emerging field of “altmetrics” (Konkiel, 2013; Priem,
2014; Thelwall, 2013), studies focus on parametrics including whether it
has been cited and discussed on academic blogs (Shema, Bar-Ilan and
Thelwall 2014), tweeted (Eysenbach, 2011), and uploaded to a social
media platform such as Mendley (Li and Thelwall, 2012).

Research has also investigated whether articles cited on the decidedly
non-elitist Wikipedia. A study conducted in the beginning of 2010 found
that 0.54% of approximately nineteen million Wikipedia pages cited a
PubMed journal article, which corresponds to about 0.08% of all Pubmed
articles. The researchers showed that journal articles that were cited
in Wikipedia were cited more and had higher F1000 scores than a random
subset of non-cited articles, a phenomenon they explained according to
their hypothesised that Wikipedia users would only cite important
articles that present novel and ground-breaking research (Evans and
Krauthammer, 2011).

A larger study carried out two and half years later came to the same
conclusion that academic papers in the field of computer science which
are cited on Wikipedia would be more likely to be cited because the
Wikipedia entries are written by talented authors who are careful to
cite reputable authors and trending research topics (Shuai, Jiang, Liu
and Bollen, 2013).

These conclusions support the “earmark hypothesis” that Phillips
rejected and Kiernan doubted. Wikipedians are credited with identifying
high impact journal articles soon after they are published and
recommending them to other users.

In order to preserve a careful dialectic of both sides of the
publicity/earmark hypotheses though, the possibility should be
entertained that the large number of Wikipedia users may include
researchers who, flooded with an information overload of thousands of
articles, are motivated to read and quote certain articles because they
saw them quoted on Wikipedia. Future research could investigate the
information behavior of a large number of researchers, specifically
their use of Wikipedia.


Abt, H. A. (1998). Why some papers have long citation lifetimes. Nature, 395, 756-757.

Aksnes, D. W. (2003). Characteristics of highly cited papers. Research Evaluation, 12(3), 159-170.

Ebrahim, N., Salehi, H., Embi, M. A., Habibi Tanha, F., Gholizadeh, H.,
Seyed Mohammad, M., & Ordi, A. (2013). Effective strategies for
increasing citation frequency. International Education Studies, 6(11),

Ball, P. (2008). A longer paper gathers more citations. Nature, 455(7211), 274-275.

Borsuk, R. M., Budden, A. E., Leimu, R., Aarssen, L. W., & Lortie,
C. J. (2009). The influence of author gender, national language and
number of authors on citation rate in ecology. Open Ecology Journal, 2,

Browman, H. I., & Stergiou, K. I. (2008). Factors and indices are
one thing, deciding who is scholarly, why they are scholarly, and the
relative value of their scholarship is something else entirely. Ethics
in Science and Environmental Politics, 8(1), 1-3.

Corbyn, Z. (2010). An easy way to boost a paper's citations. Nature. Available at

Evans, P., & Krauthammer, M. (2011). Exploring the use of social
media to measure journal article impact. In AMIA Annual Symposium
Proceedings (Vol. 2011, p. 374). American Medical Informatics

Eysenbach, G. (2011). Can tweets predict citations? Metrics of social
impact based on twitter and correlation with traditional metrics of
scientific impact. Journal of Medical Internet Research, 13(4).

Falagas, M. E., Zarkali, A., Karageorgopoulos, D. E., Bardakas, V.,
& Mavros, M. N. (2013). The impact of article length on the number
of future citations: a bibliometric analysis of general medicine
journals. PloS one, 8(2), e49476.

Gazni, A., & Didegah, F. (2011). Investigating different types of
research collaboration and citation impact: a case study of Harvard
University’s publications. Scientometrics, 87(2), 251-265.

Gomez-Mejia, L. R., & Balkin, D. B. (1992). Determinants of faculty
pay: an agency theory perspective. Academy of Management Journal, 35(5),

Habibzadeh, F., & Yadollahie, M. (2010). Are shorter article titles
more attractive for citations? Crosssectional study of 22 scientific
journals. Croatian medical journal, 51(2), 165-170.

Hamrick, T. A., Fricker, R. D., & Brown, G. G. (2010). Assessing
what distinguishes highly cited from less-cited papers published in
interfaces. Interfaces, 40(6), 454-464.

Jacques, T. S., & Sebire, N. J. (2010). The impact of article titles
on citation hits: an analysis of general and specialist medical
journals. JRSM short reports, 1(1).

Jamali, H. R., & Nikzad, M. (2011). Article title type and its
relation with the number of downloads and citations. Scientometrics,
88(2), 653-661.

Kiernan, V. (2003). Diffusion of news about research. Science Communication,25(1), 3-13.

Konkiel, S. (2013). Altmetrics: A 21st‐century solution to determining research quality. Online Searcher, 37(4), 10‐15.

Kostoff, R. N. (2007). The difference between highly and poorly cited
medical articles in the journal Lancet. Scientometrics, 72(3), 513-520.

Li, X., & Thelwall, M. (2012). F1000, Mendeley and traditional
bibliometric indicators. In Proceedings of the 17th International
Conference on Science and Technology Indicators (Vol. 2, pp. 451-551).

Monastersky, R. (2005). The number that’s devouring science. The chronicle of higher education, 52(8), A12.

Osuna, C., Cruz-Castro, L., & Sanz-Menéndez, L. (2011). Overturning
some assumptions about the effects of evaluation systems on publication
performance. Scientometrics, 86(3), 575-592.

Phillips, D. P., Kanter, E. J., Bednarczyk, B., & Tastad, P. L.
(1991). Importance of the lay press in the transmission of medical
knowledge to the scientific community. The New England Journal of
Medicine, 325(16), 1180-1183.

Price, D. (2014). A bibliographic study of articles published in twelve
humanities journals. Available at

Priem, J. (2014). Altmetrics. In B. Cronin and C. R. Sugimoto (Eds.)
Beyond bibliometrics: harnessing multidimensional indicators of
scholarly impact (pp. 263-287).

Rostami, F., Mohammadpoorasl, A., & Hajizadeh, M. (2014). The effect
of characteristics of title on citation rates of articles.
Scientometrics, 98(3), 2007-2010.

Schloegl, C., & Gorraiz, J. (2011). Global usage versus global
citation metrics: the case of pharmacology journals. Journal of the
American Society for Information Science and Technology, 62(1), 161-170.

Schoonbaert, D., & Roelants, G. (1996). Citation analysis for
measuring the value of scientific publications: quality assessment tool
or comedy of errors? Tropical Medicine & International Health, 1(6),

Shema, H., Bar‐Ilan, J., & Thelwall, M. (2014). Do blog citations
correlate with a higher number of future citations? Research blogs as a
potential source for alternative metrics. Journal of the Association for
Information Science and Technology.

Shuai, X., Jiang, Z., Liu, X., & Bollen, J. (2013). A comparative
study of academic and Wikipedia ranking. In Proceedings of the 13th
ACM/IEEE-CS joint conference on Digital libraries (pp. 25-28).

Thelwall, M., Haustein, S., Larivière, V., & Sugimoto, C. R. (2013).
Do Altmetrics Work? Twitter and Ten Other Social Web Services. PloS
one, 8(5), e64841.

van Wesel, M., Wyatt, S., & ten Haaf, J. (2014). What a difference a
colon makes: how superficial factors influence subsequent citation.
Scientometrics,98(3), 1601-1615.

Vieira, E.S., & Gomes, J.A.N.F. (2010). Citation to scientific
articles: Its distribution and dependence on the article features.
Journal of Informetrics, 4 (1), 1-13.

Webster, G. D., Jonason, P. K., & Schember, T. O. (2009). Hot topics
and popular papers in evolutionary psychology: analyses of title words
and citation counts in evolution and human behavior, 1979–2008.
Evolutionary Psychology, 7(3), 348-362.

Wuchty, S., Jones, B. F., & Uzzi, B. (2007). The increasing
dominance of teams in production of knowledge. Science, 316(5827),

The Impact of Print Media and Wikipedia on Citation Rates of Academic Articles ~ libfocus - Irish library blog

Is it worth the effort contributing in Wikipedia? - Quora


Is it worth the effort contributing in Wikipedia?

  • They don't let original researches to be included
  • Inclusion
    of reference and citation needs a lot of research to be done. It is
    almost similar to publishing in a research journal.
6 Answers
Dave Waghorn
Dave Waghorn, Wikipedia editor for over 10 years and administrator for over 9.
253 Views · Most Viewed Writer in Wikipedia Editing
it's absolutely worth the effort. Conducting the research into topics
you don't already know about can be fascinating; verifying and
challenging the things you do know can be both surprising and rewarding.
Shaping that information into well-worded prose is satisfying and then
seeing your contribution immediately published on one of the world's
biggest and most popular websites is very exciting. Inviting others to
then peer review your work and receiving feedback on it can be very
satisfying, as can working together with other contributors to improve
their own work. Knowing that your contribution may help others to
increase their own knowledge about the subject you've written about -
whether that's for work, just out of interest, or whatever other reason -
is very satisfying.
Contributing to Wikipedia -
at least, contributing something meaningful - is not effortless. But if
your attitude is that anything that takes a bit of effort isn't worth
doing, I'm afraid you're not going to go far in life. Is it worth the
effort? Yes, definitely. Come and join in.
Written May 11 · View Upvotes

Is it worth the effort contributing in Wikipedia? - Quora

Guidelines for Participating in Wikipedia from NIH | National Institutes of Health (NIH)


Guidelines for Participating in Wikipedia from NIH

is an encyclopedia and is the fifth most popular property on the web.
At present [February 3, 2015] there are more than 75,000 active
contributors across the globe with hundreds of thousands of volunteers
at computers working on more than 13,000,000 articles in more than 260
languages. Recently, the system has become much more sophisticated and
"vandals" are handled quickly. But the majority of science articles and
good health information are rated by Wikipedia as "incomplete." There is
a real opportunity to strengthen this public resource. A biochemistry
professor who was part of the NIH Wikipedia Academy workshop in July
2009, noted about the content: "[it is] meant to derive its authority
from journal references, scientific literature…." We hope these
guidelines will help you to become part of a unique opportunity in
keeping with the NIH’s history of making credible, vetted, authoritative
information available to the public. The time spent can be minimal, but
the impact could be great. Information you have already developed that
might benefit scientists or the public worldwide could be put up in a
few minutes — or one item in a journal club could be an entry (suggested
by some NIH scientists) as a shared experience, or if one has
permission and is so inclined, an authored article in an area needing
new information would require more effort. Many have heard that schools
will not permit citation of Wikipedia — completely appropriate—and
acknowledged by Wikipedia. They, as with any encyclopedia, would not be a
credible citation, however, the growing list of peer-reviewed,
published sources used in citation makes it an invaluable tool for
getting started or locating original sources.

  1. NIH scientists and health and science writers can
    contribute to Wikipedia within their own fields. Contributions may be
    in the form of a range of activities from authoring articles in areas of
    individual or laboratory expertise, editing for accuracy or improving
    current entries, or providing—and this is an important contribution —
  2. Remember that Wikipedia makes every effort to be a neutral resource
    and fact-based. The community is particularly sensitive to self
    aggrandizing, hype, or policy items. The main page for NIH programs (IC
    or Office) should go through the appropriate communication officer in
    your IC or Office.
  3. Time spent on Wikipedia entries must be approved by the immediate
    supervisor and the time permitted for this outreach activity
    predetermined before the work is begun. It is recommended that the staff
    person indicates the areas he/she will contribute to in the time
  4. In your NIH capacity: NIH staff scientists and science writers and
    health professionals may only contribute to Wikipedia entries in their
    own scientific and health areas of expertise from the NIH.
  5. In your personal capacity: To contribute to articles in additional
    areas of interest, NIH employees should use personal resources,
    including non-Government home computers and personal internet accounts.
  6. NIH staff may only share information that is in the public domain
    and contribute factual information not opinion. Nor should staff enter
    into discussions of policy. That is not appropriate to Wikipedia nor NIH
  7. To ensure quality and avoid instances of plagiarism, Wikipedia
    requires contributors to cite literature or link to existing materials
    and to provide proper attribution to authors and sources, even if the
    work is in the public domain. There is constant vigilance by Wikipedians
    in favor of fact over opinion; inaccurate information is corrected
    through a series of permissions.
  8. The Wikimedia Foundation has established a switchboard, staffed by
    volunteers, for NIH editors and contributors. Contributors should use
    the NIH Switchboard for Wikipedia to ensure that your contributions are
    welcomed. We will be posting information about contacting the
    switchboard on line.
  9. Seize opportunities to link to full articles through NIH public access holdings.
  10. These guidelines are designed for communications from NIH.
    Individuals may, of course, contribute privately to Wikipedia on their
    own time and own equipment in areas of personal knowledge.
  11. We are using these guidelines as a working document for the NIH
    community. If you have comments or questions, please contact us at (link sends e-mail).
This page last reviewed on September 11, 2015

Guidelines for Participating in Wikipedia from NIH | National Institutes of Health (NIH)