Sunday, 31 May 2015

SSRN Top Downloads For PRN: Philosophy of Art (Topic)

RECENT TOP PAPERS for all papers first announced in the last 60 days

1 Apr 2015 through 31 May 2015



Paper Title



The Scientific Articles on Art Criticism

Mina Hedayat,
Pegah Jahangiri,
Aida Torkamani,
Mahsa Mashayekhi,
Nader Ale Ebrahim

University of Malaya (UM), University of Malaya (UM), University of Malaya (UM), University of Applied Science and Technology, University of Malaya (UM) and University of Malaya (UM) - Department of Engineering Design and Manufacture, Faculty of EngineeringUniversity of Malaya (UM) - Research Support Unit, Centre of Research Services, Institute of Research Management and Monitoring (IPPP)

Date posted to database: 22 May 2015
Last Revised: 22 May 2015

SSRN Top Downloads

Friday, 29 May 2015

REGISTER NOW | One Day Workshop on Research Tools: Supporting Research and Publication

Facebook >
speaker ›› Dr. Nader Ale Ebrahim

Visiting Research Fellow, Research Support Unit, IPPP,

University of Malaya
medium ›› English
venue ›› MPWS Training Centre # map
time ››
8:30am – 5:30pm
Early Bird Rate
Normal Rate
30 May 2015

(including GST)

No GST: 001902862336

from 30 April 2015

until closing date

Research Tools”can
be defined as vehicles that broadly facilitate research and related
activities. Scientific tools enable researchers to collect, organize,
analyze, visualize andpublicized research  outputs. Dr. Nader has
collected over 700 tools that enable students to follow the correct
path in research and to ultimately produce high-quality research
outputs with more accuracy andefficiency. It is assembled as an
interactive Web-based mind map, titled “Research Tools”, which is updated periodically.

Research Tools” consists of a hierarchical set of nodes. It has four main nodes: (1) Searching the literature, (2) Writing a paper, (3) Targeting suitable journals, and (4) Enhancing visibility and impact of the research,
and six auxiliary nodes. Several free tools can be found in the child
nodes. Some paid tools are also included. In this workshop some tools
as examples fromthe four main nodes will be described.The e-skills
learned from the workshop are useful across various research
disciplines and research institutions.

Problems statement

Research can be time consuming and sometimes tedious. The
following statements capture some of the main concerns of the
researchers as faced in the research process:

• “I just join as a new postgraduate student and I am not sure how to do a literature search”
• “I have been in research for some time now but I spend a lot of time to get the articles I want”
• “I am sure I have downloaded the article but I am not able to find it”
• “I wanted to write a new paper, how can I manage the references in the shortest possible time?”
• “I have many
references, some of my old papers, and some of my current research.
Sometimes, they are so many that I can’t recall where I have kept them
in my folders!”
• ……..
• “I have written an article and I am not able to find a proper Journal”
• "I want to increase the citation of my papers, how can I do?"
Can research become , easier, more fun and more
result-oriented? The answer to this question is YES. We need an
effective search strategy which can save hours of wasted research time
and provide a clear direction for your research.


The workshop seeks to serve the following objectives:

To discover how to use more efficient the tools that are available through the Net.
To help students who seek to reduce the search time
by expanding the knowledge of researchers to more effectively use the
"tools" that are available through the Net.
To evaluate the types of literature that researchers will encounter.
To convert the information of the search for a written document.
To help researchers learn how to search and analyze the right journal to submit.
Who should attend?

Researchers at any experience and skill level
Research supervisors and research managers
Anyone interested in carrying out the research process

The workshop will bring the following benefits for the attendees:

» Expanding and developing a knowledge base for research related tasks and activities
» Effective presenation and communication of research to the scientific world
» Attracting a higher level of comments and citations from other researchers
» Gaining more academic visibility
» Finding and formulating a suitableresearch topic
» Developinghigh quality academic proposals, theses and articles.
Tentative program

Time Activity
8:30am Registration
9:00am Introduce “Research Tools” Box
9:15am Developing a search strategy, Finding keyword and proper articles
9:30am Evaluate a paper/journal quality
10:30am Tea break 1
11:00am Keeping up-to-date (Alert system),
12:30pm Lunch break
2:00pm The paraphrasing & editing tool
2:30pm Indexing desktop search tool and write an academic paragraph
2:45pm Avoid plagiarism
3:30pm Tea break 2
4:00pm Reference management tools, Getting published and Target suitable journal
5:00pm Q&A
Nader Ale Ebrahim
is currently working as a research fellow with the Research Support
Unit, Centre of Research Services, Institute of Research Management
and Monitoring (IPPP), University of Malaya. Nader holds a PhD
degree in Technology Management from Faculty of Engineering, University of Malaya.
He has over 19 years of experience in the field of technology
management and new product development in different companies. His
current research interest focuses on E-skills, Research Tools, Bibliometrics and managing virtual NPD teams in SMEs’ R&D centers.

Nader developed a new method using the “Research Tools
which help students who seek to reduce the search time by expanding
the knowledge of researchers to effectively use the "tools" that are
available on the Internet. Research Tools consist of a hierarchical set of nodes. It has four main nodes: (1)
Searching the literature, (2) Writing a paper, (3) Targeting
suitable journals, and (4) Enhancing visibility and impact.

He was the winner of Refer-a-Colleague Competition and
has received prizes from renowned establishments such as Thomson
Reuters. Nader is well-known as the founder of “Research Tools” Box and the developer of “Publication Marketing Tools”.
Registration fees cover refreshments and lunch. All fees must be fully paid before commencement of the course. Otherwise,
participants will not be allowed to enter the lecture hall.
Reservation/booking by intending participants should be made with
payment as soon as possible. We accept payment via Local Order (LO).
However, the price should be normal price and the LO must be sent prior
to the event.

If a place is reserved and the intended participant failed to
attend the course on the date of the event, the fee is to be settled
in full. Fees paid are non-refundable. However, substitution of
participant(s) will be permitted at the discretion of the organizer. In
view of the limited places available, intending participants are
advised to send their registration with the payment made as early as
possible to avoid disappointment.

The organizer reserves the right to alter or change the
programme due to unforeseen circumstances. Every effort will be made to
inform the registered participants of any changes. The organizer will
not be responsible for the arrangement of participant's transportation
and accommodation should there be any changes for the date of the
workshop. The participant is responsible to check the MPWS official
website from time to time for any changes, or contact the organizer

Completed registration form accompanied by evidence of payment slip should reach the organizer not later than 3 working days before the commencement of the course.

If you require further details or clarifications, kindly contact the organizer

Malaysian Postgraduate Workshop Series (MPWS):

Email: (Training) | Email: (General)

Telefon : 03 8912 7212 | Faks : 03 8928 9212 (8.30 a.m. - 5.30 p.m. | Monday - Friday)

MPWS Training Centre, 63-1, 63-2, Jalan Kajang Impian 1/11,Taman Kajang Impian,

Seksyen 7, 43650 Bandar Baru Bangi, Selangor

View Larger Map

REGISTER NOW | One Day Workshop on Research Tools: Supporting Research and Publication

Elsevier’s new sharing policy is really a reversal of the rights of authors.


Elsevier’s new sharing policy is really a reversal of the rights of authors.


Virgina Barbour
takes to task publishing giant Elsevier for their latest round of
introduced restrictions on the sharing of academic research. Their new
policy states that, if no article processing charge is paid, an author’s
accepted version of the article cannot be made publicly available via
their institution’s repository until after the embargo period, which can
ranges from six months to four years.

Delegates at the The Higher Education Technology Agenda (THETA) conference on the Gold Coast a few weeks ago heard from futurist Bryan Alexander
about four possible scenarios for the future of knowledge. Three of
them sounded engaging: there was one where “open information
architecture has triumphed”; another where automation is the primary
driving force; and a third which is a renaissance of “digitally enabled

However, one was chilling. This was where the drive for “open” has failed, and content is locked up in walled gardens. This future is closer than many of us might care to think. Today the Confederation of Open Access Repositories – an international association of open access (OA) repositories – sounded an alarm
that policies are being enacted which, if unchallenged, will ensure
that that the foundations for these walls are cemented into place.

no accessImage credit: James Nash CC BY-SA
Green and gold

There are currently two main approaches to open access publishing: Green and Gold. Green OA
is where the final published version of an article is only available
from a journal publisher’s site, after paying a subscription or after an
embargo period. However, the authors’ accepted version (after
peer-review but before copyediting), or an earlier version, can be made
immediately available via a repository – usually at an author’s institution or via a subject repository such as arXiv. Green
OA is the primary way that OA is supported in Australia. This is unlike
the UK, for example, which has chosen to support OA via Gold Open Access.

Gold OA is where the journal publisher typically charges the author
(or their institution) an “article processing charge” (APC), and it then
makes the article freely available to read and reuse via the journal’s
website. This is the model used by all journals published by the Public Library of Science. Gold
OA content is both free to read and, because of the license, usually
available for wide reuse. Although sometimes there is a compromise,
known as Hybrid OA, where some articles in a journal are Gold OA, but
publishers also charge a subscription for the non-OA content.

Reversal of rights

This issue raised by COAR has been brought to the fore by a new policy
announced by the giant publisher Elsevier relating to embargo periods
for articles that can be shared via a “Green OA” policy. Elsevier’s new
policy is a substantial tightening of its rules around Green OA. It
states that, if no APC is paid, the author’s accepted version of the
article cannot be made publicly available via their institution’s
repository until after an embargo period, which ranges from six months
to four years.

In addition, the license required is the most restrictive possible,
in that it prohibits commercial reuse, or use of excerpts of the work.
For example, an author’s colleague would not be able to use a figure
from a manuscript in teaching without specific permission. The fully
typeset version of the article is available only from the publisher’s
site after paying a subscription. Despite Elsevier heralding
the policy as “unleashing the power of academic sharing”, it is really a
reversal of the rights of authors with their own manuscripts.

Previously, Elsevier and other publishers had allowed authors to
place these accepted versions into repositories with no restrictions on
sharing. It’s also worth noting that Elsevier derives immediate income
from subscriptions to the final published articles, although there is no evidence
that deposition of the accepted version into repositories decreases
that income. Then, in 2012, Elsevier announced that if an institution
had a policy on open access, then authors could not share their articles
unless the institution had entered into a specific arrangement with
Elsevier. This was a policy that was considered so manifestly absurd,
not to mention confusing, that it was widely ignored.

Undermining the walls

This is, at its heart, another skirmish in the long running saga of
who owns what, and who has rights in scholarly publishing. And, for the
publishers, how they derive income from it. The issue of income has been
brought into sharp focus recently by information released by the two
biggest funders of OA in the UK, the Wellcome Trust and the UK Research Councils.

Analysis of what organisations in the UK are paying for OA found that in 2013, 20 UK institutions spent £3,312,679 on APCs
for hybrid articles, which was on top of the £29,392,142 they had to
pay for subscription access to the same journals. In addition, the vast
bulk of APCs – £1,861,757 in the case of Wellcome – are not going to
newer publishers who are trying to make a sustainable business out of OA
publishing, but to traditional publishers such as Elsevier and Wiley.
The majority of money paid to them, including the highest APCs, was going to support hybrid OA.

This debate affects everyone, not just publishers, funders and
librarians. Academic research is one of the most valuable global public
goods that exists, and its value only multiplies with access and reuse,
most of which can’t be predicted or planned. Building walled gardens or
segmented siloes of content only restricts that public good. Nowhere is
this shown more starkly than the announcement that, following the Nepal
Earthquake, the US National Library of Medicine was activating its Emergency Access Initiative,
to provide “temporary free access to full text articles from major
biomedicine titles” but there are restrictions on use of content, and it
only runs until June 13, 2015.

Is that the future we really want? Where access to information, based
largely on research that was publicly funded, has to be doled out as
charity? If not, then we should take heed of COAR’s statement.

This article was originally published on The Conversation under the title “Publisher pushback puts open access in peril“. Read the original article.

Note: This article gives the views of the author, and not the
position of the Impact of Social Science blog, nor of the London School
of Economics. Please review our Comments Policy if you have any concerns on posting a comment below.

About the Author

Dr Virgina Barbour is the Executive Officer of the Australian Open Access Support Group.
She is based in Brisbane, Australia. She has a long history of working
in open access publishing, having joined PLOS in 2004 as one of the
three founding editors of PLOS Medicine, finally becoming Medicine and Biology Editorial Director of PLOS in 2014. Her training in publishing was at The Lancet where she worked before joining PLOS.

Impact of Social Sciences – Elsevier’s new sharing policy is really a reversal of the rights of authors.

The scientific articles on art criticism


Volume 11, Issue 13, 16 May 2015, Pages 130-138

The scientific articles on art criticism  (Article)

Cultural Center, University of Malaya, Kuala Lumpur, Malaysia

University of Applied Science and Technology, Art and Culture Branch 23, Malaysia

Academy of Malay study, University of Malaya, Kuala Lumpur, Malaysia

Research Support Unit, Centre of Research Services, Institute of
Research Management and Monitoring (IPPP), University of Malaya,


Research has been
extremely involved in improving in the art criticism area. These
improvements are reflected in scientific articles. This article purposed
to investigate the 214 articles in art criticism to explore their main
characteristics. These articles published in the Web of Science database
of the Institute of Scientific Information (ISI) from the period of
1980 till 20 December 2013. Types of articles were article and review
which is included in the study. The three top cited (more than 10 times
citations) articles in art criticism were published in 1993 and 1999.
The 214 articles mean citation rate was 0.87 (SD 2.38) times. Among the
various fields, art (58.87%), arts humanities other topics (28.03%),
both art and arts humanities other topics (5.14%), both art and
education and educational research (2.33%), both art and history
(1.40%), art, arts humanities other topics and literature (1.40%), both
art and cultural studies (0.93%), both art and philosophy (0.93%), both
art and literature (0.46%), and both arts humanities other topics and
cultural studies (0.46%) were the most popular fields of research. The
results showed that researches were done in the United States had
highest citation which was written in English language. © 2015, Asian
Social Science. All Rights Reserved.

Author keywords

Art criticism; Article; Association; Citation-classics; Journal

ISSN: 19112017
Source Type: Journal
Original language: English

DOI: 10.5539/ass.v11n13p130
Document Type: Article
Publisher: Canadian Center of Science and Education

Scopus - Document details

Thursday, 28 May 2015

Publicar (casi exclusivamente) en revistas de impacto


Publicar (casi exclusivamente) en revistas de impacto

Lorenzo García Aretio


Editorial, RIED, 18(2)

En torno a la máxima "publish or
perish", el editor de RIED escribe un trabajo relacionada con el
excesivo valor concedido a la publicación científica en revistas de
impacto. Preguna ¿dónde está el impacto?, ¿existe un impacto social en
las publicaciones científicas? Posteriormente aborda el problema de la
cualificación de los pares revisores de las revistas científicas.
También los graves errores cometidos, en ocasiones, por estos árbitros.

Palabras clave

Revistas; impacto; revisores; almetrics

Texto completo:



Enlaces refback

  • No hay ningún enlace refback.

Publicar (casi exclusivamente) en revistas de impacto | García Aretio | RIED. Revista Iberoamericana de Educación a Distancia

Sunday, 24 May 2015

How do scientists share on academic social networks like ResearchGate? |


How do scientists share on academic social networks like ResearchGate?

A data analysis of
user habits shows open sharing is mostly limited to publications; very
few scientists are liberal with knowhow. A key aim of sciencebite is to create a open platform for scientific expertise online. At the moment, sciencebite
is much smaller than the academic social networks,, and So I wondered how scientists behave
on these networks, how they are promoting their identities, publications
and expertise, and how they are connecting with each other.
Without wishing to comment on the
competitive positions of the three largest academic networks, in this
part of the world – Berlin – where we are based, ResearchGate is the one
that we hear about the most, and with a claimed user count of > 6m
academics from whitelisted institutions, it has certainly achieved an
amazing penetration of the world’s academics. Speaking personally as
someone with a background in academic science, I can confirm that every
scientist that I know now seems to have a ResearchGate profile, although
many do not actively engage with it.
So how are scientists behaving online,
taking ResearchGate as the source of data? I wrote a script to browse
ResearchGate profiles at random, and ran it twice, to look at the change
over a year: firstly in November 2013 (sampling 3028 profiles), and
secondly in February 2015 (sampling 3407 profiles). The sample
represents some 0.1% of ResearchGate profiles, and leads to some pretty
interesting conclusions about how scientists behave on the new internet

Scientists are increasingly sharing their professional identities online

ResearchGate has 6m users as of January
2015, and is currently growing at around 10k users per day. A growing
minority of users share their professional identity on the site (Figure
1): 36% shared a profile picture in 2013, 43% in 2015. Fewer have
updated their profiles with their current positions, but this too is
growing: 7% in 2013 and 24% in 2015. It’s interesting that fewer have
filled in a current position than uploaded a profile picture, which
perhaps reflects that many users register with Facebook, or have left
academia since joining ResearchGate. As of February 2015, 18% have both
uploaded a profile picture and filled in their current position. If we
consider this as a definition of an active user, we can estimate
ResearchGate’s active user base as 1.1m.
Figure 1: Identity sharing on ResearchGate
Figure 1: Identity sharing on ResearchGate

A growing number of scientists are sharing full texts of their publications

We see an impressive engagement with
sharing full texts of publications online (Figure 2). Some 41% of
ResearchGate users had uploaded at least one full text by Nov 2013,
growing to 80% by Feb 2015. Considering the rapid growth of ResearchGate
signups in this time, it reflects an impressive rate of sharing among
new users.
figure 2
2: Publication uploading on ResearchGate has achieved an amazing
penetration of the academic community. Now 80% of RG users have uploaded
at least one full text.
It’s remarkable that so many more users
have shared their publications (~3m) than filled in their profile
information (~1m). However, given the importance of publications in the
academic career path, it is perhaps less surprising: academics find it
more important to share publications than to display their current
position. This would surely be the opposite of user behavior on
mainstream professional networks like LinkedIn.
In addition to the sampling of users, I
sampled 1000 publications randomly on ResearchGate, looking at the
relationship between publication date and sharing. In the past
three years, a great number of new publications have been listed on
ResearchGate, reflecting its deep penetration of the scientific
community. We can also see that new publications are increasingly shared
on ResearchGate – almost half of new publications in the past three
years are uploaded (Figure 3).
Figure 3: The growth in shared full texts on ResearchGate reflects the amazing penetration of the academic community in recent years. Now almost 50% of new publications are shared openly as full text.
3: The growth in shared full texts on ResearchGate reflects the amazing
penetration of the academic community in recent years. Now almost 50%
of new publications are shared openly as full text.

Almost nobody is using ResearchGate’s Open Reviews

ResearchGate launched a new feature in
March 2014: Open Reviews, for scientists to openly peer-review each
others’ papers after publication, for quality, originality,
reproducibility, etc. Unfortunately almost nobody is using it. Only 4
users in my sample of 3407 have given an Open Review, and none came back
to give a second.
The reason for this low adoption is
probably that scientists get little career advancement from an Open
Review. If they give a negative review, they may make enemies, and they
gain none of the influence over a journal that they do in the
traditional peer review system. Moreover, if they give a positive
review, it does not help their publication and citation record, which
are their main criteria to advance their careers.

Sharing of knowhow and advice is still lacking

Far fewer users engage with the Q&A
forum on ResearchGate, in comparison with their sharing of publications
(Figure 4). In November 2013, only 6.3% had ever asked, commented on, or
answered a question. Even of those who had engaged with Q&A, very
few were repeat users: 2.3% had ever asked a question, only 0.56% had
come back to ask a second question, and only one in the entire sample
could be really classified as an active user by asking more than 10
The situation had not improved much by
February 2015. 6.6% had even asked/commented/answered. 2.8% had ever
asked, and 1.1% had ever come back for a second. No users in the sample
had asked more than 10 questions.
Why do so few scientists request and
share practical knowhow and advice online? Perhaps it is the secretive
culture of research, and the importance of publications versus other
types of dissemination. Perhaps it is also because of the highly
specialized nature of knowhow in science – valuable knowhow is often
much more specialized than in other professions, for example in software
development, where programmers enjoy a culture of sharing all kinds of
practical information, through open source software, blogs and Q&A
Figure 4: Knowhow sharing on ResearchGate is still minimal. Only a very small proportion of users use the Q&A, and this proportion has changed little in 2013 - 2015
4: Knowhow sharing on ResearchGate is still minimal. Only a very small
proportion of users use the Q&A, and this proportion has changed
little in 2013 – 2015

Conclusion: scientists are changing some of their habits, but only as far as the academic career path allows them

Great numbers of scientists are now
sharing online, and in the past couple of years, it has become common to
share publications openly on academic social networks like
ResearchGate. However, not many scientists have changed their behavior
beyond this – other types of sharing are still lacking, In particular,
sharing of expertise and knowhow is still minimal.
I believe this represents a great
challenge and opportunity to scientists who are interested in working
differently. The trend of online sharing among scientists still
overwhelmingly reflects the traditional academic career path, because it
so strongly anchored to journal publications and citations. For all the
frustrations of collaborating and making progress in science, there are
perhaps still a great number of undiscovered solutions that would allow
us to work together better.

How do scientists share on academic social networks like ResearchGate? |

Using Content Marketing Metrics for Academic Impact | Substantia Mea


Using Content Marketing Metrics for Academic Impact

Academic contributions start from concepts and ideas. When their
content is relevant and of a high quality they can be published in
renowned, peer-reviewed journals. Researchers are increasingly using
online full text databases from institutional repositories or online
open access journals to disseminate their findings. The web has surely
helped to enhance fruitful collaborative relationships among academia.
The internet has brought increased engagement among peers, over email or
video. In addition, they may share their knowledge with colleagues as
they present their papers in seminars and conferences. After
publication, their contributions may be cited by other scholars.

The researchers’ visibility does not only rely on the number of
publications. Both academic researchers and their institutions are
continuously being rated and classified. Their citations may result from
highly reputable journals or well-linked homepages providing scientific
content. Publications are usually ranked through metrics that will
assess individual researchers and their organisational performance.
Bibliometrics and citations may be considered as part of the academic
reward system. Highly cited authors are usually endorsed by their peers
for their significant contribution to knowledge by society. As a matter
of fact, citations are at the core of scientometric methods as they have
been used to measure the visibility and impact of scholarly work (Moed,
2006; Borgman, 2000). This contribution explores extant literature that
explain how the visibility of individual researchers’content may be
related to their academic clout. Therefore, it examines the
communication structures and processes of scholarly communications
(Kousha and Thelwall, 2007; Borgmann and Furner 2002). It presents
relevant theoretical underpinnings on bibliometric studies and considers
different methods that can analyse the individual researchers’ or their
academic publications’ impact (Wilson, 1999; Tague-Sutcliffe, 1992).

Citation Analysis

The symbolic role of citation in representing the content of a document
is an extensive dimension of information retrieval. Citation analysis
expands the scope of information seeking by retrieving publications that
have been cited in previous works. This methodology offers enormous
possibilities for tracing trends and developments in different research
areas. Citation analysis has become the de-facto standard in the
evaluation of research. In fact, previous publications can be simply
evaluated on the number of citations and the relatively good
availability of citation data for such purposes (Knoth and Herrmannova,
2014). However, citations are merely one of the attributes of
publications. By themselves, they do not provide adequate and sufficient
evidence of impact, quality and research contribution. This may be due
to a wide range of characteristics they exhibit; including the semantics
of the citation (Knoth and Herrmannova, 2014), the motives for citing
(Nicolaisen, 2007), the variations in sentiment (Athar, 2014), the
context of the citation (He, Pei, Kifer, Mitra and Giles, 2010), the
popularity of topics, the size of research communities (Brumback, 2009;
Seglen, 1997), the time delay for citations to show up (Priem and
Hemminger, 2010), the skewness of their distribution (Seglen, 1992), the
difference in the types of research papers (Seglen, 1997) and finally
the ability to game / manipulate citations (Arnold and Fowler, 2010).

Impact Factors (IFs)

Scholarly impact is measure of frequency in which an “average article”
has been cited over a defined time period in a journal (Glanzel and
Moed, 2002). Journal citations reports are published in June, every year
by Thomson-Reuters’ Institute of Scientific Information (ISI). These
reports also feature data for ranking the Immediacy Index of articles,
which measure the number of times an article appeared in academic
citations (Harter, 1996). Publishers of core scientific journals
consider IF indicators in their evaluations of prospective
contributions. In Despite there are severe limitations in the IF’s
methodology, it is still the most common instrument that ranks
international journals in any given field of study. Yet, impact factors
have often been subject to ongoing criticism by researchers for their
methodological and procedural imperfections. Commentators often debate
about how IFs should be used. Whilst a higher impact factor may indicate
journals that are considered to be more prestigious, it does not
necessarily reflect the quality or impact of an individual article or
researcher. This may be attributable to the large number of journals,
the volume of research contributions, and also the rapidly changing
nature of certain research fields and the increasing representation of
researchers. Hence, other metrics have been developed to provide
alternative measures to impact factors.


The h-index attempts to calculate the citation impact of the academic
publications of researchers. Therefore, this index measures the scholars
productivity by taking into account their most cited papers and the
number of citations that they received in other publications. This index
can also be applied to measure the impact and productivity of a
scholarly journal, as well as a group of scientists, such as a
department or university or country (Jones, Huggett and Kamalski, 2011).
The (Hirsch) h-index was originally developed in 2005 to estimate the
importance, significance and the broad impact of an academic’s
researcher’s cumulative research contributions. Initially, the h-index
was designed to overcome the limitations of other measures of quality
and productivity of researchers. It consists of a single number that
reports on an author’s academic contributions that have at least the
equivalent number of citations. For instance, an h-index of 3 would
indicate that the author has published at least three papers that have
been cited three times or more. Therefore, the most productive
researcher may possibly obtaining a high h-index. Moreover, the best
papers in terms of quality will be mostly cited. Interestingly, this
issue is driving more researchers to publish in open access journals.


The science of webometrics (also cybermetrics) is still in an
experimental phase. Björneborn and Ingwersen (2004) indicated that
webometrics involves an assessment of different types of hyperlinks.
They argued that relevant links may help to improve the impact of
academic publications. Therefore, webometrics refer to the quantitative
analysis of activity on the world wide web, such as downloads (Davidson,
Newton, Ferguson, Daly, Elliott, Homer, Duffield and Jackson, 2014).
Webometrics recognise that the internet is a repository for a massive
number of documents. It disseminates knowledge to wide audiences. The
webometric ranking involves the measurement of volume, visibility, and
the impact of web pages. Webometrics emphasise on scientific output
including peer-reviewed papers, conference presentations, preprints,
monographs, theses, and reports. However, these kind of electronic
metrics also analyse other academic material (including courseware,
seminar documentation, digital libraries, databases, multimedia,
personal pages and blogs among others). Moreover, webometrics consider
online information on the educational institution, its departments,
research groups, supporting services, and the level of students
attending courses.

Web 2.0 and Social Media

Internet users are increasingly creating and publishing their
content online. Never before has it been so easy for academics to engage
with their peers on both current affairs and scientific findings. The
influence of social media has changed the academic publishing scenario.
As a matter of fact, recently there has been an increased recognition
for measures of scholarly impact to be drawn from Web 2.0 data (Priem
and Hemminger, 2010).

The web has not only revolutionised how data is gathered, stored and
shared but also provided a mechanism of measuring access to information.
Moreover, academics are also using personal web sites and blogs to
enhance the visibility of their publications. This medium improves their
content marketing in addition to traditional bibliometrics. Social
media networks are providing blogging platforms that allows users to
communicate to anyone with online access. For instance, Twitter is
rapidly becoming used for work related purposes, particularly scholarly
communication, as a method of sharing and disseminating information
which is central to the work of an academic (Java, Song, Finin and Tseng
B, 2007). Recently, there has been rapid growth in the uptake of
Twitter by academics to network, share ideas and common interests, and
promote their scientific findings (Davidson et al., 2014).

Conclusions and Implications

There are various sources of bibliometric data, each possess their
own strengths and limitations. Evidently, there is no single
bibliometric measure that is perfect. Multiple approaches to evaluation
are highly recommended. Moreover, bibliometric approaches should not be
the only measures upon which academic and scholarly performance ought to
be evaluated. Sometimes, it may appear that bibliometrics can reduce
the publications’ impact to a quantitative, numerical score. Many
commentators have argued that when viewed in isolation these metrics may
not necessarily be representative of a researcher’s performance or
capacity. In taking this view, one would consider bibliometric measures
as only one aspect of performance upon which research can be judged.
Nonetheless, this chapter indicated that bibliometrics still have their
high utility in academia. It is very likely that metrics will to
continue to be in use because they represent a relatively simple and
accurate data source. For the time being, bibliometrics are an essential
aspect of measuring academic clout and organisational performance. A
number of systematic ways of assessment have been identified in this
regard; including citation analysis, impact factor, h-index and
webometrics among others. Notwithstanding, the changes in academic
behaviours and their use of content marketing on internet have
challenged traditional metrics. Evidently, the measurement of impact
beyond citation metrics is an increasing focus among researchers, with
social media networks representing the most contemporary way of
establishing performance and impact. In conclusion, this contribution
suggests that these bibliometrics as well as recognition by peers can
help to boost the researchers’, research groups’ and universities’
productivity and their quality of research.


Arnold, D. N., & Fowler, K. K. (2011). Nefarious numbers. Notices of the AMS, 58(3), 434-437.

Athar, A. (2014). Sentiment analysis of scientific citations.
University of Cambridge, Computer Laboratory, Technical Report,

Borgman, C. L. (2000). Digital libraries and the continuum of scholarly communication. Journal of documentation, 56(4), 412-430.

Borgman, C. L., & Furner, J. (2002). Scholarly communication and bibliometrics.

Bornmann, L., & Daniel, H. D. (2005). Does the h-index for
ranking of scientists really work?. Scientometrics, 65(3), 391-392.

Bornmann, L., & Daniel, H. D. (2007). What do we know about the h
index?. Journal of the American Society for Information Science and
technology, 58(9), 1381-1385.

Björneborn, L., & Ingwersen, P. (2004). Toward a basic framework
for webometrics. Journal of the American Society for Information Science
and Technology, 55(14), 1216-1227.

Glänzel, W., & Moed, H. F. (2002). Journal impact measures in bibliometric research. Scientometrics, 53(2), 171-193.

Harter, S. (1996). Historical roots of contemporary issues involving self-concept.

He, Q., Pei, J., Kifer, D., Mitra, P., & Giles, L. (2010, April).
Context-aware citation recommendation. In Proceedings of the 19th
international conference on World wide web (pp. 421-430). ACM.

Java, A., Song, X., Finin, T., & Tseng, B. (2007, August). Why we
twitter: understanding microblogging usage and communities. In
Proceedings of the 9th WebKDD and 1st SNA-KDD 2007 workshop on Web
mining and social network analysis (pp. 56-65). ACM.

Knoth, P., & Herrmannova, D. (2014). Towards Semantometrics: A
New Semantic Similarity Based Measure for Assessing a Research
Publication’s Contribution. D-Lib Magazine, 20(11), 8.

Kousha, K., & Thelwall, M. (2007). Google Scholar citations and
Google Web/URL citations: A multi‐discipline exploratory analysis.
Journal of the American Society for Information Science and Technology,
58(7), 1055-1065.

Moed, H. F. (2006). Citation analysis in research evaluation (Vol. 9). Springer Science & Business Media.

Nicolaisen, J. (2007). Citation analysis. Annual review of information science and technology, 41(1), 609-641.

Priem, J., & Hemminger, B. H. (2010). Scientometrics 2.0: New
metrics of scholarly impact on the social Web. First Monday, 15(7).

Seglen, P. O. (1992). The skewness of science. Journal of the American Society for Information Science, 43(9), 628-638.

Seglen, P. O. (1997). Why the impact factor of journals should not be used for evaluating research. Bmj, 314(7079), 497.

Tague-Sutcliffe, J. (1992). An introduction to informetrics. Information processing & management, 28(1), 1-3.

Wilson, C. S. (1999). Informetrics. Annual Review of Information Science and Technology (ARIST), 34, 107-247.

Using Content Marketing Metrics for Academic Impact | Substantia Mea

Saturday, 23 May 2015

Promoting Open Access research with the Internet and social media | Open Science


Promoting Open Access research with the Internet and social media


July 7, 2014
This post is the fifth part of my series ‘How to promote an Open Access book?‘ Although having realized that it may apply to both books and papers, I decided to change the title a little bit.

It is impossible to imagine promotion without the Internet and
especially without social media. Obviously, this also includes promoting
research. However, it is important not to overestimate the significance
of Internet promotion – it can be worthless without the use of other
methods. Therefore, before you start to think about Twitter, Facebook or, make sure that you have thought through all the steps I
have described in my previous posts on book promotion. Especially
networking, discussed in my first entry,
which is unavoidable when talking about efficient strategies in social
media. Social media may support your action conducted in the ‘real
world’ (like attending conferences, participating in workshops,
discussions etc.) but it will not replace them.

A second important truth about social media is that no third party
service offers you full control. One day a post you submitted on a
social media platform or even on your own account might be blocked or
deleted. Services themselves might also be discontinued (who remembers

Create a personal website

It is clever to establish your personal website and to treat it as a
primary place for on-line publication (this is easy thanks to free,
modern content management systems such as WordPress).
Create the website in the domain of your research institution or
university. If you do not have such an opportunity, buy your own domain
(it is not very expensive). All your publications, databases and
research outputs should be first of all submitted there (I hope you have
chosen a publisher who allows it) and then promoted via social media.
It is wise to create a separate page named ‘publications’ on your
website. Remember that the file or article metadata (title, author
etc.) should be visible on the page that is linked to the document. Post
your academic CV on to your website (you can create separate page for


Ok, you are done with the website. Now it is a good time to get your Open Researcher and Contributor ID
(ORCID). ORCID is a persistent and unique number, which distinguishes
you from other researchers, even those with similar names. It is used by
authors and publishers (including De Gruyter) to attribute works to
respective contributors. You can use ORCID to link all your online
publications – your website, blog entries, as well as social media
profiles – to one account, with complete information about your
scholarly works.

Use your website as a blog

Finally, you can use your website as a blog to share information
about your new professional concepts and all promotional events,
discussions and conferences that you are taking part in. Write about
your involvement in the research community, let your blog entries become
a part of your every day social networking. Every time you write about
something, try to inform the people who are involved in this particular
event or discussion, but do not be annoying (social media and especially
Twitter is a perfect tool for letting people know that you have
mentioned them – more below).

Choose Social Media Platforms

Nowadays there are several social media platforms, including some
dedicated only to scientists. In my opinion Twitter, Google Plus,
Facebook, LinkedIn and are the ones that you should
consider when thinking about promoting your work. You probably do not
have enough time to contribute everywhere. Cheer up, nobody has.
Twitter, G+ and Facebook are quite dynamic and are good for continuous
communication with people that follow you. You should choose at least
one of these services to promote your blog posts and to discuss recent
events and issues connected to your work. I think it is wise to choose
the one that you are already familiar with, and that is used by your
co-workers and friends. If you have no personal preferences I recommend
Twitter – it is the most popular among scientists in Western Europe and
USA. Do not forget to link your social media profiles to your website
and your ORCID.

On the other hand, LinkedIn and allow you to submit a
lot of information regarding your career as a researcher and to post
your works. I would recommend doing that, even if it duplicates the
information on your personal website. These services have their own
internal search and recommendation tools and are quite popular, so your
profile here might be more discoverable for some people than your
website. If you decide to create your profile on one of these social
networks do not hesitate and publish as much professional information as
you can and add publication lists (with full text PDFs – if your
agreement with a publisher allows this), your ORCID and your personal

At the very end, I would suggest publishing the final version of your book or paper, including your data, notes, etc. on Figshare.
Figshare is attracting growing popularity due to two important facts:
You can upload data in every format (images, audio, charts, tables, as
far as full text etc.) and each separate piece of content uploaded there
receives its own DOI number, which makes it more discoverable.

Is it a lot of work? Hmm, maybe even too much. Above all, remember to
take care of the quality of your research, then discuss it with
colleagues, find a good publisher, promote your work at conferences,
meetings, seminars, etc., and then if you still have some time left,
think about Internet promotion.

This entry was posted on July 7, 2014 by Witold Kieńć and tagged , , , .

Promoting Open Access research with the Internet and social media | Open Science

It’s ok to be lazy with the Google Scholar Button | Open Science


It’s ok to be lazy with the Google Scholar Button


May 22, 2015
Google comes with a new serviceable solution for
researchers. The Google Scholar Button has become a hit in the past few
weeks. It also almost made me cry with joy. What is so useful about it?

The Google Scholar Button extension is currently available for both Chrome and Firefox browsers,
which means for the majority of Internet users. The add-on has already
been installed by more than 374 thousand Chrome users and 27 thousand
Firefox users (there are plenty of scholars around the world, right?)
What is more, the extension is highlighted among “the hottest” on the
official Firefox add-ons website. The button grabbed almost everyone’s
attention as a tool for speeding up the search of full texts academic
articles in the Google Scholar database, which provides links to
millions of free papers. Although, in my opinion the really nice thing
about the Google Scholar Button is that it makes another step, managing
references, super easy.

I will confide my little secret to you. I hate the moment when I need
to add a reference to my text, but I do not remember all the
biographical data of the work I want to cite. Usually, I only remember
the author’s name, or in the worst case, what the text was about and
some essential keywords. When I was an undergraduate student I did not
have my own computer (yes, it was in 21st century), and I had to go a
library to borrow several books or journals and scan them manually to
find the texts I needed and to create proper references. It got a little
bit better when I started using a PC with Internet connection, and a
lot better when I learned about BibTex (read more here about how to manage a bibliography with BibTex). But the Google Scholar Button, which appeared last month, almost made me cry with tears of joy.

Citing is a piece of cake now!

Let me show you how it works with an example. I write a sentence down
in my notebook that I want to cite, but, oy gevalt! I forgot to write
down the source of the sentence. Then I start my browser, I click on the
Google Scholar Button, which is on my toolbar and type in the small,
elegant window the phrase I want to cite. Google Scholar shows me the
search results in the same window, then I click cite and get a ready
BibTex entry, which I can add to my bibliography file. This way, I have a
properly formatted citation in my text just a few clicks away from the
point when I could hardly remember the name of the author that I wanted
to cite.

When you find the academic text mentioned on the website (the title
of an article, or some phrases cited) and you want to find the full text
of the source article, you can just highlight the text and click on the
button. It will provide you with a link to the full text (if it is
available in the Google Scholar database) and use the “cite” option in
the very same, elegant window.

Google Scholar Button print screen {focus_keyword} It's ok to be lazy with the Google Scholar Button Google Scholar Button

Not the first add-on of this kind

There are other add-ons, which offered similar possibilities before
the launch of the Button (i.e. search in Google Scholar). However, the
GS Button has a really great interface, which makes searching and citing
even quicker than in the case of former extensions of this kind. The
exception is Lazy Scholar,
an extension for Chrome, created by an individual developer, Colby
Vorland, much earlier than the launch of the Button, which has some very
similar features, but in my experience it searches less effectively for
content. Often I find no results when searching for a text with Lazy
Scholar, but I do when using the GS Button (I really do not know the
reason for that). Although maybe Lazy Scholar is seen as a good
alternative for researchers working in other fields.

Is it wise to use the Google Scholar database in research work?

Since there has been a lot of criticism towards Google Scholar and
its limitations as a source for academic work are well known, is it a
good idea to use this database at all? Is Google Scholar Button a nice
addition to a useless service?

Google Scholar indexes everything that is cited in articles or books,
which it treats as “scholarly”, and scholars cite not only other
scholars but also other kinds of literature (as far as press articles,
etc.), which they might use as research material. Website owners may
also have their websites crawled as trusted sources of academic content,
which results in their content being added to Google Scholar. We do not
know anything about the criteria Google Scholar uses for this
procedure. We only know that journalism, opinions, blog posts,
pseudo-science and very low quality academic articles are indexed there
(have a look here for more information). Jeffrey Beall, a well known opponent of open access, recently called Google Scholar “the world’s largest index of junk science”, which immediately resulted in him being accused of hypocrisy, since he is promoting his very own profile on Google Scholar on his website, as I understand, to publicise his research works.

And this is all true about Google Scholar. This database simply
indexes a lot of works (including Beall’s ones). What is wonderful about
it, is that it allows us to quickly search, and immediately access
biographical data and, very often, the full text of an article, which
can otherwise be very hard to find. Almost every good academic article
is indexed in Google Scholar, which make the database an irreplaceable
tool for every scholar. Not every article indexed there is
scientifically sound, and some of them are science fiction. But for me
this is only a reason not to use Google Scholar to count citations, and
to not use it mechanically and uncritically.

To sum it up, I am not surprised that the discussed extension is
installed in around 400 thousand browsers. Google Scholar and its Button
will make the lives of a lot of researchers much easier.

This entry was posted on May 22, 2015 by Witold Kieńć and tagged , , , , .

It’s ok to be lazy with the Google Scholar Button | Open Science