Monday, September 19, 2022

Why don't some post PDFs by marginalized scholars? Altmetrics.

Image: 
An illustration of a hand holding a smartphone, and a number of people connected with lines and word bubbles to the content the person is looking at on the smartphone.

Gerardo Muñoz’s graduate school professor did not post a PDF of the assigned author’s paper on Canvas, the class learning management system. The instructor explained that they were reading works by marginalized scholars and that these authors would benefit if students downloaded their work so that the resulting metrics could provide an indication of scholarship impact. Muñoz thought that was noteworthy, so he tweeted about it, and that tweet has since racked up nearly 50,000 likes and thousands of retweets.

Many of the replies to Muñoz’s tweet voiced support, including “BEAUTIFUL,” “Data is capital,” “Normalize THIS” and “Damn.” Others expressed concern about the professor’s policy, including some who wrote, “an amazing way to have your students not read said scholarship of marginalized thinkers,” or “Oh how I hate living in an algorithm-driven society.”

At the heart of Muñoz’s Twitter storm are altmetrics, or alternative metrics, a relatively new measure of research impact that tracks how scholarly works are discussed, shared, read and reused online. This may include numbers of page views and downloads, shares and likes on social media, and mentions in online news, forums and policy documents.

Scholars have long relied on a variety of metrics to understand the relative value of their research. A journal’s impact factor, for example, is often thought to confer prestige to the journal’s authors. Likewise, an author’s h-index combines information about the author’s number of papers with the number of citations to suggest value. Authors also count citations as a stand-alone measure.

Proponents argue that altmetrics may supplement traditional scholarly metrics in important ways, especially given the limitations of traditional scholarly metrics. Citations, for example, often take years to accumulate. Journal impact factors do not provide granular information about individual articles and may rely on flawed statistical arguments, at least according to some.

In contrast, altmetrics arrive promptly, in real time, and provide insight into how research influences societal conversations, thought and behavior. Such feedback may be especially meaningful to underrepresented and early-career researchers. But altmetrics are not a panacea, as they are susceptible to manipulation and do not always measure quality.

“Downloads matter,” said Tia Madkins, assistant professor in the University of Texas at Austin’s department of curriculum and instruction. “We talk about supporting scholars of color and what it means for people to know that their work is valued … People don’t always cite your work, so some universities and colleges have become more expansive in their views of how your work is being used.”

Altmetrics Help Tell a Story

Since scholarship now exists in media beyond print materials, some researchers now list altmetrics on their CVs and grant applications, especially when their colleges or funders indicate that they value that information.

“This ties into a bigger conversation about needing to modernize promotion and tenure criteria, because right now—how do I put this?—they’re based on certain metrics that maybe aren’t actually as informative as they were sold or are being used as proxies for something that they actually don’t represent,” Megan O’Donnell, a librarian at Iowa State University, said.

To be sure, altmetrics and traditional metrics are sometimes correlated. For example, a 2018 study identified a strong positive correlation between communication about scientific research on social media that happened within weeks of publication and citation rates that accumulated over months and years. The researchers concluded that increasing the profile of one’s work online early may predict an increase in future citations. In another study from 2018, researchers found that early altmetrics that counted online mentions of research products, such as mentions in mainstream news, can predict future citation counts.

Tools for Gathering Altmetrics

Altmetrics have emerged alongside the rise of open-access research. Many university librarians now point researchers to the Metrics Toolkit, a free resource developed “to help scholars and evaluators understand and use citations, web metrics, and altmetrics in the evaluation of research.” The tool kit’s editorial board is made up of experts versed in research impact measures who serve for one year at a time.

The PLoS Impact Explorer, for example, provides real-time metrics on “which articles are seeing the most buzz from social media sites, newspapers and in online reference managers.”

The free Altmetric Bookmarklet tool allows researchers to get metrics for an article they’ve published. Researchers may also embed code on webpages to get an “attention score” reported in the center of a colorful “doughnut.”

Altmetrics’ Limitations

Altmetrics share some limitations with traditional scholarly metrics. For example, a work may have a high number of engagements for negative reasons, such as clicks or downloads that follow news of fraudulent research. Also, both alternative and traditional metrics may reflect the implicit bias of a community.

But some limitations are unique to altmetrics. The broad term has no single definition, which means that different scholars’ altmetrics can rarely be compared. Also, altmetrics can sometimes paint an incomplete picture of a scholar’s work. For example, when a PDF of a paper is shared via some technology, such as email or on a course learning management platform, those page views are not counted. Muñoz’s graduate school professor was seeking to avoid this problem when she asked the students to search for and download the article on their own—skills that can foster students’ abilities to use research tools. Solving one problem, however, can sometimes introduce another.

“For authors, providing a link is better [than providing a PDF] because then every single downloaded visit and read is tracked,” O’Donnell said. “For students, a direct access to the PDF ensures they’re reading the right material.” A professor might provide a link to the article, assuming the library has a subscription to the journal. If the library does not have a subscription, instructors may have few options other than posting PDFs, assuming they obtained permission.

That said, altmetrics overcome some limitations of traditional metrics. Altmetrics at least provide understanding of how many readers viewed the article, even if the picture is sometimes incomplete. Journal impact factor, for example, does not provide that information at all.

Still, no metric—traditional or alternative—is perfect.

“Almost everything can be gamed,” O’Donnell said. “Automatic bots can go around and download everything,” inflating a count. “There are also ways to game traditional metrics.” Citation cartels—groups of colleagues who work together to cite each other’s works—have helped researchers game traditional metrics, O’Donnell said. At least altmetrics that look too good to be true can often be traced, she reasoned. For example, a scholar who claims to have tens of thousands of engagements on a tweet should be able to produce the tweet with ease. Also, downloading bots can be exposed.

Though altmetrics may help tell a story about the societal impact of a scholar or their work, few claim they should be used in isolation.

“A more rigorous peer-reviewed citation is still the most trustworthy measurement of the impact and quality of the work,” said Hui Zhang, a digital services librarian at the University of Oregon. But Zhang is glad to see that altmetrics “are finally getting some attention,” especially as they provide more opportunities to make the work of underrepresented and early-career researchers more visible.

“We’re in an age where people aren’t using our work in the same ways that were popular even 10 years ago,” said Madkins of UT Austin. She is pleased that her institution asks faculty members to include measures of scholarship used by practitioners or other stakeholders in their tenure and promotion portfolios. “Knowing that people outside of academia are using my work is really valuable, because that’s who matters.”

Books and Publishing
Technology
Image Source: 
elenabs/Getty Images
Image Caption: 
Altmetrics track how scholarly works are discussed, shared, read and reused online. This may include numbers of page views and downloads, shares and likes on social media, and mentions in online news, forums and policy documents.
Is this diversity newsletter?: 
Newsletter Order: 
0
Disable left side advertisement?: 
Is this Career Advice newsletter?: 
Magazine treatment: 
Display Promo Box: 
Live Updates: 
liveupdates0
Most Popular: 
3
In-Article Advertisement High: 
6
In-Article related stories: 
9
In-Article Advertisement Low: 
12


from Inside Higher Ed | News https://ift.tt/yjehwsf

No comments:

Post a Comment

CONNECTED WITH US

Carousel Display

FlatBook

English Library24 - The Daily Higher Education News in the World




Comments

Contact Us

Name

Email *

Message *