- Browse by Author
Browsing by Author "Guiliano, Jen"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item Disrupting Hierarchies of Evaluation: The Case of Reviews in Digital Humanities(Knowledge Futures, 2022-11-15) Risam, Roopika; Guiliano, Jen; History, School of Liberal ArtsThis essay discusses how the editors of the journal Reviews in Digital Humanities have developed a people-first approach to peer review: community-centered peer review policies, workflows, and practices intended to address the gap in evaluation of digital scholarship. This work offers a model for disrupting hierarchies of evaluation that position senior, tenured professors as the appropriate gatekeepers of “quality” for digital scholarship and instead reframes the notion of “scholarly community” to recognize that expertise lies beyond the professoriate — particularly when evaluating public-facing scholarship. The essay further offers an example of how to create a community-driven peer review culture that brings in graduate students, librarians, archivists, public humanities workers, curators, and more to assess scholarship. In doing so, it articulates a vision for disrupting conventional notions of “expertise” and, in turn, hierarchies of evaluation for scholarship within the academy. What does it mean to develop and implement a people-first peer review system? This question lies at the heart of our work founding and running Reviews in Digital Humanities, an open-access journal published on PubPub that is dedicated to peer reviewing digital scholarly outputs (e.g., digital archives, exhibits, data sets, games) based on humanities research. Reviews responds to a gap in evaluation at the intersection of technology and the humanities, offering researchers who produce scholarship in genres other than traditional monographs, journal articles, and book chapters the opportunity to seek the imprimatur of peer review and external vetting of their work. From our commitment to creating a humane system of peer review that supports scholars as people, to the design of our peer review workflow, to the selection of reviewers who participants, Reviews disrupts hierarchies of evaluation in the academy and aims to consistently remind our scholarly community that we are all people first. The journal emerged from conversations between us, based on our experiences running peer review mechanisms for digital humanities conferences together. Through this work, we recognized a lack of consensus over how to peer review digital scholarly outputs. Despite the fact that colleagues in digital humanities create digital scholarship, there appeared to be no shared sense of how to evaluate digital scholarship created by others. Although professional organizations like the Modern Language Association (MLA) and American Historical Association (AHA) have invested time in developing guidelines, these have yet to be operationalized in evaluation. In addition to the challenges of conference abstract reviewing, there has also been a lack of outlets for peer review of digital scholarly projects themselves. We further observed that those most negatively affected by this lack of consensus were scholars in areas such as African diaspora studies, Latinx studies, Native and Indigenous studies, Asian American studies, and other areas that have been systematically marginalized in the academy. As many in these fields are also often scholars of color and/or Indigenous scholars, the peer review problems for digital scholarship compound harm in multiple ways: scholars in these areas already have a burden of demonstrating the legitimacy of their research, which is further compounded by the lack of an evaluation structure for the digital scholarship they create. This, in turn, has impacts on how their work is (or isn’t) valued in hiring, reappointment, tenure, and promotion. Recognizing that the many facets of these scholars’ identities as people has a direct impact on their professional lives, we identified the lack of peer review as a clear deterrent to building up digital scholarship in these underrepresented fields in digital humanities.Item Final Report of IUPUI Public Access to Research Data Working Group(2022-04) Baich, Tina; Ben Miled, Zina; Berbari, Nick; Chu, Gabe; Coates, Heather; Erkins, Esther; Friesen, Amanda; Guiliano, Jen; Han, Jiali; Organ, Jason; Yoon, AyoungIn light of the movement towards greater access to and transparency in research, the Association of American Universities (AAU) and Association of Public and Land-grant Universities (APLU) convened gatherings in October 2018 and February 2020 to provide a venue for learning, sharing, and planning (campus roadmaps) to support research universities in creating and implementing strategies and systems to provide public access to research data. At the request of Vice Chancellor of Research Janice Blum, Heather Coates and Tina Baich attended the February 2020 gathering. The primary goals of the 2020 convening were to identify best practices, where they exist, to develop a Guide to Accelerating Public Access to Research Data at Academic Institutions (now available here), and to develop a strategic plan for AAU and APLU to drive future actions. As a result, Coates and Baich proposed that the Vice Chancellor for Research convene a working group to further this work on the IUPUI campus. Vice Chancellor Blum charged the Public Access to Research Data Working Group (PARDWG) with investigating the current landscape of data sharing at IUPUI and creating a plan to increase awareness and provide education of campus stakeholders around public access to research data. Working Group members were invited to ensure broad representation of disciplines, acknowledging that data sharing happens differently in different disciplines.