Volume XLVIII, Number 2

Fall 2017

From the Editor's Desk

Timothy L. Linker
High Point University

The Journal of Research Administration ( Journal) is the premier scholarly resource addressing excellence in research management, administration and development of the profession. The Journal serves as a pathfinder for how to effectively grow and manage your research enterprise. As our profession evolved and spread throughout the globe over recent decades, it took on a span of tasks that requires distinctive skill sets. This issue of the Journal reflects the breadth of duties that have now become the norm. Wells and co-authors, in their paper entitled “Allocation of R&D Equipment Expenditure Based on Organization Discipline Profiles,” offer a new approach for determining the distribution of research and development equipment funding across large organizations. The authors compare equipment benchmarking data from the National Science Foundation (NSF) to comparable indicators from the Commonwealth Scientific and Industrial Research Organization, which is the Australian equivalent to the NSF. Dr. Dwayne Lehman, in his article “Organizational Cultural Theory and Research Administration Knowledge Management” applies cultural theory as a lens to interrogate the research administration community of practice. Lehman identifies four factors essential to promoting a learning culture in our community of practice and higher education. Liberale and Kovach offer a case study in applying Lean Six Sigma methodology to the work of an Institutional Review Board (IRB) in an effort to reduce the time of review and decision-making in their article “Reducing the Time for IRB Reviews: A Case Study.

In the article by Wiebe and Maticka-Tyndale, entitled “More and Better Grant Proposals? The Evaluation of a Grant-Writing Group at a Mid-Sized Canadian University,” they offer a case study on the effectiveness of implementing Dr. Robert Porter’s approach to stimulating proposal development. Using Dr. Porter’s approach, the authors saw significant increases in both proposal submission and funding rates. Hottenstein, in her article “Protecting the Teaching and Learning Environment: A Hybrid Model for Human Subject Research Public Policy Implementation,” examines the intersection of robust human subject compliance and enabling undergraduate research through Ripley’s Model of the Policy Process. Hottenstein’s findings indicate that a hybrid model offers a reliable compliance mechanism to encourage undergraduate research. In the article “Using Competencies to Transform Clinical Research Job Classifications,” Brouwer and co-authors outline a process and tools developed at Duke University for classifying and remapping clinical trials staff.

As always, I want to thank the Journal’s Deputy Director, Dr. Nathan Vanderford, and the editorial board for their outstanding efforts. Your Journal team works hard to bring you the  best research in our field. If you will be at the SRA International annual meeting in Vancouver, British Columbia in October 2017, please consider attending the no-cost, Journal-provided learning lab, Stepping Stones to Becoming a Peer-Reviewed Journal Author. This three-hour lab will provide participants with an overview of the Journal’s peer-review process and how to prepare a manuscript. This will be on the morning of Sunday, October 15, 2017. Additionally, the Journal will offer a no-cost webinar on scholarly writing on March 9, 2017. Please send an email to the address below if you would like more information. Finally, if you are a non-SRAI member and wish to have the Journal delivered to you via email, please send a message with your name and institution to journal@srainternational.org.

Abstracts

Using Competencies to Transform Clinical Research Job Classifications

Authored by:
  • Rebecca Namenek Brouwer
    Duke University
  • Christine Deeter
    Duke University
  • Deborah Hannah
    Duke University
  • Terry Ainsworth
    Duke University
  • Catherine Mullen
    Duke University
  • Betsy Hames
    Duke University
  • Heather Gaudaur
    Duke University
  • Tara McKellar
    Duke University
  • Denise C. Snyder
    Duke University

The field of clinical research has changed considerably in the past 20 years. As the workinthis realmhas come to embodyfarmorethanthepursuit of improvedpatient care, this has meant that staff supporting the research are asked to take on additional responsibilities, learn new processes, and be continuously educated on modernized policies and procedures. To address the increased responsibilities and complexities of work, Duke University School of Medicine leadership agreed that an overhaul of job descriptions for clinical research professionals was needed. A working group was created, assembling administrative leaders, human resources professionals, and clinical research subject matter experts. The Clinical Research Professionals Working Group  (CRPWG)  aimed  to  simplify  the  number  of  job classifications at Duke from approximately 80 to 12 and utilize a competency-based approach to professionalize the clinical research professionals working environment. The Joint Task  Force  for Clinical Trials  Competency ( JTFCTC) developed draft competencies that were used as the foundation to develop a tool that helped define job descriptions and map incumbent employees into the new jobs. Almost 600 employees were mapped using the competency-based tool. This paper describes the processes used to develop the competency- based tool and map incumbents, and provides the results and lessons learned of the mapping. A strong workforce of clinical research professionals will enable higher quality research and ultimately lead to better patient care and health outcomes.

Protecting the Teaching and Learning Environment: A Hybrid Model for Human Subject Research Public Policy Implementation

Authored by:
  • Kristi N. Hottenstein, Ph.D.
    University of Michigan-Flint

Regulations for research involving human subjects have long been a critical issue in higher education. Federal public policy for research involving human subjects impacts institutions of higher education by requiring all federally funded research to be passed by an Institutional Review Board (IRB). Undergraduate research is no exception. Given the literature on the benefits of undergraduate research to students, faculty, and institutions, how human subject research public policy is being implemented at the undergraduate level was a significant gap in the literature because how these public policies are implemented impacts undergraduate research. This qualitative, single-case study examined the human subject research policies and practices of a selective, Mid-western, Council on Undergraduate Research institution. The purpose of the study was to determine how this institution implemented human subject research public policy to benefit its students. This institution used a hybrid approach of public policy implementation that met federal requirements while capitalizing on the role local actors can play in the implementation process. This model resulted in a student-friendly implementation emphasizing various learning outcomes and student mentoring. Although there is considerable research and public discussion on the negative aspects of IRBs, if approached in a manner that embraces student learning, the IRB experience can be an extremely beneficial aspect of the institution’s learning environment.

Reducing the Time for IRB Reviews: A Case Study

Authored by:
  • Andrea Pescina Liberale
    University of Houston
  • Jamison V. Kovach
    University of Houston

Research activities often involve enrolling human subjects as volunteers to participate in research studies. Federal regulations mandate that research institutions are responsible for protecting the ethical rights and welfare of human subjects from research risks. This is usually accomplished by requiring approval of research protocols by an institutional review board (IRB) through a review process that is often complicated and time-consuming. The aim of this research was to reduce the time to obtain IRB approval/denial decisions for research protocols. Through a case study, this research addressed this issue within a leading public research university using the Lean Six Sigma methodology, a structured, problem- solving approach for improving process performance. Analyzing the IRB review process and implementing solutions to address the root cause(s) of lengthy processing times helped to streamline this process, which enhanced investigators’ ability to conduct their research in a timely manner, while also ensuring compliance with federal regulations for human subject research.

Organizational Cultural Theory and Research Administration Knowledge Management

Authored by:
  • Dwayne W. Lehman
    Carnegie Mellon University

The administration and management of sponsored projects spans many levels within an institution of higher education. Research administration professionals require an operational understanding of a complex and intertwined set of disciplines that include project management, finance, legal, ethics, communication, and business acumen. The explicit knowledge needed for research administration is visible in work processes, policies, procedures and organized knowledge repositories. The implicit, or tacit knowledge required for the profession is much more difficult to externalize, codify, store and share. The management  of this knowledge is greatly affected by the culture of the organization where the person works and the research administration community of practice. By applying organizational culture theory to the research administration profession and exploring shared artifacts, espoused beliefs and values, and basic underlying assumptions, barriers and opportunities for knowledge management initiatives are realized. Creating and sustaining a knowledge- sharing community involves establishing knowledge leaders in organizations that exhibit the ideals, beliefs and principles of the profession, allocating opportunities for research administration professionals to communicate and share, utilizing dynamic information systems, and establishing metrics for knowledge management initiatives.

More and Better Grant Proposals? The Evaluation of a Grant-Writing Group at a Mid-Sized Canadian University

Authored by:
  • Natasha G. Wiebe
    University of Windsor
  • Eleanor Maticka-Tyndale
    University of Windsor

Obtaining external funding has become increasingly difficult for Canadian researchers in the social sciences and humanities. Our literature review suggests that grant-writing groups and workshops make an important contribution to increasing both applications for external funding and success in funding competitions. This article describes an 8-month grant-writing group for 14 social scientists in a mid-sized Canadian university. The goal was to increase applications and successes in funding competitions. The group integrated several strategies perceived by Porter (2011b) to encourage more and better grant proposals: offering “homegrown” workshops that were ongoing rather than occasional, sharing successful proposals, coaching and editing, bringing together emerging researchers with established ones, and placing participants in reviewers’ shoes. These strategies were combined in a series of monthly sessions that required participants to write each section of a grant proposal and share it with others for feedback. Participants perceived this approach to work well; it appeared to provide useful feedback and examples, and develop a sense of accountability and community. The number of applications submitted for funding increased 80% from the funding cycle just prior to the group (2013- 2014) to the funding cycle during or immediately after the group (2015-2016). The rate of success in obtaining funds from internal and external grant submissions increased from 33% to 50% over this same time period. The greatest increase in submissions and success were experienced by emerging and alternative academic researchers. From their program evaluation, authors conclude that grant-writing groups are a useful way to build researcher confidence and commitment to submitting proposals to funding competitions and contribute to success, especially for researchers with limited experience in such competitions.

Allocation of R&D Equipment Expenditure Based on Organisation Discipline Profiles

Authored by:
  • Xanthe E. Wells
    CSIRO
  • Nigel Foster
    CSIRO
  • Adam Finch
    CSIRO
  • Ian Elsum
    Australian National University

Sufficient and state-of-the-art research equipment is one component required to maintain the research competitiveness of a R&D organisation. This paper describes an
approach to inform more optimal allocation of equipment expenditure levels in a large and diverse R&D organisation, such as CSIRO. CSIRO is Australia’s national science agency, is comprised of individual research units and conducts R&D across many disciplines. CSIRO’s research equipment expenditure allocations have been to some extent based on both previous years’ expenditures and current operating performance. In an effort to refine this process, a method was developed to consider the difference in expenditure profiles across research areas and calculate a benchmark (or expected level) for research units within CSIRO. The approach also allowed CSIRO to compare its actual equipment expenditure levels to benchmark (or expected) levels derived from expenditure data from US academic institutions. This comparison found that CSIRO’s overall level of expenditure was below the benchmark levels and assisted in guiding the allocation of available funds more fairly across research units with different equipment needs.

Several datasets were used for this analysis. R&D equipment expenditure patterns across disciplines are available for USA academic institutions and the differences in levels between disciplines was calculated. For example, in the Biological Sciences equipment expenditure is 3% of total R&D expenditure, whereas in Physics it is 3-fold greater. Using research publication subject classifications, discipline profiles were constructed for the entire CSIRO and each of its units. Publication subject categories were also mapped to the research fields used by the USA source. These datasets were combined to determine an overall benchmark value for CSIRO and each unit. The value varied by a factor of 2.2 fold across individual CSIRO units. Actual equipment expenditure for CSIRO was determined using internal finance records. This was compared to the benchmark levels and some units were below the calculated benchmark values and a few were close to or above.

The results of this study were considered by CSIRO managers when deciding equipment expenditure allocations and the implications of the findings for the organisation are discussed. Furthermore, it was found that there are very few studies on research equipment expenditure readily available and it is hoped that this study will encourage further discussion and research on this topic.