Federal Updates: Please visit the following page for up-to-date information and guidance for navigating recent Executive Orders  See latest information

Using Generative AI (Artificial Intelligence) tools in research

The potential impact of generative AI (GenAI) on research is undeniable. Despite the possibilities for greater efficiencies, researchers should carefully consider a variety of factors as they approach using these tools in their work. GenAI tools such as ChatGPT, Stable Diffusion, Bard, etc. can be applied in many realms of research – idea development, proposal writing, manuscript preparation, simulated data generation and modeling, preparing for conferences, etc. But should researchers use it? What are the guidelines?      

 

Best practices

Use of GenAI tools comes with risks in key areas including information security, plagiarism, fabrication, data privacy, and bias. To minimize the inherent risks associated with use of GenAI tools, Duke researchers should adhere to the following best practices. 

  • Identifiable information should not be inputted into GenAI tools. Unless stated in an executed research contract with approved information security controls, information with identifiers should not be uploaded into AI tools. This can include information collected in a research project, information gleaned from the medical record, student information, employee information, and more.
  • Don’t presume accuracy. Output provided by GenAI tools may be biased, inaccurate, contain copyrighted material, or can be entirely false (i.e., hallucinated). It is the responsibility of researchers and scholars at Duke to thoroughly review any AI-generated material before its use.
  • Let others know if you use AI tools. Duke researchers employing GenAI tools should always provide attribution. Treat the AI as the author and cite appropriately: "Text generated by [the AI tool]. Date content was generated. URL for the tool.” Various publishers may have specific guidelines on how AI can and cannot be used, and whether AI can be cited as an author. Review author guidelines carefully.  

 

Duke AI Suite

Duke University provides a suite of AI platforms designed to make AI more accessible, secure, and practical for students, staff, and researchers. The suite includes ChatGPT, DukeGPT, MyGPT Builder and the AI Gateway, for advanced AI exploration.

Use of DukeGPT

DukeGPT provides the Duke community with a secure, university-managed platform to explore and compare advanced AI models. Combining on-prem open-source options with cloud-based foundation models, it ensures maximum privacy and robust data protection, offering tailored resources and tools for learning, research, and productivity. DukeGPT is not to be used with PHI. Access DukeGPT here.

Use of Microsoft Copilot 

Copilot is powerful, built on Large Language Models (LLMs) such as GPT-4, or advanced tools designed to predict and generate text based on your internal data (emails, reports, presentations, articles, etc.). Whether using the Copilot app or accessing it through a web browser, keep Duke data secure by signing in with your NetID. Duke’s OIT has additional guidance and information on using Copilot for your work. 

Discover the resources here and visit often for updates

Resource Topics Covered 
AI and Academic Research Town Hall (April 25, 2023) 
  • AI in Health-related Research (predictive and generative)
  • Imaged-based research
  • Regulatory considerations
  • Trustworthiness and AI  
AI and Teaching at Duke  
  • Guidance for using generative AI in teaching
  • Guide to writing AI policies
  • AI Detection software  
AI at Duke
  • Advancing AI research
  • Addressing ethical challenges posed by AI
  • Shaping the future of AI in teaching and learning
AI Health Events 
  • AI Health Seminar Series
  • Workshops and Studios 
ABCDS 
  • Algorithm-Based Clinical Decision Support for Duke Health 
Generative AI interest group in Teams 
  • MS Teams group for people interested in generative AI in healthcare.  
ICMJE Guidelines 
  • Author guidelines on use of AI-assisted technology  
NIH: Artificial Intelligence in Research: Policy Considerations and Guidance
  • Research Participant Protections
  • Data Management and Sharing
  • Health Information Privacy
  • Licensing, Intellectual Property, & Technology Transfer
  • Peer Review
  • Biosecurity and Biosafety

NSF notice 

NIH notice 

  • Notice to research community on use of generative AI technologies in the merit/grant review process 
Guidance from Duke Health Technology System
  • Appropriate use of LLMs in Duke clinical activity

Do you know of additional resources of use to the Duke research community? Contact researchinitiatives@duke.edu with ideas.