Outlook: Newsletter of the Society of Behavorial Medicine

Summer 2023

Leveraging Artificial Intelligence (AI) to Enhance your Productivity: Current Considerations with ChatGPT

Rebecca E. Lee, PhD, FSBM; Angela J. Fong, PhD; Rodney P. Joseph, PhD; Erin L. Van Blarigan, ScD; & the Physical Activity SIG

Artificial intelligence (AI), and more specifically, “chatbots”, are popping up all over physical activity science. A recent systematic review concluded that AI interventions are a promising strategy for delivering intervention content, engaging participants in conversations, and persuading participants to do more physical activity1. One chatbot, ChatGPT, has been gaining momentum. Of course, there is no substitute for the experienced human brain, and AI is only as good as the information fed into it, but chatbots might be a helpful tool for moving some of the day-to-day of behavioral science along more efficiently. Below are some examples:

  • Many research labs are already successfully using ChatGPT to help draft intervention and social media content, such as text messages, newsletters, and blogs. As with all AI generated content, these outputs need to be double checked by a human prior to posting. Nevertheless, this is a great time saver for busy researchers who often have limited time to continuously generate content for a lay audience.
  • If an intervention concept needs to be fleshed out or checked for completeness, try asking a chatbot. Biswas recommends using ChatGPT as an information source in public health2. For example, ask the chatbot to list worksite-based strategies to promote physical activity. Although its response might be a too generic or unbalanced, it can help researchers refine questions and help generate ideas, providing opportunities for exploration and expansion.
  • If ChatGPT provides information that is different from what you are presenting in your work, it is a good opportunity to double check and verify. For example, a search on population levels of physical inactivity in ChatGPT may return a number that doesn’t match your sources. If this occurs, it may be that more recent data have become available, and Chat GPT has, by virtue of its ability to catalog vast amounts of information, more updated data than you do. It is also possible the chatbot is wrong! Make sure to follow-up on any facts to find a valid reference.
  • It often happens that many papers and grant proposals are written about the same study using the same method—which can make it tricky to avoid plagiarizing oneself. Chatbots can help with this too. You can feed in your original paragraph into the chatbot, and it will help with a rewrite. Of course, review the AI’s interpretation carefully, and double check any references. Ultimately, chatbots like ChatGPT currently lack the nuance and critical thinking that is necessary for presenting a clear and interesting portrayal of science3.

In sum, chatbots in general, and ChatGPT specifically, have the capacity to greatly enhance productivity. Like all technology, they need human oversight and expert interpretation. Caution is warranted: some authors have noted that ChatGPT citations are not correct, and in some cases, may even be completely made up!4 Similarly, academic journals may have guidelines related to the use of AI in all aspects of writing – be sure to double check that using AI is aligned with these guidelines.5 A critical reminder to never share confidential information with a chatbot—all information that is fed into a chatbot is instantly publicly available. For example, it is not appropriate to use AI for grant reviews.6 AI is the stuff of science fiction rapidly becoming science fact. (Editors’ Note: For SBM’s journals, the use of AI (for example, to help generate content, write code, or analyze data) must be disclosed both in cover letters to editors and in the Methods or Acknowledgements section of manuscripts. Neither symbolic figures such as Camille Noûs nor natural language processing tools driven by AI such as ChatGPT qualify as authors, and the publisher will screen for them in author lists.)

References

  1. Oh YJ, Zhang J, Fang ML, Fukuoka Y. A systematic review of artificial intelligence chatbots for promoting physical activity, healthy diet, and weight loss. Int J Behav Nutr Phys Act. 2021 Dec 11;18(1):160. doi: 10.1186/s12966-021-01224-6. PMID: 34895247; PMCID: PMC8665320.
  2. Biswas SS. Role of Chat GPT in Public Health. Ann Biomed Eng. 2023 May;51(5):868-869. doi: 10.1007/s10439-023-03172-7. Epub 2023 Mar 15. PMID: 36920578.
  3. Anderson N, Belavy DL, Perle SM, Hendricks S, Hespanhol L, Verhagen E, Memon AR. AI did not write this manuscript, or did it? Can we trick the AI text detector into generated texts? The potential future of ChatGPT and AI in Sports & Exercise Medicine manuscript generation. BMJ Open Sport Exerc Med. 2023 Feb 16;9(1):e001568. doi: 10.1136/bmjsem-2023-001568. PMID: 36816423; PMCID: PMC9936276
  4. Anderson N, Belavy DL, Perle SM, Hendricks S, Hespanhol L, Verhagen E, Memon AR. AI did not write this manuscript, or did it? Can we trick the AI text detector into generated texts? The potential future of ChatGPT and AI in Sports & Exercise Medicine manuscript generation. BMJ Open Sport Exerc Med. 2023 Feb 16;9(1):e001568. doi: 10.1136/bmjsem-2023-001568. PMID: 36816423; PMCID: PMC9936276.
  5. Hosseini M, Rasmussen LM, Resnik DB (2023) Using AI to write scholarly publications. Accountability in Research.2023 Jan 25;1-9. doi: 10.1080/08989621.2023.2168535
  6. Lauer M, Constant S, Wernimont A. CSR Review Matters Blog [Internet]. Bethesda, MD: U.S. Department of Health & Human Services, Center for Scientific Review; 2023 Jun 23 [cited 2023 Jun 27]. Available from: https://www.csr.nih.gov/reviewmatters/2023/06/23/using-ai-in-peer-review-is-a-breach-of-confidentiality/