• All
  • Article Title
  • Abstract
  • Keywords
  • Author
  • Institution
  • Open Access
    Letter to the Editor

    AI in biomedical science: innovations, challenges, and ethical perspectives

    Aynur Aliyeva 1,2*

    Explor Digit Health Technol. 2025;3:101144 DOI: https://doi.org/10.37349/edht.2025.101144

    Received: November 25, 2024 Accepted: April 01, 2025 Published: April 08, 2025

    Academic Editor: J. G. Manjunatha, Mangalore University, India

    Abstract

    Artificial intelligence (AI) increasingly influences biomedical scientific writing and clinical practice. The recent article by Fornalik et al. (Explor Digit Health Technol. 2024;2:235–48. doi: 10.37349/edht.2024.00024) explores AI’s capabilities, challenges, and ethical considerations in scientific communication, particularly highlighting tools like ChatGPT and Penelope.ai. This commentary aims to reflect on and expand the key themes presented by Fornalik et al. (Explor Digit Health Technol. 2024;2:235–48. doi: 10.37349/edht.2024.00024), emphasizing AI’s role in auditory healthcare, particularly in otolaryngology and auditory rehabilitation. The discussion is based on a critical review and synthesis of recent literature on AI applications in scientific writing and auditory healthcare. Key technologies such as generative AI platforms, machine learning algorithms, and mobile-based auditory training systems are highlighted. AI has shown promising results in enhancing manuscript preparation, literature synthesis, and peer review workflows. In clinical practice, adaptive AI models have improved cochlear implant programming, leading to up to 30% gains in speech perception accuracy. Mobile apps and telehealth platforms using AI have also improved listening effort, communication confidence, and access to care in remote settings. However, limitations include data privacy concerns, lack of population diversity in datasets, and the need for clinician oversight. AI presents transformative opportunities across biomedical science and healthcare. To ensure its responsible use, interdisciplinary collaboration among clinicians, researchers, ethicists, and technologists is essential. Such collaboration can help develop ethical frameworks that enhance innovation while safeguarding patient well-being and scientific integrity.

    Keywords

    Artificial intelligence, biomedical science, auditory rehabilitation, neurotechnology, ChatGPT, healthcare ethics

    Introduction

    I am writing to express my appreciation for the recently published article “Rise of the machines: trends and challenges of implementing AI in biomedical scientific writing”, by Fornalik et al. [1]. The manuscript explores the advancements, challenges, and ethical considerations of artificial intelligence (AI) in biomedical writing. Its discussion of AI’s capabilities, such as text generation, literature synthesis, and enhanced peer-review processes, is timely and thought-provoking.

    Expanding on key themes

    This discussion resonates with key aspects of AI implementation that have emerged in recent literature. For instance, the author emphasizes the integration of ChatGPT and its potential role in transforming scientific communication. Similarly, the study identifies critical challenges like plagiarism detection and the risks of biased outputs, which echo broader concerns within the medical community [2]. The authors highlight the growing prevalence of ChatGPT as a generative AI tool, underscoring its ability to generate, paraphrase, and refine text effectively. Fornalik et al. [1] discuss tools like Penelope.ai, which streamline manuscript reviews, check compliance with journal standards, and elaborate on AI’s limitations in maintaining the scientific rigor and ethics required for publishing [3].

    Clinical integration in otolaryngology and real-world impact

    Beyond the manuscript, the findings align with others’ contributions to AI in otolaryngology. The ethical challenges of integrating AI tools like ChatGPT into medical research and patient care were analyzed. These tools show potential in patient education and manuscript preparation [1, 36].

    The study by Fornalik et al. [1] highlights the transformative potential of AI in biomedical scientific writing and its implications across various facets of healthcare and research. This perspective aligns with recent advancements in integrating AI into otolaryngology and auditory rehabilitation, where AI has demonstrated its capacity to enhance clinical and technological practices [13].

    One significant intersection of AI and biomedical innovation lies in integrating neurotechnology and medical devices—particularly in auditory rehabilitation. Recent advancements have seen the application of AI algorithms in optimizing cochlear implant programming through adaptive learning models and neural network-based sound processing. These approaches enable real-time customization of auditory input based on patient-specific hearing profiles, leading to measurable improvements in speech perception and auditory comprehension. For instance, studies utilizing deep learning techniques have demonstrated up to a 25–30% improvement in word recognition scores among post-implantation users, especially in noisy environments [5, 7, 8].

    Moreover, AI-powered mobile applications for auditory training now incorporate personalized feedback loops, voice recognition, and gamified exercises to engage users and enhance neuroplasticity. These tools not only offer accessible rehabilitation for remote or underserved populations but have also shown statistically significant gains in patient-reported listening effort and communication confidence. Similarly, telehealth platforms and AI-integrated social media applications provide continuous monitoring and adjustment, further reinforcing their practical utility. This convergence of AI and auditory healthcare directly complements the themes addressed by Fornalik et al. [1], particularly regarding AI’s role in streamlining clinical workflows, enhancing accessibility, and reinforcing patient-centered innovation in biomedical science [1, 9, 10].

    Limitations

    Despite its promise, the implementation of AI in auditory healthcare faces limitations such as data privacy concerns, limited generalizability across diverse patient populations, and the need for clinician oversight to validate algorithmic decisions. Current models often rely on datasets that may not capture the complexity of real-world auditory disorders. Future advancements will likely focus on integrating multimodal AI systems with brain-computer interfaces and wearable devices to enable fully personalized, adaptive auditory rehabilitation solutions [1113].

    Conclusion and forward perspective

    Furthermore, the application of generative AI tools in postoperative care highlights the value of AI in delivering accurate, timely, and comprehensible information to patients. This capability is particularly impactful in resource-limited settings, where traditional healthcare access may be constrained. The emphasis on transparency and ethical issues in AI usage, as discussed by Fornalik et al. [1], aligns with these real-world applications, reinforcing the need for robust guidelines to ensure AI’s reliability and integrity [1, 5].

    These studies underscore AI’s role as a catalyst for innovation in clinical practice and scientific communication. The themes explored by Fornalik et al. [1] resonate strongly with advancements in auditory healthcare, reinforcing the importance of collaboration and ethical oversight to fully realize the potential of AI-driven technologies in medicine and research.

    Interdisciplinary collaboration among clinicians, researchers, ethicists, and technologists is essential to ensure ethical implementation and maximize AI’s benefits. Such collaboration will foster the development of responsible frameworks that support innovation while safeguarding patient welfare and scientific integrity [14]. Table 1 shows AI’s diverse and evolving applications in auditory healthcare and scientific communication, emphasizing its transformative impact on clinical decision-making, remote rehabilitation, manuscript development, and ethical innovation.

    Overview of artificial intelligence applications in auditory healthcare and scientific communication

    DomainAI applicationDetails/technologies involved
    Scientific communication- Manuscript drafting and editing
    - Peer review optimization
    - Plagiarism detection
    - ChatGPT, Claude 3, and other generative large language models (LLMs) for generating and refining scientific text
    - Penelope.ai and Scholarcy for formatting, compliance, and readability checks
    - AI-powered plagiarism detection tools (e.g., iThenticate, Turnitin AI) ensure originality
    Clinical practice (otolaryngology)- AI-assisted cochlear implant (CI) mapping
    - Diagnostic support for audiological disorders
    - Surgical planning and risk assessment
    - Deep neural networks (DNNs) and convolutional neural networks (CNNs) for speech sound classification
    - AI-driven optimization of CI parameters via real-time auditory feedback
    - Machine learning in imaging analysis for middle ear pathology and tumor detection
    Mobile and telehealth applications- AI-based auditory rehabilitation
    - Remote monitoring and therapy
    - Virtual audiometry platforms
    - AI-enabled mobile apps (e.g., HearCoach, Amptify) with adaptive training modules
    - Natural language processing (NLP) for speech feedback and assessment
    - Gamification strategies to promote adherence and cortical plasticity
    - Voice biomarker analysis for early detection of hearing decline
    Ethical and future perspectives- Development of ethical frameworks
    - AI transparency and explainability
    - Cross-disciplinary innovation
    - Algorithmic audit systems to ensure bias minimization and fairness
    - Involvement of ethics boards and institutional review in AI deployment
    - Integration of wearable devices and brain-computer interfaces for closed-loop hearing systems
    - Responsible AI (RAI) frameworks for regulatory and clinical compliance
    Display full size

    Thank you for bringing this essential discussion to light. I hope my reflections and additional insights contribute to the ongoing dialogue on the responsible and innovative use of AI in scientific communication.

    Abbreviations

    AI:

    artificial intelligence

    Declarations

    Acknowledgments

    ChatGPT and Grammarly were used to correct grammar and style in the preparation of this manuscript.

    Author contributions

    AA: Conceptualization, Investigation, Writing—original draft, Writing—review & editing.

    Conflicts of interest

    The author declares no conflicts of interest.

    Ethical approval

    Not applicable.

    Consent to participate

    Not applicable.

    Consent to publication

    Not applicable.

    Availability of data and materials

    Not applicable.

    Funding

    Not applicable.

    Copyright

    © The Author(s) 2025.

    Publisher’s note

    Open Exploration maintains a neutral stance on jurisdictional claims in published institutional affiliations and maps. All opinions expressed in this article are the personal views of the author(s) and do not represent the stance of the editorial team or the publisher.

    References

    Fornalik M, Makuch M, Lemanska A, Moska S, Wiczewska M, Anderko I, et al. Rise of the machines: trends and challenges of implementing AI in biomedical scientific writing. Explor Digit Health Technol. 2024;2:23548. [DOI]
    Aliyeva A, Sari E. Be or Not to Be With ChatGPT? Cureus. 2023;15:e48366. [DOI] [PubMed] [PMC]
    Aliyeva A. “Bot or Not”: Turing Problem in Otolaryngology. Cureus. 2023;15:e48170. [DOI] [PubMed] [PMC]
    Diniz-Freitas M, López-Pintor RM, Santos-Silva AR, Warnakulasuriya S, Diz-Dios P. Assessing the accuracy and readability of ChatGPT-4 and Gemini in answering oral cancer queries—an exploratory study. Explor Digit Health Technol. 2024;2:33445. [DOI]
    Aliyeva A, Alaskarov E, Sari E. Postoperative Management of Tympanoplasty with ChatGPT-4.0. J Int Adv Otol. 2025;21:16. [DOI]
    Ding K, Forbes S, Ma F, Luo G, Zhou J, Qi Y. AI bias in lung cancer radiotherapy. Explor Digit Health Technol. 2024;2:30212. [DOI]
    Han JS, Lim JH, Kim Y, Aliyeva A, Seo J, Lee J, et al. Hearing Rehabilitation With a Chat-Based Mobile Auditory Training Program in Experienced Hearing Aid Users: Prospective Randomized Controlled Study. JMIR Mhealth Uhealth. 2024;12:e50292. [DOI] [PubMed] [PMC]
    Borjigin A, Kokkinakis K, Bharadwaj HM, Stohl JS. Deep learning restores speech intelligibility in multi-talker interference for cochlear implant users. Sci Rep. 2024;14:13241. [DOI] [PubMed] [PMC]
    Rajamäki J. Digital Twin Technology training and research in health higher education: a review. Explor Digit Health Technol. 2024;2:188201. [DOI]
    Deo N, Nawaz FA, du Toit C, Tran T, Mamillapalli C, Mathur P, et al. HUMANE: Harmonious Understanding of Machine Learning Analytics Network—global consensus for research on artificial intelligence in medicine. Explor Digit Health Technol. 2024;2:15766. [DOI]
    Ueda D, Kakinuma T, Fujita S, Kamagata K, Fushimi Y, Ito R, et al. Fairness of artificial intelligence in healthcare: review and recommendations. Jpn J Radiol. 2024;42:315. [DOI] [PubMed] [PMC]
    Williamson SM, Prybutok V. Balancing Privacy and Progress: A Review of Privacy Challenges, Systemic Oversight, and Patient Perceptions in AI-Driven Healthcare. Appl Sci. 2024;14:675. [DOI]
    Lee EE, Torous J, De Choudhury M, Depp CA, Graham SA, Kim HC, et al. Artificial Intelligence for Mental Health Care: Clinical Applications, Barriers, Facilitators, and Artificial Wisdom. Biol Psychiatry Cogn Neurosci Neuroimaging. 2021;6:85664. [DOI] [PubMed] [PMC]
    Harishbhai Tilala M, Kumar Chenchala P, Choppadandi A, Kaur J, Naguri S, Saoji R, et al. Ethical Considerations in the Use of Artificial Intelligence and Machine Learning in Health Care: A Comprehensive Review. Cureus. 2024;16:e62443. [DOI] [PubMed] [PMC]