Collaborative Research in Trustworthy AI for Medicine Penn Engineering and Perelman School of Medicine University of Pennsylvania

Call for Proposals due January 31, 2024


One of the most compelling applications of AI-driven systems is in the broad areas of health and medical care: not only can AI-directed systems offer a level of continuous monitoring that exceeds human capabilities, but these systems can incorporate each patient’s unique history, state, and preferences. Decreasing cost of sensors, computing elements, communication, and data storage, coupled with significant advances in machine learning algorithms, are catalysts for this ongoing revolution in medicine. However, trust in AI systems is limited both in the community of medical professionals and in the patient population. As such, making AI in medicine trustworthy is, arguably, the central challenge to enable wider adoption.


Existing research, including a number of current projects at Penn, broadly falls in two categories. Researchers in computer science and engineering, usually funded by NSF, are focused on advancing the core AI technologies, and their lack of understanding of domain knowledge can lead to solutions that are not clinically meaningful or useful. Researchers in medicine, on the other hand, usually funded by NIH, are focused on innovative applications of AI to clinical research, and may run into fundamental obstacles or develop point solutions that are not generalizable to other problems. We believe that truly transformative research in AI-driven medicine requires both a focus on clinically relevant problems and foundational advances leading to generalizable solutions. Research strengths in both the Penn Engineering (SEAS) and the Perelman School of Medicine (PSOM), and their geographical proximity, gives Penn a unique opportunity to overcome the hurdles for transformative collaborative research across disciplines and work cultures. This funding program is aimed at jump-starting impactful collaborative research at Penn to advance trustworthy AI for medicine, and is sponsored by Penn Engineering Center ASSET (AI-Enabled Systems: Safe, Explainable, and Trustworthy) and Penn Institute for Biomedical Informatics (IBI). This is the second call in this initiative, and the list of projects funded from the first call, issued in Fall 2022, can be found here.

Proposal Requirements:

Proposals are welcome from all members of Penn’s faculty. Each proposal should be structured as follows:


  1. Information about investigators (upto 1 page): A proposal must have at least two PIs, one from SEAS and one from PSOM, and ideally propose a new collaboration. For each PI, include contact information and a brief biosketch. No faculty should be listed as a PI for more than two proposals.
  2. Proposed research (upto 2 pages): The proposal should describe a clinically relevant problem, shortcomings of the existing approaches, and how AI can potentially provide a transformative solution. It should describe what foundational advance in trustworthy AI will be pursued and how that relates to the original clinical problem.
  3. Budget justification (upto 1 page): Each selected proposal will be awarded $100,000 for a duration of one year. The proposal should describe how the funds will be used. Note that funds cannot be used for faculty salary.

Timeline and Review Process:

  • Deadline for Proposal

    The deadline for proposal submission is 5:00 PM, January 31, 2024, and it should be submitted by email to Maggie Weglos (

  • Review Committee

    The review committee consists of Professors Rajeev Alur (SEAS), Christos Davatzikos (PSOM), John Holmes (PSOM), Insup Lee (SEAS), David Meaney (SEAS), and Qi Long (PSOM). Feel free to contact any member of the review committee for any questions.

  • Decision Date

    We expect to select up to five proposals, and the decisions will be announced by February 29, 2024.

  • Funds

    The funds will be available starting March 1, 2024.

  • Final Progress Report

    Each funded proposal will be required to submit a progress report in Spring 2025, and the investigators will be asked to make a presentation of their results in an on-campus day-long symposium (see Fall 2023 Symposium).