Every Learner Everywhere
Online Learning Consortium

10 Best Practices for Generative AI Faculty Development: Insights from the Field

Generative AI is reshaping how faculty design assignments, give feedback, and think about academic integrity, often faster than institutional guidance can keep up. For centers for teaching and learning, faculty developers, and academic leaders, the challenge is no longer whether to address generative AI, but how to do so in ways that are pedagogically grounded, ethically responsible, and sustainable over time.

The Faculty Development and GenAI Playbook: Evidence-Based Best Practices, published by Every Learner Everywhere and the Online Learning Consortium, was developed to meet that need. Drawing on a national survey of CTL leaders and instructional support staff, along with interviews and institutional examples, the playbook synthesizes what colleges and universities are actually doing to support faculty as they navigate AI’s implications for teaching and learning. Rather than prescribing a single model, it documents common patterns, design choices, and tradeoffs that have emerged across different institutional contexts.

Beyond the best practices summarized below, it also includes sections on how faculty development efforts have evolved since the release of widely available generative AI tools, discussion of persistent barriers such as time constraints and policy ambiguity, and examples of how institutions are structuring programs, fellowships, and communities of practice. Readers will also find descriptions of practical tools, formats, and engagement strategies that CTLs are using to meet faculty where they are, from low-stakes experimentation to more advanced assessment redesign.

What follows in this article is an excerpt from the summary of best practices in The Faculty Development and GenAI Playbook. Together, these practices offer a snapshot of how institutions are approaching faculty development for generative AI today, and how they are balancing innovation with care, autonomy, and academic values. Readers interested in the full set of findings, context, and examples can download the complete playbook from Every Learner Everywhere.

Related reading — A Four-Stage Model for Faculty Development in GenAI

1. Prioritize ethical, responsible, and transparent use

Why: Ethical clarity builds trust, prevents misuse, and supports institutional integrity.

What to do:

  • Center training on ethical, equitable, and responsible AI use. In our survey, 100% of respondents rated ethical concerns as at least “important,” with 59.5% rating them as “critically important.”
  • Address academic integrity, bias, environmental impact, and institutional policies. These were frequently raised concerns in both survey comments and interviews, including challenges related to FERPA, HIPAA, and environmental cost.
  • Promote transparent AI use in syllabi and assignments (e.g., traffic light model).
  • Encourage reflective practice and open dialogue about AI’s societal role: “We encourage critical discussions and transparency in how AI is used in coursework,” shared one CTL leader.

2. Scaffolded, tiered, and flexible learning

Why: Faculty vary in readiness, time, and learning preferences.

What to do:

  • Offer tiered learning: intro (AI basics), intermediate (teaching with AI), advanced (assessment redesign with AI). Many CTLs use this structure already, with 40.48% revising existing programming and another 23.81% creating new offerings.
  • Use varied formats: workshops (92.86% offered), self-paced modules (28.57%), toolkits (78.57%), and 1:1 consultations (83.33%).
  • Provide “AI Playgrounds” and low-stakes environments for hands-on exploration.
  • Embed AI into existing PL (e.g., book chats, mini-conferences) and offer badges or certifications.

3. Contextual, discipline-specific relevance

Why: Faculty engage more when training aligns with their teaching goals and field.

What to do:

  • Develop case studies and examples tailored to disciplines (e.g., nursing, engineering, humanities). Faculty in applied fields were the most engaged, with notable success stories including nursing simulations, photography inspiration, and engineering assignment scaffolding.
  • Partner with departments to co-create tools and adapt examples.
  • Align AI training with pedagogical goals and instructional needs like simulation or feedback: We start by looking at their workflows, then show how GenAI can help,” one CTL reported.

4. Faculty autonomy and reflective practice

Why: Respecting faculty choice fosters trust and thoughtful adoption.

What to do:

  • Encourage informed decision making and pedagogical alignment. Interviewees emphasized avoiding mandates and instead promoting autonomy: “We allow space for all opinions. It’s not our job to push people to use GenAI tools.”
  • Support autonomy in tool adoption and classroom integration.
  • Facilitate reflective discussions on AI’s role in teaching and learning.

5. Normalize AI through practical applications

Why: Everyday use builds confidence and reduces resistance.

What to do:

  • Demonstrate real-world use cases (e.g., syllabus creation, assignment design). “AI for your workflow” was a common theme in interviews and yielded positive engagement.
  • Highlight time-saving benefits and instructional enhancements. In survey responses, efficiency in course design was rated “very important” or higher by 54.8% of CTLs.
  • Use live demos and peer modeling by faculty or instructional designers.

6. Peer leadership and community building

Why: Peer influence and shared learning foster sustained engagement.

What to do:

  • Identify and support GenAI faculty fellows or champions in each college. Institutions that engaged early adopters reported the most sustainable uptake.
  • Involve them in mentoring, co-leading groups, and showcasing innovations: “Our faculty fellow co-leads book groups and supports workshops—that’s been key to momentum,” one CTL shared.
  • Run faculty learning communities (FLCs) and interdepartmental cohorts. A total of 47.62% of CTLs have created peer networks as part of their GenAI strategy.
  • Create shared repositories of prompts, activities, and case studies.

7. Address barriers and emotional complexity

Why: Faculty face time constraints, ethical concerns, and uncertainty.

What to do:

  • Offer introductory sessions and ready-to-use resources. Survey data shows that 84.6% cite limited knowledge and 82.1% cite lack of time as barriers to faculty engagement with AI.
  • Clarify institutional policies to reduce confusion. A total of 43.6% of respondents noted unclear policies as a major challenge.
  • Facilitate safe spaces for discussing ambivalence, burnout, and philosophical divides: “We host listening sessions to surface fears and concerns before jumping into the tools,” said one CTL leader.

8. Institutional support and sustainability

Why: Faculty need time, resources, and recognition to innovate responsibly.

What to do:

  • Advocate for stipends, release time, and recognition for AI-related work. Some institutions funded GenAI pilot projects or offered book stipends.
  • Fund pilot projects and course redesigns with small grants. A total of 74.36% of respondents named institutional support as one of the three most critical factors for long-term success.
  • Establish a central AI leadership role or committee on campus.
  • Collaborate across departments and with external partners.

9. Evaluate, iterate, and stay agile

Why: AI evolves rapidly and impacts teaching in dynamic ways.

What to do:

  • Use surveys (80.95% of CTLs) and participation data (76.19%) to assess impact. Consider taking the feedback loop one step further and gathering student feedback from courses where GenAI is newly implemented.
  • Track outcomes related to teaching quality and student learning.
  • Update materials regularly and plan PL on short cycles (e.g., quarterly).
  • Promote AI literacy as a core competency for faculty and students.

10. Foster a culture of curiosity and critical engagement

Why: A curious, critical mindset supports thoughtful innovation.

What to do:

  • Encourage open dialogue about AI’s benefits and risks. Future outlooks show that 25.64% of CTLs anticipate AI becoming essential, while 17.95% highlighted uncertainty and concern.
  • Support critical thinking and exploration of discipline-specific tools.
  • Celebrate experimentation, creativity, and inquiry.
Download Faculty Development and GenAI Playbook: Evidence-Based Best Practices