一本道无码

一本道无码

Eberly Center

Teaching Excellence & Educational Innovation

GAITAR Fellows | Project Descriptions

 Scott Andrew headshotScott Andrew

Adjunct Faculty
Art
College of Fine Arts

60-424 AI Animation (Spring 2024)

Generative AI Tool(s) Used

Runway, Deforum Stable Diffusion, ChatGPT, ElevenLabs, Midjourney, Dall-E

Research Questions
  1. How does student use of generative AI tools to make animations impact their:
    1. technical and aesthetic control over their art?
    2. self-efficacy as animators and generative AI users?
Teaching Intervention with Generative AI

Scott’s students used generative AI tools as collaborators to create animations, especially during creative editing and stylization decisions. Applications during and between class sessions included generating storyboards, scripts, animated sequences, synthesized voice narration and voice acting, and sound designs, resulting in both narrative and experimental works of animation. The suite of generative AI tools included Runway, Deforum Stable Diffusion, ChatGPT, ElevenLabs, Midjourney, Dall-E and more.

Study design

Students used a suite of generative AI tools across all animation assignments. The first assignment required students to recreate an animation from a previous course and Scott will compare student deliverables created with (treatment) and without (control) the assistance of generative AI. He will also measure changes in students’ performance and attitudes regarding animation across the individual assignments in the course.  

Data Sources
  1. Students’ deliverables from animation assignments, scored via a rubric with criteria for technical and artistic control 
  2. Pre/post surveys of students’ self-efficacy regarding skills using generative AI and course learning objectives

Brandon Bodily headshotBrandon Bodily

Assistant Teaching Professor
College of Engineering

49-101 Introduction to Engineering Design, Innovation, and Entrepreneurship (Fall 2024)

Research Question(s): 
  1. Does type of feedback interaction (peer vs generative AI) impact the quality of revisions of interview protocols?
  2. Does type of feedback interaction impact the development of self-efficacy for interviewing skills?
  3. Does type of feedback interaction impact the quality of conducting interviews?
  4. What are student attitudes about receiving feedback when role playing with a peer versus generative AI?
Teaching Intervention with Generative AI:

Bodily provided students with suggestions and tips for how to engage with the generative AI tool (Co-Pilot). Students then interacted with the tool to conduct a practice interview and elicit feedback on their interview protocols. Students next updated their interview protocols and engaged in real-life interviews as part of the coursework. 

Study Design:

Bodily delivered the same classroom instruction on interview protocol development to all students. All students then crafted an initial draft of an interview protocol. Bodily randomly assigned each student to engage with generative AI, as described above, or to leverage peers to receive feedback on their protocols. Then, during the same class meeting, students practiced their interviewing skills by role playing an interview using their revised protocol either with a peer or with the generative AI tool, depending on the study condition that they were in. All students could revise their protocol after receiving feedback and roleplaying, before conducting the actual interview. 

Data Sources:
  1. Rubric scores for students’ performance on both draft and final versions of an interview protocol
  2. Surveys of student’s self-efficacy regarding their development of an interview protocol and interviewing skills
  3. Student reflections on the feedback session, revision process, and interview

Emily DeJeu headshotEmily DeJeu

Assistant Teaching Professor
Tepper School of Business

70-340 Business Communications (Spring 24)

Generative AI Tool(s) Used

ChatGPT, Copilot

Research Questions
  1. How, and to what extent, do students feel they do or do not benefit from learning about ethical and effective generative AI use cases for the kinds of professional communication tasks they are likely to face in their future jobs?
  2. To what extent does inviting students to use a generative AI tool in specific ways help or hinder their writing-related skill-building? 
  3. How are students using generative AI tools when it comes to planning, drafting, and revising course assignments?
Teaching Intervention with Generative AI

DeJeu introduced four mini-lectures in one of the two sections of her course that showcase generative AI use cases in professional communication contexts. Specifically, these lessons provided instruction and modeling on using ChatGPT or Copilot to revise a document, create model documents, identify "lexical bundles" (i.e., phrases and sentences that are used often in particular genres of writing), and generate ideas. Mini-lectures each occurred in tandem with one of the four major writing assignments that included reflection questions and documentation regarding the writing process. 

Study design

DeJeu had two sections, one of which received in-class scaffolding for ethical and effective generative AI tool use while the other did not. In both sections, students had the option to use generative AI tools and document their use on each of their four standard writing assignments. DeJeu will compare students’ perceptions, documented practices using generative AI tools, and writing deliverables across the two sections. In addition, in a third section taught by a colleague, students were asked to not use generative AI tools for an assignment that will be used to compare students’ writing to her sections (i.e., with vs. without the reported use of generative AI).

Data Sources
  1. Pre/post surveys of students’ experiences with and perceptions of generative AI tool use
  2. Students’ reflections on and documentation of generative AI tool use
  3. Transcripts of students' interactions with generative AI tools
  4. Students’ writing assignments, scored with a rubric for various writing skills

Sébastien Dubreil headshotSébastien Dubreil

Teaching Professor 
Modern Languages
Dietrich College of Humanities and Social Sciences

82-304 French and Francophone Sociolinguistics Oral Language and Storytelling (Spring 24)

Generative AI Tool(s) Used

ChatGPT

Research Questions
  1. Do various use cases of generative AI yield different linguistic accuracy and complexity in French students’ writing?
  2. What are French students’ perceptions of using generative AI to complete writing assignments?
Teaching Intervention with Generative AI

Dubreil introduced generative AI (ChatGPT) as a support for students writing in a foreign language. In one condition, he instructed students to create their initial draft while using AI as a language assistant for support to suggest vocabulary or specific language features (e.g., a rhyme, an alliteration), check the accuracy of sentences, or to edit. In the other condition, he instructed students to use AI as a creative assistant, prompting the AI to create their initial draft. They adjusted their prompting to create three different drafts that the students then refined into a single, final deliverable.

Study design

All students in the course prepared a writing assignment in both AI conditions. Dubriel randomly assigned the order in which students experienced conditions, which counterbalanced the type of AI usage across different writing genres.

Data Sources
  1. Students' two writing assignments scored with a rubric for linguistic accuracy in vocabulary, grammar, and syntax as well as genre conventions, emotional impact, and originality
  2. Students’ reflections on their writing process and the quality of their written assignments
  3. Pre/post surveys about students’ familiarity, competency, and confidence working with genAI

Catherine Evans headshotCatherine Evans

Graduate Student Instructor
English
Dietrich College of Humanities and Social Sciences

76-106 Writing about Literature, Art and Culture (Fall 2024)

Research Question(s): 

To what extent does introducing critical AI studies during the writing process change:

  1. how do first-year writing students conceptualize the relationship of LLMs to cultural production?
  2. the development of students' attitudes toward art, culture and the humanities? 
  3. the way students think about authenticity, voice, diversity, and creativity?
Teaching Intervention with Generative AI:

Evans implemented a week-long unit on critical AI studies. Students not only engaged with emerging work in the critical AI studies, but also the 一本道无码 Archives and Special Collections to understand 一本道无码 student historical role in producing campus culture. Students then used generative AI to produce images based on text from the archival collections and compare the images produced by generative AI to the actual historical images. Students also incorporated theory from cultural studies, with the option to focus on critical AI studies, in their final paper and had the option to use generative AI in the brainstorming stages of their writing process.

Study Design:

Evans taught two sections of the course, one in mini one and one in mini two. In the first section, she implemented the teaching intervention as described above. Students in the second condition instead spent extra time covering course content not related to AI. Evans will compare the same data sources across the two sections.

Data Sources:
  1. Surveys of student’s attitudes and perceptions toward writing, generative AI, cultural production, and the humanities. 
  2. Student reflection assignments on engaging with the archives, generative AI, and their understandings of 一本道无码 Tartan identity
  3. Students’ rubric scores on the course’s writing assignments.

Rebekah Fitzsimmons headshotRebekah Fitzsimmons 

Assistant Teaching Professor
Heinz School of Information Systems and Public Policy

90-717 Writing for Public Policy (Fall 2024)

Research Question(s): 

  1. How does student self-efficacy regarding writing and generative AI use change after instruction on and practice with genAI during class discussions? 
  2. How does student use of generative AI while completing formal writing assignments impact students’ writing performance?
Teaching Intervention with Generative AI:

Fitzsimmons provided the same classroom instruction on and practice activities with generative AI to students in all three sections of her course. Classroom discussions specifically targeted prompt engineering, the evaluation of generative AI outputs, and the ethics of using generative AI in various, realistic professional contexts. 

Study Design:

Fitzsimmons permitted students to use generative AI to support the completion of graded assignments in one of three course sections. The other two sections were not allowed to use AI on formal writing  assignments. On the same assignments, Fitzsimmons will compare students’ performance and changes in self-efficacy across course sections in which generative AI use was permitted and not.

Data Sources:
  1. Surveys of student’s self-efficacy regarding writing and generative AI use at the beginning, middle, and end of their course.
  2. Rubric scores for students’ performance on the course’s second major writing assignment (due after completion of generative AI instruction).

Gabriela Gongora-Svartzman headshotGabriela Gongora-Svartzman

Assistant Teaching Professor
Heinz College of Information Systems and Public Policy

94-819 Data Analytics with Tableau (Fall 2024)

Research Question(s): 
  1. To what extent does the early introduction and scaffolded use  of generative AI tools for learning Tableau impact student knowledge of concepts?
  2. How does the use of generative AI tools for data analysis impact student’s confidence in their data literacy? 
Teaching Intervention with Generative AI:

Gongora-Svartzman introduced students to Explain Data, a generative AI tool designed to assist in the data exploration phase of the data analysis process. Gongora-Svartzman demonstrated how this Tableau generative AI tool can provide an efficient way to view the landscape of potential data analysis pathways in a given project. In one mini, students were introduced to Explain Data early in the course, and in another mini, students were exposed to the tool later in the course. 

Study Design: 

Gongora-Svartzman had three sections of the course, one in Mini 4 of Spring 2024, and two in Mini 1 of Fall 2024. Explain Data was briefly introduced late in the course in the Spring 2024 section, which will serve as the control condition, whereas it was introduced earlier in the course and in more scaffolded form in the Fall 2024 section. Gongora-Svartzman will compare data sources across sections. 

Data Sources:
  1. Student deliverables (in-class exercises, final group projects, and case study challenges) from course assignments requiring data analysis
  2. Surveys of students’ self-efficacy regarding their data literacy


Larry Heimann headshot         

Larry Heimann

Teaching Professor
Information Systems
Heinz College of
Information Systems
and Public Policy

Houda Bouamor headshot         

Houda Bouamor

Associate Teaching Professor
Information Systems
一本道无码-Qatar

Shihong Huang headshot          

Shihong Huang

Teaching Professor
Information Systems
Heinz College of
Information Systems
and Public Policy

 67-272 Application Design and Development (Spring 24)

Generative AI Tool(s) Used

ChatGPT, Copilot

Research Question

Does generative AI tool us affect equity in student outcomes, giving less-experienced students a better chance to be successful in technical courses?

Teaching Intervention with Generative AI

Heimann, Bouamor, and Huang introduced generative AI tools (Copilot, ChatGPT) in their course and encouraged students to leverage these tools for solving computer lab assignments and the main course project during the semester. Instructors demonstrated effective generative AI tool use during class to help scaffold students’ learning. They required students to document the frequency of generative AI usage while completing course assignments.

Study design

To gauge students’ level of programming and programming-related experience, Heimann, Bouamor, and Huang surveyed their Spring 2024 students as well as students from the past two iterations of the course (Spring 2022 and Spring 2023) when there was no formal policy for generative AI use and such tools were not as omnipresent in the academic landscape. Then, they encouraged Spring 2024 students to use generative AI tools while completing course assignments. To determine the extent to which generative AI tool use impacts less experienced students, these instructors will compare student work between the past and present cohorts using prior level of experience as a hypothesized moderator.

Data Sources
  1. Surveys of students’ background programming experience
  2. Students’ documentation of generative AI tool use frequency during coursework
  3. Students’ deliverables from coding exercises, exams, and a course project

Alan Thomas Kohler headshotAlan Thomas Kohler

Senior Lecturer
English
Dietrich College of Humanities and Social Sciences

76-270 Writing for the Professions (Spring 24, Fall 24)

Generative AI Tool(s) Used

Copilot

Research Question 

To what extent can the use of generative AI tools improve the student peer review process for students in an intermediate level undergraduate writing course?

Teaching Intervention with Generative AI

Kohler’s students completed a peer-review feedback process for each of five writing projects in his course. For two of the projects in Spring 2024, students completed this process using a generative AI tool (Copilot), rather than another student, as the source of feedback. Students submitted their writing along with the rubric and an instructor-engineered prompt to receive feedback from the AI tool on their writing sample. Additionally, students submitted an instructor-engineered prompt to the AI tool to generate a writing sample. Then, students practiced providing feedback on that sample. Kohler introduced Copilot during class and provided all pre-engineered AI prompts. For each project, students documented the feedback they received and gave, as well as their perceptions on the usefulness of each experience for learning.

Study design

Students used traditional peer review for the first three projects (control) in Spring 2024, but substituted generative AI for peer reviewers (treatment) during the fourth and fifth projects. In Fall 2024, this design will be counterbalanced, with the first three projects using the generative AI-based peer review (treatment) and the fourth and fifth projects using traditional peer review (control). Kohler will compare student perceptions of the feedback process and the quality writing deliverables across conditions.

Data Sources
  1. Pre/post surveys of students’ perceptions of the usefulness of the peer review process
  2. Students’ reflections on the feedback process for each project
  3. Transcripts of feedback given and received for traditional and AI-based peer review
  4. Students’ deliverables for all writing projects, scored with rubrics measuring writing skills

Derek Leben headshotDerek Leben

Associate Teaching Professor
Tepper School of Business

70-332 Business, Society, and Ethics (Fall 2024)

Research Question(s): 
  1. What is the impact of debating with generative AI (as compared to debating with a peer) on students’ development of analytical reasoning skills? 
  2. How does student self-efficacy regarding their analytical reasoning and debate skills change throughout the course and does this vary across experimental conditions? 
Teaching Intervention with Generative AI:

Leben provided suggestions and tips for how to engage in a debate with a generative AI tool, about arguments written by students. Next, Leben had students prompt the generative AI tool to: a) give them objections to their argumentative paper from both the same and different normative frameworks, and b) engage in debate with the students about their arguments. 

Study Design:

Leben taught three course sections. Leben provided the same classroom instruction on leveraging normative frameworks to design policies across all sections of his course. Students in all sections drafted an argumentative paper for a policy supported with a normative framework. Then, in each section, Leben randomly assigned students to one of two study conditions. In one condition, Leben implemented the generative AI intervention described above. In the second condition, Leben had students work with peers to elicit objections and engage in debate. The cycle of drafting a paper, receiving feedback, and revising was repeated for two paper assignments, with students remaining in the same treatment conditions. Leben will compare data sources across the two groups in which generative AI use was permitted and not permitted.

Data Sources:
  1. Rubric scores for students’ performance on both draft and final versions of two major writing assignments (i.e. argumentative papers). 
  2. Surveys of student’s self-efficacy regarding their analytical reasoning and debate skills.

Marti Louw headshotMarti Louw

Director, Learning Media Design Center
Human-Computer Interaction Institute
School of Computer Science

05-291/05-691 Learning Media Design (Fall 2024)

Research Question(s): 
  1. To what extent does the quality of student-designed interview protocols differ when feedback on first drafts comes from an expert as compared to generative AI? 
  2. To what extent does students’ self-efficacy of their interviewing skills change across the semester when receiving generative AI feedback?
  3. What are students’ attitudes about receiving generative AI feedback on an interview protocol?
Teaching Intervention with Generative AI:

Louw’s students first used generative AI as a coaching tool to receive feedback on written drafts of their interview protocols. Next, students simulated the interview by roleplaying with the generative AI using spoken inputs to the tool. Both experiences provided opportunities for students to reflect and iterate on their work. For both the written and spoken generative AI assignments, Louw provided specific instruction on prompt engineering strategies during class sessions. 

Study Design:

Louw required every student to use generative AI for feedback on interview protocol drafts and for simulated interview practice in Fall 2024. On the same assignments, she will compare student performance in Fall 2024 to that of students from Fall 2023, when students did not use generative AI. Student surveys regarding self-efficacy and other attitudes will be deployed at the beginning and end of Fall 2024.

Data Sources:
  1. Rubric scores for each student’s draft and revised interview protocols across the two conditions 
  2. Students’ written reflections following interview simulations
  3. Pre/post surveys of students’ self-efficacy with and attitudes towards generative AI in design research

Steven Moore headshotSteven Moore

Graduate Student Instructor
Human Computer Interaction Institute
School of Computer Science

05-840 Tools for Online Learning (Spring 2024)

Generative AI Tool(s) Used

ChatGPT

Research Questions

How does student use of generative AI while creating micro lessons affect: 
  1. the quality of their lesson designs?
  2. their learning of fundamental teaching and learning principles? 
  3. their self-efficacy as educators and generative AI users?
Teaching Intervention with Generative AI

Moore’s students engage with four interactive, online learning modules on fundamental teaching and learning principles. Each module contains two micro lesson design activities, in which he challenges students to apply the learning principles to their practice. For particular micro lesson activities, he instructs students to use generative AI (ChatGPT) as a collaborator in their design process.

Study design

Moore implemented two conditions, generative AI used (treatment) or not (control) by students, in the single section of his course. For the first micro lesson assignment in each of four online learning modules, randomly, he assigned half of the students to the treatment condition and the half to the control condition. For the second micro lesson assignment in each module, students switched to the other condition. Data sources will be compared for each student between conditions, across modules and micro lesson assignments. 

Data Sources
  1. Students’ deliverables from micro lesson assignments, scored via a rubric with criteria for cohesion, correct application of learning principles, diversity of names and pronouns referenced, and diversity of lesson topics
  2. Concept- and application-based multiple choice questions embedded in online learning modules and aligned with course learning objectives 
  3. Transcripts of students' interactions with generative AI
  4. Pre/post surveys of students’ self-efficacy regarding skills using generative AI and course learning objectives

Carrington Motleyr headshotCarrington Motley

Assistant Professor
Tepper School of Business

70-415 Introduction to Entrepreneurship (Spring 2024)

Generative AI Tool(s) Used

Copilot

Research Questions
  1. Do students generate more or fewer distinct ideas when using AI while brainstorming?
  2. How does the nature of students’ ideas change when using AI while brainstorming?
  3. How does brainstorming with AI impact student self-efficacy regarding AI use and course learning objectives? 
Teaching Intervention with Generative AI

Motley implemented scaffolded brainstorming sessions during class to support ideation for entrepreneurship projects (by individuals). Students then leveraged generative AI tools (Copilot) to support both the generation and evaluation of ideas for new business ventures. Individual students created “pitch decks” (slides) to present their ideas to their peers to recruit collaborators to design a business implementation plan. Teams of students then collaboratively designed implementation plans for the entrepreneurship projects chosen.

Study design

All students in two concurrent course sections received training on brainstorming techniques. Motley randomly assigned two conditions to sections: students used (treatment) or did not use (control) generative AI tools in brainstorming exercises during class. Treatment groups received training on generative AI use focused on prompt engineering. Data sources will be compared between course sections, statistically controlling for variation in students between conditions.  

Data Sources
  1. Artifacts of brainstorming sessions, including google docs (control and treatment) and transcripts from generative AI use (treatment) 
  2. Students’ pitch decks (slides from student presentations), scored using a rubric with criteria for uniqueness of the problem being solved, the solution, and the customer segment targeted 
  3. Pre/post surveys of students’ self-efficacy regarding skills using generative AI tools and course learning objectives 

Fethiye Ozis headshotFethiye Ozis

Assistant Teaching Professor
Civil and Environmental Engineering
College of Engineering

12-333 Experimental and Sensing Systems Design and Computation for Infrastructure Systems (Spring 2024)

Generative AI Tool(s) Used

PerplexityAI

Research Questions
  1. Does utilization of AI tools impact students’ skills for data processing, cleaning, and visualization of large data sets?
  2. What are the attitudes, perceptions, and experiences of students regarding AI-powered tools for data processing and visualization?
Teaching Intervention with Generative AI

Ozis introduced generative AI (PerplexityAI) as a possible support tool during students’ multi-week, big-data group project. Students had the option to use AI during their data cleaning and visualization tasks. They were not restricted in how they could choose to use the tool but were given some possible uses, such as a coach to provide advice or a tool to detect outliers in the dataset or to provide code to create data visualization plots in Python.

Study design

Students could choose to opt into using AI during their data project, creating a self-selected group of AI users to compare to a group of non-AI users within the course. Additional comparisons will be made to a previous iteration of the course that did not use AI.

Data Sources
  1. Students’ data visualizations, cleaned datasets, and documentation of process scored with a rubric for ability to clean, analyze, and visualize large data sets
  2. Students’ reflections on how effective, challenging, and rewarding their data cleaning process was and whether or not they used AI in their process

Omid Saadati  headshotOmid Saadati

Adjunct Professor
Integrated Innovation Institute
一本道无码 Integrated Innovation Institute 

49-750 Integrated Thinking for Innovation (Fall 2024)

Research Question(s): 
  1. How does using generative AI affect students’ industry knowledge as communicated through a team Miro board and a Q&A session with the instructor?
  2. How does the timing of generative AI use impact the quality of future assignments? 
  3. How many students choose to use genAI on future assignments when the use is optional?
Teaching Intervention with Generative AI:

Outside of class time, teams of students leveraged generative AI as a “subject matter expert" to gain insight into their assignment subject, such as analyzing or mapping their assigned industry. For instance, students may use the LLM to help them identify various parts of the commerce value chain. (e.g., retail, payments, e-commerce, m-commerce) for their assigned industry. Saadati provided prompting tips as well as prompt templates for sample questions.

Study Design:

Saadati randomly assigned each team of students to use an LLM (i.e., Microsoft Copilot) on either the second or third course assignment. Consequently, students served as their own controls by completing one of two comparable assignments without generative AI, counterbalancing the order of conditions across student teams. Saadati will grade both assignments with the same rubric. Additionally, on at least one subsequent assignment, Saadati will offer all students the choice of using generative AI or not, and will track which students report using generative AI.

Data Sources:
  1. Rubric scores from two team-based Miro Board assignments
  2. Students’ written reflections following the final course assignment, including whether or not they used generative and why

Raja Sooriamurthi headshot

Raja Sooriamurthi

Teaching Professor
Information Systems
Heinz School of Information Systems and Public Policy

Xiaoying Tu headshot

Xiaoying Tu

Assistant Teaching Professor
Information Systems
Heinz School of Information Systems and Public Policy

67-262 Database Design and Development (Fall 2024)

Research Question(s): 

  1. Does the source of feedback (instructor vs generative AI) affect student attitudes?
  2. Does the source of feedback affect performance on a future assignment?
  3. How does student self-efficacy for working in SQL change across the course? How does the source of feedback (instructor vs generative AI) impact students’ self-efficacy?
Teaching Intervention with Generative AI:

Students uploaded their coding assignment deliverables to a customized, generative AI chatbot called the Intelligent Assessor (designed by Sooriamurthi and Tu). Instructors trained the chatbot on the assignment rubric, their paper detailing the three-step heuristic process of formulating any SQL inquiry, SQL style guidelines, and documentation of mistakes made by previous students. The trained chatbot asked each student questions about specific answers students submitted as part of their assignment, probing them to describe their thinking and decision process. For each student, the chatbot created unique follow up questions encouraging the student to essentially “think out loud”.

Study Design:

After completing an SQL assignment, Sooriamurthi and Tu randomly assigned students to debrief and receive feedback on their assignment from a randomly assigned instructor or a customized generative AI chatbot called the Intelligent Assessor. Students then completed another SQL assignment and debriefed in the counterbalanced condition. Following each debriefing session, students reflected on the experience and the value of the feedback received. 

Data Sources:
  1. Rubric scores from students’ SQL assignment deliverables
  2. Surveys of students’ self-efficacy for working with SQL, administered at the beginning of the course and after both debrief sessions
  3. Surveys of students attitudes about the value of feedback received and comfort with the feedback interaction, administered after each debrief session

Jordan Usdan headshotJordan Usdan

Adjunct Faculty
Heinz College of Information Systems and Public Policy

94-816 Generative AI: Applications, Implications, and Governance (Spring 2024)

Generative AI Tool(s) Used

ChatGPT

Research Questions
  1. How effective is AI at enhancing student thinking, research, and writing skills?
  2. How does AI impact research and writing efficiency?
  3. Are there different impacts of AI across writers with different English language proficiencies or other characteristics? 
Teaching Intervention with Generative AI

Usdan provided students with an overview of prompt engineering and ways to use generative AI (e.g., ChatGPT), as a tool for summarization, information synthesis, research, explanation, idea generation, and more. Classroom demonstrations of potential student applications of generative AI included as a prose assistant, editor, thought partner, and critic. Across three policy memo assignments, he instructed students to not use, to use extensively, and finally to choose whether or not to use a genAI tool while preparing their written submission. 

Study design

All students in the course prepared a writing assignment in each of three conditions: first without genAI, then with the use of genAI, and finally with their choice of using genAI or not. While the order of these conditions did not vary, Usdan counterbalanced (randomized) the specific policy scenario assigned across these conditions.

Data Sources
  1. Students’ three writing assignments, scored with rubrics measuring thinking (e.g., logical policy, argument to support policy) and writing skills (e.g., quality of summary, concision, organization)
  2. Students’ self-report of writing efficiency, i.e., students tracked the time they spent actively engaged in completing each writing assignment
  3. Pre/post survey about students’ writing confidence and perspectives on AI as an educational tool
  4. Post survey about students’ perceived improvement in their writing and attribution of improvement to repeated writing practice versus use of AI


Liz Walker headshot                             

Liz Walker

Graduate Student Instructor
English
Dietrich College of
Humanities and Social Sciences

Bonnie Youngs headshot                              

Bonnie Youngs

Teaching Professor
Department of Languages, Cultures, and Applied Linguistics
Dietrich College of
Humanities and Social Sciences

66-139 DC Grand Challenge Seminar: Reducing Conflict Around Identity and Positionality (Spring 2024)

Generative AI Tool(s) Used

PerplexityAI

Research Questions
  1. To what extent does student use of generative AI impact their:
    1. ability to critically read and analyze academic papers?
    2. self-efficacy as critical readers and generative AI users?
Teaching Intervention with Generative AI

Walker and Youngs provided classroom training on how to read academic papers as well as how to engineer generative AI prompts and evaluate generative AI output. Students then used generative AI (Perplexity AI) as a reading support tool prior to class discussions by uploading assigned readings and individually engaging with the AI as a dialogue partner, asking questions to clarify paper content and potential interpretations of the text.

Study design

Walker and Youngs required every student to use the AI tool for each assigned reading in Spring 2024. They will compare student responses to reading questions to a previous semester, when students did not use generative AI, but only for questions used in both semesters. Student self-efficacy will be measured at the beginning and end of Spring 2024.

Data Sources
  1. Students’ responses to assigned reading questions, scored with rubrics for academic reading skills (e.g., reading comprehension, metacognition, critical analysis of text)
  2. Transcripts of students' interactions with generative AI, analyzed for ability to ask productive questions
  3. Pre/post surveys of students’ self-efficacy regarding skills using generative AI and course learning objectives

Rafal Wlodarski headshotRafal Wlodarski

Assistant Teaching Professor
Electrical and Computer Engineering
College of Engineering

18-656 Functional Programming in Practice (Fall 2024)

Research Question(s): 
  1. To what extent does generative AI, serving as a personal tutor, enhance equity in the learning outcomes for students on engineering projects?
  2. To what extent does using generative AI as a feedback generator improve the quality of student design deliverables in terms of completeness and correctness?
  3. To what extent do students' attitudes change across the semester (specifically, self-efficacy and trust in generative AI)?
Teaching Intervention with Generative AI:

Wlodarski introduced students to a customized generative AI tool to serve as a thought partner for students to receive feedback and explanations of concepts related to domain knowledge (cryptocurrency trading) and the Domain Driven Design framework. This is an attempt to help scale the course and guide students on the team project component.

Study Design:

Wlodarski taught the course during the Spring 2024 and Fall 2024 semesters, to approximately 40 students each. During Spring 2024, students did not have access to the customized chatbot. In Fall 2024, students had optional access to a customized LLM, specifically trained for the course’s context, during the learning process and while completing the course project. Wlodarski will compare the data sources below across semesters. 

Data Sources:
  1. Student performance on graded assessments. 
  2. Pre-post survey on self-efficacy and trust in generative AI

Jungwan Yoon headshotJungwan Yoon

Senior Lecturer
Dietrich College of Humanities and Social Sciences

76-100 Reading and Writing in an Academic Context (Fall 2024)

Research Question(s): 
  1. What is the impact of the use of generative AI for text analysis on students' knowledge of genre-specific discourse and linguistic features? 
  2. What is the impact of the use of generative AI for text analysis on students’ feelings toward writing? 
  3. What is the impact of the use of generative AI for text analysis on students' self-efficacy for producing genre-appropriate text?
Teaching Intervention with Generative AI:

Yoon provided students with instructions on how to use generative AI (ChatGPT) as a pedagogical tool to help support their identification and understanding of linguistic features, focusing on genre awareness. Students practiced using this tool for model text analysis during class for certain assignments, prompting the tool to analyze model text looking for specific rhetorical features. Students were then asked to critically evaluate the output to help reinforce their understanding.

Study Design:

For certain units in the course, students practiced their text analysis skills using generative AI, and for others, generative AI was not used. Later on in the semester, students were asked to independently complete transfer tasks that corresponded to the learning units for which they either did or did not use generative AI as a practice tool. Yoon will compare students’ performance on these tasks, reported feelings toward writing, and changes in self-efficacy to assess the impact of the generative AI tools.

Data Sources:
  1. Rubric scores of transfer tasks and student generated concept maps compared from before and after each unit.
  2. Students’ self-reported feelings toward writing.
  3. Surveys of student’s self-efficacy for producing genre-appropriate text at the beginning and end of the course.

Bo Zhan headshotBo Zhan

Lecturer
Dietrich College of Humanities and Social Sciences

82-171 Elementary Japanese I (Fall 2024)

Research Question(s): 
  1. To what extent does genAI impact novice students’ speaking performance? In other words, do students who practiced with genAI improve at a different rate as compared to their classmates who practiced with their peers?
  2. To what extent does genAI impact students’ confidence and their motivation in speaking Japanese?
Teaching Intervention with Generative AI:

During a class session, Zhan supplied students with general prompts and guidelines for using the generative AI tool (chatGPT) as a speaking partner. Prompts including asking chatGPT for feedback on grammar corrections, structure of responses, vocabulary, pronunciation while simulating a speaking partner. Additionally, during that class session, students used chatGPT to practice speaking Japanese. Students were then encouraged to practice speaking with the generative AI tool outside of class. Students then completed a homework assignment in which they recorded themselves speaking with the generative AI tool.

Study Design:

Zhan provided the same classroom instruction on Elementary Japanese I (e.g. vocabulary, grammar, etc.) across both sections of the course. Students in both sections of the course completed a baseline speaking assessment with Zhan to act as a pre-measure of student performance. Students then practiced speaking Japanese with either the generative AI tool, as described above, or a peer during a class session. Next, students in both groups completed the same assignment in which they recorded their conversation either with the generative AI tool or a partner, depending on which condition they were in. 

Data Sources:
  1. Rubric scores from recordings of student speaking assignments.
  2. Surveys of student’s self-efficacy regarding their Japanese speaking skills.

Peter Zhang headshotPeter Zhang

Assistant Professor
College of Engineering

19-867: Decision Analytics for Business and Policy (Fall 2024)

Research Question(s): 
  1. How does the way in which a generative AI tool is integrated into the course impact students’ ability to engage in critical thinking over technical troubleshooting, particularly in formulating decision questions and translating stakeholder requirements into analytical models?
  2. How does the way in which a generative AI tool is integrated impact equity by enabling students with varying levels of technical preparation to participate equally in the critical thinking process?
Teaching Intervention with Generative AI:

Zhang delivered scaffolded instruction on how to leverage the generative AI tool to solve decision analytic scenarios. During this instruction, students learned about prompt engineering, fine-tuning existing AI models, and how to use a group of generative AI agents to perform a specific data analysis task. 

Study Design:

Zhang taught two course sections. Zhang provided the same classroom instruction to all students on modeling frameworks and technical topics, such as contextual optimization and optimization under uncertainty. Zhang assigned two study conditions to the sections. In one condition, Zhang implemented the generative AI intervention described above. In the second section, Zhang provided information on general generative AI use without specific guidance on how to apply the tools to decision analysis problems. All students then completed a data optimization course group project. Zhang will compare data sources across course sections in which students were provided with a general introduction to generative AI vs. a more applied and structured approach to using generative AI to solve analysis problems.

Data Sources:
  1. Rubric scores for students’ performance on a data optimization course project, including an assessment of critical thinking.
  2. Student’s self-reported time on task.
  3. Surveys of student’s prior experience with the technical concepts and self-efficacy regarding their data analytic skills and their ability to use generative AI to complete analytic tasks.
  4. Rubric scores for students’ performance on an in-class quiz.

Additional GAITAR Project Descriptions Coming Soon…

Daragh Byrne headshotDaragh Byrne

Architecture
College of Fine Arts

Laura DeLuca headshotLaura DeLuca

Graduate Student
Dietrich College of Humanities and Social Sciences

Christopher McComb headshotChristopher McComb

Associate Professor
College of Engineering

Nimer Murshid headshotNimer Murshid

Assistant Teaching Professor
Mellon College of Science
一本道无码-Qatar