AI Trust & Usage Research

Exploring young adults' trust in AI for financial decision-making.

Problem

Young adults lack trust and confidence in using AI for high-stakes financial decisions.

Solution

Qualitative research with 15 young adults uncovering the emotional landscape of AI-assisted decision-making.

Role

Researcher

Team

Alana S., Lauren P., Neil N., FCAT (Fidelity Center for Applied Technology)

Tools

Miro, Tetra/Respondent, Canva

Timeline

October 2025 - December 2025

Overview

As a part of our DesignTK 520/521 classes (Design Innovation Studio), we partnered with the Fidelity Center for Applied Technology to explore AI trust and usage. AI adoption is rapidly evolving, yet trust levels vary significantly. We identified a specific gap: less than 50% of low-income, young (18-29), or Hispanic adults are currently investing in stocks. We hypothesized that the primary barriers to budgeting and investing are knowledge and confidence, rather than a lack of funds.

Initially, we aimed to understand how AI could bridge literacy gaps. However, we reframed our goal to explore the broader emotional landscape—how young adults feel about relying on AI for high-stakes decisions across school, work, and finances.

Desirability, feasibility, and viability assumptions chartInitial assumption about investing barriers

Research Methodology

Systems map showing research connections between financial literacy, AI trust, and user barriers

Screening Questions

  • Age 18-29
  • U.S. Based
  • Familiarity with AI tools
  • General Attitude Toward AI
  • Comfort with AI influence

Participants

  • 15 Total
  • Some interest in managing finances and using AI/fintech

Methodology

  • Tetra/Respondent Online Interviews
  • In-Person Interviews
  • Conducted 11/17-11/28 (Thanksgiving Break)

We conducted a mix of 15 in-person and online interviews via Tetra/Respondent with U.S.-based adults aged 18–29 with existing familiarity with AI tools. We used Miro to ideate and cluster research findings into actionable insights for our financial partner.

Key Findings

  • While 93.3% of participants use ChatGPT, there is significant drop-off for other AI tools
  • Users view AI as a tool for learning and informing a decision, but insist on making final investment choices themselves
  • Trust is conditional—for high-stakes financial tasks, users require 100% accuracy and the ability to verify sources

ChatGPT

93.3%

Gemini

40%

Perplexity

30.8%

CoPilot

13.3%

Claude

13.3%

Other

33.3%

Insights

AI can advise, not decide

  • “Make the investment decision yourself, but use everything that ChatGPT can offer you to learn and inform that decision” - Tetra Interviewee #2
  • “I can always ask [AI for advice], but I just would not rely on it completely” - Interviewee “T”

Strong desire for accuracy

  • “I generally trust [AI], but if it’s a higher stakes thing then I definitely want to check the sources” - Tetra Interviewee #3
  • “AI hasn’t always been 100% accurate and it’s always worth looking at another source...I trust it but it’s not going to be the only thing I look at” - Tetra Interviewee #5

Outcome

We proposed a three-tiered approach:

Phase 1

Human-First Branding

Position AI as a support system for human advisors, emphasizing empathy and reliability.

Phase 2

Backend Deployment

Use AI as a productivity booster for internal teams rather than direct-to-customer interface.

Phase 3

Clarify & Learn Feature

An AI feature that explains financial terms with full transparency, including citations and verified-by-expert badges.

Next Steps

This project is ongoing! We are continuing our work with FCAT to refine our recommendations and validate our findings through additional research. More information will be added soon as the project evolves!

Continue exploring

View All Projects →