Generative Research | Deep Dive

Understanding User Trust in Learning Analytics: A Mixed-Methods Study on Privacy, Perception, and Adoption

Overview

As EdTech systems increasingly rely on learning analytics, they must handle sensitive student data responsibly. While data privacy laws exist, they often lack the specificity needed to guide product design or user experience decisions. This project set out to understand how students and instructors perceive data privacy in educational technologies—and how those perceptions influence trust, adoption, and engagement. We also aimed to generate actionable insights for designing privacy-aware, user-aligned learning analytics tools.

To explore these questions, I conducted interviews with students and instructors to uncover expectations around transparency, consent, and control. I then developed a scenario-based survey to test those emerging themes at scale with a broader population of instructors.

The research revealed key mismatches between user mental models and system behavior, highlighting areas where current tools fall short of user expectations. These findings informed a set of design recommendations for creating trustworthy, transparent, and user-centered data-driven systems in education—and beyond.

The Problem

EdTech tools increasingly rely on sensitive student data to drive personalized learning experiences, yet users—both students and instructors—often have limited understanding of what data is collected, how it is used, and whether they can control that use. These uncertainties can lead to mismatched expectations around privacy and consent, resulting in user distrust, reduced engagement, or even outright rejection of data-driven systems.

Our research set out to explore a key question:
What do users expect from data-driven educational technologies—and when do they feel those expectations are violated?


Answering this was essential for informing the design of transparent, trustworthy, and privacy-aware learning analytics tools.

My Role & Team

I led the end-to-end research process, including research design, interview protocol development, thematic analysis, and the creation of a scenario-based survey. My work spanned both the qualitative and quantitative phases, ensuring continuity in insight generation and methodological rigor.

I collaborated closely with academic researchers, instructors, and educational stakeholders to ensure the research addressed real user concerns and institutional priorities. I also synthesized insights into product-relevant frameworks, including a “privacy calculus” model that helped explain user trust dynamics in data-driven systems.

The findings were shared across both academic and applied settings, demonstrating their value for design strategy, product development, and policy conversations.

Qualitative Phase: Interview Study

Goals

  • Explore how users conceptualize data privacy in the context of educational technology (learning analytics).

  • Uncover latent concerns around consent, transparency, and ethical boundaries that may not surface in traditional usability testing.

Methods

  • Conducted 1:1 semi-structured interviews with students and instructors across disciplines.

  • Used scenario-based prompts to spark reflection on data use, control, and comfort levels.

  • Applied thematic analysis to identify user expectations, mental models, and triggers of trust or discomfort.

Key Findings

  • Both students and instructors expected more transparency and control than current systems typically offer.

  • Expectations around granularity and consent differed—students sought fine-grained control, while instructors prioritized contextual understanding.

  • Trust was not tied to data collection alone, but to how clearly the benefit was communicated—highlighting the importance of purpose-driven, transparent design.

Quantitative Phase: Survey Study

Goals

  • Validate key themes uncovered during the qualitative phase.

  • Investigate how data sensitivity and perceived system usefulness interact to shape user acceptance of learning analytics tools.

Methods

  • Designed and deployed a scenario-based survey to over 150 instructors across disciplines.

  • Scenarios varied in data type (e.g., grades vs. social behavior), intrusiveness, and intended use (e.g., learning support vs. monitoring).

  • Applied statistical analysis (correlation, regression) to identify patterns and quantify tradeoffs in privacy perception.

Key Findings

  • A clear privacy calculus emerged: instructors were more willing to accept data use when the benefit was tangible, relevant, and well-communicated.

  • Instructors showed heightened concern for systems using social, behavioral, or emotional data—especially when lacking contextual transparency.

  • There was strong support for opt-in data sharing, layered consent, and transparent system feedback—reinforcing the importance of user agency in data-driven tools.

Synthesis & Implications

By integrating findings from both qualitative and quantitative phases, we uncovered that user trust in data-driven systems is highly contextual. Trust depends not only on what data is collected, but how, why, and when that use is communicated. Designers and product teams must carefully balance perceived benefit, data sensitivity, and communication strategy to ensure adoption and ongoing user engagement.

One key takeaway was the concept of “consent as UX”—reframing consent not as a one-time checkbox, but as a continuous, understandable, and user-centered experience throughout the product journey.

We distilled our findings into a set of actionable design implications:

  • Progressive disclosure of data use to reduce cognitive overload and improve transparency

  • Just-in-time explanations contextualized to user actions and system feedback

  • Role-specific feedback controls, recognizing differing expectations between user types (e.g., instructors vs. students)

These recommendations are broadly applicable to any product that relies on sensitive user data—extending beyond EdTech into domains like healthcare, HR, and enterprise software.

What I Learned

This project reinforced the value of mixed-methods research in uncovering not just what users think, but why—and how those beliefs scale across different contexts and populations. I learned that designing for trust in data-driven systems requires more than compliance; it demands ethical framing, plain-language communication, and a deep understanding of user expectations and emotional responses.

I also gained insight into how stakeholders often hold conflicting mental models of risk, shaped by their roles, responsibilities, and values. Bridging those gaps takes more than data—it requires research that connects insight to action, helping teams align around shared goals while honoring user concerns.

Outcomes & Impact

  • Shared research findings with learning platform designers, institutional stakeholders, and policy influencers, driving conversations around responsible data use in education.

  • Work was recognized at leading learning and HCI conferences, contributing to broader discussions on ethics, trust, and user-centered design in data-driven systems.

  • Directly informed local EdTech procurement and analytics policy discussions, highlighting the role of UX research in shaping real-world decisions.

  • Developed a scalable, mixed-method research framework with relevance far beyond EdTech—applicable to any domain that handles sensitive personal data, including workplace platforms, healthcare tools, and compliance-focused systems.