Grammarly Home
Share on FacebookShare on TwitterShare on LinkedinShare via emailShare via Facebook Messenger

The Trust Practice: What Building Credibility Requires

Published on April 16, 2026Institutions

The Trust Question | Part 02 of 02

This is the second post in The Trust Question. The first mapped how institutions are approaching AI and traced how debates that look like technology questions are often trust questions underneath. This post asks what trust actually requires, as a practice, and why the answer depends on who you ask.

Research by Yolanda Wiggins, Ph.D., sociologist, former SJSU faculty, and 2025 ASA Public Engagement and Policy Fellow.

People talk about trust in education as if it’s one thing. Build it. Restore it. Protect it. That framing is understandable, but the research suggests it’s incomplete.

After hundreds of qualitative interviews with educators and administrators across K–12 and higher education, a different pattern emerged. Trust in education is often misread. What educators mean by trust depends heavily on who they are and what they’re responsible for. When institutions or technologies ignore that distinction, trust doesn’t just weaken; it erodes. It fractures along the seams where the accountability structures of different stakeholders diverge.

Understanding this is among the most underexamined dimensions of AI governance in education. And it changes what a good partnership actually requires.

In K–12, trust is about stewardship

In K–12 settings, trust is tightly bound to protection. Administrators and educators discussed student safety, parental expectations, and duty of care. When they evaluated new systems, the underlying question was simpler and harder than any product evaluation criteria:

Will this keep students safe, and will it protect the institution when things go wrong?

Trust in this context is collective, institutional, and cautious by necessity. When a system signals clarity, guardrails, and shared responsibility, it earns trust. When it introduces ambiguity around student data, oversight, or accountability, trust erodes quickly, regardless of intent or design quality.

In higher education, trust is about autonomy and credibility

Higher education tells a different story. Trust here is deeply personal and professional. Faculty and administrators talked about academic integrity, authorship, intellectual ownership, and professional judgment. Concern extended beyond whether a system was safe to whether it respected expertise.

The question became: Does this tool support my role as a scholar and educator, or does it undermine it?

Trust in this context is tied to autonomy and to the legitimacy of learning itself. The same system that feels reassuring in a K–12 environment can feel threatening in higher education because the risks those educators carry are different. Same tool. Different stakes.

For leaders, this creates a specific kind of challenge: the relevant question is who needs to trust a system, and under what conditions.

Why this matters for how AI lands on campus

Many AI frameworks emphasize transparency, explainability, and user control. These are important foundations. What our research makes clear is that principles alone don’t create trust. Trust forms when systems align with the real responsibilities educators are navigating.

When tools don’t reflect those realities, even well-designed features can land poorly. Hesitation shows up. Governance gets more restrictive. Adoption stalls. Trust is being evaluated through a lens the system wasn’t designed to see.

The same AI behavior can build trust in one educational context and erode it in another. That changes how decisions land, how partnerships form, and how long they hold.

Context changes how trust lands.
See where you stand with the AI Policy Compass.

What educators are actually asking for

What educators asked for, across every research conversation, was clarity. Not reassurance. Clarity.

They want to know: What is the system actually doing? Who is accountable when something goes wrong? How does this affect my professional judgment, my students, and my authorship? Do I still get to decide?

Trust grows when the answers to those questions are clear and reflect the actual conditions of the role. When they don’t, trust breaks down at exactly the moment it’s needed most.

The ask, across every conversation, was for a particular kind of partnership: partners who understand that every institutional decision about AI carries meaning. Governance signals what an institution values. Messaging signals who it trusts. Even silence signals something.

Leaders are trying to lead responsibly in public while working things out in private. What they need alongside them is someone who can hold that complexity without flattening it.

What building trust actually requires

The institutions navigating this moment well are the ones willing to sit with the complexity long enough to understand what they are actually deciding.

Trust is built through that process, over time. It shows up in consistency, in how institutions respond when something doesn’t go as planned, and in whether the people inside them feel heard.

For platforms working across K–12 and higher education, this has a direct implication: trust cannot be designed once and shipped. It has to be context-aware, role-sensitive, and honest about risk and responsibility. Design choices, governance models, and messaging that resonate in one educational setting may create friction in another. Treating trust as universal often means missing the very things that matter most to the people using the system.

The future of education will be shaped by whether the tools operating within it respect the people accountable for their use. That is a design requirement. And it’s the orientation we bring to every partnership we’re part of.

Trust takes practice. So does good governance.
The AI Policy Compass is where to start.

Your writing, at its best.
Works on all your favorite websites
iPhone and iPad KeyboardAndroid KeyboardChrome BrowserSafari BrowserFirefox BrowserEdge BrowserWindows OSMicrosoft Office
Related Articles