No longer an ethical consideration, data responsibility is an imperative

By Joon Soo Lim, Syracuse University; Chunsik Lee and Junga Kim, University of North Florida; and Dongshee Shin, Texas Tech University

Title Card: Earning User Trust in Generative AI

The generative AI boom (GenAI), driven by innovators like OpenAI, Perplexity, and Anthropic, is transforming how we communicate, think, write, search, shop, work and create. Even the answers from Amazon’s Alexa or Apple’s Siri rely on GenAI. But it’s not just tech giants leveraging this technology — other industries are using AI to provide more personalized and targeted experiences.

Take Zara, for example. Through its “Mirror Stores” system, Zara empowers store managers with real-time sales data and consumer insights, moving away from the traditional centralized decision-making model. Store managers can compare performance benchmarks with similar stores, analyze local consumer preferences, and dynamically adapt displays and inventory.

Behind the scenes, all of this runs on one crucial ingredient: our data. Every click, scroll, or purchase feeds these systems, making our data the fuel that keeps GenAI running.

The growing concern about data use

As GenAI becomes more a part of our daily lives, many of us are starting to ask bigger questions: How is our data being used? Is it safe? Is it fair? A high-profile incident involving Google’s Gemini chatbot illustrates these concerns. In February 2024, Gemini mistakenly generated images of people of color in Nazi-era German uniforms — a historical rarity — after a user prompted it to create “1943 German solidiers” (intentionally misspelled).

This sparked backlash over the biases embedded in AI models, which stem from the datasets they’re trained on.

Such incidents illustrate the urgent need for corporate data responsibility (CDR) — a new chapter in corporate social responsibility (CSR). As Kai-Fu Lee and Chen Qiufan aptly noted in their book AI 2041: Ten Visions for Our Future, “Responsible AI could be a part of the future ESG.”

We believe CDR isn’t just an ethical consideration; it’s becoming a business imperative. Those that prioritize transparency, fairness, and security in their data practices are not only addressing ethical concerns but also earning the trust and loyalty of their consumers.

Building trust through responsible communication

Our research set out to explore how companies using GenAI can bridge the trust gap with users by addressing concerns around fairness, privacy, and security. GenAI systems may seem like magic—producing instant answers, recommendations, or creative content — but this magic is fueled by user data. When companies fail to communicate how they handle and protect that data, it leaves users asking: Can we really trust them?

Our study shows that public relations plays a vital role in responsible CDR communication. It’s not enough for companies to quietly implement strong data practices behind the scenes. In order to build trust, companies should:

  • Actively engage stakeholders in the process.

  • Communicate clearly and transparently about their data practices.

  • Involve users in improving those practices (e.g., gathering feedback or explaining safeguards).

  • Demonstrate accountability.

Why engagement matters

Trust isn’t just about having strong data practices — it’s about showing users you’re fulfilling them. A stakeholder engagement approach makes invisible systems tangible.

By actively explaining how data is kept secure, how fairness is maintained, and how biases are addressed, companies can reassure users that their information is being handled responsibly. This engagement builds confidence, turning abstract promises into visible commitments.

For more information about this study, email Lim at jlim01@syr.edu. This project was supported by a 2023 Page/Johnson Legacy Scholar Grant from the Arthur W. Page Center.

Topics:

Blog Post Type: