Find A Speaker or Advisor

Tags:   +   +

The algorithms that power artificial intelligence (AI) tools are built on a foundation of human judgment calls ­– but humans sometimes make mistakes. How can organizations confidently take advantage of the benefits associated with modern technology while avoiding negative impacts on surrounding communities?

“Human experts and tech experts need to work together,” says Desmond Upton Patton, the University of Pennsylvania’s Brian and Randi Schwartz University Professor (the highest professorship at Penn) and Penn Integrates Knowledge University Professor who also has joint appointments in the School of Social Policy & Practice, the Annenberg School for Communication, and in the department of psychiatry in the Perelman School of Medicine.

“Existing data science techniques cannot accurately understand key cultural nuances in language amongst predominantly communities of color. Our methodologies ­– which center and privilege culture, context and inclusion in machine learning and computer vision analysis – create non-biased and culturally nuanced algorithms to give tech companies a holistic perspective on various business and social issues. The companies that adopt these proactive measures are then able to ensure they are not unintentionally propagating bias.”

At the SAFElab, where Patton is the founding director, social workers and local residents join Patton and his team to add context to social media messages, which helps programmers build algorithms that interpret messages correctly. Community-based partnerships between social workers and technology developers reframe innovation to incorporate a fuller spectrum of humanity, creating a more useful, equitable and joyful environment. By nurturing these relationships, organizations see their time and financial investments return as their solutions are adopted for best understanding customer need. Tapping a diverse group to incorporate their expertise into data used by technical systems results in crucial knowledge and insights, ensuring that output will be used and embraced.

Desmond Upton Patton - Pioneering Social Scientist and Social Worker Whose Frameworks for Designing Ethical AI Ensure Well-Being for Diverse Communities and Organizational Cultures

A pioneer in fusing social work, communications and data science, and the most cited and recognized scholar studying how groups constructed online can influence behavior offline, Patton helps organizations create processes that connect employees with customers, enabling their products to affect people more broadly. Through keynote presentations, interactive workshops and as an advisor to AI companies Kai.ai and Lifebrand, Patton helps organizations develop a better approach to diversity and inclusion that includes fairer practices to address the challenge of prejudice, rather than contribute to it.  

How to Expand Your Product’s Audience

Historically, breakthrough technologies like AI or augmented and virtual reality have been exclusively wielded by data scientists or software engineers, but Patton says it’s time for that to change. He encourages computer and data scientists to move beyond a reflection-based mindset to drive true inclusion. Pointing out that many developers create products based on non-representative market research that excludes viewpoints from people of color, people with disabilities, LGBTQIA+ people and other populations, he recommends teams adopt reflexive thinking strategies that ask critical questions about the context of data. Patton’s practical recommendations for adding steps like naming, active listening and processing to developer training, then directly reaching out to underrepresented groups to incorporate their points of view, ensures their voices make an impact.

“Being able to talk to people like anthropologists, political scientists, community members and computer scientists matters for getting outside of restricting narratives,” stresses Patton, a former Fellow at the Harvard Kennedy School’s Carr Center for Human Rights. “You cannot, and should not, be an ethical engineer if you have not gone through a process in which you have to deal with your impact on the things you’re developing. If we can make that a requirement, then I think that we will slowly get to a space where people can at least be active in these conversations, willing to be checked, and to listen.”

Organizational Culture Tools for Responsible Innovation

Many companies make great efforts toward resolving cultural problems only to be left confused when their hard work doesn’t net results. According to Patton, this is often because they aren’t solving the right problem. By applying Patton’s qualitative analysis approach, which he’s already brought to companies including TikTok, Spotify and Microsoft, organizations can identify the different emotions people are experiencing, and the events that have triggered them. This provides a contextual understanding of peoples’ experiences, allowing the organization to create listening processes for understanding their employees, which can be turned into educational modules for a lasting and scalable effect on culture.

As AI tools continue to advance, Patton’s methods for responsible innovation that helps, and doesn’t harm, communities are becoming increasingly important. As the world grows in diversity, he helps leaders design products with diverse audiences in mind, understand how to use AI in ethical ways, and evaluate existing algorithms for biases and potential risks.

“Social work allows us to have a framework for how we can ask questions to begin processes for building ethical technical systems,” Patton says. “We need hyper-inclusive involvement of all community members — disrupting who gets to be at the table, who’s being educated and how they’re being educated, if we’re actually going to fight bias.”


When an organization incorporates voices from local communities in its development process, the results are crucial insights that ensure their products will be used and embraced. Stern Strategy Group connects you with renowned thought leaders whose insights, strategies and management frameworks help organizations fuel growth and disruptive innovation to better compete in a constantly changing world. Let us arrange for these esteemed experts to advise your organization via virtual and in-person consulting sessions, workshops and keynotes.

The Power of Local Knowledge: Why Product Development Should Start with Listening to Your Community was last modified: April 8th, 2023 by Meg Virag