star twitter facebook envelope linkedin instagram youtube alert-red alert home left-quote chevron hamburger minus plus search triangle x

Why Everyone Is Talking About ChatGPT – And How CMU Students Are Contributing to the Discussion Around Ethics in Tech

By Jennifer Monahan

As technology continues to advance at a rapid pace, it is increasingly important for the public to understand how it impacts our lives and society. That's where the concept of “public interest technology” comes in - using technology to serve the greater good and address societal challenges.

The remarkable thing about the otherwise unremarkable paragraph above is that it was written by OpenAI’s new ChatGPT tool. The prompt was to “write a pithy lead for an article about public interest technology.” ChatGPT’s response is not only a perfectly passable lead paragraph, but also an example of why ChatGPT makes some people profoundly nervous.

Recent media stories have offered doomsday predictions about how ChatGPT signals the end of high school English class, the end of the college essay, and by extension, the end of human creativity as we know it. Maybe that’s a bit dramatic, but you get the point.

ChatGPT is the latest example of some of the thorny ethical implications generated by emerging technological advances. That space where technology and public interest policy intersect is where Carnegie Mellon University shines. In his Policy Innovation Lab: Public Interest Technology at Heinz College of Information Systems and Public Policy, Professor Chris Goranson challenges students to address real-world issues.
What they uncovered is that the problem runs deeper. You have to figure out ultimately how to motivate participants to see ethics as part of their essential training in technology. Professor Chris Goranson

Most recently, a team of three master’s students in Goranson’s class considered the problem of ethics in technology. They uncovered a striking insight.

Edvin Handoko (MISM ‘23), Oliver Marguleas (MSPPM-DC ‘24), and Curt Williams (MSPPM-DA ‘23) initially set out to develop a community that would inform a technologist’s code of ethics – a set of guidelines that would help technologists create society’s digital future, without compromising privacy and human rights. As the team began their research, which included interviewing practitioners in a variety of technology roles, they realized their project objective required some rethinking.

The real need, the students discovered, was to determine how to embed ethical thinking into the technology development process – essentially, their project morphed into figuring out how to change the culture around technology development.

“What they uncovered is that the problem runs deeper,” explained Goranson. “You have to figure out ultimately how to motivate participants to see ethics as part of their essential training in technology.”

The interviews with practicing technologists in a variety of roles were compelling.

“What we heard from experts is that current incentives – like profit incentives – force people to focus on other things,” explained Williams. While technologists are not averse to ethical considerations, technology’s effect on people is not necessarily front-of-mind when they’re developing technology. With this insight, the team’s focus shifted to how best to change culture to incentivize ethics.

The Role of Tech in Everyday Life

Technology is like salt, explained Handoko. It’s not the main course, but it flavors and improves many of the foods we eat. Because it touches everything from education and healthcare to entertainment and transportation, technology will inevitably impact our lives.

“If we don’t design it correctly, implement it correctly and with an eye toward the human users who will ultimately experience the technology, we risk unintended consequences,” Handoko said. “It’s not just about creating a new code of ethics, but how we can impart these ethics into the design process, how we can create a new way of thinking among technologists, with more human-centered solutions.”

The students created a playbook detailing how to push the industry towards this new way of thinking. Their framework involves three components: train, reinforce, and compel. Culture change starts with teaching technologists consistently to consider the ethical implications of their actions—from the beginning of the design process. Those ethics need to be reinforced by government and industry, and to be tangible in every step of innovation. Finally, some level of government oversight – similar to other existing consumer protection laws – is necessary to ensure accountability.

The team shared their recommendations with New America, a think-and-action tank focused on public interest tech.

Alberto Rodríguez Álvarez, senior program manager for public interest technology at New America, said the students’ work is significant because the project attempts to understand why a code of ethics is important and how it would impact stakeholders.

“The students approached these questions on a bigger scale,” Rodríguez Álvarez said. “More importantly, they proposed a way of thinking about how to embed an ethical framework in the day to day work of technologists, which is an extremely valuable addition to the conversation we have been having about the need or impact of a code of ethics.”

“We cannot think that the future of public policy or the future of social enterprise doesn’t include technology. We know that it will,” Rodríguez Álvarez explained. Merging public policy and technology requires considering the morality of how technology affects our lives. “It’s culture change,” he said. “The students nailed it.”

Public Interest Technology at CMU

As lead instructor of CMU’s Policy Innovation Lab and with a range of experience in both city and federal government, Goranson is well acquainted with the intricacies of public interest technology and policy. He framed the students’ impact within that context.

“They spent a lot of time digging deeper and figured out the real problem that they’re trying to solve, which is that you can create a code of ethics, but that doesn’t mean that suddenly all technology is ethical,” Goranson explained. “There’s a lot more to be done to make sure that people are practicing ethical technology. So how do you do that?”

The team created a framework with steps to accomplish that lofty task. They focused on training and accountability, identifying ways to integrate ethics into the existing tech landscape. They hope that their map provides the initial steps to begin to move the needle.

“You have to figure out ultimately how to motivate technologists to see ethics as part of their essential training,” Goranson said. “It’s a completely different way of thinking about technology.” That lens is starting to take hold in universities across the country – thanks in part to New America’s Public Interest Technology – University Network (PIT-UN) and to faculty like Goranson who are committed to teaching the next generation of technologists to consider the common good.

As a charter member of PIT-UN, CMU has a unique role to play at the intersection of technology and public policy.

“One thing that I love about Heinz College is that Heinz brings folks who are interested in public policy, but also programming, together, and that's really unusual,” Goranson said. “It forces technologists to be able to speak policy. And it forces public policy practitioners to have to speak technology.” The unconventional blend of disciplines offers a bigger perspective to the rest of the world in terms of how policy and technology translate into other industries, sectors, and governments – and how those bridges can be built elsewhere.

“The value that CMU provides is just helping others really understand how these worlds are connected,” Goranson explained.

As technology continues to dazzle – and sometimes alarm – us with AI-generated poetry and artwork, self-driving cars, facial recognition software, and predictive modeling, it’s nice to know that the tech and policy leaders of tomorrow are learning to innovate ethically today…and guiding others to do the same.


Author’s Note: In case you’re interested in ChatGPT’s version of a closing paragraph, here’s what the AI tool produced in response to the request:  “Excellent closing paragraph for an article about a technology code of ethics.”

"In conclusion, a technology code of ethics is a crucial guide for professionals working in the field. It helps to ensure that technology is used responsibly, ethically, and in the best interests of society. By following a clear set of principles, we can work to create a brighter, more equitable future for all. Whether you are a developer, designer, engineer, or simply a tech enthusiast, it is important to remember that with great power comes great responsibility. Let's use technology to make the world a better place, not just for ourselves, but for generations to come."

Though the final result reads like the conclusion to a high school essay, it’s a decent effort. I’ve read – and written – worse.