Man vs Machine

July 29, 2022

“A lot of people are being handed technology that they’ve never worked with before, and now their entire lives are depending on it,” said Shea Sullivan, co-director of the Institute for Digital Humanity. Alongside Covid-19 came a massive switch to meetings via Zoom calls and the growing trend of using social media as a primary interface with the world. With the dependence on digital applications to interact with others also exists an increased digital footprint. Many people do not know their data is being collected, where it goes, or what is done with it. Many do not know that an algorithm may decide what content they are shown online, and most of all, they do not know how that algorithm will influence their thoughts and opinions.

The Institute for Digital Humanity (IDH) is a student-run think tank based out of Minneapolis’ North Central University (NCU). They are advocates for civil rights and they grapple with ethical issues that come alongside technology’s advancement. They work to prevent harm brought on communities by technology, and seek to drive dialogue and bridge cultural divides that have been fostered by algorithms. Each member is drawn to the IDH by a passion for issues they confront and work they are doing.

Classes at NCU with Dr. Aaron McKain, the IDH’s interim director, were an entry point for many now-members. McKain’s teaching style, characterized by Socratic discussion, encourages students to formulate and share their own perspectives on a variety of topics. For students in media and communication courses with McKain, it is not uncommon to engage topics like algorithmic discrimination, data privacy, effects of social media, and more. Micah Headley, a Junior at NCU and the IDH’s Research Project Manager, “got really into it because it was the way that I thought.” For each of the IDH’s members, something clicks and pulls them in. For the IDH’s Associate Director Shea Sullivan, the topic that clicked was algorithmic discrimination.

Algorithmic Discrimination

Algorithmic discrimination is characterized by instances where the use of an algorithm has unique negative consequences for certain demographics. As Sullivan points out, an algorithm can draw incorrect conclusions and filter out job applicants based on data calculations behind the scenes. “The way these algorithms process information is not always correct and doesn’t always accurately reflect who we actually are.” As algorithms are adopted in more aspects of life, such as banking and the justice system, the effect of their inaccuracies grows. Algorithmic discrimination and inaccuracy is a core issue that the IDH works to address.

One key way that the IDH addresses algorithmic discrimination is through a curriculum they created which is available for educators across the nation. Developed in partnership with the Anti-Defamation League (ADL), the curriculum is designed to introduce students to the impact of algorithms and encourage discussion on the topic. Students are pushed to consider how the implementation of algorithms affects civil rights and what can be done about them.

Another avenue to address this issue is the IDH’s collaboration with PBS’ Coded

Bias, a Netflix movie centered on algorithmic bias and its impacts. The IDH has engaged in longform discussion with the film’s director discussing the ways algorithms are implemented today and solutions to their unfair implementation. This rapport between the Coded Bias team and the IDH has not only raised awareness but also has had real-world effects.

As The Verge, a tech news site, reports, Minneapolis’ city council unanimously voted to ban the use of facial recognition software in city agencies and for policing. Facial recognition, as shown in Coded Bias, has issues accurately identifying certain ages and ethnicities. For this reason, as well as privacy concerns, the Minnesota branch of the American Civil Liberties Union (ACLU MN), another partner with the IDH, pushed for the ban of facial recognition’s usage. This change has the potential in the view of the ACLU MN to prevent many false arrests due to misidentification.



Algorithms can also affect lives through the creation of informational filter bubbles. A user will watch videos that they enjoy and a site’s algorithm then will present that user with more and more content that affirms what they already watch. This means, especially in the political sphere and on other polarizing topics, that two users can unwittingly be presented with distinct and different facts, opinions and ultimately realities.

Hannah Grubbs, an NCU Senior and member of the IDH, spearheads the effort to #BursttheBubble. This campaign strives to raise awareness on the growing cultural divides driven by these filter bubbles. Hannah explains that on common issues, “we could be in the same room and both think completely different things about something that happened, and both of our perspectives could be completely wrong.”

Recent points of division in culture go beyond mere difference of opinion and toward difference of facts. Divisions regarding what precipitated the events January 6th, what those events ought to be classified as and what they mean are fed by differing views on the 2020 election, the integrity of that election and more. Perspectives on these issues can be, in many cases, politically partisan, and once a site like Facebook labels a user as Democrat or Republican, the user will be fed certain types of advertisements and content, regardless of the label’s accuracy.

To combat this the IDH is exploring a variety of avenues. Their curriculum includes a section devoted to filter bubbles. Through this curriculum students learn how their social media and other sites create filter bubbles and discuss the impact this has personally and societally. Students consider how filter bubbles can let disinformation proliferate.

Concerning disinformation, the IDH seeks out accurate news and a variety of perspectives on the 2020 election, medical communication, public health, and more. The IDH’s journalism network writes on these topics and encourages dialogues to heal the rift that filter bubbles have formed. The IDH and the Northerner interviewed Sen. Scott Jensen, M.D. after being deplatformed for his views on Covid-19 to discuss how the banning had taken place. Also of note are the Election Parties the IDH throws. In 2020 they live streamed the election over zoom to encourage dialogue and factfulness.

Ultimately, as co-director Sullivan pointed out, while “we might not agree on the issue at hand, we can start to agree to different terms or understandings of the situation, whatever that looks like. We can have a good discourse without having it be ‘you’re on this side, I’m on this side.'” The IDH, in an effort to #BursttheBubble, collaborates across faith, cultural, and partisan lines to drive dialogue and bridge the gaps formed in the wake of technological advancement.

Looking to the Future

The IDH always has irons in the fire. Driven to foster dialogue and connection in a post-digital world, they are continually looking for new chances to share about digital ethics. The research division still explores a variety of topics contributing to future curriculum and is continually on the lookout for instances of algorithmic bias and its impact on communities.

An upcoming art gallery put together by the IDH will center on the topic of gun violence and will be a place for artists of all types to express their experiences with gun violence. This will be done in collaboration with the Twin Cities organization Guns Down Love Up, which focuses on healing and conflict resolution as ways to prevent gun violence. The IDH will continue to partner with a variety of organizations. Netflix’s Coded Bias team is one ongoing collaborator, and other partnerships are being sought out constantly.

At its core, the Institute for Digital Humanity is a student-run think tank, filled with passionate people who care about their work. Ethical and civil rights issues like algorithmic discrimination are an important part of their advocacy while also seeking to drive dialogue and educate people on how technology affects users.

“You’re not going to get rid of technology,” Headley says. As technology continues to advance, and algorithms are more complex, so too do the issues complicate. But as each new advancement in the digital world both helps and hinders mankind, the IDH will be there, working to preserve and promote human connection amidst the ones and zeros.

The Institute for Digital Humanity has been working since its founding in 2019 to ensure a free and fair internet for all users. They are bi-partisan, student-run and driving discussions about digital ethics. Find out more about what they’ve accomplished on their website.

The Verge’s article

The IDH discussion with Coded Bias

The IDH’s interview with Scott Jensen M. D.


How IDH is addressing human issues and promoting human connection in the digitally-integrated world