In many nations around the world, we have seen the growth of technology bring with it increased attacks on human rights and human dignity. Technology like facial recognition and cell phone tracking are being used by many authoritarian governments to unjustly monitor and control their citizens.
Jason Thacker with the Ethics and Religious Liberty Commission has recently written an article entitled “What is digital authoritarianism?” in which he addresses the use of technology to suppress human rights. Thacker joins Traci DeVette Griggs on this week’s episode of the Family Policy Matters radio show and podcast to discuss this crucial topic.
“It’s not that authoritarianism is new,” says Thacker. “It is that these authoritarian governments are abusing technologies that are incredibly powerful to demean and to diminish the human rights and human dignity of their fellow citizens.”
While in the United States we do not live under an authoritarian government, there is still much debate over how certain technologies impact human rights and dignity in our country—especially in light of the recent election and censorship via social media. As Christians, we are called to “seek justice, love mercy, and walk humbly with God.” (Micah 6:8). This mission certainly extends to the digital realm.
“As Christians, we need to make sure that we’re recognizing the value, dignity, and worth of every single human being, digitally or in person,” says Thacker. “To make sure that we’re upholding and proclaiming the truth, the truth of the gospel, that we can only be saved through a relationship with Jesus Christ. And that changes us. It changes the way that we interact, not only in person, but also digitally in this new kind of digital-first world—that we’re proclaiming Christ in everything we do.
Tune in to Family Policy Matters this week to hear Jason Thacker explain how we can prevent digital authoritarianism and promote positive practices with our technology.
TRACI GRIGGS: Thanks for joining us this week for Family Policy Matters. If you’re like me, you love and hate technology! In the same way that it can make our lives easier, it can make us more isolated from others and can be, and is increasingly, used as a tool for great evil. We’re seeing this malicious intent played out, especially in other countries as the growing sophistication of technology is bringing with it increased attacks on human dignity, human rights, and freedoms. There’s a name for that. It’s called “digital authoritarianism” or the use of technology to suppress human rights. Jason Thacker recently wrote an article entitled “What is digital authoritarianism?”, and he’s written a book called The Age of AI: Artificial Intelligence and the Future of Humanity. He’s here with us today to discuss this very important topic.
Jason Thacker is Creative Director at Ethics and Religious Liberty Commission—the ERLC—at the Southern Baptist Convention, where he also serves as Chair of Research in Technology Ethics.
Jason Thacker, welcome to Family Policy Matters.
JASON THACKER: Thank you for having me, Traci.
TRACI GRIGGS: First of all, I think it’s important for us to establish that the Internet can look quite different in other countries, right?
JASON THACKER: Yeah, that’s exactly right. Often when we think of the Internet, we think of something that everyone has access to, that we have full access to information, whatever we search for we’ll find, but that’s just simply not the truth. Certain countries throughout the world, especially in China. China has kind of the infamous, what they call the “great firewall,” which they actually filter Internet access by only allowing what the Chinese Communist Party considers acceptable content for their people to see. So, if they Google things like “Tiananmen Square” or “democracy,” all of those types of things are filtered out of their Internet, and that’s controlled by the Communist Party. And this is where they don’t have access to certain sites. They don’t have access to certain types of information or even certain types of applications because the Communist Party wants to have a tight control or this kind of authoritarian control over their people, what they say, where they can go, what they can do. And this leads to really concerning issues surrounding human rights and freedoms, especially in places that listeners might be familiar with, like Xingjian where you have the Uighur Muslims who are in detainment camps, because China wants to have this heavy hand control over their people and control every aspect of their life.
TRACI GRIGGS: So of course, authoritarianism is not new, but why do we say that the way that it has been digitized is new?
JASON THACKER: Technology is a very powerful tool, and it can be used for good, and it can be used for evil. In the hands of those who are seeking to maintain power and control over other people, over other image bearers, the truth is that these technologies can be incredibly powerful to be used in those ways. Think of facial recognition technology, where we’re having a lot of debates over the proper use of facial recognition technology in policing and surveillance and government use here in the United States. In China, the Chinese government uses facial recognition to round up dissidents, to round up those who don’t agree with the Communist Party, or even those who are being profiled, basically because of their religious beliefs like the Uighur Muslims, to be rounded up, to be tracked and detained. And this is not just happening in the Muslim faith. This is also happening in the Christian Church in China, where the state tries to have a high level of control and authority to really maintain their power as a government. And so, it’s not that authoritarianism is new; it is that these authoritarian governments are abusing technologies that are incredibly powerful to demean and to diminish the human rights and human dignity of their fellow citizens.
TRACI GRIGGS: How pervasive is this around the world? Do you have other examples?
JASON THACKER: Yeah. I mean, outside of China, certain countries like Russia have a really tight control over their Internet infrastructure, what their people can say and what their people can do. You see in places even earlier this year, in the country of Belarus where President Alexander Lukashenko basically shut down the Internet for his entire country. He pulled the plug on it, which might sound like “wow, is that even possible?” Well in the United States, it’s incredibly difficult to actually shut down the Internet, but in these countries that are based on an authoritarian-type regime, they built their Internet systems in a way the government can essentially just pull the plug and shut it all down. Lukashenko specifically did this in order to crack down on this mess because the election was not free. It was not fair; it was not a democracy. He rigged the election in order to maintain power in that country. You see this in Iran last year, this increasing role of digital technologies in the hands of authoritarian governments. It is very concerning on the international front and something that not only Americans, but really the wider world needs to be aware of about the way these tools are being used and abused to violate human rights.
TRACI GRIGGS: So short of shutting down the Internet, talk a little bit about the effect that this has on people when the government controls everything that they see and hear. Talk about what that daily life is like and how that influences what people think.
JASON THACKER: I mean, you think of, even in the mornings when you wake up in the morning, you check your email, or you check social media. We have a free and kind of open Internet where we have dissenting opinions. You see that with a lot of the divisions that we even have in our nation today. But in other countries you don’t see as much of that, you see where it’s kind of propaganda talking about the state and how well the state’s doing and how perfect and how everything’s going really well. They don’t have access to information and talking about dissenting views. You see that even when they could leave their house, where there might be facial recognition cameras throughout their cities not only tracking their faces, but tracking where they go and what they do and what they look like in order to maintain this kind of heavy hand of control, this iron fist over these people.
And so, you see that not only through facial recognition technology and access to information, but even the ability to gather and specifically in faith services or faith gatherings. Even the Christian Church in China has had to go underground because of the type of surveillance technologies that have been used. But then you kind of zoom out, and you see this play out in elections. You see this play out in international affairs and world affairs where these authoritarian regimes are bent on retaining power. And specifically in China, you see this. Where in the United States we have a separation between private entities and government entities, and there’s kind of a line where there might be partnerships and things like that, but they’re very separate, but in China, that’s simply not the case. The government has a heavy hand of control, even in private businesses about the information they gather, being able to not just request it, but just take the information and use that for nefarious purposes.
This was a lot of the controversy surrounding the use of TikTok here in the United States, where the United States government was concerned and raising the alarm about the way that China was in cahoots in many ways with TikTok parent company and being able to harvest that data and use it for nefarious purposes. And so, it’s a very broad kind of way that these authoritarian governments go in and use these technologies, but it’s in many ways just to control every single aspect of someone’s life. Because the ultimate goal is not human freedoms and liberty, the ultimate goal is to retain power and authority and tight control over fellow image bearers.
TRACI GRIGGS: Speaking of access to information, let’s talk about that in regard to the pandemic. Some people got very upset when social media platforms began censoring discussions that the social media people considered to be bad science or conspiracy theories. They were alarmed that these companies were setting themselves up as filters of truth for the entire nation. Is this something you feel like we need to be concerned about?
JASON THACKER: Yes. And in many ways, we have a lot of really healthy debate going on in America right now about the role of social media companies and these Internet platforms and the role of the digital public square. And so, there’s lots of questions that need to be answered. But there is considerable debate about the role and the ways that these companies enact certain policies for their platforms and a free society. They are private companies, and so they can, in many ways, control the type of things that happen on their platform. But as they’ve grown in size and the ability for many millions of people to connect, there are really legitimate questions about what should these policies look like. What certain types of information should be allowed and shouldn’t be allowed. And a lot of this centers around what’s called Section 230 that listeners may be aware of, which is a part of a 1996 Communications Decency Act that says that platforms have certain types of immunity, or they take away liability for information that’s posted on their platforms.
But that was to encourage them to have safe online platforms, to keep issues of child abuse or pornography away from those miners taking these kinds of illegal acts off of these platforms. There are considerable debates that need to be happening surrounding the role of social media in our society. But it’s definitely something that we should be a part of in having those conversations, because I wouldn’t say that this is digital authoritarianism per se, because the government isn’t mandating it, but we do have legitimate questions that need to be answered about the proper relationship between private companies, governments, and individuals.
TRACI GRIGGS: Right. So besides participating in some of these conversations on a public policy level, what can we do personally, do you think, to break or limit the effect of technology in our own lives?
JASON THACKER: Yeah, I think first and foremost is recognizing how we’re using technology, how technology is being used. One of the best ways to combat a lot of this is just through education, realizing the power and the ambiguity of technology in our lives. As we recognize those things, being thoughtful about how we approach it. So even in our own families or our own personal use of social media, maybe it’s slowing down a little bit. Instead of just sharing information that we see because it confirms maybe a position or a belief we had, maybe we slow down and read the entire story before just re-tweeting it or sharing it to make sure that we understand the full context of what’s being said, how it’s being said, being people who stand up for truth.
Or maybe we should be the ones who are slowing down and not treating the person on social media as just simply an avatar, but to recognize that they’re a flesh and blood human being, a fellow image bearer just like us and have dignity, value, and worth. In the digital age, it’s so easy to treat the person on the other side of the interaction online as simply an avatar, as just simply a combatant, someone that we can argue with and say whatever we want. Because it’s just digital, it doesn’t really matter. But in reality as Christians, not only is our relationships with our family and those in our community very important, but even our relationships online are also important. And we need to make sure that we’re recognizing the value, dignity, and worth of every single human being, digitally or in person. To make sure that we’re upholding, we’re proclaiming the truth, the truth of the gospel, that we can only be saved through a relationship with Jesus Christ. And that changes us. It changes the way that we interact, not only in person, but also digitally in this new kind of digital-first world that we’re proclaiming Christ in everything we do.
TRACI GRIGGS: Wow. Yep, great point. And of course, there are organizations such as yours, ERLC that does a great job of staying informed on this and keeping people informed on this. Tell us how people can connect with you.
JASON THACKER: For a lot of our technology work, the easiest way is to go to my website, which is jasonthacker.com. There we host a weekly podcast called WeeklyTech, but we also have a weekly newsletter where we try to keep people up-to-date on technology issues. Technology is such a wide-ranging issue that can be kind of overwhelming for folks amidst all of the other pressures that we have in our daily lives. So, this is a way and a resource that we at the ERLC wanted to put together, a newsletter and podcasts. But also going to ERLC.com, staying up-to-date on our videos or podcasts, our articles on a lot of these difficult questions surrounding technology.
TRACI GRIGGS: All right. So that was jasonthacker.com and ERLC.com.
Jason Thacker with the Ethics and Religious Liberty Commission, thank you so much for being with us today on Family Policy Matters.
JASON THACKER: Thank you so much for having me, Traci.
– END –