Introduction | About the guest | Christopher Dietzel's Research at Congress | Transcript | Follow us
Introduction
Welcome to Congress in Conversation, a special series presented by the Big Thinking Podcast in partnership with The Conversation Canada where we convene researchers presenting at Congress 2024 to share their research and experiences within the context of shared responsibility to our society, systems, and planet.
For this first episode, our host Nehal El-Hadi, journalist, editor, and producer at The Conversation Canada is joined by Christopher Dietzel, a postdoctoral fellow at Concordia University.
About the guest
Christopher Dietzel is a postdoctoral fellow whose work explores the intersections of gender, sexuality, health, safety, and technology. Dr. Dietzel works with the iMPACTS Project, the Digital Intimacy, Gender, and Sexualities (DIGS) Lab, and the Sexual Health and Gender (SHaG) Lab, and he is a co-investigator on Digitally Informed Youth (DIY) Digital Safety. Dr. Dietzel's recent projects focus on LGBTQ+ people’s experiences with dating apps and social media, and he investigates the barriers, harms, and violence that people face when using these digital platforms.
Dr. Christopher Dietzel's research at Congress:
Dr. Dietzel’s research paper is titled “Beyond the swipe: Interrogating dating app approaches for sustaining user safety”
Synopsis
Dating apps like Tinder, Bumble, and Grindr facilitate dates, hook-ups, and friendship, but people can experience a range of emotional, physical, sexual, financial, social, and cultural harms online and in person. This study considers to what extent apps provide sufficient policies, features, and supports that address the range of harms that diverse users can experience. To do this, we analyzed the content moderation policies, terms and conditions, safety features, and other in-app mechanisms from 30 dating apps popular in Canada. Findings shed light on policies, features, and other aspects of the app experience that enhance and/or hinder diverse users’ safety and reveal gaps in safety measures. The findings will be used to create an interactive Digital Dating Safety Map, a tool that people can use to compare apps’ safety mechanisms and risks, providing individuals with greater literacy and agency in making technology-related dating decisions.
[00:00:07] Nehal El-Hadi: Welcome to Congress in Conversation, a special series presented by the Big Thinking Podcast and The Conversation Canada, where we convene researchers presenting at Congress 2024 to share their research and experiences within the context of our shared responsibility to our society, systems, and planet.
[00:00:26] My name is Nehal El-Hadi, journalist, editor, and producer at The Conversation Canada, and I will be your host for this special feature of Congress in Conversation. Today I am joined by Christopher Dietzel, a postdoctoral fellow at Concordia University whose recent projects focus on people’s experiences with dating apps.
[00:00:56] How did, how did we get here? How did you get here?
[00:01:00] Christopher Dietzel: The story actually begins from when I was doing my PhD. I was working as, as a research assistant, and on a grant that was looking at sexual violence. And I was having a casual conversation with a few colleagues - a few other research assistants at the time - about their experiences with dating apps, actually.
[00:01:19] And so one of the, one of the research assistants in particular was saying that she was going on a date that evening, but she was really frustrated because even though she was really interested in this guy, he had sent her some unsolicited dick pics.
[00:01:32] Although that particular experience might not seem out of the norm for many people, um, particularly since I was having this conversation with some colleagues on a grant around sexual violence, I became very interested in how that kind of mundane everyday experience of receiving unsolicited dick pics might map onto the experiences of queer people.
[00:01:51] So, uh, when thinking instead about, like, a heterosexual man and a heterosexual woman, how might queer men, for example, gay men, experience the sending and receipt of unsolicited dick pics? Like, for example, quite simply, would they characterize that as sexual violence? And so, that question actually launched my PhD, which is what I ended up studying.
[00:02:11] I looked at gay and queer men's experiences of sexual violence related to their use of dating apps. And then that, of course, grew into a larger field of study where I now look at LGBTQ+ people's experiences of safety and health related to their use of technology, which includes dating apps, but also social media and other platforms. So I'm really, really interested in how these kind of everyday experiences of people's use of technology can have an impact on their understandings and experiences of safety.
[00:02:40] Nehal El-Hadi: So what did your PhD research find?
[00:02:42] Christopher Dietzel: Yeah, so what I found with my PhD research and studies that I've done since then is that racialized queer people in particular can experience more - not necessarily more - but they can experience sexual violence and issues of consent differently compared to non-racialized or white people.
[00:02:57] And so for instance racialized folks can be exploited, they can be fetishized, they can be taken advantage of, and they can experience sexual racism as consent negotiations and processes are complicated, and that can, of course, have implications on their safety and health.
[00:03:15] Nehal El-Hadi: And so, how are the rules of consent considered differently in, like when you're using a dating app? And then by extension, what does safety mean offline and online?
[00:03:26] Christopher Dietzel: So safety is a really interesting piece because, once again, identity is a big factor in that. We tend to think that digital spaces are neutral and that's not the case. Not only because of the people that we interact with but the technology itself.
[00:03:39] What we don't always realize is that there are algorithms, that there are artificial intelligence, there are mechanisms behind the screen that can have a really big impact in how people feel or are safe online. And of course, that then has implications for their offline experiences.
[00:03:55] So to give you an example, for trans people, as they might use dating apps and engage with identity verification, which takes a photo and checks it against other photos that you would upload or share to a site. For somebody who is trans, whose gender might change, or they might have a less binary presentation or expression of their gender, that can complicate then how they are verified in the app.
[00:04:19] Somebody who is not out or somebody who is uncomfortable with presenting publicly in one of these online spaces might also not feel safe or comfortable verifying their identity. And so these are types, these systems, which then could be monitored by AI or algorithms can create a space where if somebody is not identity verified, somebody else might experience, might perceive that as them not being genuine, or, you know, that could be perceived as a red flag, or for the person who's deciding not to engage with those systems of identity verification, it can actually be safer for them.
[00:04:56] Nehal El-Hadi: So how do you find apps are kind of dealing with this?
[00:05:02] Christopher Dietzel: Yeah, so that actually brings, it's a great question and it sets the stage for the research that we're doing and presenting at the Congress of Social Sciences and Humanities.
[00:05:10] So I'm working on a team of researchers, um, and I'm at Concordia, I'm doing a postdoc at Concordia, with Dr. Stephanie Duguay, and the two of us are on a team of a few researchers, we have a SSHRC Insight Grant that's led by Dr. Diana Parry from the University of Waterloo.
[00:05:26] And our project examines dating app users' experiences of safety and how those, how safety is constructed by dating apps. And so, going back to your question, we are in one of the first stages of this research project, but that is a question that we are trying to answer, uh, Stephanie and I in our research at Concordia.
[00:05:45] Is we're specifically interested in how dating apps construct these notions of safety and what mechanisms are available for users in order to be and feel safe online and offline. So in order to answer that question, what we're doing is we are doing an analysis of policies, so content moderation policies as well as privacy policies.
[00:06:08] We're looking at community guidelines and we're looking at different safety tips and other safety related information that is available through the apps. We're also doing a walkthrough of the apps themselves to see from the minute that you download the app to when you delete it and including all of the interactions that you might have throughout that experience.
[00:06:30] We're interested in understanding the features that are available to the everyday user in order to help keep them safe. So we're looking at blocking and reporting, we're looking at to what extent you can include pronouns on your profile, we're looking at what photos you share, we're looking at how things are processed, we're looking at how you might message or stay anonymous, we're looking at all of these different features that are embedded within the apps and analyzing to see how they may be more or less safe for diverse users.
[00:07:04] Nehal El-Hadi: May I ask if you use the apps yourself?
[00:07:06] Christopher Dietzel: Sure, and yes, I do.
[00:07:09] Nehal El-Hadi: Okay, so like, So how has, how has doing this research changed the way that you use the apps?
[00:07:15] Christopher Dietzel: So first and foremost, it's definitely had an impact on how I interact with other people, not only in terms of my online interactions, but offline, especially when I introduced myself as a researcher who studies safety and consent and sexual violence and sexual racism and all these things. Some people are really intrigued by that and other people are actually turned off.
[00:07:36] So, some folks interpret that as me analyzing everything that they do and that they say. And like they perceive that as me imposing a value set on our interactions or potential interactions. I've also many many times had people ask me quite simply if - as I'm interacting with them - if they are a subject in a research study. And I'm like no I'm just here for personal reasons, you know, I just want to use the apps like everybody else. So yeah, it's definitely had an impact.
[00:08:06] Nehal El-Hadi: That's, that's really interesting. But let's, let's go back to the apps a little bit. Which ones are you specifically looking at?
[00:08:13] Christopher Dietzel: We actually have 30 apps that we're looking at. So we've identified the 30 most popular apps in Canada. And so there's quite a diversity across these 30 that we're looking at.
[00:08:23] We have some of the more mainstream or popular ones like, Bumble, and Tinder and Grindr, as well as some smaller apps that are specifically oriented towards certain populations.
[00:08:34] So we have Christian Mingle, we have Salams, we have SilverSingles. We have a variety in order to make sure that we've captured a range of experiences, including a range of experiences from people with diverse or marginalized identities.
[00:08:51] Nehal El-Hadi: Okay. And then what stage are you at in the research?
[00:08:55] Christopher Dietzel: So we are still in our data analysis stage. And so I am going through and reading the policies and procedures, all of the documents from these 30 different dating apps. So looking at their content moderation policies, looking at their safety tips, their guidelines, etc.
[00:09:11] It's a lot of reading. And you know, when you click that little terms and conditions button that says, “Yes, I accept” without reading, well, I'm actually reading that. So, I am going through and reading pages and pages and pages of terms and conditions from 30 different dating apps. So that's taking some time.
[00:09:28] And then there's a - we have a research assistant that we're working with as well - and she's using the walkthrough method, which was actually developed by Stephanie Deguay, as well as a couple of other colleagues to do a walkthrough of the apps themselves.
[00:09:41] So, as I mentioned before, downloading the app, inputting your information, going through and checking what different features are available, how to use them, and then deleting the app at the end.
[00:09:51] Nehal El-Hadi: Like right now, you're just presenting, this is the information that we've gathered, we're still going through it. What are you, what have you learned so far?
[00:09:58] Christopher Dietzel: Yeah, so that's actually one thing that we're really happy about with Congress is because it's a great opportunity to connect with colleagues and to talk about research at whatever, whatever stage of the process you're in.
[00:10:08] So we're excited to share some of the initial things, some of the initial, I don't even want to call them findings because we're still going through the analysis. But we're, we're really interested in sharing some of these emerging trends, some interesting codes and such that we've come across thus far in the data, and to discuss that with colleagues at Congress.
And so we're presenting our methods, we're talking about how these apps were selected, and then we are sharing some of these “preliminary findings” with people.
[00:10:38] Nehal El-Hadi: So then I have a question like going back to like we talked a little bit about consent and safety online and offline and I kind of wonder like what levels of responsibilities lie with which group of stakeholders or actors in this situation.
[00:10:52] Christopher Dietzel: So once again you're asking a fantastic question and I think that really gets to the heat of the issue with dating apps nowadays and not just dating apps but with tech companies in general is to what extent are the companies themselves responsible for the for individual users experiences of safety, and then violations of safety.
[00:11:14] And to what extent are individuals responsible for their own experience, and how other people impact their experience. And there's not really an easy answer to that. I think I, and I think my colleagues would argue that since these are companies who are providing a service, they have a responsibility to their users to keep them safe.
[00:11:32] However, what we've seen time and time again is that that level of safety can often be like a one size fits all approach where they don't actually consider how different populations perhaps related to identity, related to background to a variety of different aspects of, of their experience, how different populations can experience, um, more or higher rates of threats to their safety versus others.
[00:11:58] Of course, that's not to say that this is a blanket approach across all of the apps. And that's what we're kind of finding already is that some are more attuned to the heightened risk among certain populations compared to others. But what we are seeing as well is where we have to keep in mind as well is that these are companies, right?
[00:12:15] So they are here with the primary purpose of making money, of offering a service that can keep people involved, engaged in their product. And since many of these apps are free, what that means is even if people are not paying for the product itself - the app - they are generating data, which the companies can use to monetize.
[00:12:35] And so, I say all of that to keep in mind that even though we would like to see dating apps and technology companies in general do more to keep their users safe, we have to keep in mind what their position is in this regard, and that they are going to offer safety to a certain extent, but they might not be as ambivalent of actors in this process because of the financial motivations that are behind.
[00:13:01] Nehal El-Hadi: I mean, I'm not, I'm not that familiar with the dating apps, but it seems like there's a lot more safety the more you pay into it.
[00:13:08] Christopher Dietzel: And that's a great point and a fascinating part as well that we're already starting to see. And we kind of knew this before we even began this research, but yes, there are many apps, not only have a free version, but a paid version.
[00:13:21] And then there are safety features that you can only unlock when you pay for the app. And so what that means is that it creates boundaries to how people can be safe in their online experiences, but it also means that then in order to access these, these features, you, of course, have to pay for them. So it creates class levels of safety, if you will, in terms of the extent to which people can be safe or feel safe in their online experiences.
[00:13:51] Nehal El-Hadi: That's a great answer. I'm also curious about, like you'd mentioned, that even if you're not paying for the app, you're generating data, which is generating income. I'm also wondering how dating apps have leaned into gamification, to collect more data to keep people more involved, and they also do that at the expense of safety.
[00:14:11] Christopher Dietzel: You are right. And that's actually, so Stephanie, myself, and one of our colleagues, his name is David Miles, the three of us were really interested in how, during the COVID 19 pandemic, dating apps repurposed themselves in order to keep people engaged, especially during the early lockdown periods when folks could not meet in person.
[00:14:31] You know, a big premise of the dating apps is that you connect online and eventually meet in person, but during COVID, that premise was completely challenged. And so the three of us conducted a couple studies actually to understand how dating apps were remarketing themselves during that time in history.
[00:14:48] What we saw, at least in part, was that they additionally gamified the user experience in order to keep people engaged, involved. So one thing, for example, that they did is they unlocked paid features and they made some of them free in order to entice people to use their products.
[00:15:06] So for instance, Tinder made its passport feature, which allows you to quote unquote, travel around the world and meet people in different locations. They made that free in the early stages of the pandemic.
[00:15:17] And Tinder - just to note - in and of itself, how it's constructed as an app, is actually set up like a deck of cards. And so, as you're swiping left or right, it's intended to mimic how you might play with playing cards, which just shows how, how gamified that is, just in its basic construction as an app.
[00:15:35] But of course, other apps use different features. Like there are, I'm trying to think of what the features are called, but essentially there's like streaming services on certain apps, uh, there's tokens, there's rewards, of course, as we know, when we use any type of social media platform or just digital platforms in general, there's notifications, which are meant to, um, get your attention and release endorphins.
[00:15:58] So all these types of things are perhaps consciously or unconsciously encouraging people to use the product more. And if I recall correctly, there's even a, I believe there was a lawsuit in the United States where there's some individuals for suing one of the dating app companies because they are constructed in a way to keep people on the site rather than actually facilitating matches.
[00:16:24] So I'm forgetting some of the e specifics of that, so that might be something to check if I'm remembering correctly or not, but essentially it goes back to what you were saying before of how these apps are designed in a way to keep users on their site.
[00:16:38] And even though they might present themselves as like, “Let's find matches, let's find you the love of your life, let's find you an intimate connection.” They can be designed in ways that actually keep people on the site.
[00:16:50] Nehal El-Hadi: I just found it.
[00:16:51] Christopher Dietzel: Oh, did you? Good!
[00:16:53] Nehal El-Hadi: Six people in the U. S - Yeah.
[00:16:53] Christopher Dietzel: Good.
[00:16:54] Nehal El-Hadi: I didn't, I hadn't seen this. [Six people in the US] raised a lawsuit against Match Group, which owns Tinder, Hinge, and Match. So that is really fascinating. Where do you think dating apps are going to go from here? Because now there's a whole bunch of like talk in the zeitgeist about how dating apps are over and people would rather meet in real life and it's like they're passé, they're done.
[00:17:14] And I think I'm asking that question to you in that the world has now opened up, we're back to kind of what life was like before the pandemic. And so there's not that much necessarily as a perceived need for them.
[00:17:27] Except I also think that the dating landscape - not I think, like this is from what I read - um, that the dating landscape has also been irrevocably changed by dating apps. So my question is, like, we're talking about how has being on the apps changed ideas? You know what, let's go back to your research. How have dating apps changed the ideas of safety and consent offline?
[00:17:49] Christopher Dietzel: So, I think, and I agree with you quite simply is once we've, you know, removed the - what's the phrase - Pandora's box or once we've opened Pandora's box or once we've let the worms out of the can or whatever the case may be. I think, I think that yes, there might be some people who use dating apps and some don't, but I think it's become just another option for people to connect with one another that is going to be hard to remove that as an option from society.
[00:18:18] That said, and going back to your question, I think people are becoming increasingly aware of how their data is used. They're becoming increasingly aware of the risks and harms that are associated or that are, that can be potentially, that can come through their use of dating apps and being aware of what those harms could be online and or in person.
[00:18:40] So I think in that regard, even if people are using dating apps, there's a bit more of public awareness around concerns and risks associated with the use of these technologies. And because the public is becoming more aware of these potential harms, I think that puts pressure on the companies themselves to respond to these harms.
[00:19:02] And so for instance, using COVID as an example, the app companies didn't have a choice, but to present options for people, present information for people about how to interact in safe and healthy ways. Because they literally had no choice.
[00:19:18] We've seen this as well in response to concerns about sexual violence in particular. Many of the apps now have information about consent on their websites. Many of them actually provide links to resources if people experience an act of sexual violence. Of course, this is not to say this is the case for all dating apps, but we are seeing that the public knowledge and people's experiences with harms have forced the apps to respond to these concerns.
[00:18:49] Nehal El-Hadi: Thank you for that. How long have you been doing this research now? Like looking at dating apps?
[00:19:54] Christopher Dietzel: Since 2017.
[00:19:58] Nehal El-Hadi: Okay. So that's like definitely way, way, way before AI coming in.
[00:20:01] Christopher Dietzel: Yes, yes.
[00:20:03] Nehal El-Hadi: So what have you, what have you noticed there or what have you been thinking about in terms of AI and dating apps? Kind of, generally and then specifically as it relates to safety and consent and the work that you're doing.
[00:20:16] Christopher Dietzel: So this question about AI is quite an interesting one and I just want to start by prefacing that by we have to be clear about what type of AI we're referring to and I, and I say that very intentionally because some aspects of AI have already been well integrated into people's dating app experience.
[00:20:33] For instance, as I was talking about before with identity verification, for example, these are not human moderators who are checking every individual's photo that is uploaded to a dating app.
[00:20:44] We have systems in these apps themselves that are managed by the companies, we can talk about this in terms of algorithms, we can talk about this in terms of AI but nonetheless these are automated technological systems that will assess an individual's photo, again, using the example of identity verification, that will assess an individual's photo and based on their codes, they will either flag or accept the photo and then they have a system to say like, yes, this person's identity is verified or no, this person is not.
[00:21:18] And so, in that sense, AI has already been integrated into the back end of people's experiences. And that's just one example. There's other examples, of course, when we think about blocking, when we think about reporting, when we think about flagging problematic content, um, and we're even seeing more and more that AI has been integrated into dating apps, um, in order to catch what people say or catch photos that people send and prompt the user “Is this something that you would like to see?” “Is this something that is harassing?” “Is this something that's problematic?”
[00:21:48] Then if yes, would you like to report it to us? So, in that sense, there's already been this casual integration of AI into dating apps. However, nowadays, as we're seeing that users are being prompted to have more front-end interaction with AI, for instance, with ChatGPT, it's not what is happening behind the scenes, it’s what the user can do individually in order to prompt AI to generate a response.
[00:22:19] What we're seeing is that could very well be coming more into that kind of user generated AI might be coming into people's dating app experience. So for example, uh, the CEO and founder of Bumble very recently, at a news conference, was talking about the future integration of an AI dating concierge.
[00:22:40] So that somebody what they could do is have this AI chat bot, they could talk to their chat bot in Bumble, for example, and tell the chat bot what they wanted out of their dating app experience, what their hopes were, what their dreams, what their fears. So they could divulge all of this very personal information to their AI chat bot and release it into the dating app so that then the bots would go out and do the dates for them.
[00:23:05] And then the bot would interact with other users and come back to the user to present them with suggestions of who this person then should actually meet up with in person for a date.
[00:23:17] Nehal El-Hadi: I was going to say that sounds like an incredible premise for a movie with where all that happens is that all these AI chatbots end up dating each other.
[00:23:25] Christopher Dietzel: Exactly, exactly. And that's even what the CEO herself said is that, you know, imagine a not-so-distant future where AI is dating other AI.
[00:23:34] Nehal El-Hadi: And then people have to meet in real life again.
[00:23:36] Christopher Dietzel: And who knows? I just, you know, this has a lot of implication in terms of how we understand intimacy, in terms of how we outsource our own personal experience, what does it then mean to be, to drive our own experience, or to engage with the messiness of dating and relationships, you know, this prompts a lot of questions, many more questions than answers.
[00:23:54] Bumble is not the only app who's considering a feature of this sort. Grindr - a dating app that's very popular with gay and queer men as well as trans folks - the CEO has actually similarly talked about integrating an AI chatbot into that dating app.
[00:24:11] That doesn't seem like that it's going to roll out anytime soon, but nonetheless, it's interesting to note how not one, but perhaps two, and if these are two who have just talked about this publicly, there's probably even more who are experimenting with this or considering AI in the background.
[00:24:26] But all that to say, I think as AI has certainly become very popular in, in, in public discourse in society nowadays, I think we are going to see it be integrated further into people's dating app experiences and how or to what extent remains to be unseen.
[00:24:45] Nehal El-Hadi: What do you hope your research does both within and outside of the Academy?
[00:24:51] Christopher Dietzel: Yeah, one thing in particular that this project that I'm conducting with Stephanie DeGuay and others around dating apps constructions of safety. One thing that we're aiming to produce through this project is a digital safety map.
[00:25:06] What we're intending to create is an online interactive website where people can go and receive information about these 30 different dating apps that we're currently analyzing.
[00:25:17] So we want to have an engaging interactive site where people can click on an app and compare one or two or more apps and understand what safety features, what mechanisms, what policies, what information, what resources are available through these apps so that they can have an informed awareness of how these apps construct safety and then be able to determine for them as individuals, which apps might be and feel safer as they're trying to connect with people online and in person.
[00:25:52] Nehal El-Hadi: Thank you for listening to Congress in Conversation and to my guest, Christopher Dietzel.
[00:26:01] The Big Thinking Podcast would like to thank our friends and partners at the Social Sciences and Humanities Research Council, whose support helps make this podcast possible, to CitedMedia for their support in producing the podcast, and to The Conversation Canada for their partnership.
[00:26:16] Let us know what you thought of this episode and share your feedback with us on social media. Follow us for more episodes wherever you listen to your podcast and stay tuned for new episodes coming soon!