After The Pillar’s expose on Msgr. Burrill using Grindr, there have been a whole bunch of people claiming this was a breach of privacy. As a person writing a doctoral thesis on privacy in Catholic theology, I’ve thought about this a bit since then. I think there are some privacy issues with the whole situation, but I think a lot of the concerns are misguided.
Sorry, this is coming out a bit late: so many thoughts on this were running through my head that it was hard to systematize my thoughts. I am trying to make something that is readable by the average reader but also contains sufficient sources for more academically minded readers, which is a real challenge. I hope that despite being long, most can read it easily. (I have extensive academic footnotes, but they are shortened for space and readability. Please help me with additional key sources if you know of them.)
I will divide this into several sections. I will conclude with the specific cases of hookup app data used by The Pillar, but several points need to be laid out first to narrow in on that case. I begin with privacy, then move to digital privacy, and finally speak of app privacy before getting to the specific case of investigative journalism.
The Right to Privacy
There seems to be a universal natural sense of privacy. Privacy is basically a universal in every culture. The exact things that are private and how it is protected vary a bit, but every culture has some privacy. For example, every culture makes some effort to have the marital act in some privacy. Sociologists refer to this as socially acceptable sexual activity, which adapts to various cultures’ social structures. It would be obvious you would hide illicit or socially unacceptable sexual activity but even approved relations are always removed partially or fully from the sight and sound of others. There also seems to be a universal of concealing female genitals from men who are strangers. (Cf. Westin, Privacy and Freedom, chapter 1; Levine, “Privacy in the Tradition of the Western World”; Keizer, Privacy: big ideas/small books, 57; Moore, “Privacy: Its Meaning and Value”; Moore, Privacy Rights, 33; Solove, Understanding privacy, 66; Ben Mocha “Why do human and non-human species conceal mating? The cooperation maintenance hypothesis”; etc.)
Privacy is always a secondary right, not an absolute right. I have a right to privacy that prevents the government from going through my closet or reading my diary. However, that right is not absolute: if there is reasonable suspicion I’ve committed a serious crime, that secondary right can be overruled in light of a more important right, like the right to life were I reasonably suspected of murder. Property is likewise a secondary right: people have a right to their property, but this is not absolute as other principles can override property rights. For example, the universal destination of human goods and the duty of all to help society with the common good justify a reasonable tax rate. And if you are the only obstacle to building a new highway, the government using eminent domain to buy it – even when you don’t want to sell – can be just.
Many define informational privacy as some degree of control over that information, both in collection and use. Obviously, someone I don’t want reading my journal or taking my ID is an issue with collection. Use also matters: if I want to drive, I have to have my name, DOB, address, photo, SSN, etc. in a government database of driver’s licenses; but if the DMV then put that on a publicly accessible site, I think most of us would consider that use an invasion of privacy.
I think a good starting place is Alan Westin’s definition of privacy: “Privacy is the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others.” (Privacy and Freedom, 5). I would temper it with the idea that this claim is to a rational degree and not absolute as noted above. I would also add a final cause to a definition of privacy: it keeps human flourishing from being inhibited. Privacy does not itself directly help with human flourishing, but a lack of privacy often impedes human flourishing. If I am unable to have privacy in any conversation with a friend, that impedes the good of friendship. If I lack privacy, I lack alone time (solitude or anonymity) which is not an end in itself but is needed instrumentally for flourishing. The very fact that these are significantly impeded in prison is a large part of the punishment.
There is currently a big issue with privacy regarding electronics. Most breaches of privacy on apps rely on terms of service that nobody reads. This is an issue both in theory and in practice.
Although Westin’s privacy theories are generally helpful, he fails substantially in how much knowledge he assumes an average consumer has. He assumes that an average person is privacy pragmatic, but in reality multiple others have shown that people are really less informed than he assumes. Hoofnagle and Urban showed in a series of surveys from 2009-2012 that most people – and especially those classed as “privacy pragmatists” by Westin – didn’t know the basics of legal privacy protections or lack thereof. They note, “It may not be that people do not care. Instead, it is more likely that they do not even understand the exchange involved.” Yet our regulatory system assumes that people are well informed. Austin notes, in opposition to Westin, “I argue that the law should focus on securing meaningful privacy choices rather than on individual control over personal information.” This is because she too notes how most laws and policies assume each person is a super-informed individual. (Hoofnagle & Urban, “Alan Westin’s Privacy Homo Economicus”; Hoofnagle & Urban, “The Privacy Pragmatic as Privacy Vulnerable”; Austin, “Re-reading Westin”; others have pointed this out, but these three specifically critique Westin).
This theoretical issue gets more complicated with the technical way things are done. Most programs, apps, websites, etc. have complex terms of service that are written in barely-legible legalese, and we are just given the options of “take it or leave it.” There have been multiple examples showing how ridiculous this policy is.
- First, Carnegie Mellon researchers found that it would take the average person 76 8-hour workdays to read all the terms of service they agree to annually. That’s about 1/3 of the time an average full-time employee works in the USA (1801 hours a year). And that was in 2012 – I can only imagine more now in 2021.
- Second, as Carl Tauer notes regarding genetic counseling, “Catholic teaching has never held that any decision is morally acceptable just because the choice is informed and autonomous” (“Personal Privacy and the Common Good: Genetic Testing Raises Ethical Considerations for both Patients and Clinicians”).
- Third, another study found 98% missed clauses that included giving up your firstborn child, and sharing all data with your employer and the NSA when given terms of service to review. A similar prank got thousands to surrender their souls (despite a link to claim otherwise that got you £5 store credit if clicked).
- Fourth, the terms have no mandated, standardized, plain-English summary that people who had a bit of privacy concern might be able to check. (Theoretically, companies can add this, but I don’t trust a company’s summary as they’ll make themselves look the best.) Two Europeans noted in 2007: “Genuine informed consent should be distinguished from legally or institutionally effective consent. Informed consent in the latter sense is relative to prevailing rules, laws and regulations, and these are variable across time and place. Effective consent is no guarantee of genuine informed consent, however. A potential subject may give all the required signatures, be deemed competent by the appropriate parties, and be of legal age, without having adequately understood the necessary information.” (Kristinsson & Árnason, “Informed consent and human genetic database research,” in Häyry et al. [edd.], The Ethics and Governance of Human Genetic Databases: European perspective)
- Fifth, the US has almost no regulations on data storage or use so long as it’s hidden in the terms of service. The EU and UK do have some regulations here such as requiring companies to delete data on a user when they delete their account. Many smaller apps follow EU and UK regulations around the world as it’s easier to have a single policy worldwide, but bigger apps find they gain something by setting up two systems and taking advantage of laxer US laws.
- Sixth, even when the law is broken, the US doesn’t actually punish companies. Flo is a menstrual cycle tracking app. They told users they weren’t selling data, then were caught selling that data to Facebook. You would think they might have a fine equivalent to all the money made from selling that data and then some, right? Nope. They had to inform users they were selling data going forward and make a “pinky promise” not to go against policy again. Googling “Flo menstruation app,” any information on this breach of privacy is hidden on page 2 or 3 of results. This is more serious than recent hookup app data for three reasons: (1) it went against stated policy; (2) sharing your info (like location) with others is an essential part of hookup apps, while sharing menstruation cycles with other users is not a primary purpose of a cycle tracking app (it may be theoretically possible, but my guess is most women do not share it or share it very privately with one or two people like an email not broadcast it to all in their vicinity), and (3) selling it to Facebook with device IDs might as well be de-anonymizing all of it as Facebook knows which device IDs connect to which Facebook accounts while finding that out from the hookup app data was difficult and only done for a single high ranking official, not every single device ID.
We definitely need better policies – either through law or citizen’s pro-privacy groups – for our digital privacy in general. The current trajectory is dangerous for our privacy.
Apps for Sharing Location
Apps where the central purpose is sharing certain info with others, would generally not be able to ensure privacy on that data shared. I’ll start with another app as an example, move to hookup apps, move to de-anonymization, and then talk about the most ridiculous line of this whole controversy.
Let’s take Twitter as an example. I don’t think anything shared on a public account can be considered private in any way. Even if a public figure protects their tweets after a scandal, the fact that the thousands who follow them can see tweets and that they are a public figure minimizes the amount of privacy they can expect from those tweets. I know any tweet I make could end up in a news story, and that is something I accept being on Twitter as the point of the app is to share tweets (being verified with 55,000 followers probably makes this more so for me, but it would apply to most public accounts even if small and unverified). I assume DMs are relatively private, but they can appear publicly if a priest were using them to groom a minor (as that’s newsworthy) or if the person asked permission / indicated they would post them publicly (for example, if a reporter tells me they are writing a story and asks me for comment in DM, I assume they can publish the reply).
Hookup apps or other apps where sharing your location publicly is a central aspect of the app obviously cannot make your location completely private. Every user near you has some idea of your location. Some may argue that the app only shows distance to other users, but GPS spoofing and triangulation based on that is not too hard for any user on the app to do. A person was able to track Tinder users’ locations to within 100 feet back in 2014 when the app was open (and I can imagine it you could do even better today). These apps also include profiles that show you in a way to attract other users. So those using the app might be discovered by other users if they connect photos on the app to photos elsewhere, as happened with a legislator in North Dakota. This is where the issue of blackmail or similar is most likely to happen: I can only imagine if some unscrupulous person found a priest on one of these apps, they could ask for money or favors.
De-anonymization or doxing can be an issue on many different apps. Going back to Twitter: some use it anonymously, but obviously, if they are a public figure it’s newsworthy when their burner is discovered. I have an “Autistic Priest” Twitter account that I was using anonymously for about a year before I made my diagnosis public. Instead of creating a whole new account, I took an old account I had created for automatically retweeting the Pope (@Pontifex2FB) which no longer worked for technical reasons (a Facebook page I made at the same time to automatically post Francis’s tweets to Facebook is still in operation). Twitter has a unique ID number for each account that remains even if you change the name: if anyone had saved the ID number of @Pontifex2FB which clearly said it was my account, they could see that it was the same as @AutisticPriest and de-anonymize me. If I am considered a public or newsworthy figure, then there likely would have been a news story.
Moving on to apps in general and location apps. Many pieces have talked about issues with such data privacy and anonymity. The New York Times did an extensive report on cell phone location data for sale in 2019. This year, they used similar info to track those who entered the capital building on January 6. This coverage in the popular press is more recent, but academic journals have been talking about this much longer. Here is a chronological list of some of the academic pieces written on this in the past 50 years. I provide a bunch of sources to show a pattern in academic publishing on the topic: this is beyond an average person to keep up on, but app makers should be aware. Not every case speaks directly about hookup apps, but I think all are related in some way.
- In a report for the Rand corporation 1973, computing pioneer William Ware noted, “The concentration of information within computer files at one location and the access to such files through remote access terminals tend to magnify the opportunities for misuse of personal information.” (“Records, Computers and the Rights of Citizens,” 2) He also gave five principles for data privacy he’d recommend, which I think are good, which still aren’t in use in the USA.
- In 1991, Wendy Mackay noted that most people don’t change default settings on computer systems (Mackay, “Triggers and barriers to customizing software”). This matters for app privacy as corporations set defaults and the less privacy we have, the more money they can make on our information.
- In 1997, Judith Wagner DeCew noted there are four issues with big databases. Her main concerns seem to be drug testing and credit bureau databases, but similar things would apply for app databases today. She notes: (1) you lose control of data once collected, (2) loopholes to legal protections of privacy, (3) data is accessible by so many individuals, and (4) data rarely disappears. (In Pursuit of Privacy: Law, Ethics, and the Rise of Technology)
- In 2001, David Lyon worried about how such “fragments of information gleaned from interactions and exchanges, comes back to haunt the person about whom the data purports to be.” (“Facing the future: Seeking ethics for everyday surveillance,” 178)
- Christopher Slobogin took issue with the overuse of CCTV by the government and how that could negatively affect anonymity (“Public Privacy: Camera Surveillance of Public Places And The Right to Anonymity”). Similar concerns would seem to apply to “anonymous” apps that broadcast your location.
- Luciano Floridi noted in 2005 that the current lack of “infospheric friction” meant that privacy against databases was almost non-existent for the average person and he proposed increasing that “infospheric friction” (“The Ontological Interpretation of Informational Privacy”).
- Salvör Nordal argued in 2007 that the correlation of so many huge databases made privacy illusionary in modern society (“Privacy”, in Häyry et al. (edd.), The Ethics and Governance of Human Genetic Databases: European perspectives).
- In 2008, Wolfgang Sofsky began his book with an extended trip of an ordinary citizen showing the vast number of intrusions to privacy that happened in an average day, and noting how both corporations and the government see it as in their interest to infringe on our privacy so unless we resist it strongly, they will do so (Privacy: A Manifesto).
- In 2013, some scientists found that 4 location data points could identify a person 95% of the time. The same researchers noted this was still the case in a newer 2021 paper.
- In 2015, Michael Fuller wrote about privacy issues with Big Data. He notes, “In both the generation and storage of big data sets containing personal information, there are issues raised around the confidentiality of the information involved.”(“Big Data: new Science, New Challenges, New Diagnostic Opportunities”) He also notes that the values often used in Big Data are essentially financial values related to the value of the data for the company: there is an unequal exchange as most people don’t understand how this data can be used but the corporations know exactly how. In 2019, he noted, regarding Facebook, “This leads to situations in which ‘people consent to the collection, use, and disclosure of their personal data when it is not in their self-interest to do so.’” (“Big data and the Facebook scandal: Issues and responses,” 16, citing: Solove, “Privacy self-management and the consent dilemma”)
- In 2016, Marjolein Lanzing noted extensive privacy issues with fitness trackers (“The transparent self”): it would seem reasonable that any app regularly broadcast your exact location would have similar issues.
This leads to the most ridiculous line I’ve seen in this whole story was Grindr’s statement to Vice: “The alleged activities listed in that unattributed blog post are infeasible from a technical standpoint and incredibly unlikely to occur. There is absolutely no evidence supporting the allegations of improper data collection or usage related to the Grindr app as purported.” My doctorate on privacy is more about larger principles, but I had read a bunch of the academic papers and books on this before showing how simple such a process would be (see references in this piece, such as the bullet points right above this) and studying a degree in computer engineering before the seminary gives me a decent understanding of technical details. I was completely unsurprised by the technical aspects of the story. (I had to take a step back and realize how much I’d read that I doubt others did when I saw how surprised many were about such technical details as I fell a little into the trap of knowledge.) I am not expecting most people to be up on issues but if your app is worth a quarter billion dollars and you care about privacy, someone on staff should be reading these papers and helping you adjust policies accordingly. Also, Grindr just paid a $11.7 million fine in Europe for allowing stuff that is very similar to these reports. Moreover, reports on Grindr’s specific privacy concerns have been online before this.
Matthew Shadle on Catholic Moral Theology makes a good note that Burrill could have had his privacy settings such that Burrill may not have opened the app at each place the location was grabbed at. Apps can ask for permission to use location data when not in use. Personally, I only enable this for the weather app I use so the weather notification stays correct, but one less familiar with such security may enable this. Therefore, without more information than is currently public, we can ask if the app was opened each time. Like, Shadle notes, even if this is true it does not exonerate Burrill as he had the app on his phone and visited a gay bathhouse in Vegas. Moreover, since the Pillar indicates app signals on a “near-daily basis” not a “daily basis,” it seems likely it is actually only tracking when the app is open as if it was tracking the phone at all times even when the app was closed, it would have daily location data.
There is some violation of privacy in this data that should be fixed up along the lines of the privacy issues noted above for digital privacy in general. However, as sharing location is an essential function of the app, its privacy is going to be limited. It is along the lines of investigative reporters who have gone into members-only conventions to see what people say behind closed doors when they only think a few dozen or few hundred supporters are present or what they say in the hallways at such conventions in clear earshot of a whole bunch of attendees. With a hookup app, you might think you are just sharing your data with a few dozen close to you also using the app, but that data can easily be picked up by others. Some have attempted to compare this to a state-spying or things at a similar level, but with the app sharing locations whenever open, and normal levels of data collection and sale today, this analogy seems highly inaccurate.
Finally, we get to questions about The Pillar’s reporting. I will deal with five topics: the data in general, specific data issues, arguments from journalism experts, some disingenuous arguments, another moral theologian, and a side issue.
Is using this data ethical? Investigative journalism often uses data collection, informants, etc. who are far from perfectly clean morally speaking. Without this, you’d have a whole bunch of major stories that never reached the light of day. As noted above, I think there are issues with the amount of data apps collect, how they use it, and the limited real choice we have in this regard. I think US laws should be strengthened in this regard. However, investigative journalism almost always includes some not white as the driven snow information. I see this along the lines of a person impersonating a supporter of a group they want to investigate to go to a conference of that group. For example, I imagine The Atlantic reporter who videotaped Richard Spencer talking to his supporters at a conference didn’t go in with press credentials as journalists often do for other conferences to get in free but blended in with the crowd and had a hidden camera. If you are like me, such investigations don’t seem squeaky clean, but ultimately would be justified by a journalist for the purpose of an investigation into matters of public interest. The object being sought is exposing something that figures are saying to supporters that may vary from public statements, and although the means aren’t perfect, none of them is intrinsically evil so always forbidden. I see this data used in a similar way: the way it is obtained has some issues and there is evident cooperation in evil in obtaining the data (paying the company that has questionable privacy policies and whose app is primarily used for hookups, which are immoral). Thus, I think such data can be used ethically in general.
There are a number of more specific questions about data. First, they need to be aware of their source. Ed Condon said on their podcast that once he’s verified information from a source, he doesn’t really consider the source’s motives. Although I think the focus should be on the information, often a source may be presenting X to move the narrative in a direction they want while hiding Y which would damage the narrative that source wants, and you don’t want to become a mouthpiece for an anonymous source. Related to this, second, we have the question of the purchase of data. As the more extensive data set is seen, it seems that this was an expensive data set to purchase so something you’d want to look more into motives of the source. Third, there is the question of when to de-anonymize. I think the Pillar’s judgment to de-anonymize Burrill but simply state hookup apps were used in 10 of 212 rectories in Newark seems balanced. The random priests in these 10 rectories are not public figures and even mentioning which rectories would taint other priests there who are faithful to clerical celibacy. (Mentioning the diocese could theoretically be seen as tainting all priests there but given Newark has 705 priests, that means we are dealing with about 1% of priests in the diocese.) If they keep to that standard, I can’t fault them.
Two journalism experts I note make critiques worth noting. First, Catholic News Agency ran an interview with Dr. William J. Thorn, a Catholic Journalism professor emeritus. I think some points are valid, but others I wonder about.
- Thorn: “The investigative reporter moves to confront the subject and provides an opportunity to deny, admit wrongdoing or explanation… Simply drawing conclusions from an online source seriously challenges verifiability and risks libeling an innocent individual.” I definitely think this is true. From The Pillar podcast, they had arranged a meeting with Burrill and others to discuss it; it was rescheduled and they had submitted written questions; then as they were driving there, he resigned and the meeting was called off. This interview was published the morning after the podcast at 6 am so I’m going to assume the interview was done before he heard the podcast and this is just a hypothetical. As far as official data, I think a data source like this is as official as a receipt or similar if reporting something else.
- He notes that such information and de-anonymization by journalists can’t be used to blackmail the targets of the investigation. I agree: if The Pillar does that, I wouldn’t defend them for 10 seconds.
- Thorn notes, “The celebration raises questions about ignoble motives, e.g., revenge or personal animus connected to the investigation.” I concur. I personally feel hurt by this betrayal. I wish priests weren’t using hookup apps but if found using them, priests should be removed from positions of authority and – at least temporarily – from active ministry.
- Thorn argues, “In Msgr. Burrill’s case there is only circumstantial evidence of behavior based on GPS location with no eye witness or other factual evidence such as a credit card receipt… Grindr location data insinuate but do not demonstrate the alleged corruption, or perhaps a level of ignorance in the user about the actual privacy of the Grindr app.” The first sentence here makes me wonder if Thorn does not fully understand this technology: this is about as clear evidence as any of those he supposes, and more reliable than an eyewitness without photos. Regarding the second sentence: the issue is not that Burrill got caught, it is that a priest was using an app whose purpose is to facilitate hookups.
Second, John Allen Jr., the editor of Crux, noted, “As an editor I wouldn’t have run the original Burrill story based on the information it contains.” I actually concur here. A screenshot of his Grindr profile would have provided another level of evidence to confirm he used it. I doubt that profile could be found now. (Ethically, I think a journalist could download the app and GPS spoof they were near his location a number of times to see if he was on to grab his profile photo, but most direct communication with him through the app would cross the line similar to police entrapment.) John Allen also noted, “he’s a public figure, but at a low level and therefore the bar should be higher to compromise his privacy, especially in a way certain to damage his career and soil his reputation.” Canon 220 of the code says, “No one is permitted to harm illegitimately the good reputation which a person possesses nor to injure the right of any person to protect his or her own privacy.” I do think there is an issue of reputation, but I think the intention not to de-anonymize a whole bunch of the other data they have shows proper restraint on The Pillar. There is a question of how big of a public figure he is. Unless one resigns as Burrill did, his position almost automatically includes a miter when the term ends.
Some arguments against this journalism seem disingenuous. First, people are claiming this is homophobic: this claim was made even after the second story stated, “Evidence that both homosexual and heterosexual hookup apps were used in parish rectories or other clerical residences.” Second, some are claiming this will cause blackmail. As Zac Davis said, “It is difficult to see a scenario in which The Pillar’s report will lead to more transparency and less secrecy. Instead, it is a blueprint for blackmail. And unfortunately, the threat of blackmail is a factor in the coverup of sexual abuse; those who fear their own reputations will be destroyed are far less likely to blow the whistle on someone who offenses are criminal.” The reality is that this reporting is only revealing an existing situation of blackmail. The blueprint for blackmail is priests using hookup apps. Reporting just reveals an existing situation where someone could be blackmailed. A priest who is unfaithful but not abusive is less likely to report an abusive priest. Third, some of the less nuanced privacy concerns also seem disingenuous as they would imply a lot of other things I doubt the person would support if pushed on. Many have double standards or inconsistencies here.
As I was finishing this, I saw Matthew Shadle’s piece: I think he presents the best argument that The Pillar acted contrary to privacy as he makes some important distinctions. Shadle thinks the app location data just meant he was there not he used the app there so they are making unfounded accusations that he repeatedly used the app, which was addressed above. He also wonders whether, upon Burrill’s resignation, The Pillar needed to publish anything: I think given that the USCCB note mentioned forthcoming media reports and they didn’t seem clearly moved to act without those reports, it would have been odd if no media report came out; however, I can see an argument for just mentioning they had probable evidence of priestly infidelity, without going into details; but on the other hand, they knew their other stories on hookup app data in the pipeline so that would be the suspected source anyways even if not stated explicitly, plus I see no obligation not to publish what that probable evidence is in the way they did.
Finally, regarding journalism, I think the Pillar made a slight error. Their original story would have worked better as two stories: a news story on Burrill and an analysis piece with the Fr Thomas Berg interview and other citations on the connection between hookup apps and abuse. I think pushing it all into one story had two bad effects. First, although there were a few lines indicating there were no indications of minors or abusive sexual activity, the extensive space given to this made some read it that way which is not fair for Burrill and reates a number of responses that are not helpful. Second, it makes it harder to respond to significant breaches of priestly infidelity that aren’t connected at all to abuse or minors which might happen come to light in the future.
Overall, it seems like this was within the realm of what is ethical for journalists. I don’t think it was pure as the driven snow, but I don’t see a clear breach of ethical principles. The Church is better if we respond promptly to such sexual impropriety.
Privacy is a serious concern about what control we have over our data. In a digital environment, this is becoming increasingly difficult to maintain. We should put in better regulations and hold digital companies to a higher standard of privacy. On the other hand, if the purpose of an app is to broadcast certain details about ourselves, we should expect less privacy about those details than an app where sharing data is not the purpose of the app.
Investigative journalism inevitably has ethical questions. So far, from The Pillar’s reporting of this data trove, I see no clear breaches of ethics. If, on the other hand, they were to use said data to de-anonymize random pastors nobody has heard about or blackmail others, that would be a huge ethical concern and I would denounce anyone doing that.
- On tip of this taking longer to write than normal, I got this back from a friend looking over it just as a conference started this past weekend.
- If you appreciated this, please support me on Patreon so I can write more analysis on the news from the perspective of Catholic theology.