Over the past months, we’ve been listening to partners, customers, employees, and community members regarding Pantheon’s position on content moderation and our responsibilities as an open web platform in a polarized world. While sometimes charged, these conversations have been valuable and constructive. We’re announcing an important update to our Acceptable Use Policy (AUP). This letter explains how Pantheon is navigating these questions and changes in our policy.
How We Got Here
At DrupalCon ’23, Pantheon hosted a roundtable discussion sparked by our decision not to respond to calls on social media to drop service for a customer, Alliance Defending Freedom (ADF). Our priority during this discussion was to provide a space for constructive conversation and open dialogue — to be transparent about how and why the company made the decision, listen to feedback, answer questions, and more broadly discuss the challenges of governance on the open web.
The room in Pittsburgh was overflowing with passionate members of the open-source community, and David Strauss and I spent nearly an hour in dialogue. That discussion has been echoed in many individual and small group conversations before and since the roundtable. These conversations have been overwhelmingly respectful and productive and have informed our evolving perspective, which I’m happy to share today.
After months of work to craft the right approach, we have revised our Acceptable Use Policy (AUP), creating standards not just for what content or behavior we will permit but also, for the first time, how we will evaluate potential customers. Going forward, we will not do business with organizations whose primary purpose is organized hate, voter fraud/misinformation, or climate change denial.
This is not something we have undertaken lightly. Read on for a deeper explanation of how we arrived at this decision, how we intend to apply this policy, as well as a link to the policy document itself.
Of Customers and Content
We believe that, as a rule, Pantheon should not interfere with how our customers operate their websites. We agree in principle that the further “down the stack” you are, the closer you should stick to the rule of law as your operating model for acceptable use.
However, over the past months of listening, a common sentiment that’s been expressed is that people expect more from us. Our historical position is industry standard, and being asked to clear a higher bar is — while difficult to hear sometimes — really quite an honor. That expectation comes from a place of passion and care that is hard-won for any company or brand.
But it’s not simple. While we’re not a network provider, and it’s not as though we have an ethical obligation to treat all packets equally, we do serve a diverse array of news organizations, research institutions, and causes. It’s not our place to dictate editorial standards, define the limits of academic freedom, or determine what kind of political speech is acceptable for our customers. We’re also very wary of how well-intentioned content moderation policies have historically been weaponized by bad-faith actors to attack marginalized groups.
Indeed, our roles and responsibilities are very different from the likes of X/Twitter, Facebook, or even a small community forum. We are a website operations platform but not a speech platform. We do not confer brand legitimacy or provide an audience to our customers. Our customers’ websites and content belong to them, not us – a vital aspect of the open web’s value.
As such, we moderate customers’ website content only when legally required (e.g. copyright enforcement), or when we believe there’s a tangible threat to safety. Instances of targeted harassment, threats, and doxing have been rare, but when they have occurred, we’ve taken swift action. Otherwise, we have historically adopted a hands-off approach.
But this expectation that we do better is one we aspire to meet. To do so, we needed a separate framework from content moderation. We found useful prior art in the application of ESG (Environmental, Social, and Governance) as investment criteria, as well as in how organizations have worked to align themselves with the UN Sustainable Development Goals.
We also realize that who we do business with isn’t necessarily a high-stakes question of free speech or deplatforming. The open web is incredibly robust, and while Pantheon provides unique value to professional web teams leveraging open source, and particularly to customers with large site portfolios, we do not have the power to kick someone off the internet. Which is a good thing! Indeed, as I’ve been listening to customers and partners on this issue, almost nobody has said that they want any given group’s website taken offline; they just don’t want it running on our platform.
So, when it comes to who we work with as a customer, we are marking new boundaries with a Prohibited Customers policy. We are setting standards that will, for instance, help us avoid taking on Southern Poverty Law Center-designated hate groups as customers. Still, the policy is deliberately narrow, and I expect it will be disappointing to some.
Firstly, while we’re adding more depth and examples to the Prohibited Content policy, this part remains effectively unchanged. That’s a “church and state” line of separation that we believe should only be crossed in extreme circumstances.
Secondly, the Prohibited Customers policy is intentionally not designed to be a purity test; it considers the customer in totality, not individual pieces of content or even distinct websites within a primary domain. It also sets a high bar around the primary purpose or majority of activity to identify a Prohibited Customer.
So, for example, a university with a provocative student group or far-out professor would be considered in toto as an institution. A newspaper that publishes an objectionable column or misleading article, even in a charged national moment, would be evaluated based on their total output, not the most extreme or inflammatory example. An advocacy organization with a multi-plank policy platform would not be turned away even if one of their policy positions could arguably cross the line.
However, organizations whose primary purpose or the majority of their activity is focused on one of the above mentioned categories would be Prohibited Customers and going forward will decline to offer them the use of our platform.
Because of the self-service nature of Pantheon and the inevitability of human error, we must also consider how to handle violations of the Prohibited Customer policy. Our perspective is that when we take on a customer we shouldn’t, the fault is with us, not the counterparty. Thus, the response will be much more measured and deliberate than to violations of the Prohibited Content policy.
Unlike the swift action we must take for dangerous or illegal content, when we determine someone out of compliance with the Prohibited Customers policy, we will provide ample notice and ensure an orderly transition. In some cases, we may simply not renew business when their subscription term expires. At a minimum, we will provide a reasonable period for any impacted customer to find an alternate home on the open web.
For those curious about the details, you can read our new AUP here. Though no policy is perfect and all written clauses are subject to interpretation, we think this will be a welcome improvement, and that we can implement the new policy with a minimum of ambiguity for our partners, prospects, current customers, and employees.
Finally, I would like to personally thank everyone who I’ve spoken to about this over the past six months from all different perspectives. I don’t believe we could have reached this position without all the input and critical thinking, and I am looking forward to more frank and open conversations as we proceed.