It continues to surprise me how many of my fellow community managers are either resistant to, or just plain disinterested in, topics surrounding personal data protection and privacy. We own the most visible and vulnerable centers of non-compliance when it comes to legislation like GDPR, California’s CCPA and CPRA, and similar laws enacted in Colorado and Virginia (and active legislation moving through the U.S. state governments in New York, Massachusetts, North Carolina, Minnesota, Ohio, and Pennsylvania). Non-compliance with data privacy legislation, especially GDPR, is a very expensive prospect if the offenses are egregious and numerous. GDPR violations can cost a company up to 20 million euros or 4% of their global revenue per year per violation! (The biggest fine doled out so far has been Amazon’s whopping 746 million euros in 2021.) Yet, many community manager groups and professionals are hesitant to talk about data privacy as a core aspect of their jobs.
When I was at SAP, I led the team responsible for implementing GDPR compliance for a community of just under 3 million users, many of whom are European Union citizens and thus, fall under the protections of GDPR. As a tech giant in Germany, SAP is held to the highest standards of both good and ethical business practices by the European business and tech communities, as well as by the governing bodies of the country and the EU (not to mention its own internal compliance and ethics teams). What this means is that under intense scrutiny, the team and I, in partnership with our legal and data protection and privacy offices, had to figure out how to navigate an intensely complex technical ecosystem to implement both policies and platform features that were compliant and also user-friendly. As non-engineers and non-lawyers ourselves, this was a very daunting task and it took a lot of research, learning on the job, and collaboration. That said, when it was all said and done, I realized that I had developed a deeper understanding of what it truly meant to build a user experience through Privacy by Design.
I took this experience with me as I ventured out into the broader world of developer community work, annoyingly becoming that person in every conversation who questioned how the user data was handled, how it was anonymized and protected, how the platform handled compliance with user requests. What I found as I started thinking about getting back into more content creation for the community manager audience is that there are surprisingly very few of us who have had this experience. I’m not sure if it’s because most of my fellow community managers have worked solely in U.S.-based tech companies (I know this isn’t true…) or because it’s never been put on their shoulders to manage, but it’s concerning to me as the community profession develops rapidly in the wake of the pandemic.
If you are utilizing a vendor-built platform (think: Khoros, Salesforce, inSided, Vanilla, Discourse, etc.) many of the features and functionality to ensure GDPR compliance are built into the platform. Yay you! But you still need to handle the process side of the equation. Who handles user requests? Where do they come in? How are they processed? What are your policies regarding data retention? Do you know how to manage requests for data that goes beyond just your platform? What about your other data processors? Does your user data flow into a CRM? A digital marketing platform? An analytics dashboard? Is that data anonymized? How do you handle Right to be Forgotten requests in a public-facing community? What are the limitations to your responsibilities? Is your privacy statement updated to cover community engagement? These are all questions that community managers should be equipped to address.
There are lots of resources out there to help you learn about these various legislative requirements, but the deeper question I have is: why do we think this is not our responsibility? I consider myself a caretaker of my members’ data. It’s why I generally don’t share it with marketing, sales, or any other function of the business that my users did not consent to (and that creates plenty of tension with those teams!). Have many community leaders shelved this issue in the “future problems” bin? Are organizations putting these compliance measures in the hands of “Trust and Safety” professionals instead? How does that work into ownership of the community member experience? Where does the community leader fit into that model?
Would love to hear your thoughts in the comments.