This is a transcript of episode 93 of the Let’s Get Data-Driven podcast.
I’m Lanie Lamarre and I know this isn’t the case for everyone but I actually love change; it’s uncertainty that makes me anxious. Maybe this is why “responsible marketing” and privacy compliance in digital marketing doesn’t really intimidate me – I find it empowering to know how and why things are being done the way they are – but even I feel like it’s a little never-ending because I can’t tell you how often I’m faced with something I say, “Oh! I never thought of that!”. Today, we’ll be talking about one of those instances I had never really thought of… til now!
I recently read this story that perked my ears up: there’s a class action lawsuit against Foot Locker over their website chat feature, which “alleges that because the company is recording chat conversations, archiving them, and sharing them with analytics partners to gather insights, the company is illegally wiretapping.” Because it isn’t obtaining prior consent from its visitors before collecting this information and this information is said to be sold to third-parties – also without consent – the lawsuit alleges a violation of the California Invasion of Privacy Act. It appears that Crocs and Adidas have similar suits against them so this got me thinking about how online marketers are using chatbots in their business and what’s happening with this data – what’s being collected, how is it being stored, where is it being used.
This episode could be titled “welcome to my rabbit hole” because frankly, I knew very little about how chatbot tracking works. My personal experience was limited to the user’s side of the interaction where I’m asking questions but I never put much thought into what happens to what we’re inputting into those chatbots and chat features. And if I didn’t know, well, I’m probably not the only one so I did my research and I’m sharing my findings with you today.
Having said that, if you’re using chatbots or chat features and you have questions or concerns, this is not legal advice and I encourage you to seek out counsel that will meet your specific use case and requirements. My hope for you with today’s episode is to get you thinking about the information you’re collecting and the responsibility you have in doing so.
But let’s start at the beginning – always a good place to start – what the hey are chatbots anyways?
WHAT ARE CHATBOT AND CHAT FEATURES
You know when you access software or an ecommerce site and usually in the lower right corner of the screen, there’s a little chat box where you can input your questions? That’s a chat feature and it’s designed to provide visitors with 24/7 customer service, without requiring a human being to be on-call at all times. Usually you’ll be redirected to a knowledge base with a lot of answers to frequently asked questions but they also advise you that a real person will get back to you in whatever time period. Chat features are designed to hook you up with access to answers you’re looking for ASAP but also queue you up for the genuine human interaction you may need as soon as someone is available to help you.
Chat features typically live on your website, which is a little different from chatbots, which typically live on your social media platforms, specifically in your DMs. Rather than point you to resources and act something like an answering machine the way chat features do, chatbots are designed to simulate human interaction and engagement by using automated, canned questions and optional choice answers related to the offers and content people would typically want more information about.
Both of these are growing in popularity and rightfully so; when used properly, they provide a level of immediate gratification to your audience while also granting you the space to reach out and engage on your own schedule and availability.
Another advantage of chatbots and chat features is their ability to collect data on customer interactions and preferences, which can be used to personalize the experience you offer. However, any time you’re collecting data about people, you’re exposing yourself and those people to certain risks and of course, you have a responsibility with how you handle, store and use that information.
So let’s talk about those risks and responsibilities:
MITIGATING THE RISKS OF USING CHATBOTS AND CHAT FEATURES
The main risk with chatbots and chat features – or really any data you’re collecting and storing – is the potential risk of a data breach.
What consists of a data breach? A data breach is an unauthorized access or disclosure of information, such as personal data, financial information, or even your intellectual property. Data breaches can happen in the form of cyber attacks where hackers access the information you have collected and stored, but it can also happen due to human error when an employee or contractor clicks a link they should not have, and it can occur in the form of data loss where an employee or contractor downloads this information for their own unapproved use.
Digital marketers should be concerned about data breaches because you’re collecting and storing all kinds of customer data, including personal information such as names, addresses, and credit card details. If this data is compromised, it can lead to significant harm to the affected individuals and damage to the company’s reputation, and that’s on top of the legal and financial consequences such as fines, lawsuits, and loss of revenue. And here’s the thing: people are increasingly aware of privacy risks and the importance of data protection, and your customers may choose to take their business elsewhere if they do not trust you to protect their personal information.
As such, you want to minimize the risk and vulnerability you are exposing yourself and your audience to with the way you’re collecting, storing and using their personal information. If you’re using chatbots and chat features, this is yet another point of data collection for you and here are a few things you want to consider with the way you’re using and configuring your settings:
- Let visitors know what data is being collected and how long you will be storing that data: The issue with the Foot Locker case I mentioned at the top of this episode is that they were collecting data without consent AND they were selling it to third-parties. One of the tenets of data privacy is that data can only be used for purposes for which it was collected; this means that if you used a chatbot to collect someone’s email address so that your support could get back to them as soon as a human became available, you can’t input that email address as a new subscriber unless you have explicitly told them you would be using their information in this way. You have a responsibility to be transparent with how you’re using and intend to use that information.
- Limiting what data the platform collects: The more places you’re collecting and storing data, the more you open yourself up to vulnerabilities. Limit what information you store and collect in your chat software to only what is necessary. For instance, unless you’re calling them, you don’t need to collect their phone number. If you don’t need to know their location, then you don’t need to collect their IP address. Make sure that the settings you select for your data collection actually reflects the information you need to accommodate what you’re both using the software to accomplish.
- Limiting access to your data sets: Chances are that you’ll only have a select set of people who will actually be using the data your chatbots and chat features collect. As such, you want to limit access to that data to only be available to the people who actually have a valid use for accessing this data. In turn, this limits your potential data loss and cyber attacks to the few people who do access this information.
- Aggregate the data you want to keep: If you intend on keeping the data and conversations collected within the platform for reasons such as audience research or you want to identify what struggles or problems your audience trends with, these are great uses of the information you’ve collected first-hand and I encourage you to do so… but do so responsibly. It isn’t a bad idea to create a whole new database with fields for the feedback you’re looking to keep for your market research purposes and to omit entirely the fields that would contain any Personally Identifiable Information (PII). The goal in doing so is to make your conversations virtually unidentifiable. This is a process called “aggregation” where you break down larger data sets into its component parts and in this case, you’d be removing the parts where the individual’s identity would present itself because you don’t need to know who said what or how to contact them in order to conduct your market research.
- Read the software’s Terms of Use and Privacy Policy: This seems like a Captain Obvious comment but most of us just check the box saying we agree, without getting clear about what we’re agreeing to. You want to make sure you’re aware and comfortable with what the platform is doing with the data you’re using it to collect. Key things to look out for are whether they reserve the liberty to sell the data you’re collecting to third-parties; you want to ensure their software is GDPR and CCPA compliant and reviewing which settings are required for you to be using the software in this way; you want to be clear on what the platform’s safeguards are; and you want to know what its use, disclosure and retention policies are, meaning you want to know how long they’re keeping records of the data you collected, when they will share that information with others and who those other entities are.
RESPONSIBLE MARKETING
While the context we’re discussing here is about chatbots and chat features, these guidelines for mitigating risk applies to just about any software you choose to use on your website and in your business. There are ways of collecting the information that is useful for you to make better, more data-driven business decisions without having put the exposure of personal data at such high risk.
There will always be risk, of course. The more factors and components you introduce, the more risk you choose to expose yourself to. This is why you want to take every opportunity available to you to mitigate risk and minimize your points of vulnerability.
Is it a pain to be this flipping vigilant about everything? Yes, of course it is. And that’s the point! Not to get all comic book hero on you but with great power comes great responsibility and information is power; the more information you’re collecting, the more responsibility you have to safeguard it.
It’s kind of like buying a dog: you have one dog you’re responsible for taking care of and feed and make sure it doesn’t escape through the fence… and then you get another dog, which also needs to be fed and you also have to make sure doesn’t escape and you now also have to make sure it doesn’t bother the first dog too much. You’re not giving yourself less work or even the same work by adding another dog; it’s more to do, it’s more risk you’re taking on and you have more to account for.
And here’s the clincher: you ARE expected to account for your use of these platforms. A software or platform may advertise itself as being privacy-compliant – and it probably is when it is used “straight out of the box” style because software WANTS to remove any doubts you may have with using their products and adhering to privacy legislation is a big one – but you’ll often find that these platforms will allow you to change the settings and you may not realize that in changing those settings, you’re also changing your compliance. Regardless of whether we’re talking about chatbots or some other software, always read through HOW your software says they’re privacy-compliant to make sure that you’re not unknowingly “un-doing” your compliance.
Because – and I say this a lot, but – you’re the boss, apple sauce!
Talk soon, baiiieeee!!