Did you know that your tech devices have the potential to break your clients’ confidentiality just by being in the counseling setting with you? Imagine that you have worked a full day seeing an array of clients for the various concerns they are facing. Then, at the end of the day, you snuggle up on the couch and scroll through your phone’s applications. You notice numerous ads and suggestions that relate to the topics clients have shared. For instance, imagine a client sharing about a traumatic event that happened in the Catskills, and now you have Airbnb suggestions for that area, along with resources for dealing with sexual abuse.

You may be wondering, “How did that happen? Was my phone listening to our session?” The answer might be yes.

In other cases, you might not be made aware that your phone was listening, but it is important to know that it has that capability. The reason for this is the voice assistant technology on your devices. While on, these devices are constantly listening. For instance, Apple iPhone is listening for the word “Siri”; anything said after that is considered a command. The same is true with Amazon’s voice assistant Alexa and with Google Assistant. Each of these devices is waiting for its name to be called so that it can follow up with whatever assistance the person using it desires.

However, it has been found that the devices sometimes mistake certain words and are activated unintentionally.

This past July, The Guardian newspaper shared shocking reports from an Apple contractor. This whistleblower reported that Apple contractors “regularly hear confidential medical information, drug deals, and recordings of couples having sex, as part of [Apple contractors’] job providing quality control.” These workers are tasked with listening to grade the responses of the company’s Siri voice assistant. For example, the workers will grade if the response from Siri was accidental or deliberate and if Siri’s response was appropriate.

But what does this mean for professional counselors? Just think invasion of privacy and breach of confidentiality concerns.

Voice assistant concerns in the counseling setting

This next section is going to present a hypothetical counseling office to address some of the confidentiality concerns that surround the counseling experience with technological voice assistants. Consider whether you address these concerns in your informed consent with clients. Would these occurrences align with Health Insurance Portability and Accountability Act (HIPAA) regulations?

Waiting room: Counselors strive to create a warm and inviting setting to foster a comfortable feeling for clients because they are in a vulnerable situation. Perhaps some relaxing music is playing in the waiting room. Consider Alexa being programed to shuffle through various playlists of calming songs throughout the day.

As clients await their sessions or end their sessions, they may need to discuss billing with the front-desk assistant or call their insurance companies. Clients may even take a call during this time for other purposes. Alexa hears all of these conversations throughout the day. Therefore, the potential is there for the entrance to this “safe place” for clients to instead become a place where personal information is leaked to Alexa and to those who monitor Alexa or have access to Alexa’s recordings.

Additionally, clients may not even realize that while they are in your office discussing billing, diagnosis, and plans moving forward, their smartphone’s voice assistant can be eavesdropping as well. The same goes for all of the other smartphones located in the waiting room, including those being used by personnel working the front desk.

In session: When clients and counselors meet in an office, safety is a concern. Therefore, counselors may choose to keep their phones in their pocket or nearby in case they need to call for help. Some sites may even have a policy requesting that counselors have their cellphones on them at all times. However, now these phones’ voice assistants can have access to the dialogue that occurs within the room. This also means that whoever is monitoring the voice assistants have access. What was intended to be a safe place for clients to navigate and process concerns is now compromised.

Can you imagine if you, as the counselor, were facilitating a group and each client had a smartphone with a voice assistant? Consider also if you take notes on an iPad that has voice assistant technology. As counselors, we understand there are some limits to confidentiality. However, these voice assistant technologies have the capability to leak what clients and counselors once believed to be confidential information.

 

Disconnect: Don’t be considered liable

A number of considerations need to be taken into account by both the counselor and the client regarding confidentiality of sessions when voice assistant technologies are present. First and foremost, this issue should be addressed. Now that you are aware of the implications for your practice, you are ethically responsible for addressing these possibilities with your clients.

According to the 2014 ACA Code of Ethics, clients have the right to confidentiality and an explanation of it limits (Standard A.2.b.). Understanding these limits, clients have the right to make an informed decision regarding whether they would like to participate in counseling services with you (Standard A.2.a.).

Therefore, if you choose to utilize voice assistant technologies, you need to inform clients of the benefits and risks prior to them beginning counseling services. This explanation is not limited only to the counselor using these technologies but also acknowledging whether the counseling site allows its staff or clients to use them. If your site chooses not to utilize voice assistant technologies, you will need to address what your protocol is concerning this matter. For instance, will all cellphones be turned off? How will this be regulated?

What if your site requires cellphones for safety concerns or if clients are not willing to turn their phones off? How can you still protect client confidentiality and be in alignment with HIPAA regulations? The simple answer is to turn off your voice assistant technologies. You might consider noting the confidentiality risks in your informed consent and then sharing some of the directions noted below for how to disable these technologies.

 

*****

 

For iPhones and iPads, to turn off Siri, complete the following directions:

1) Open your settings.

2) Click Siri and Search.

3) Toggle OFF, listen for “Hey Siri.”

4) Toggle OFF, Press Home (or side button) for Siri.

5) Toggle OFF, allow Siri when locked.

 

*****

 

To turn off “Hey/OK Google,” complete the following directions:

1) Open your settings.

2) Under Google Assistant, tap Settings again.

3) Under Devices, tap Phone.

4) Turn OFF Access with Voice Match/Assistant.

 

*****

 

To turn off Amazon Alexa, complete the following directions:

1) Open your settings.

2) Select Alexa Privacy.

3) Tap Manage How Your Data Improves Alexa.

4) Turn “Help Improve Amazon Services and Develop New Features” OFF by tapping the switch.

5) Confirm your decision.

 

*****

 

These steps can provide clients with a choice while also informing them of the risks of their choices. In group counseling, however, as a safeguard to clients’ confidentiality, I would recommend not allowing any client to keep their cellphones, iPads or any other voice assistant technologies on.

Because these devices may travel with us basically everywhere we go, our conversations are being monitored for product improvements, but in the process, our confidentiality is being breached. Currently, with some simple options for turning off these technologies, clients can continue to maintain the level of confidentiality to which they originally thought they were agreeing.

As counselors, we take many safeguards to protect our clients’ confidentiality. I encourage you to toggle off your voice assistant technology options to keep your devices from being the reason you are held liable for breaking confidentiality. Moving forward, as technologies continue to transform, we as counselors need to be ready to address implications in the counseling setting.

****

Nicole M. Arcuri Sanders is a licensed professional counselor, national certified counselors, approved clinical supervisor, and core faculty at Capella University within the School of Counseling and Human Services. Contact her at Nicole.ArcuriSanders@capella.edu.

 

****

Opinions expressed and statements made in articles appearing on CT Online should not be assumed to represent the opinions of the editors or policies of the American Counseling Association.

Comments are closed.