19
January
2018
|
12:04 PM
America/Chicago

Alexa, Siri, Bixby, Google’s Assistant, and Cortana testifying in court

Vol. 74, No. 1 / Jan.-Feb. 2018

Robert D. Lang1 and Lenore E. Benessere2

Nearly 100 years ago, when Judge Cardozo famously commented, “law never is, but is always about to be,”3 he could not have anticipated the concept of “virtual assistants” like Amazon’s Alexa, Apple’s Siri, Google’s Assistant, Microsoft’s Cortana, or Samsung’s Bixby. Yet, his quote perfectly sums up the new frontier in law as more and more people integrate speech recognition technology into their everyday lives.

Although “speech recognition” may sound like a lofty term, it simply refers to what most of us do daily, when we use our voices to ask our phones to dial our friends, our cars for directions, and our speakers to play our favorite songs. Speech recognition is “the ability to speak naturally and contextually with a computer system in order to execute commands or dictate language.”4 Technology rivals are hard at work creating irresistible versions of easy-to-use devices with which we can talk and have questions answered.5 For the most part, this technology has become so good that a simple command, or “wake word” (“Alexa?!”), allows us to ask our virtual assistants a host of questions from what is today’s weather to who was the fifth President of the United States.6

Amazon, maker of the Echo (Alexa), a hands-free speaker you control with your voice, touts that the Alexa Voice Service, which is integrated into the Echo, is “always getting smarter.”7 When you interact with Alexa, she streams audio to the cloud. Amazon’s Terms of Use for the Echo duly notifies users that “Alexa processes and retains your Alexa Interactions, such as your voice inputs, music playlists, and your Alexa to-do and shopping lists, in the cloud to provide and improve our services.”8

For most people, their virtual assistants’ ability to always be listening for their “wake words” is helpful. When we are driving, this allows us to complete tasks hands-free, avoiding distractions, as well as moving violations. While we are making breakfast in the morning, contemplating getting to work or to court on time, we can ask Alexa how long the morning commute will take. Alexa also allows us to use voice commands to turn on the light while walking into a dark room, without having to search for the light switch.

Many large tech companies believe that voice commands and intelligent assistants will become the primary ways in which people interact with technology, possibly even more significant than touch screens and keyboards.9 Voice control has rapidly evolved from a quirky and interesting technology to a “must have” capability in new devices.10 Virtual assistants are being adopted seamlessly to stay.11 Microsoft reports that Cortana, launched in 2014, now has 145 million users and has handled 18 billion tasks.12 Apple claims it has reached two billion Siri interactions each week, with [an estimated] 41.4 million currently active users….13

This “space age” technology sounds great. However, if you believe that all artificial intelligence designed to serve us can do us no harm, just consider any number of science-fiction movies, which now seem more real than fiction, where humans are nearly done in by artificial intelligence machines, which were created with the intent of serving, not harming, us.14

Defense attorneys are among those lawyers who should consider how they can use virtual assistants’ recordings to shed light not only on how accidents occur, but also to challenge plaintiffs’ personal injury claims. The recent Arkansas trial of James Bates for the murder of his friend, Victor Collins, who was found dead, floating face-up in Mr. Bates’ bathtub, sparks debate regarding the first issue: Can Alexa actually record a murder or, in the personal injury context, an accident?15 In Bates, the prosecution asked Amazon to disclose recordings from Mr. Bates’ Amazon Echo.16 Amazon refused, citing privacy concerns.17 Ultimately the issue went unresolved, without addressing Amazon’s position regarding privacy concerns, when Mr. Bates voluntarily turned over the recordings.18

While the Bates case does not resolve the constitutional issue of whether Amazon may use the First Amendment’s protection of free speech to refuse to disclose the recordings gathered by our Amazon Echoes, it does highlight the fact that users have access to their recordings and, therefore, can willingly disclose them. Amazon’s Alexa App keeps a history of the voice commands that follow the wake word (“Alexa!”). Specifically, in response to a user’s question, “Can I review what I have asked Alexa?”, Amazon states “Yes, you can review voice interactions with Alexa by visiting History in Settings in the Alexa App. Your interactions are grouped by question or request. Tap an entry to see more detail, provide feedback, or listen to audio sent to the Cloud for that entry by tapping on the play icon.”19

We also know that Alexa can record events, such as a crime or an accident, because Echo is equipped with seven microphones that use beam-forming technology and enhanced noise cancellation. Alexa also has a camera, though it is off until a user activates it by asking Alexa or using the Echo Look App to take a photo, video, or use live preview. Alexa’s evidentiary value can also be found in the circumstantial evidence she can provide regarding a plaintiff’s day-to-day life, which can assist defense attorneys preparing for depositions and trial. Questions and commands from parties to their virtual assistants can provide valuable information regarding the places parties have visited since the alleged incident, their hobbies, and activities in which they are involved.

For example, by knowing that plaintiffs have asked their virtual assistants about the commute times to a certain office building, or their requests for Alexa to hail an Uber for them, defense counsel can ask more targeted questions during depositions, including whether plaintiffs have worked since the accident or have traveled or gone on vacation. The value of these records can be immeasurable, given the wide array of commands to “virtual assistants,” including giving definitions of new terms and phrases (“Siri, what is meant by ‘Big Data’?”); playing music (“Alexa, play songs by the Judybats”); assisting in recreation (“Siri, where can I play court tennis in the United States?”); (Alexa, how can I play golf at High Ridge Country Club?”); answering any number and variety of factual questions (“Siri, for which projects has Sciame Construction won awards?”); (“Alexa, which famous people are named ‘Oona’?”); (“Cortana, how do I apply for a Fulbright Scholarship in The Netherlands?”); (“Bixby, what were the ‘moral imperatives’ in ‘Real Genius’?”); (“Siri, who are the leading female poets in New York City?”); (“Google, which movie directors live in Brooklyn?”); (“Bixby, when did the Beach Boys record ‘I Get Around’?”); (“Siri, who is Phil Ochs?”); (“Cortana, how did Holly Golightly in ‘Breakfast at Tiffany’s’ support herself financially?”); going to events (“Bixby, where is the Songwriters Hall of Fame located?”); and securing prices for travel (“Alexa, ask Kayak how much it costs to fly from New York to Easter Island.”). Virtual assistants can also set up timers and alarms, thereby providing defense counsel with valuable information regarding when a person gets up in the morning and their appointments during the day. Alexa can even be used to begin a workout (“Alexa, ask Random Workout to pick a workout.”), which can be significant in those cases where plaintiffs claim to have sustained substantial physical limitations as a result of an accident.

It cannot be overstated how valuable this information can be to gain insight into a plaintiff’s everyday activities, which is often the essential element of most personal injury claims. Defense attorneys know that, when it suits plaintiff’s interests, plaintiffs often do not provide a wealth of information regarding their past day-to-day activities. Armed with a compendium of plaintiffs’ virtual assistants’ searches, however, defense counsel can refresh plaintiffs’ recollections regarding what people did on a certain day, even whether plaintiffs tried to call 911 for help,20 thereby leading to more effective and meaningful questioning. This may be especially helpful if plaintiffs are trying to conceal their actual lifestyles. Like Facebook photos from a vacation, Alexa can be used to expose those plaintiffs who fail to testify truthfully and candidly regarding their injuries and ability to carry on activities of daily life.

The other side of the coin is that those plaintiffs who are ethically challenged can conceivably use their virtual assistants strategically, for example, by asking Alexa for information which would tend to validate their false narratives. For example, someone who is basically physically fine but nevertheless eyeing a potential personal injury suit as a result of an accident may be tempted to ask, “Alexa, add knee brace, cervical collar and Aleve to my shopping list.” – personalized “fake news,” if you will. As it is, this past July, British Security Researcher Mark Barnes warned of a technique that can be used to install malware on Amazon Echo that would silently stream audio from the hacked device to a faraway server, in essence, tapping the Echo.21 Since AI assistants can provide a “real time” autobiography, with malware, that autobiographical information can be read by those who were never intended to have access to that private information. Forewarned is forearmed.

Amazon Echo is not the only piece of technology that has the ability to alter the way we practice by collecting valuable information. Other smart devices, including the pedometer feature on our iPhones and Fitbits, can also provide valuable information regarding a person’s fitness level, including the number of steps a person takes and when they take them. This data, like other documentary evidence, is likely to be more accurate and informative than deposition testimony, which was taken only after a preparation session with an attorney and relies on a person’s memories of events that may have occurred years before the deposition. Opposing counsel devote considerable time and effort to obtain definitive answers from plaintiffs, pinning down physical fitness regimes before and after an accident, with some plaintiffs testifying that they used a run a 5K every weekend and now run “less,” “not as much,” or “not at all.” Plaintiffs have been known to respond with limited or vague answers to these probing questions at depositions. Now, however, the raw data from these devices can provide information that defense counsel can analyze to accurately determine plaintiffs’ actual fitness levels before and after an accident. Defense experts can also use this definitive information from plaintiff’s virtual assistants to construct a baseline from which they can assess a plaintiff’s physical changes, pre- and post-accident.

Using the discovery process to obtain data from plaintiffs’ Alexa or Cortana will most certainly be met with opposition from plaintiffs’ counsel, on the grounds of privacy and prejudice. That opposition will continue until the law begins to develop parameters regarding this type of discovery. However, since those who turn on virtual assistants presumably know, or should know, how they work, they should not be heard later to complain when the devices perform as advertised. One self-help solution is to unplug AI assistants when we do not want Siri or Alexa to overhear and record what is being said in their presence, something most people have not been doing.22 Keeping virtual assistants unplugged, unless or until needed, or taking the precaution of unplugging the virtual assistant when engaging in intended confidential conversations, may become common practice, if not also good common sense.23

The situation becomes more problematic when it involves guests in someone’s home who do not realize they are being recorded.24 Upon entering a person’s house or apartment, are we now expected to ask whether their virtual assistant is on, listening to and recording every word we say? Expectations of privacy therefore now change. As attorneys bring these issues before the courts, judges will weigh the right to a proper defense for defendants against the important right to privacy of plaintiffs. In doing so, courts will determine whether plaintiffs have true expectations of privacy regarding the data and recordings of their smart devices when they have put their physical conditions at issue in personal injury litigation. Simply put, why should the information collected by virtual assistants be treated any differently in discovery than the information contained in personal diaries or cell phone data? To state the proposition is to reject it.

Attorneys who understand the potentially valuable information these devices can provide to our clients should begin to question adversaries about them during discovery and be ready to defend their own witnesses for cross-examination when called upon to testify. When appropriate, counsel should also seek rulings on disclosure of this information if opposing counsel object to providing it. Significantly, Amazon Echo users can delete their voice recordings, which are stored in the History section of the Alex App. Some plaintiffs, with or without advice or counsel, may therefore log on to Amazon.com/myx, find their Echo, and delete old voice recordings.25 Deletions can also be made on Google’s Assistant.26 Knowing this, it will be prudent for defense counsel to serve opposing parties with a demand at the beginning of litigation for the preservation of evidence, requesting plaintiffs to retain that information in the Cloud and not to dispose of any recordings in the History section of the Alexa App, a fair quid pro quo for demands by plaintiffs for the preservation of any CCTV believed to have captured an accident, often served at or prior to the commencement of a lawsuit. To obtain that information, authorizations directed to Amazon, Apple, Google, Microsoft and Samsung should also be requested, in order to access important data from Alexa, Siri, Google’s Assistant, Cortana and Bixby, respectively.

Moreover, the evidentiary value of this newly available information extends well beyond casualty litigation, to any area of law where liability hinges on proof of what someone said or knows and when they said it or knew it, thereby encompassing all practice areas. For just one example, information from virtual assistants will be valuable to attorneys handling securities fraud and insider trading cases, as Alexa can be the “fly on the wall,” overhearing conversations regarding which stock to buy or sell and when. Attorneys litigating sexual harassment or Title VII immigration cases that look to the context surrounding what was said behind business decisions will be able to benefit heavily from this technology, which provides unvarnished insight into what was previously disputed “he said/she said” conversations. Trademark, copyright, and patent attorneys, piecing together the origination of ideas, may also find useful the data that virtual assistants can now make readily available.

Information from digital assistants is not limited to the United States and now has direct application to cases and litigants worldwide. Just this past October, Yandex, the largest search engine in Russia, often referred to as the Russian equivalent of Google, introduced Alice, its first conversational, intelligent assistant.27 This new girl in town is touted to be the most capable Russian language assistant of its kind.28 Accordingly, the search for information acquired by virtual assistants will soon be across borders, in any country where AI assistants are located.

As Kyle Rees cautioned Sarah Connor in the first “Terminator” movie, “Listen and understand. That terminator is out there. It can’t be bargained with. It can’t be reasoned with. It doesn’t feel pity, or remorse, or fear.” So, too, are digital personal assistants, who can and will record our every statement, which can later be used as evidence by lawyers who understand and make full use of this new technology.29 The next generation of artificial intelligence platforms may provide attorneys access to even more information which previously was assumed to be private and non-discoverable.

Whether in the board room, the living room, the proverbial “smoke filled room” or the office, the defense of “plausible deniability,” used conveniently when confronted with previously hard to prove facts, will not be less successful in voiding disclosure of what actually took place. If, during conversations intended to be secret, Siri, Alexa, Google’s Assistant or Cortana are present in the room, unobtrusively sitting on a table, bookshelf or mantel, silently listening to and recording all that is being said, it will be far harder for the participants of that meeting to later deny what was said, when, and by whom.

Remembering Judge Cardozo’s remark that “law never is, but is always about to be,” and Chief Justice John Roberts’ comment this past July that “advancing technology poses one of the biggest challenges for the Supreme Court,”30 forward thinking attorneys should not shy away from putting these issues before the court, as attorneys and judges (perhaps with the help of AI devices) together grapple with this new technology, directly applicable in today’s world, both real and virtual. We are now at the start of an era in which previously unavailable data can be accessed, become discoverable, and later be introduced into evidence. Attorneys who fail to recognize this will be left behind.

Reprinted with permission from New York State Bar Association Journal, November/December 2017, published by the New York State Bar Association, One Elk Street, Albany, NY 12207.

Endnotes

1 Robert D. (“Bob”) Lang (RDLang@damato-lynch.com) is a senior partner at the firm of D’Amato & Lynch, LLP in New York City, where he manages the Casualty Department.

2 Lenore E. Benessere (LBenessere@damato-lynch.com) is an associate of the firm. The authors would like to thank paralegal Megan Kessig for her help, and Alexa, Bixby, Siri, Google’s Assistant, and Cortana for their assistance.

3 Benjamin Cardozo, Lecture, The Nature of the Judicial Process, Yale Law School (1921).

4 Stacey Gray, Always On: Primary Implications of Microphone-Enabled Device, Future of Privacy Forum (April 4, 2017).

5 George Anders, “Alexa, Understand Me,” M.I.T. Tech. Rev. (Aug. 9, 2017).

6 Alexa tells us the president is James Monroe.

7 Amazon, https://www.amazon.com/Amazon-Echo-Bluetooth-Speaker-with-Alexa-Black/dp/B00X4WHP5E/ ref=sr_1_1?s=amazon-devices&ie=UTF8&qid=1500487675&sr=1-1&keywords=echo.

8 Amazon, Terms of Use, https://www.amazon.com/gp/help/customer/display.html/ref=hp_left_v4_sib?ie=UTF8%nodeId=201809740.

9 Nick Wingfield, Amazon Wants to Wake You Up With Alexa, and That’s Just the Start, N.Y. Times (Sept. 29, 2017).

10 Peter Nowak, Why It’s No Longer Strange to Talk to Your Home Appliances, The Globe and Mail (Oct. 2, 2017).

11 Adrian Cutler, The Virtual Revolution of the Digital Assistant, IT ProPortal (Oct. 5, 2017).

12 Artificial Intelligence, The Invincible Revolution Than Can Change Everything, Latin American Herald Tribune (Oct. 16, 2017).

13 Martin Courtney, Alexa, Cortana, Siri, et al: Do Our Digital Assistants Hear More Than We Want Them To?, E&T Magazine (Oct. 13, 2017).

14 Consider the computer HAL 9000 in 2001: A Space Odyssey; Skynet in The Terminator movies; the computer WOPR in WarGames; the NS-5 robots in I, Robot; Gatekeeper in The Net; ARIA in Eagle Eye; GLaDOS in Portal; the Terrans in the X game series; SHODAN in the System Shock series; SID 6.7 in Virtuosity; X.A.N.A. in Code Lyoko; AI Omega in Red vs. Blue; and the comely computer Ava (apparently no relation to Alexa) in Ex Machina.

15 Eliott C. McLaughlin and Keith Allen, Alexa, Can You Help With This Murder Case?, CNN (Dec. 28, 2016); Agatha French, Alexa May Be Listening, but Will She Tell on You?, L.A. Times (Jan. 5, 2017).

16 Hailey Sweetland Edwards, Alexa Takes the Stand: Listening Devices Raise Privacy Issues, Time (May 4, 2017); Jill Bleed, Alexa a Witness to Murder? Prosecutors Seek Amazon Echo Data, Yahoo.com (Dec. 27, 2016); Sarah Buhr, An Amazon Echo May Be the Key to Solving a Murder Case, TechCrunch (Dec. 27, 2016).

17 Gerald Sauer, A Murder Case Tests Alexa’s Devotion to Your Privacy, Wired (Feb. 28, 2017).

18 Shona Ghosh, Amazon Handed Over Alexa Recordings to Police in a Murder Case, Business Insider (Mar. 7, 2017); Chris Perez, Amazon Abandons Legal Fight Over “Alexa Data”, N.Y. Post (Mar. 7, 2017).

19 Amazon, Alexa and Alexa Device FAQs, https://www.amazon.com/gp/help/customer/display.html?nodeId=201602230.

20 Jefferson Graham, “Alexa, Call 911” Won’t Work. Here’s What Will, USA Today (July 19, 2017).

21 Andy Greenberg, A Hacker Turned an Amazon Echo into a “Wiretap”, Wired (Aug. 1, 2017).

22 Siri and Alexa Are Spying on Us, Gotham Girl (Aug. 26, 2017).

23 In reaching to turn off virtual assistants, some may have flashbacks to any number of science-fiction books and movies, such as 2001: A Space Odyssey by Arthur C. Clarke and Stanley Kubrick, where the artificial intelligence device resists and actively defends itself when people seek to turn it off. Our advice: notwithstanding your fears, summon your courage and turn off your virtual assistants when you do not want them listening to and recording what you are saying.

24 Alexa, How Much Is My Privacy Worth?, Salem News (Aug. 1, 2017).

25 Jake Swearingen, Can an Amazon Echo Testify Against You?, N.Y. Mag. (Dec. 27, 2017).

26 Kevin Murnane, How to Delete the Recordings of Your Interactions With Alexa and Google Home, Forbes (Oct. 2, 2017).

27 Radu Tyrsina, Cortana Gets New Competition from Alice, Courtesy of Yandex, Window Report (Oct. 4, 2017).

28 Brian Heather, Yandex Introduces Alice, An Alexa-Like Assistant, That Speaks Russian, TechCrunch (Oct. 11, 2017); Pradeep, Yandex Release Its Cortana Competitor in Russia, MSPowerUser (Oct. 11, 2017); David Reid, Russian Launches Its Own Version of Amazon Alexa With ‘Near-Human Levels’ of Speech Recognition, CNBC (Oct. 10, 2017).

29 Kevah Waddell,  The Privacy Problem With Digital Assistants, The Atlantic (May 24, 2016).

30 Nick Perry, In Overseas Remarks, Roberts Says Technology Poses Challenge for Courts, N.Y. L.J. (July 26, 2017).