Course Leader, MA Interaction Design Communication, London College of Communication, University of the Arts London, UK
Reference this essay: Goatley, Wesley. “The Voice of the Underworld: Magical and Political Narratives of the Smart Speaker.” In Language Games, edited by Lanfranco Aceti, Sheena Calvert, and Hannah Lammin. Cambridge, MA: LEA / MIT Press, 2021.
Published Online: March 15, 2021
Published in Print: To Be Announced
ISBN: To Be Announced
Repository: To Be Announced
This essay interrogates the use of language in describing machine learning-enabled smart speakers and voice assistants, such as the Amazon Echo and its Alexa technology, arguing that particular dominant narratives obfuscate the deeper function of these devices, an obfuscation compounded by the limits of access inherent to their design and operation. This critique will begin by assessing claims made by powerful actors in the smart speaker market regarding these technologies, and how choices in their language can be seen to strategically endow these devices with near-supernatural capacities. Using Judith Williamson’s study of magic as a referent system in advertising, this language will be argued to conjure promissory visions that compound the limits of access inherent to these devices, and the practices of data exploitation and attempts at market dominance this serves. As a propositional response to these conditions, I will examine my own installation artwork The Dark Age of Connectionism: Captivity, which explores the machinic voice in both text-to-speech and speech-to-text technologies, as an intervention upon the narratives examined in this essay. Through this, I will demonstrate the value in challenging these conditions, and how such challenges offer new avenues for knowledge production in this context. I will conclude by arguing that, when interrogating these technologies and the limits of access inherent to them, language is particularly potent both as a site of examination and as a tool in critical artistic practice.
Keywords: Smart speaker, smart assistant, voice-user interface, data art, critical data studies, artistic research
Limits of Access
It is an interesting moment when the interrogation of technology takes on a more literal form, as is possible with the ‘smart speaker’ range of consumer devices such as the Amazon Echo and Apple HomePod.  These networked devices are increasingly common examples of voice-user interface technologies, and are primarily concerned with language in two dimensions: both the language of the user interacting with the device, and the language of the ‘smart assistant’ the user is supposedly interacting with, such as Amazon’s ‘Alexa’ or Apple’s ‘Siri’. Both voices are becoming increasingly pervasive in consumer technological contexts; Siri is now a component of all Apple operating systems across their entire range of devices and Amazon have made arrangements with a range of manufacturers to produce Alexa-enabled cooking appliances, light switches, bathroom fittings, and door locks, alongside their own Echo range of products. 
In spite of the increasing presence of these voices, critical examination of smart speaker devices and their smart assistants is problematized by the limits of access inherent to their design and function. An initial point of obfuscation is the form of the smart speakers themselves; products such as the Amazon Echo and Apple HomePod feature minimal points of user interaction, and a trend towards seamless design principles which obscure crucial components such as microphones. That this lack of access is intentional is made explicit by the fact that devices such as the Amazon Echo are not user-serviceable, with the warranty being declared void under conditions of repair or alteration by its owner. 
Even when these exteriors are penetrated, the networked nature of these devices provides a substantial obstacle to a critical interrogation of their function. A typical use scenario begins with a ‘wake word’ (e.g. “Alexa” or “Hey Siri”) being detected by the smart speaker, triggering the immediate broadcasting of the subsequent user request over the Internet to a data center. The content of the request is then analyzed through a speech-to-text process, and a potential response is generated. The reply is then synthesized in the assistant’s voice and sent back over the Internet to emerge from the device, suggesting a seamless, local performance.
Examining this performance and its technical details becomes impossible to those outside the manufacturers themselves when the algorithms and machine learning technologies which underpin such capacities are closely-guarded intellectual properties.  Even the data centers themselves limit access, being on well-secured private property, often buried underground to reduce the cost of cooling this already incredibly energy-consumptive infrastructure or surrounded by armed guards and fences. 
What is known is that both Apple and Amazon’s terms of service each make it clear that they can legally use the voice recordings sent to their data centers through this process towards further, unspecific ends and retain them for undisclosed lengths of time.  Terms and conditions such as these are heavily critiqued in fields such as critical data studies for the forms of data exploitation and threats to privacy they allow,  and the disproportionate risks posed to vulnerable or marginalized groups.  The breadth of Amazon and Apple’s data collecting practices have drawn particular scrutiny by privacy campaigners and regulatory bodies, amongst others.  The value of such data (and the analytics thereof) to a company like Amazon is perhaps best exemplified by a statement from their former chief scientist that “it’s like an arms race to hire statisticians nowadays.” 
In the context of smart speakers, voice data has the potential to become a new front in the exploitation of personal data at the corporate, state, and geopolitical levels. For example, given that internet communication passing through US soil falls under the telecommunications jurisdiction of that country, and that Amazon and Apple operate data centers in the US, users of these devices could find the sentiment and content of their speech to be a factor in a successful border crossing into that country (or potential forced extradition from it). Such speculation is justifiable when border restrictions are already imposed through examination of other forms of personal device data.  With secondary searches and examinations already disproportionately targeting marginalized communities at international borders such as the US,  Finland  and Russia,  these groups are among those most at risk from this new vector of violence.
There are many other examples of governments explicitly seeking and achieving access to personal data such as this, as evidenced in the data leaked by the NSA whistleblower Edward Snowden in 2013,  even while members of government security agencies such as GCHQ have accused the tech industry of far more invasive forms of data collection and surveillance.  This debate aside, what is far more certain is that there has been a marked rise in the last ten years in recorded requests from international law enforcement agencies for access to personal data from companies such as Google,  Amazon  and Apple,  including requests made for recordings from smart speaker devices such as the Amazon Echo.  This creates another vector of critical concern for marginalized groups in countries such as the US and UK, where substantial markets for smart speakers exist alongside well-documented policing biases regarding the searching of ethnic minority and Black communities. 
Given these conditions, the synthesized voice is best understood not simply as an assistant who lives in your home, but a voice that emanates from the literal underworld of the data center; a place where, in a perverse exchange, the voice of the human user is stored, processed and analyzed towards unknown, yet potentially violent, ends.
Given these limits of access and their potential ramifications, a critical examination of these technologies and the work that they do is both pertinent and challenging. There is, however, a third form of language concerning these technologies that is both accessible and inextricably tied to their function: that of how they are described by their manufacturers through their advertising, which attempts to inform the ways these devices are understood in the world and the abilities they are perceived to possess.
In what follows I will examine online advertisements from Amazon and Apple regarding their Echo and HomePod products, respectively, towards exploring what forms of knowledge and methods of critique can be proposed in reflection upon them. While other products and manufacturers such as Google’s ‘Google Home’ and Microsoft’s ‘Cortana’ assistant have a visible presence in this market, I have chosen to analyze Amazon and Apple’s advertisements in this essay, given that these two companies have the largest global market share in smart speaker sales  and smart assistant users  respectively. It is my hope that this study may prove of use to further scholarship in this area, particularly in the similarities and diversions between these products and those originating in non-Western markets such as China, given the rapid international growth of Baidu’s smart speaker and voice assistant technologies. 
The Myth, and Strategy, of Smartness
Like other applications of machine learning technologies,  the smart speaker field of products is permeated with hype and hyperbole. For example, the term ‘smart’ is a central component of how both Apple and Amazon frame the capacities of the HomePod and Echo; it prepends both the speaker and the assistant, despite the fact that neither can be said to truly possess intelligence, knowledge, wisdom, or similar properties that might constitute a smart human. This is a condition endemic to the contemporary use of the term ‘smart’ to describe other non-conscious objects such as phones, locks, and bombs. It has a strategic property when espousing the capacities of algorithms, artificial intelligence, and cities, the latter being a site where the term comes under particular criticism.  The political and economic geographer Alberto Vanolo responds to the lack of specificity for what makes a city ‘smart’ by framing the term as an “evocative slogan lacking a well defined conceptual core,” an explicitly tactical gesture through which “proponents of the smart city are allowed to use the term in ways that support their own agendas.”  Given this, it is notable that neither Amazon or Apple provide definitions for the term ‘smart’ in the description of their smart speakers, smart assistants, or the ‘smart homes’ they claim to interface with.
Halpern, Mitchell and Geoghehan chart the use of ‘smart’ in technological contexts from the twentieth-century onwards, arguing that it is inherently entangled with discourses of technologically-mediated control and governmentality, where “the agents of this smartness often remain obscure.”  The authors argue that the source of this ‘smartness’ is not in devices, but populations; through “the algorithmic manipulation of billions of [data] traces left by thousands, millions, or even billions of individual users”  that have fed contemporary developments in machine learning technologies, among others.
The value of this data to companies such as Amazon is suggested by supply-side analytics estimating that the Echo was produced, and its infrastructure run, at a considerable loss when it was first released in order to establish a user base from which to collect data.  Their claim that the first-generation Echo would be “always getting smarter” through its interactions with users  was therefore built upon a need to convince audiences of the value of the object and its ‘smart’ capacities at launch. This creates a seemingly paradoxical circumstance: the promise of smartness is a discursive strategy to enable the gathering of data needed to perform the very capacities that Amazon claim make the Echo ‘smart.’
This strategy would appear to be working as intended, given a recent press release that claims “Alexa is now even smarter.”  It is notable that there is no a quantification or description of this increase in smartness, and the subject of this claim is not the device, but the abstract entity of ‘Alexa.’ This frames Alexa as an individual, one whose increasing capacities are presented here without a seeming limit, or end, to ‘her’ continuing development.
The narratives of smartness as a strategic tool and the boundless capacities of the smart assistant can be seen at work in Apple’s framing of their HomePod smart speaker as one whose capabilities exceeds that of its human users, bordering on the omniscient. Central to their HomePod’s website is their claim that the device is “the ultimate music authority” through “the intelligence of Siri.”  At the core of this is the voice command “Hey Siri, play something I’d like,” which Apple describe as producing a playlist of “your favorite songs, as well as new tracks similar to the ones you like”  when the HomePod user also has a subscription to their Apple Music service, one that they claim “unlocks virtually every song you can imagine.” 
The authority Apple are asserting through this description both extends the capacities of the device into the realm of predicting musical taste, while also promoting a second product: Apple Music. Such ‘branded music experiences’ tied to algorithmic authority have been proposed as being explicit mechanisms for data collection, behavioral influence, and market dominance.  The goal of market dominance as a dominant factor here is further evidenced throughout the official HomePod website, where in spite of Spotify having almost double the market share of Apple Music  the platform is only mentioned once, with far more visibility given to Apple’s own, rival, platform.
This attempt to expand market dominance through connecting users to platform ecologies has been noted as an increasingly prevalent strategy in Silicon Valley,  as can be seen in Amazon’s description of the Echo as being “better with [Amazon] Prime,” their subscription service crossing across their e-commerce site, music, and streaming video platforms.  When the Echo’s capacities are claimed to be enhanced in this way, much as how an Apple Music subscription purportedly allows access to profound powers, smartness is a strategy with its roots in the banal and everyday reality of capitalism: that of the desire for market dominance, and the access to more data that this represents.
Apple’s framing of the HomePod as capable of knowing what you want before you do demonstrates a narrative of the supernatural that is consistently invoked in relation to the HomePod and the Echo. Apple claim their device is “an incredible listener,” that it “knows how to work a room,” and describe its chipset as an advanced ‘brain’;  each phrase at once suggesting human-like capacities, while inferring they may be, in fact, beyond those of the consumer. Amazon follows similar logics with claims that the Echo “hears you from any direction,” and that its “thousands of skills” will protect your home, amongst other feats. 
The connection of both of these devices with the narrative of the smart home serves to further extend the capacities of these devices into beyond-human and supernatural territory, such as the telekinetic ability to “switch on the lamp before getting out of bed or dim the lights from the sofa…all without lifting a finger.”  Crucially though, these powers remain the domain of the device, not the user. For example, in the suggested voice control of a smart thermostat through the command “Hey Siri, make the room cooler,”  the phrase encapsulates a distinct separation between user and action; instead of ‘reduce the temperature of the Nest thermostat by three degrees’, the user is kept from both the mechanism of control and its granularity through the request. This twist in the narrative of the power of smartness is echoed in wider critiques of the ‘smart home’, which argue that it produces a passive and infantilized domestic subject who relies upon multiple corporate actors for the basic functions of the home.  This presence of power and the paradoxical denial of it to the user is borne out in the imagery on the Amazon Echo homepage, in which users are shown issuing commands, but the Echo itself is absent from the scene; it having seemingly become a ghost that now haunts the house. 
Magic, Spells, and Rituals
The supernatural overtones seen here are encapsulated by Apple’s description of the HomePod as “quick and magical”;  suggesting both simplification for the user and the extra-human, supernatural prowess of the product. In the context of the advertisement of these products and the claims made by their manufacturers, Judith Williamson’s examination of the discourse of magic in advertising provides a compelling analysis of the language used in the advertisement of these devices. 
Williamson describes magic in the context of advertising as not simply the use of the literal imagery of sorcery, but a referent system that performs transformations of products; giving them powers that the consumer does not control, or is required to understand, but accesses through them. A key factor in this is that magic itself is its own justification; the explanation of any magical act inevitably recourses to “it’s magic,” [emphasis original]  denying the need for details, analysis, or justification. The strategic value of this lies in obfuscation: magic can be “used to misrepresent any system of production,”  and through this, to “mis-represent our relation to the world around us.” [emphasis original] 
Williamson’s analysis is remarkably applicable to smart speakers, when “the act of saying the product’s name…is thus a spell which provides a short cut to a larger action, performed not by us but by the product.”  In the context of smart speakers, these larger actions are the promise of the near-supernatural feats of the smart speaker seen so far: prediction, telekinesis, and omniscience. This devolution of control to the simplicity of the spell, or magic word, produces a paradox of disempowerment in the user, who accesses “a vast source of external power, though…we never produce or control the forces we have learned to tune in to.”  The only control that remains is that which “which we are given back in the surrogate form of spells and promises.”  For smart speakers, the wake word is the spell that is given, and smartness is the promise that it holds.
Williamson’s study suggests a narrative one could call the ‘magic of smartness’ that both amplifies the obfuscations inherent to these technologies, and perpetuates a distance between the user and the mechanisms of the device’s function. Facets of smart speaker advertisement such as the outsourcing of your own music taste to the ‘ultimate authority’ of the HomePod and Apple Music, the infantilizing almost-control of the smart home, and the pervasively unqualified claims to smartness illustrate how this narrative allows for data collection to not only continue but expand, and further reinforce the device as one whose internal function and politics are kept at literal and metaphorical distance to the user.
Interventions and Demonstrations
To close this gap between the user and the device is to challenge this discourse of the ‘magic of smartness’, and expose the promises of the manufacturers to critique. Keller Easterling calls such acts the ‘re-designing of disposition’, exposing the reality of a situation against what’s promised by probing “the ways in which power says something different from what it is doing.”  In the context of smart speakers, exposing breakages and limitations in these devices defies the supernatural promises made by their manufacturers, and offers a challenge to the limits of access by provoking new opportunities for knowledge about them.
As an example of a propositional response to these conditions through practice, I will now examine my own installation artwork The Dark Age of Connectionism: Captivity,  which explored the machinic voice in both text-to-speech and speech-to-text technologies, as an intervention upon the narratives examined in this essay. I am far from alone in pursuing practice-based research in this field, which includes critical design practitioners  and groups such as Feminist Internet who, since 2018, have been hosting practice-based workshops critiquing conditions such as gender bias present in voice assistant technologies.  Though the earliest iteration of this work was exhibited in April 2017,  the iteration I will be discussing is a later development created through ongoing engagements with this field of enquiry.
The Dark Age of Connectionism: Captivity
The Dark Age of Connectionism: Captivity was an installation that comprised of a ring of seven microphones surrounding an Amazon Echo hanging above a small speaker, with additional speakers mounted in the ceiling above. The microphones were positioned in a way that explicitly signaled their presence to the audience, capturing sounds such as footsteps, the rustle of clothing, noises from phones, as well as speech. Whenever a sound was detected by the microphones, the voice of ‘Siri’ emanated from the small speaker to ask a question to ‘Alexa’, which the Echo attempted to respond to.
These questions were randomly selected from a list of over three hundred possible questions contributed by a wide range of artists, researchers, and smart speaker owners for this installation, with their content centered upon the present and historical context of the device. Examples of the questions included ‘Alexa, can law enforcement agencies gain access to your records?’, ‘Alexa, what’s your algorithm for vocal tract normalization?’, and ‘Alexa, how long a break do workers in Amazon warehouses get every hour?’. If a new sound was detected by the microphones while a question was being asked it would be interrupted by a new question, creating a constant stream of partial questions and half-responses between the two voice assistants.
The questions themselves functioned as a method of exposing the limitations of these technologies as a counter to the ‘magic of smartness.’ The Echo could frequently not parse questions framed in conversational tones, in spite of the claims to being a conversational interface. This was not merely an artefact of the synthesized voice of ‘Siri’ asking the questions, as certain trends were notable; for example, questions that involved adding items to an Amazon wish lists or shopping basket were more likely to be successfully interpreted by the Echo. Such events suggest that the speech-to-text system designed by Amazon has been developed with particular emphasis on successfully processing purchases through it, rather than having a meaningful conversation with ‘Alexa.’ Questions that included database manipulation commands within them, or drew attention to landmark legal cases with wide-ranging privacy implications for smart speaker devices, demonstrated that an adversarial position could also be adopted through this process.
Exposing these limitations and failures through questions is an application of Andrew Barry’s notion of the ‘demonstration.’ Barry draws attention to the dual meaning of ‘demonstration’ as both a protest against, but also the demonstration as “the possibility of a real object…a way of showing what can or might be done.”  In this work, the act of questioning exposes the technical gap between the supernatural prowess proposed by the manufacturers of these devices, and their reality as limited, inconsistent devices, while the content of the questions articulates the densely layered political realities of these devices.
While there are substantial limits of access inherent to these technologies, and obfuscating narratives perpetuated by their manufacturers, they are not beyond critique. Examining the promissory language used in their advertising exposes the strategic discourse of magic which underpins claims to their capacities, and critical practice offers a potent site for re-contextualizing these technologies to critique these claims. The Dark Age of Connectionism: Captivity challenges the mythic power of smartness through exposing the banal and everyday reality of the limited capacities of the smart assistant and the biases in its design. This is present not only in how few questions the ‘conversational’ smart assistant can actually respond to, but notably how one of the few intelligible responses it gives are to questions that involve adding an object to a shopping list, exposing the non-neutrality of its design. Through this, the voices of the assistants themselves become tools to critique claims made about these technologies by their manufacturers.
Combining investigation with how these technologies are being presented through advertising with propositional interventions through practice demonstrates that, in spite of the challenges to critical interrogation posed by these technologies and the limits of access to them, there are always questions to ask, and actions to be taken, which may provoke new knowledge.
Wesley Goatley is a sound artist and researcher based in London, UK. His critical practice examines the aesthetics and politics of data, machine learning, and voice recognition technologies and the power they have in shaping the world and our understanding of it. His work is exhibited and performed internationally, including venues such as Eyebeam in New York, The Nam June Paik Art Center in Seoul, and the Victoria & Albert Museum in London. He was awarded an EMARE/EMAP residency prize with Impakt Festival in 2017, and has been artist in residence at Extrapool Nijmegen and Queen’s University in Kingston, Canada. In 2019 he received his doctorate in Creative and Critical Practice for his practice-led thesis titled “Critical data aesthetics: Towards a critically reflexive practice of data aestheticisation.” He is Course Leader of MA Interaction Design Communication at the London College of Communication, University of the Arts London.
Notes and References
 “Echo Dot,” Amazon, accessed October 26, 2020, https://www.amazon.com/dp/B01DFKC2SO/ref=ods_xs_dp_oop; “HomePod,” Apple, accessed October 26, 2020, https://www.apple.com/uk/homepod/.
 Taylor Martin, “5 Ways You Can Add Alexa To Your Car Right Now,” Cnet, April 30, 2018, https://www.cnet.com/how-to/ways-you-can-add-alexa-to-your-car-right-now/; Brian Bennet “All the New Alexa Products From CES 2019”, Cnet, January 9, 2019, https://www.cnet.com/news/all-the-new-alexa-products-from-ces-2019/.
 “One Year Limited Warranty for Amazon Devices,” Amazon, accessed October 26, 2020, https://www.amazon.co.uk/gp/help/customer/display.html/ref=hp_left_ac?ie=UTF8&nodeId=201311110.
 Nick Seaver, “Algorithms as Culture: Some Tactics for the Ethnography of Algorithmic Systems,” Big Data & Society 4, no. 2 (2017): 1–12.
 Vincent Mosco, To the Cloud: Big Data in a Turbulent World (London: Routledge, 2014); Ingrid Burrington, “Why Amazon’s Data Centres are Hidden in Spy Country,” The Atlantic, January 8, 2016, https://www.theatlantic.com/technology/archive/2016/01/amazon-web-services-data-center/423147.
 Rob Kitchin, Data Revolutions (London: Sage, 2014); Andrew Iliadis and Federica Russo, “Critical Data Studies: An Introduction,” Big Data & Society 3, no. 2 (2016): 1–7.; danah boyd and Kate Crawford, “Critical Questions for Big Data,” Information, Communication and Society 15, no. 5 (2012): 662–679.
 Safiya Noble, Algorithms of Oppression (New York: NYU Press, 2018); “Data for Black Lives,” Data for Black Lives, accessed October 26, 2020, https://d4bl.org/; “As If,” Ramon Amaro, accessed October 26, 2020, https://www.e-flux.com/architecture/becoming-digital/248073/as-if/; “Vulnerable Bodies: Relations of Visibility in the Speculative Smart City,” Debra McKinnon and sava saheli singh, accessed October 26, 2020, https://mappinginjustice.org/vulnerable-bodies-relations-of-visibility-in-the-speculative-smart-city/.
 Ian Bogost, “Apple’s Empty Grandstanding About Privacy,” The Atlantic, January 31, 2019, https://www.theatlantic.com/technology/archive/2019/01/apples-hypocritical-defense-data-privacy/581680/; Jon Porter, “EU Opens Amazon Antitrust Investigation,” The Verge, July 17, 2019, https://www.theverge.com/2019/7/17/20696214/amazon-european-union-antitrust-investigation-third-party-seller-marketplace.
 Charles Duhigg, “How Companies Learn Your Secrets,” New York Times, February 16, 2012, https://www.nytimes.com/2012/02/19/magazine/shopping-habits.html.
 Olivia Solon, “US Border Agents Are Doing ‘Digital Strip Searches.’ Here’s How To Protect Yourself,” The Guardian, March 31, 2017, https://www.theguardian.com/us-news/2017/mar/31/us-border-phone-computer-searches-how-to-protect.
 Yvonne D. Newsome, “Border Patrol: The U.S. Customs Service and the Racial Profiling of African American Women,” Journal of African American Studies 7, no. 3 (2003): 31–57.
 Eeva-Kaisa Prokkola and Juha Ridanpää, “Border Guarding and the Politics of the Body: An Examination of the Finnish Border Guard Service,” Gender, Place & Culture 22, no. 10 (2015): 1374–1390.
 Anssi Paasi, Territories, Boundaries and Consciousness: The Changing Geographies of the Finnish-Russian Border (New York: Wiley, 1996).
 Ewen Macaskill and Gabriel Dance, “NSA Files Decoded: What the Revelations Mean for You,” The Guardian, November 1, 2013, https://www.theguardian.com/world/interactive/2013/nov/01/snowden-nsa-files-surveillance-revelations-decoded#section/1.
 Mark Brown, “Tech Firms Know More About Us Than Any Spy Agency—Ex-GCHQ Chief,” The Guardian, October 8, 2019, https://www.theguardian.com/uk-news/2019/oct/08/tech-firms-know-more-about-us-than-any-spy-agency-ex-gchq-chief.
 “Request for User Information FAQs,” Google, accessed October 26, 2020, https://support.google.com/transparencyreport/answer/9713961?hl=en-GB.
 “Law Enforcement Information Requests,” Amazon, accessed October 26, 2020, https://www.amazon.com/gp/help/customer/display.html?nodeId=GYSDRGWQ2C2CRYEF.
 “Transparency Report,” Apple, accessed October 26, 2020, https://www.apple.com/legal/transparency/.
 Sidney Fussell, “Meet the Star Witness: Your Smart Speaker,” Wired, August 23, 2020, https://www.wired.com/story/star-witness-your-smart-speaker/.
 Shaka Yesufu, “Discriminatory Use of Police Stop-and-Search Powers in London, UK,” International Journal of Police Science & Management 15, no. 4 (2013): 281–293; Andrew Gelman, Jeffrey Fagan and Alex Kiss, “An Analysis of the New York City Police Department’s ‘Stop-and-Frisk’ Policy in the Context of Claims of Racial Bias,” Journal of the American Statistical Association 102, no. 479 (2007): 813–823.
 “Smart Speaker With Intelligent Personal Assistant Quarterly Shipment Share from 2016 to 2019, by Vendor,” Statista, accessed October 26, 2020, https://www.statista.com/statistics/792604/worldwide-smart-speaker-market-share/.
 “2019 Voice report: Consumer Adoption of Voice Technology and Digital Assistants,” Microsoft, accessed October 26, 2020, https://about.ads.microsoft.com/en-us/insights/2019-voice-report.
 “Canalys: Baidu Replaces Google to Become Number Two in Smart Speaker Market in Q2 2019,” Canalys, accessed October 26, 2020, https://www.canalys.com/newsroom/smart-speaker-market-q2-2019.
 Jo Ann Oravec, “Artificial Intelligence, Automation, and Social Welfare: Some Ethical and Historical Perspectives on Technological Overstatement and Hyperbole,” Ethics and Social Welfare 13, no. 1 (2019): 18–32.
 Robert G. Hollands, “Will The Real Smart City Please Stand Up? Intelligent, Progressive or Entrepreneurial?” City 12, no. 3 (2008): 303–320; Gillian Rose, “Screening Smart Cities: Managing Data, Views and Vertigo,” in Compact Cinematics, eds. Pepita Hesselberth and Maria Poulaki (London: Bloomsbury, 2017): 177–184.
 Albert Vanolo, “Smartmentality: The Smart City as Disciplinary Strategy,” Urban Studies 51, no. 5 (2014): 884.
 Orit Halpern, Robert Mitchell, and Bernard Dionysius Geoghegan, “The Smartness Mandate: Notes Toward a Critique,” Grey Room 68 (2017):110.
 Ibid., 116.
 Leslie Hook, Richard Waters, and Tim Bradshaw, “Amazon Pours Resources Into Voice Assistant Alexa,” Financial Times, January 17, 2017, https://www.ft.com/content/876ede9c-d97c-11e6-944b-e7eb37a6aa8e.
 Amazon, “Echo Dot.”
 “Alexa is Now Even Smarter—New Features Help Make Everyday Life More Convenient, Safe, and Entertaining,” Amazon, accessed October 26, 2020, https://press.aboutamazon.com/news-releases/news-release-details/alexa-now-even-smarter-new-features-help-make-everyday-life-more
 Apple, “HomePod.”
 Jeremy Wade Morris and Devon Powers, “Control, Curation and Musical Experience in Streaming Music Services,” Creative Industries Journal 8, no. 2 (2015): 106–122.
 “Share of Music Streaming Subscribers Worldwide As of the First Half of 2018, by Company”, Statista, accessed October 26, 2020, https://www.statista.com/statistics/653926/music-streaming-service-subscriber-share/.
 Nick Srnicek, Platform Capitalism (Cambridge: Polity Press, 2017).
 Amazon, “Echo Dot.”
 Apple, “HomePod.”
 Amazon, “Echo Dot.”
 Apple, “HomePod.”
 Sarah Kember, iMedia, (London: Palgrave McMillan, 2016); Sarah J. Darby, “Smart Technology in the Home: Time For More Clarity,” Building Research & Information 46, no. 1 (2018): 140–147.
 Amazon, “Echo Dot.”
 Apple, “HomePod.” For more examples of magic explicitly to describe computational technologies, see Tobias Revell, “What’s It Doing?—STRP Eindoven,” accessed October 26, 2020, https://tobiasrevell.com/What-s-It-Doing-STRP-Eindhoven; and William A. Stahl, “Venerating the Black Box: Magic in Media Discourse on Technology,” Science, Technology, & Human Values 20, no. 2 (1995): 234–258.
 Judith Williamson, Decoding Advertisements: Ideology and Meaning in Advertising, (London: Marion Boyars, 1978).
 Ibid., 140.
 Ibid., 141.
 Ibid., 144.
 Ibid., 142.
 Keller Easterling, Extrastatecraft: The Power of Infrastructure Space (London: Verso, 2014), 214.
 Wesley Goatley, 2017, The Dark Age of Connectionism: Captivity, installation, Utrecht: Impakt Festival.
 Carlos Orti Roig, “The Ultra Human,” September 4, 2019, https://vimeo.com/357876506; Beatriz Lacerda, “Definitely Not a Sarcastic Alexa”, January 27, 2019, https://vimeo.com/313657894.
 “Feminist Internet,” Feminist Internet, accessed October 26, 2020, https://feministinternet.com/projects/.
 “The Haunted Random Forest Festival: Finding Human Ghosts in Algorithms,” accessed October 26, 2020, https://hauntedrandomforest.tumblr.com/.
 Andrew Barry, Political Machines (London: Athlone Press, 2001), 178.
Amaro, Ramon. “As If.” Accessed October 26, 2020. https://www.e-flux.com/architecture/becoming-digital/248073/as-if/
Amazon. “Alexa is Now Even Smarter—New Features Help Make Everyday Life More Convenient, Safe, and Entertaining.” Accessed October 26, 2020. https://press.aboutamazon.com/news-releases/news-release-details/alexa-now-even-smarter-new-features-help-make-everyday-life-more.
Amazon. “Echo Dot.” Accessed October 26, 2020. https://www.amazon.com/dp/B01DFKC2SO/ref=ods_xs_dp_oop.
Amazon. “Law Enforcement Information Requests.” Accessed October 26, 2020. https://www.amazon.com/gp/help/customer/display.html?nodeId=GYSDRGWQ2C2CRYEF.
Apple. “HomePod.” Accessed October 26, 2020. https://www.apple.com/uk/homepod/.
Apple. “Transparency Report.” Accessed October 26, 2020. https://www.apple.com/legal/transparency/.
Barry, Andrew. Political Machines. London: Athlone Press, 2001.
Bennet, Brian. “All the New Alexa Products From CES 2019.” Cnet. Accessed October 26, 2020. https://www.cnet.com/news/all-the-new-alexa-products-from-ces-2019/.
Bogost, Ian. “Apple’s Empty Grandstanding About Privacy.” The Atlantic. Accessed October 26, 2020. https://www.theatlantic.com/technology/archive/2019/01/apples-hypocritical-defense-data-privacy/581680/.
boyd, danah, and Kate Crawford. “Critical Questions for Big Data.” Information, Communication and Society 15, no. 5 (2012): 662–679.
Brown, Mark. “Tech Firms Know More About Us Than Any Spy Agency—Ex-GCHQ Chief.” The Guardian, October 8, 2019. Accessed October 26, 2020. https://www.theguardian.com/uk-news/2019/oct/08/tech-firms-know-more-about-us-than-any-spy-agency-ex-gchq-chief.
Burrington, Ingrid. “Why Amazon’s Data Centres are Hidden in Spy Country.” The Atlantic. Accessed October 26, 2020. https://www.theatlantic.com/technology/archive/2016/01/amazon-web-services-data-center/423147.
Canalsys. “Canalys: Baidu Replaces Google to Become Number Two in Smart Speaker Market in Q2 2019.” Accessed October 26, 2020. https://www.canalys.com/newsroom/smart-speaker-market-q2-2019.
Darby, Sarah J. “Smart Technology in the Home: Time For More Clarity.” Building Research & Information 46, no. 1 (2018): 140–147.
Data for Black Lives. “Data for Black Lives.” Accessed October 27, 2020. https://d4bl.org/.
Duhigg, Charles. “How Companies Learn Your Secrets.” New York Times. Accessed October 26, 2020. https://www.nytimes.com/2012/02/19/magazine/shopping-habits.html.
Easterling, Keller. Extrastatecraft: The Power of Infrastructure Space. London: Verso, 2014.
Feminist Internet. “Feminist Internet.” Accessed October 26, 2020. https://feministinternet.com/projects/.
Fussell, Sidney. “Meet the Star Witness: Your Smart Speaker.” Wired. Accessed October 26, 2020. https://www.wired.com/story/star-witness-your-smart-speaker/.
Gelman, Andrew, Jeffrey Fagan and Alex Kiss. “An Analysis of the New York City Police Department’s “Stop-and-Frisk” Policy in the Context of Claims of Racial Bias”, Journal of the American Statistical Association 102, no. 479 (2007): 813–823.
Goatley, Wesley. 2017. The Dark Age of Connectionism: Captivity. Utrecht: Impakt Festival.
Google. “Request for User Informations FAQs.” Accessed October 26, 2020. https://support.google.com/transparencyreport/answer/9713961?hl=en-GB.
Halpern, Orit, Robert Mitchell and Bernard Dionysius Geoghegan. “The Smartness Mandate: Notes Toward a Critique.” Grey Room 68 (Summer 2017): 110.
“The Haunted Random Forest Festival: Finding Human Ghosts in Algorithms.” Accessed October 26, 2020. https://hauntedrandomforest.tumblr.com/.
Hollands, Robert G. “Will The Real Smart City Please Stand Up? Intelligent, Progressive or Entrepreneurial?” City 12, no. 3 (2008): 303–320.
Hook, Leslie, Richard Waters and Tim Bradshaw. “Amazon Pours Resources Into Voice Assistant Alexa.” Financial Times. Accessed October 26, 2020. https://www.ft.com/content/876ede9c-d97c-11e6-944b-e7eb37a6aa8e.
Kember, Sarah. iMedia. London: Palgrave McMillan, 2016.
Kitchin, Rob. Data Revolutions. London: Sage, 2014.
Lacerda, Beatriz. “Definitely Not a Sarcastic Alexa.” January 27, 2019. https://vimeo.com/313657894.
Macaskill, Ewen and Gabriel Dance. “NSA Files Decoded: What the Revelations Mean for You.” The Guardian, November 1, 2013. Accessed October 26, 2020. https://www.theguardian.com/world/interactive/2013/nov/01/snowden-nsa-files-surveillance-revelations-decoded#section/1.
McKinnon, Debra and sava saheli singh. “Vulnerable Bodies: Relations of Visibility in the Speculative Smart City.” Accessed October 26, 2020. https://mappinginjustice.org/vulnerable-bodies-relations-of-visibility-in-the-speculative-smart-city/.
Martin, Taylor. “5 Ways You Can Add Alexa To Your Car Right Now.” Cnet. April 30, 2018. https://www.cnet.com/how-to/ways-you-can-add-alexa-to-your-car-right-now/.
Microsoft. “2019 Voice report: Consumer Adoption of Voice Technology and Digital Assistants.” Accessed October 26, 2020. https://about.ads.microsoft.com/en-us/insights/2019-voice-report.
Mosco, Vincent. To the Cloud: Big Data in a Turbulent World. London: Routledge, 2014.
Newsome, Yvonne D. “Border Patrol: The U.S. Customs Service and the Racial Profiling of African American Women.” Journal of African American Studies 7, no. 3 (2003): 31–57.
Noble, Safiya. Algorithms of Oppression. New York: NYU Press, 2018.
Oravec, Jo Ann. “Artificial Intelligence, Automation, and Social Welfare: Some Ethical and Historical Perspectives on Technological Overstatement and Hyperbole.” Ethics and Social Welfare 13, no. 1 (2019): 18–32.
Orti Roig, Carlos. “The Ultra Human.” September 4, 2019. https://vimeo.com/357876506.
Paasi, Anssi. Territories, Boundaries and Consciousness: The Changing Geographies of the Finnish-Russian Border. New York: Wiley, 1996.
Porter, Jon. “EU Opens Amazon Antitrust Investigation.” The Verge, July 17, 2019. https://www.theverge.com/2019/7/17/20696214/amazon-european-union-antitrust-investigation-third-party-seller-marketplace.
Prokkola, Eeva-Kaisa and Juha Ridanpää. “Border Guarding and the Politics of the Body: An Examination of the Finnish Border Guard Service.” Gender, Place & Culture 22, no. 10 (2015): 1374–1390.
Revell, Tobias. “What’s It Doing?—STRP Eindoven.” Accessed October 26, 2020. https://tobiasrevell.com/What-s-It-Doing-STRP-Eindhoven.
Rose, Gillian. “Screening Smart Cities: Managing Data, Views and Vertigo.” In Compact Cinematics, edited by Pepita Hesselberth and Maria Poulaki, 177–184. London: Bloomsbury, 2017.
Seaver, Nick. “Algorithms as Culture: Some Tactics for the Ethnography of Algorithmic Systems.” Big Data & Society 4, no. 2 (2017): 1–12.
Solon, Olivia. “US Border Agents Are Doing ‘Digital Strip Searches.’ Here’s How To Protect Yourself.” The Guardian, March 31 2017. https://www.theguardian.com/us-news/2017/mar/31/us-border-phone-computer-searches-how-to-protect.
Srnicek, Nick. Platform Capitalism. Cambridge: Polity Press, 2017.
Stahl, William. “Venerating the Black Box: Magic in Media Discourse on Technology.” Science, Technology, & Human Values 20, no. 2 (1995): 234–258.
Statista. “Share of Music Streaming Subscribers Worldwide As of the First Half of 2018, by Company.” Accessed October 26, 2020. https://www.statista.com/statistics/653926/music-streaming-service-subscriber-share/.
Statista. “Smart Speaker With Intelligent Personal Assistant Quarterly Shipment Share from 2016 to 2019, by Vendor.” Accessed October 26, 2020. https://www.statista.com/statistics/792604/worldwide-smart-speaker-market-share/.
Wade Morris, Jeremy, and Devon Powers. “Control, Curation and Musical Experience in Streaming Music Services.” Creative Industries Journal 8, no. 2 (2015): 106–122.
Williamson, Judith. Decoding Advertisements: Ideology and Meaning in Advertising. London: Marion Boyars, 1978.
Yesufu, Shaka. “Discriminatory Use of Police Stop-and-Search Powers in London, UK.” International Journal of Police Science & Management 15, no. 4 (2013): 281–293.