Cybernetics and Conversational Interfaces

Dates: October 2017 to December 2017
Team: Scott Dombkowski
Work Type: Academic

Cybernetics and Conversational Interfaces is a paper employing cybernetic frameworks and historical conversational interfaces to examine and propose potential approaches to contemporary conversational interfaces.

Conversational interfaces (CIs) provide "a means or place of interaction" (OED Online, 2017) for the exchange of "thoughts and words" (OED Online, 2017) between two or more systems (person, piece of technology, etc...). CIs include your texting client, Slack client, search engine, and bank chatbot and connect you with friends, help desk agents, and virtual assistants. For the last fifty years (Eckler, 2016), conversational interfaces have existed in one form or another, but have only recently reached their current level of ubiquity (bots, Alexa, etc...). (Willens & DataArt, 2017) This pervasiveness can be seen in messaging apps overtaking the popularity of social media apps (Pangaro, 2017) and voice recognition technology being adopted at a higher rate than ever before. (Willens, 2017) "Microsoft CEO Satya Nadella likened ... [this] transition ... to previous revolutions such as the introduction of the graphical user interface, the Web, and mobile Internet." (Folstad & Brandtzaeg, 2017, 39)

While such interfaces are allowing for ever increasing possibilities, they are also "failing in their most basic form, conversation" (Dubberly & Pangaro, 2009), leading some to describe them as "overhyped and under-delivering" (Simonite, 2017). In this paper, I will employ cybernetic frameworks and historical conversational interfaces to examine and propose potential approaches to address these shortcomings.

To carry this out, we first need a better understanding of conversation and interfaces.

Conversation

Dubberly and Pangaro (2009) note that in regards to "models of interaction" that there are "at one extreme ... simply reactive systems, such as a door that opens when you step on a mat or a search engine that returns results when you submit a query. At the other extreme is conversation. Conversation is a progression of exchanges among participants." (Dubberly & Pangaro, 2009) Such "progression" (Dubberly & Pangaro, 2009) or "continuous action conceived or presented as onward movement through time" (OED Online, 2017) is achieved in a portion of the interchanges on conversational interfaces today. But, while today's CIs have, for the most part, effectively provided a medium for "progression" (Dubberly & Pangaro, 2009) in human to human interactions, they have yet to augment human to human interaction or allow for progression in human to computer interaction. This lack of augmentation and progression, is precisely the reason we see so many "under-delivering" (Simonite, 2017) conversational interfaces.

This absence is a result of conversation being a "highly complex type of interaction ..., for conversation is the means by which existing knowledge is conveyed and new knowledge is created." (Dubberly & Pangaro, 2009) Conversation can not only communicate and create knowledge, but it also has the ability "to coordinate action...to reach agreement... [and] begin an exchange," (Pangaro, 2017) while allowing for the creation of "shared history, relationship[s], trust, and unity" (Pangaro, 2017), ultimately allowing it to be "the foundation for community, commerce, culture, government, and society." (Pangaro, 2017). Such complexity makes today's conversational interfaces' inability to augment and progress interaction more understandable.

Interfaces

Interfaces were first defined as "a dynamic boundary condition describing fluidity according to its separation of one distinct fluid body from another" by James Thomson in his research of fluid dynamics. (Hookway, 2014) It is interesting to note Thomson's use of the word "fluidity" (Hookway, 2014) or "the quality of flowing easily and clearly" (OED Online, 2017). For a well designed conversational interface should allow a user to "easily and clearly" (OED Online, 2017) interact with the other "distinct" (Hookway, 2014) system.

Hookway further argues that an interface "might seem to be a form of technology, it is more properly a form of relating to technology, and so constitutes a relation that is already given, to be composed of the combined activities of human and machine." (Hookway, 2014) Such a distinction should be fundamental, when examining conversational interfaces. While it is easy to get lost in the advancement of technology in today's climate, a CI should be considered by the relations it prescribes on itself and those interacting with it.

Yesterday's Conversational Interfaces

Attempts to address the complexity and form of conversation have been undertaken in conversational interfaces for the last fifty years. Kaplan (2013) argues that a conversational interface should not only address that complexity, but be "an intelligent interface." This concept of "intelligence" (Kaplan, 2013) is seen throughout conversational interface literature and is the driving force behind the belief that CIs can drive conversation and learning. (Dubberly & Pangaro, 2009) To gain a better understanding of just how these interfaces prompt progress, we can examine a number of precursory conversational interfaces: Musicolor, Eliza, Urban5, and The Coordinator.

Musicolor (1953)

Musicolor was "a sound-actuated interactive light show" (Bird & Di Paolo, 2008) designed by Gordon Pask. "Pask's initial motivation for building the system was an interest in synesthesia and the question of whether a machine could learn relations between sounds and visual patterns and in doing so enhance a musical performance." (Bird & Di Paolo, 2008) Ultimately, he created a machine in which "the performer 'trained the machine and it played a game with him. In this sense, the system acted as an extension of the world with which he could co-operate to achieve effects ... [he] could not achieve on his own.'" (Bird & Di Paolo, 2008) Musicolor is especially noteworthy because it provides an example of a CI that disrupts the black box model we see in the majority of today's conversational interfaces. It's users were aware of its interpretation of their performance, thus allowing a user to reevaluate their actions.

ELIZA (1966)

ELIZA was a system designed by Joseph Weizenbaum that allowed "human correspondents" (Weizenbaum, 1966) to communicate through a typewriter to a simulated psychologist. "This mode of conversation was chosen because the psychiatric interview is one of the few examples of categorized dyadic natural language communication in which one of the ... [participants in the psychiatric interview] is free to assume the pose of knowing almost nothing of the real world" (Weizenbaum, 1966) and allows "the speaker to maintain his sense of being heard and understood." (Weizenbaum, 1966) ELIZA ultimately led its creator, Joseph Weizenbaum, to be "revolt[ed] that the doctor's patients actually believed the robot really understood their problems...[and that] the robot therapist could help them in a constructive way." (Wallace) Regardless, ELIZA demonstrates how influential the establishment of an environment, in which a user is comfortable, is on the outcome of a conversation.

Urban5 (1973)

Urban5 was designed by Nicholas Negroponte and MIT's Architecture Machine Group to "study the desirability and feasibility of conversing with a machine about environmental design project... using the computer as an objective mirror of the user's own design criteria and to form decisions; reflecting formed from a larger information base than the user's personal experience." (Negroponte, 1970) It achieved this through the use of "instructions" and "two languages...: graphic language and English language. The graphic language [used] the abstract representation of cubes (nouns). The English language was text appearing on the screen (verbs)." (Pertigkiozoglou, 2017) Urban5 provides an example of how an understanding of an interface affects the quality of exchanges for that specific CI.

The Coordinator (1988)

The Coordinator was a system designed by Terry Winograd to "provide facilities for generating, transmitting, storing, retrieving, and displaying messages that are records of moves in conversations." (Winograd, 1987) Unlike Musicolor, which interprets the data input into its system, The Coordinator allows "people [to] do the interpretation of natural language, and let[s] the program deal with explicit declarations of structure" (Winograd, 1987). Wheraes a typical conversational interface provides "a uniform command to initiate a new message [(texting, email)], The Coordinator system provides options for opening conversations that have different implicit structures of action." (Winograd, 1987) For example, "when Request is selected, templates appear prompting the user to specify an addressee, others who will receive copies, a domain, which groups or categorizes related conversations, and an action description, corresponding to the subject header in traditional mail systems." (Winograd, 1987) If a user were to select a different option, they would be provided with a different template designed for that specific request. The Coordinator demonstrates how by making a user's line of thought more visible to the other systems interacting with them, a conversation can be advanced in a more beneficial direction.

Today's Conversational Interfaces

Musicolor, ELIZA, Urban5, and The Coordinator not only lay the groundwork (development of technology, GUI, etc...) for the recent influx of conversational interfaces available today, but illustrate the directional shift from more exploratory inquiries to the more commercial applications we see in contemporary CIs. One such interface is Google Allo "a smart messaging app that helps you say more and do more." (Google, 2017) One way Allo addresses the complexity of conversation is with its "Smart Reply" functionality "[that] lets you keep the conversation moving with a single tap by suggesting text and emoji responses based on your personality" (Google, 2017). Another one of these interfaces is Facebook M. Similar to Google Allo's Smart Reply, M is a piece of functionality within a messaging platform, but unlike Smart Reply it goes beyond an algorithm. Facebook M utilizes "Human trainers [who] gamely do their best when they receive tough queries like 'Arrange for a parrot to visit my friend,'" (Simonite, 2017) that are impossible for a machine learning algorithm. Alexa and Siri, who were part of the "first generation of virtual personal assistants...conceived in response to improved speech recognition, faster wireless speeds, the cloud computing boom, and a new type of consumer" (Tuttle, 2015) also belong in such a discussion.

These examples show the the significant development in conversational interfaces over the past 60 years. Still while today's CIs provide functionality existing only in the dreams of Musicolor's and ELIZA's creators, they have yet to provide "a means or place of interaction between two systems?" (OED Online, 2017) that allow for intelligent "interchang[ing of] ... thoughts and words." (OED Online, 2017) Thus resulting in your inability to interact with a conversational interface that truly addresses conversation.

Modern CIs are negatively affected by a limited "interchang[ing] ... of thoughts and words" (OED Online, 2017) because "software is poor at understanding language and the world, so virtual assistants, such as Siri or Alexa, must be explicitly programmed to handle any given task." (Simonite, 2017) This leads to numerous misunderstandings between users and their interfaces and users connected through interfaces. Additional misunderstandings materialize because of users' and interfaces' contrasting mental models. For instance, Facebook M receives numerous unachievable requests, because "with limited, fully automated assistants like Siri or Alexa, people tend to settle into using a few functions they find to work reliably. [But w]ith M," (Simonite, 2017) a user's notion of what is possible is flawed, leading to ineffective "interchaning[ings] of thoughts and words." (OED Online, 2017) Today's conversational interfaces also lack in their ability to explain themselves. For instance, a user will never really understand how Allo's smart replies are generated because how Allo determines your "personality" (Google, 2017) remains an open question. Additionally, if a user were to wish to influence the intelligence provided by a conversational interface, they would most likely be unable to do so because of a lack of a direct method to effect such intelligence.

Pangaro (2011) argues that conversational interfaces have become pieces of "'collaboration software' based on information theory, sending predictable messages that maintain old thinking, telling us what we already know. As a result, new conversations to create new language and tame wicked problems become scarce and more expensive." To understand how these CIs can become the setting for the creation of "new language" (Pangaro, 2011) and the taming of "wicked problems" (Pangaro, 2011) we need to better understand what allows for a more productive conversation.

Cybernetics and Conversation Theory

Pangargo, Pask, and others construct their understanding of conversation on cybernetics. Cybernetics allows for the the modelling of "communication and intention in a common frame" (Pangaro, 2017). Ernst von Glasersfeld (1995) argues that it is a "way of of thinking... [that] involves concepts" and their formation and the creation of relationships between them. One particular cybernetic framework, Gordon Pask's (1976) Conversation Theory, presents a "formalism for describing the architecture of interactions or conversations, no matter where they may arise or among what types of entities." (Pangaro, 2002) Conversation Theory is noted as "a critical, crowning methodology of cybernetics because it closely tracks the most complex, complicating, ineffable, and sometimes intractable aspects of systems, their design, and their taming" (Pangaro, 2008, p. 36) and provides a lens into conversational interfaces and their successes and failures in facilitating conversations.

Dubberly and Pangaro (2009) simplify Pask's (1976) theory into six main tasks: the opening of a channel, a commitment to engagement, the construction of meaning, evolution, a convergence on agreement, and an action or transaction. To illustrate these six tasks, we can consider a hypothetical conversation.

In order for that hypothetical conversation to open, it must be initiated through a message. This initial message can come in many different forms (text message, alert, etc...), but needs to be in a context and language understood by the other participant[s] in the conversation. Additionally, the conversation requires an equal level of engagement from all participants in the conversation. Dubberly and Pangaro (2009) note that this "commitment may amount to nothing more than continuing to pay attention", but it must be the result of "each participant...see[ing] value in continuing the conversation". Within that conversation, messages from participant to participant need to be conveyed through shared language to allow for the understanding of the other parties. Participants utilize their experiences and knowledge to ideally create a shared understanding. If a message is conveyed in a manner unintelligible to the other, the conversation will break down. Furthermore, if a message is conveyed in a manner that allows for its misunderstanding, the conversation will never result in a shared understanding and ultimately fail. (Dubberly & Pangaro, 2009) The conversation may also facilitate change in both participants. Participants could "hold new beliefs, make decisions, or develop new relationships, with others, with circumstances or objects, or with themselves." (Dubberly & Pangaro, 2009) Dubberly and Pangaro (2009) also note that whether a conversation is effective or not is based on whether "the changes brought about by conversation have lasting value to the participants." A participant in the conversation may also be influenced to "confirm [their] understanding". (Dubberly & Pangaro, 2009) Such can be achieved through additional messages being conveyed back and forth until participants "have reached 'an agreement over [an] understanding'". (Dubberly & Pangaro, 2009) In addition, the conversation may result in an action or transaction from one or both participants of a conversation. This could come in the form of a purchase or a new appreciation for a person or artifact. (Dubberly & Pangaro, 2009)

Requirements for Conversation

Dubberly and Pangaro (2009) clarify these steps into five main "requirements for conversation," these include "[the] establish[ment] of [an] environment and mindset", "[the] use of shared language", "[an] engagement in mutually beneficial, peer-to-peer exchange", "[a] confirmation in shared mental models", and "[an] engagement in a transaction - [the] execution of cooperative actions." While some conversational interfaces do address a number of these requirements, you would be extremely hard pressed to present a conversational interface that addresses all of them. I propose that the direct application of these requirements will address the vast majority of issues with CIs voiced earlier.

So exactly how does a conversational interface address the five "requirements of conversation," (Dubberly & Pangaro, 2009) while providing "a means or place of interaction between two systems?" (OED Online, 2017) To examine this question, we will look at each individual requirement and pull from historical conversational interfaces to see how those requirements might be addressed in future CIs.

"[The] establish[ment] of [an] environment and mindset"

By establishing a common "environment and mindset," (Dubberly & Pangaro, 2009) a conversational interface provides a context and language for the successful exchange of "thoughts and words." (OED Online, 2017) Urban5 resolved this through clear "instructions." Through these instructions, users became aware of the restrictions of the application and their purpose within the application. ELIZA was also successful in addressing this requirement by establishing the context of a psychiatric appointment. Users were able to immediately recognize the limits of the interface, allowing them to concentrate on the successful "interchang[ing] ... of thoughts and words." (OED Online, 2017)

In the majority of today's conversational interfaces, limits are not established and a wide array of functionality is available. Established boundaries created with thoughtful intention whether it be during the opening of a channel, the establishment of a commitment to engagement, or the construction of meaning will allow for CIs to have more concrete "interchange[s] ... of thoughts and words." (OED Online, 2017)

"[The] use of shared language"

By establishing a "shared language," (Dubberly & Pangaro, 2009) conversational interfaces provide users the understanding for an effective exchange of "thoughts and words." (OED Online, 2017) Urban5 again provides a successful example, but this time for the creation of an environment for "[the] use of shared language." (Dubberly & Pangaro, 2009) It's main mode of manipulation was a block and because users and the interface shared an understanding of a block and its capabilities within the environment a "shared language" (Dubberly & Pangaro, 2009) was established. ELIZA is again similar, in that it employs the language of the user to construct a dialogue between that user and the interface. Both examples show how the establishment of a shared understanding can allow for a more effective exchange.

The same can not be said about the majority of today's conversational interfaces. Within these interfaces, it is not uncommon for the existence of multiple mechanisms to represent the same or very similar things. It is also common for these interfaces to generate intelligence and employ this intelligence in exchanges with a user. What is not common is an understanding of how these systems generate this intelligence and whether this intelligence is a true appreciation of the user's language. Such an understanding could be conveyed during the construction of meaning or the convergence on agreement. Regardless, CIs with "shared language[s]" (Dubberly & Pangaro, 2009) would make generating such an understanding much more achievable.

"[An] engagement in mutually beneficial, peer-to-peer exchange"

By engaging in "mutually beneficial, peer-to-peer exchanges[s]," (Dubberly & Pangaro, 2009) a conversational interface provides the climate for the successful exchange of "thoughts and words." (OED Online, 2017) ELIZA was particularly effective in creating "[an] engagement in mutually beneficial, peer-to-peer exchange." (Dubberly & Pangaro, 2009) Implementations of "categorized dyadic national language communication" (Weizenbaum, 1966) like ELIZA or similar instruments, especially when users are committing to engage in a conversation, would allow for improved interactions on conversational interfaces and potentially improve these interfaces' "naturality." (Lopez, Quesada, & Guerrero, 2017) By doing this, interfaces would provide environments for improved "interchanges" (OED Online, 2017) and the systems powering those interfaces would be able to provide improved responses, because of a greater willingness from users to interact with CIs resulting in improved "exchange[s]" (Dubberly & Pangaro, 2009) with users.

"[A] confirmation in shared mental models"

By confirming "shared mental models" (Dubberly & Pangaro, 2009) especially during the convergence on agreement, conversational interfaces afford the successful exchange of "thoughts and words." (OED Online, 2017) The Coordinator provides a successful example in "confirm[ing] ... shared mental models" (Dubberly & Pangaro, 2009). Coordinator users were explicitly aware of what type of statement others were delivering, allowing for a better understanding. One can also look at research done by Hirokazu Shirado and Nicholas A. Christakis (2017) in which "bots acting with small levels of random noise and placed in central locations meaningfully improve the collective performance of human groups, accelerating the median solution time by 55.6%". This research is one example of how "even simple artificial intelligence (AI) agents can serve a teaching function, changing the strategy of their human counterparts and modifying human–human interactions, and not just affecting human–bot interactions". (Shirado and Christakis, 2017) The Coordinator and Shirado and Christakis's research show the necessity of established "shared mental models" (Dubberly & Pangaro, 2009) and how such mental models could be more concrete with an understanding that "spoken language is the most sophisticated behaviour of the most complex organism in the known universe." (Moore, 2007)

"[An] engagement in a transaction - [the] execution of cooperative actions"

A conversational interface's ability to produce the setting for the "execution of cooperative actions" (Dubberly & Pangaro, 2009) influences its ability to provide an environment for the successful exchange of "thoughts and words." (OED Online, 2017) One conversational interface successful in achieving this is Musicolor. Pask's design shows how cooperative action between a system and its users can be the result of a specific implementation. Musicolor was able to create a dialogue through musical interaction between musicians and itself, which in turn resulted in users committing to engage with the system. Conversational interfaces can invoke similar cooperation to conjure the "thoughts and words" (OED Online, 2017) that their systems require to create exchanges beyond the "predictable" (Pangaro, 2011) that we see in today's CIs.

Musicolor also illustrates how an interface can display its understanding of a user's intention in an evolving manner to build trust among that interface's users. This building of trust would aid in the creation of "cooperative action" (Dubberly & Pangaro, 2009) and ultimately allow for the successful "interchange ... of thoughts and words." (OED Online, 2017)

Negative Implications of Conversational Interfaces

However, it also is important to note the potential negative implications of such development. To examine these negative implications, we can look at ELIZA's designer Joseph Weizenbaum's evolving view of his creation. As years passed, Weizenbaum's view of ELIZA evolved from a state of pride to a state of disgust. He saw how "the computer's intellectual feats ... [were explained by users] by bringing to bear the single analogy available to them, that is, their model of their own capacity to think," (Weizenbaum, 1976, p. 10) eventually leading these same users to think of ELIZA as a person and not as a system. He also saw how ELIZA revealed a "tendency to treat responsive computer programs as more intelligent than they really are" (Prujit, 2004, p. 521). This made him "realize that this newly created reality was and remains an impoverished version of the older one, for it rests on a rejection of those direct experiences that formed the basis for, and indeed constituted, the old reality." (Weizenbaum, 1976, p. 25) Ultimately leading Weizenbaum to believe "that, however intelligent machines may be made to be, there are some acts of thought that ought to be attempted only by humans." (Weizenbaum, 1976, p. 13).

Such issues must be considered in the creation of a conversational interface, for "man's capacity to manipulate symbols, his very ability to think, is inextricably interwoven with his linguistic abilities." (Weizenbaum, 1976, p. 184) Conversational interfaces should not limit a human's thought and reality, but provide an interface for the advancement of thought.

Conclusion

Today's conversational interfaces have allowed individuals to believe that "it might be impossible to create a ... system that would be capable of a sustained and productive language-based interaction with a human being." (Moore, 2016) While "spoken language is the most sophisticated behaviour of the most complex organism in the known universe" (Moore, 2016), it is clear that a focus on the five "requirements for conversation" (Dubberly & Pangaro, 2009) and an awareness of historical conversational interfaces would enhance conversational interfaces' ability to provide the space for improved "interchange[s] ... of thoughts and words." (OED Online, 2017)

Those conversational interfaces able to successfully address the "requirements for conversation" (Dubberly & Pangaro, 2009) and acknowledge that an "interface comes to define human agency" (Hookway, 2014) would be ready-to-hand. Such interfaces would have their "own kind of sight, by which its manipulation is guided" (Hookway, 2014) and enhance the "interchange [of] ... thoughts and words" (OED Online, 2017) in a "sustain[able] and productive" (Moore, 2016) way, while possibly creating a time where "human brains and computing machines will be coupled together very tightly and that the resulting partnership will think as no human brain has ever thought and process data in a way not approached by the information handling machines we know today" (Licklider, 1960, p. 4) defining an era that "should be intellectually the most creative and exciting in the history of mankind." (Licklider, 1960, p. 4)

References

Bird, J & Di Paolo E. (2008). Gordon Pask and his maverick machines; The mechanical mind in history. Retrieved from https://s3.amazonaws.com/academia.edu.documents/31020037/Husbands_08_Ch08_185-212.pdf?AWSAccessKeyId=AKIAIWOWYYGZ2Y53UL3A&Expires=1510327046&Signature=kGW8ucV%2BzVqTaPMuqzzRD4qT4kQ%3D&response-content-disposition=inline%3B%20filename%3DGordon_Pask_and_his_maverick_machines.pdf.

Brownlee, J. (2016, April 4). Conversational Interfaces, Explained. Retrieved from https://www.fastcodesign.com/3058546/conversational-interfaces-explained.

Christakis, N. A. & Hirokazu, S. (2017, May 8). Locally noisy autonomous agents improve global human coordination in network experiments. Nature, 545. Retrieved from https://www.nature.com/articles/nature22332.

DataArt. (2017, September 2). Chatbots and AI-Powered Conversation Interfaces: a New World to Be Conquered. Retrieved from https://hackernoon.com/chatbots-and-ai-powered-conversational-interfaces-a-new-world-to-be-conquered-cb282841ad69.

Dubberly, H. & Pangaro, P. (2009, May 1). What is conversation? How can we design for effective conversation?. Retrieved from http://www.dubberly.com/articles/what-is-conversation.html.

Eckler, D. (2016, April 6). Conversational User Interfaces. Retrieved from https://medium.com/the-mission/the-future-of-cui-isn-t-conversational-fa3d9458c2b5.

Følstad, A., & Brandtzæg, P. B. (2017). Chatbots and the new world of HCI. interactions, 24(4), 38-42.

Google. (2017). Google Allo - A smart messaging app. Retrieved from https://allo.google.com/.

Hookway, B. (2014). Interface. Retrieved from https://ebookcentral.proquest.com.

Kaplan, R. (2013, March 21). Beyond the GUI: It's Time For a Conversation User Interface. Retrieved from https://www.wired.com/2013/03/conversational-user-interface/.

Knight, W. (2016). Conversational Interfaces: Powerful speech technology from China's leading Internet company makes it much easier to use a smartphone. Retrieved from https://www.technologyreview.com/s/600766/10-breakthrough-technologies-2016-conversational-interfaces/.

Licklider, J. C. (1960). Man-computer symbiosis. IRE transactions on human factors in electronics, (1), 4-11.

Lopez, G., Quesada, L., & Guerrero, L. A. (2017, July). Alexa vs. Siri vs. Cortana vs. Google Assistant: A Comparison of Speech-Based Natural User Interfaces. In International Conference on Applied Human Factors and Ergonomics (pp. 241-250). Springer, Cham.

Moore, R. K. (2007). Spoken language processing: piecing together the puzzle. Speech Communication.

Moore, R. K. (2016). Is Spoken Language All-or-Nothing? Implications for future speech-based human-machine interaction In Jokinen, K. & Wilcock, G., Dialogues with Social Robots. Singapore: Singer.

Negroponte, N. (1970). The Architecture Machine: Towards a more human environment. Boston: MIT Press.

OED Online (2017, June). Conversation, n. Retrieved from http://www.oed.com/view/Entry/40748?rskey=O8uZc5&result=1&isAdvanced=false.

OED Online (2017, June). Fluidity, n. Retrieved from www.oed.com/view/Entry/72093

OED Online (2017, June). Interface, n. Retrieved from http://www.oed.com/view/Entry/97747?rskey=3czY2h&result=1&isAdvanced=false.

Oremus, W. (2016, September 21). Google's New Messaging App Is "Smart." But Should You Use It?. Retrieved from http://www.slate.com/blogs/future_tense/2016/09/21/google_s_new_messaging_app_allo_is_smart_but_should_you_use_it.html.

Pangaro, P. (2002). The Architecture of Conversation Theory. Retrieved from http://pangaro.com/L1L0/ArchCTBriefly2b.htm.

Pangaro, P. (2008). Instruction for Design and Designs for Conversation In Luppicini, R., Handbook of Conversation Design for Instructional Applications (pp 35-48). Hershey: Information Science Research.

Pangaro, P. (2011, September 11). Design for Conversations & Conversations for Design. Retrieved from http://www.pangaro.com/conversations-for-innovation.html.

Pangaro, P. (2017). Conversation is more than Interface [PDF document]. Retrieved from http://www.pangaro.com/ixda2017/index.html.

Pask, G. (1976). Conversation Theory: Applications in Education and Epistemology. Amsterdam: Elsevir.

Pertigkiozoglou, E. (2017, February 19). 1973: Nicholas Negroponte and Architecture Machine Group MIT. Retrieved from https://medium.com/designscience/1973-a1b835e87d1c.

Pruijt, H. (2006). Social interaction with computers: An interpretation of Weizenbaum's ELIZA and her heritage. Social science computer review, 24(4), 516-523.

Simonite, T. (2017, April 14). Facebook's Perfect, Impossible Chatbot. Retrieved from https://www.technologyreview.com/s/604117/facebooks-perfect-impossible-chatbot/.

Tuttle, T. (2015, October 27). The Future of Voice, What's Next After Siri, Alexa, and Ok Google. Retrieved from https://www.recode.net/2015/10/27/11620032/the-future-of-voice-whats-next-after-siri-alexa-and-ok-google.

von Glasersfeld, E. (1995). Radical Constructivism: A Way of Knowing and Learning. London: The Falmer Press.

Wallace, R. (n.d.). From ELIZA To A.L.I.C.E.. Retrieved from http://www.alicebot.org/articles/wallace/eliza.html.

Weizenbaum, J. (1966). Eliza -- A Computer Program For the Study of Natural Language Communication Between Man and Machine. Retrieved from https://f1000.com/work/item/3090598/resources/3485660/pdf.

Weizenbaum, J. (1976). Computer power and human reason: From judgment to calculation.

Willens, M. (2017, May 9). The state of voice in five charts. Retrieved from https://digiday.com/media/state-voice-five-charts/.

Winograd, Terry. (1987, May 14). A Language/Action Perspective on the Design of Cooperative Work. Retrieved from https://hci.stanford.edu/winograd/papers/language-action.html.