TY - JOUR AU - Napoli, Philip, M. AB - Abstract Communication scholars have recently begun to recognize and investigate the importance of algorithms to a wide range of processes related to the production and consumption of media content. There have been few efforts thus far, though, to connect these developments to potentially relevant bodies of existing theory and research. This article seeks to address this gap by exploring the utility of institutional theory as a potentially useful analytical framework for continued inquiry into the role of algorithms in the operation of media systems, and by offering some suggestions for ways in which an institutional analytical frame can be extended into algorithmic contexts. Algorithms are widely recognized as playing an increasingly influential role in the political, economic, and cultural spheres (Bucher, 2012a; Pasquale, 2010; Steiner, 2012). Algorithms have particularly prominent roles in the media sector, where the processes of media production and consumption are increasingly automated and algorithmically dictated (see, e.g., Mager, 2012). The key function that algorithms are performing in the media sector is to enable decision outputs derived from the analysis of the enormous quantities of data that can now be gathered in a media environment of extreme interactivity, in which audiences’ engagement with media leaves a growing array of capturable and quantifiable traces (see, e.g., Napoli, 2011). In this regard, algorithms can be seen as both consequence and cause of the “Big Data” phenomenon that dominates contemporary technology discourse, and that is particularly pronounced in the media sector (see, e.g., Mayer-Schonberger & Cukier, 2013). Essentially, the increasing prominence of algorithms in a variety of decision-making contexts can be seen as response to growing quantities of available data, as well as a motivator for media organizations to gather ever more data from every available source to feed into massive processing capacities of these algorithms. But, as is to be expected in these early stages of an emergent area of inquiry, there has been relatively little discussion at this point of useful theoretical frameworks for understanding algorithms and their role in contemporary media systems (for exceptions, see Anderson, 2013; Webster, 2010). This article seeks to address this gap via the application of institutional theory to the algorithmic turn (a phrase borrowed from Uricchio, 2011) in media production and consumption. Institutional theory has frequently been employed in the study of media organizations and practices (see, e.g., Cook, 2005; Schudson, 2002; Sparrow, 1999), and so as algorithms play an increasingly prominent role in these spheres, considering their roles, functions, and development through an institutional analytical framework would seem to be a natural extension of an established line of inquiry. This article starts from the premise that there are certain evolving roles and functions that algorithms serve in the dynamics of contemporary media systems that are fundamentally institutional in nature. The goals here are, first, to link explicitly the concept of institutions (media institutions in particular) with algorithms and the roles they are playing in the dynamics of media production and consumption. Developing this connection between algorithms and institutions provides the basis for the article’s second objective, which is to apply some specific lines of institutional theory to the algorithmic turn in media production and consumption. The concluding section will consider both the practical and research implications of approaching algorithms from an institutional perspective. Institutional theory and the study of media institutions It is important to note at the outset that institutional theory constitutes a very broad tent, encompassing everything from economics-driven agency theory (see, e.g., Alchian & Demsetz, 1972) to political science grounded rational choice theory (see, e.g., Moe, 1990), to more sociologically oriented theoretical approaches such as social constructivism (see, e.g., Berger & Luckmann, 1966) and rationalization (for a more detailed overview, see Scott, 2008). Given the theoretical and disciplinary breadth that characterizes the field, it is perhaps not surprising that institutional theory and research have long been characterized as possessing a high degree of definitional ambiguity and interpretive inconsistency (DiMaggio & Powell, 1991a; Phillips & Malhotra, 2008; Scott, 2008). One of the core definitional inconsistencies that has characterized institutional research involves whether institutions are conceptualized in very concrete terms as formal, complex organizations, or more abstractly as formal or informal routines, norms, rules, or behavioral guidelines (Jepperson, 1991). These conceptualizations can become intertwined, particularly within the context of the study of media institutions. For instance, the institution of journalism has traditionally resided at the intersection of complex and evolving formal organizations (e.g., news outlets) and equally complex and evolving norms and procedures related to the professional practice of journalism (although some would argue that journalism is becoming deinstitutionalized; see Napoli, 2009). Similarly, what has been termed the “institutionally effective” audience (i.e., the media audience as manifested in the norms, cognitions, and practices of media markets and organizations) resides at the intersection of the behaviors of specific media organizations (e.g., audience measurement firms, media buying agencies) and established norms, cognitions, and values that have gained traction across participants in the audience marketplace (Ettema & Whitney, 1994; Napoli, 2003, 2011). In such instances, formal organizations often function as the unit of analysis and/or context for understanding the establishment, evolution, and effects of formal or informal routines, norms, rules, or behavioral guidelines (e.g., Abrutyn & Turner, 2011). By the same token, the role of routines, norms, rules, or behavioral guidelines often can serve as an important point of entry or context for understanding the behavior of organizations. Reflecting the presence of this kind of intertwining of institutional elements within the media sector, this article will operate from an analytical position that embraces both definitional approaches, rather than engage in the long-running debate over the appropriateness of one definitional approach over the other. From a broad definitional standpoint, institutions can be broken down into three components: regulative, normative, and cultural-cognitive (Scott, 2008). The regulative dimension refers to the ways in which institutions “constrain and regularize behavior” (Scott, 2008, p. 52). This dimension entails a focus on elements such as regulatory processes, rule-setting, and sanctioning activities. The normative dimension refers to the role of social values and norms and how they contribute to the definition of goals and objectives, as well as the appropriate means of pursuing them. This dimension entails a focus on elements such as common beliefs and values within organizations and communities. The cultural-cognitive dimension refers to shared interpretive frames and conceptions of reality. This dimension entails a focus on the mechanisms via which shared meaning and knowledge are created and disseminated. As will become clear, all three of these dimensions, to varying degrees, resonate within the role and function of algorithms in media production and consumption. Through the examination of these various elements, institutional theory seeks to explain phenomena such as commonalities in the structure and behavior of organizations; the role of conventions, routines, and habits in individual and organizational behavior, and how those reflect or deviate from the pursuit of rational interests; and the construction and evolution of laws, rules, interests, and environmental cognitions (see Scott, 2008). Institutional theory in the media sector Institutional approaches to the media have a long history (see Moe & Syvertsen, 2007; Schudson, 2002). The overwhelming majority of media institutions scholarship that has employed institutional theory has focused on the news media (e.g., Benson, 2006; Cook, 2005, 2006; Lowrey, 2011; Lowrey & Woo, 2010; Napoli, 1997; Schudson, 2002). Much of this work has focused on processes of “gatekeeping,” via which decisions about which content to disseminate to the public are reached (Moe & Syvertsen, 2007). However, reflecting the broad scope of institutional theory outlined above, media institutions scholarship also has examined realms such as regulation and policymaking (Galperin, 2004), cultural production (Ahlkvist, 2001; Guzman, 2005; Kim, 2012), technology development (Flanagin, Flanagin, & Flanagin, 2010; Hrynshyn, 2008), and the construction of audiences (Napoli, 2003, 2011). Key points of focus for this body of research involve how organizational and supraorganizational forces affect media organization and industry structures, behavioral patterns, environmental cognitions, and ultimately (and perhaps most significantly), content. In many ways, it is this need to understand the institutional forces that affect content outputs and flows that is the driving force behind this body of research, given the political and cultural impacts of various forms of media content. From this standpoint, a key dimension of media institutions research to date is the extent to which it has compellingly illustrated that the media function as a political and cultural institution (Sparrow, 2006), an institutional breadth that, as will become clear, is equally applicable to the role and function of algorithms. Institutionality and algorithms When we consider algorithms through the lens of institutional definitional frameworks and characteristics, the functionalities and effects of algorithms map quite closely with those of institutions in general, and media institutions in particular. This argument is in many ways an extension of Katzenbach’s (2011) argument that media technologies should be thought of as an institution. As he illustrates, media technologies have a regulatory dimension (constraining and facilitating communicative behaviors and preferences) that is a key characteristic of institutional structures. Media technologies are able, through the characteristics of their design, to both constrain and facilitate communicative practices and preferences, and thus essentially provide base structures and parameters that regulate the production, distribution, and consumption of content. At the same time, the development of these technologies emerges from—and is shaped by—social processes, thereby reflecting a duality that is often identified as a defining characteristic of institutions (see, e.g., Giddens, 1984). Algorithms can be characterized similarly, in terms of the extent to which they have the capacity to directly structure user behaviors, impact preference formation, and impact content production decisions. All of this is achieved through mechanisms that are technological in nature but that are developed and frequently refined and recalibrated within complex social processes that are impacted by organizational and supraorganizational environmental conditions (see also Goldman, 2006; Grimmelman, 2008/2009; Jiang, 2014). Whether the unit of analysis is media technologies or algorithms, theoretical support for an institutional conceptualization can be derived from existing theoretical frameworks that focus on the social dimensions of technology, such as actor-network theory (e.g., Latour, 2005). A central proposition of actor-network theory is that agency need not be restricted to humans. Nonhuman actors—such as, for example, media technologies (see, e.g., Couldry, 2008; Plesner, 2009), or, relatedly (as is the contention here), algorithms, operate on equal footing with human actors to affect social conditions. Algorithms possess what scholars of organizations and technology have termed “material agency”—the capacity for nonhuman entities to act absent sustained human intervention (Leonardi, 2012, p. 35). This theoretical perspective helps establish a definitional approach to institutions that accommodates what are proposed here as the institutional functionalities of algorithms. Algorithms in many ways epitomize the complex intermingling of human and nonhuman actors that is central to an actor-network theory perspective on institutions. While it is certainly the case that human agency is central to the creation and ongoing modification and recalibration of algorithms, it is also the case that, as Ullman (1997) has emphasized, over time, and with the disparate inputs of an expanding number of individuals within specific, compartmentalized contexts, an algorithmic system becomes more difficult for any one person to understand in its entirety, and thus to some extent “takes on a life of its own” (p. 117). This latter point also highlights the dynamic nature of algorithms (and software more generally; see Leonardi, 2012; Manovich, 2013), as they are constantly adjusted in efforts to improve their performance in accordance with specific criteria. In this regard, algorithms fit quite well within an actor-network theoretical framework (which emphasizes fluidity and change), perhaps better than other areas of institutional theory, which tend to emphasize continuity, permanence, and path dependency (Zucker, 1977). Of course, when we talk about algorithms it is important to distinguish their institutional role and function (which can be stable) from their underlying mechanics, which generally are quite dynamic. Google’s search algorithms, for instance, are adjusted 500–600 times per year (MOZ, 2013), which does not necessarily undermine the institutional capacities in which these algorithms serve. This is comparable to the situation in, for example, law, where individual laws are frequently added, eliminated, or changed within the context of a stable, overarching institutional structure (La Torre, 2010). The institutionality of algorithms is inherent in Lawrence Lessig’s (2006) widely embraced notion that “code is law” (p. 6). Lessig (2006) employed this analogy to illustrate the ways in which the programming that controls the operation of communications networks and platforms is a powerful tool for regulating the behavior of users in ways that are not always obvious. If we extrapolate from Lessig’s (2006) metaphor, systems of laws, are, of course, widely understood as institutions (La Torre, 2010), reflecting the regulative dimension that is central to how institutions are defined. Algorithms can, of course, be thought of as a form of code. And so, if code is law, and law is an institution, then the institutionality of algorithms naturally follows, particularly given, as will be illustrated below, algorithms often serve a similar function in terms of regulating individuals’ behaviors (in addition to serving other core institutional functions). This perspective is supported in Jepperson’s (1991) definitional assessment of the concepts of institutions and institutionalization, in which he notes the following: “Within any system having multiple levels or orders of organization … primary levels of organization can operate as institutions relative to secondary levels of organization. A microcomputer’s basic operating system appears as an institution relative to its word-processing program (especially to a software engineer)” (p. 147). Here, through the use of a computer software example to illustrate the operation of institutions, the logic of code (and thus, by extension, algorithms) as institution becomes a bit more concrete. Extending this analogy, we can think of algorithms functioning at a primary level of organization, providing the parameters in which subsequent functions of the decision-making systems and the organizations in which they are embedded are carried out. An additional shared characteristic between institutions and (many) algorithms is their tendency toward opacity and the ways in which individuals and organizations react to this opacity. As many observers have pointed out, the internal operation of algorithms is a combination of complexity and intentional opacity (in order to protect competitive advantages and to prevent “gaming” of the systems) (e.g., Diakopoulos, 2014; Elgesem, 2008). Consequently, the notion of the algorithm as “black box” has frequently been raised (Gillespie, 2014). Importantly, though, this black box nature of algorithms does not appear to inhibit meaningfully their being embraced and employed as decision-making surrogates (at least not yet). In this regard we see an interesting parallel with Berger and Luckmann’s (1966) well-known characterization of institutions as often inherently opaque, and yet: “The objective reality of institutions is not diminished if the individual does not understand their purpose or their mode of operation” (p. 60; see also Scott & Orlikowski, 2012). The algorithmic turn in media and its institutional connections The next step is to delve more deeply into the ways in which algorithms are functioning in the media sector, and to see how these functions connect with foundational articulations of what institutions are and what they do. For the purposes of this overview (which can only capture the tip of the iceberg in terms of the full extent of the algorithmic turn that is occurring in the media space), this discussion will be divided into the role of algorithms in: (a) media consumption and (b) media production. The algorithmic turn in media consumption One of the key functions that algorithms perform in contemporary media consumption is to assist audiences in the process of navigating an increasingly complex and fragmented media environment. Central to this navigation process are the typically algorithmically driven search, recommendation, and content aggregation systems that facilitate searching for and selecting content in an environment of such extreme content abundance that technologically unaided forms of search and navigation are no longer practical or effective (Anderson, 2006). These algorithmically driven systems are, of course, central to search engines, social media platforms, and content aggregators such as Amazon, iTunes, YouTube, Pandora, and Netflix. To some extent, one could argue that content has become commodified, with the real value residing in the systems that users can employ to navigate through and select from the wealth of available content. From an institutional standpoint, it is important to emphasize, as Webster (2011) does, the extent to which these systems focus attention in particular ways and “structure decision making within certain bounds” (p. 50). In this regard, these algorithmic systems exhibit characteristics that are inherently institutional in nature. In his application of Giddens’ (1984) structuration theory to the new media environment, Webster (2011) illustrates how individuals and institutions mutually construct the media environment. According to Giddens (1984), a key element of structuration is the notion of duality, which refers to the extent to which agents and structures (translated in the context being examined here to individuals and institutions) mutually reproduce the social world. Central to Webster’s (2011) analysis are “user information regimes”—his terminology for the algorithmically driven search and recommendation systems that users rely upon. The notion of duality is of particular relevance here, as these user information regimes construct and constrain how users perceive and engage with their media environment. At the same time, the activities of these users feed into (in the form of user data), and to some extent construct (in terms of how these user data are employed by algorithms in the functioning of the search and recommendation systems) these user information regimes and the roadmap to the media environment that they provide. We see these patterns, for instance, in the fact that Netflix users rely quite heavily on the service’s recommendation system in the process of selecting video programming (Keating, 2012), while at the same time the design of recommendation systems can influence the content feedback that users provide—altering the opinions that serve as inputs to, and thus ultimately the outputs of, the recommendation system (Cosley et al., 2003). Specific patterns of mutual influence (i.e., duality) in the intersection of users and user information regimes are already being identified. One important pattern, for instance, is a certain amount of reflexivity that is inherent in much algorithmically driven media consumption. For instance, as Bucher (2012b) has illustrated in her analysis of Facebook’s GraphRank algorithm, the algorithm monitors users’ behavior to find the most interesting patterns. Once these patterns are found, they are fed back to the users via Facebook’s News Feed. “Consequently, even more users will apparently act in the way that the algorithm predicts” (Bucher, 2012b, p. 14). Further, the dynamics of many search, recommendation, and navigation algorithms emphasize popularity as a key criterion in generating results (see Jones, 2012; Webster, 2011), which again leads to a certain reflexivity in their operation. Popular content is what is most frequently and prominently recommended, thus further enhancing its popularity relative to other available content, and inhibiting less popular content from gaining popularity (see Cho & Roy, 2004). These algorithmic media consumption tools operate in ways that have the same kind of inherently political implications as more traditional media institutions such as the news media (Gillespie, 2014). Consider a case such as Twitter’s Trends list, which provides users with a list of the most popular topics currently being discussed on the platform. This list is an algorithmically generated output from over 250 million tweets sent daily that serves in part to guide the media consumption behavior of Twitter users (pointing them toward popular topics). The broader significance of this algorithm was illustrated via the recent controversy over the seemingly premature disappearance of the Occupy Wall Street movement from the Twitter Trends list (Gillespie, 2011). In response to charges of politically motivated censorship, Twitter released details regarding the operation of its Trends algorithm. The company noted that the Trends algorithm is not based on a simple calculation of the most used terms, but rather takes into account factors such as whether the term is recently surging in popularity, the clustering patterns of the users of the term, and the ratio of unique tweets to retweets (Gillespie, 2011). The fact that the presence or absence of the Occupy Wall Street movement in the Twitter Trends list was a widely discussed, controversial, and politically significant topic illustrates the extent to which the list and the algorithm that generates it possess institutional characteristics. The controversy recalls more traditional media criticism that would frequently focus on the presence or dearth of media coverage of specific issues, individuals, or organizations, within the context of the significant agenda setting effects that such patterns could have. From this perspective, the magnitude of the Twitter Trends controversy highlights how the list represents a fundamental mechanism by which individuals and organizations form their cognitions of the online public sphere and the political dynamics reflected there. This brings us to the other notable dimension of this example—the explicitly political and cultural ramifications of the output of this particular algorithm, which is in keeping with the well established understanding of the media as both political and cultural institution (see, e.g., Cook, 2005; Sparrow, 2006). Pasquale (2010) contends that this political and cultural influence of certain algorithmically driven search and recommendation systems has reached such a point that they should be treated, from a policymaking standpoint, as “essential cultural and political facilit[ies]” (p. 402), analogous to traditional essential facilities such as utilities or telephone companies. Such a perspective certainly reinforces the institutional character of these algorithmically driven systems. Another example of explicitly political resonance has been demonstrated in recent research showing that the manipulation of search engine rankings can potentially affect election outcomes, and that such manipulations can easily be conducted surreptitiously (Epstein & Robertson, 2013). Findings such as these reflect increasing concerns about the influence of such algorithmically driven platforms, due in large part to the already ingrained perceptions among users that such search returns represent objective and reliable representations of relevant online content (not unlike the prevailing perceptions that long have surrounded the institution of journalism). In a call for the imposition of public interest obligations on search engines (essentially, treating search engines comparably to another prominent media institution—the broadcast media), Laidlaw (2008) emphasizes that search engines are “authoritative and reliable, and shape public opinion and meaning” (p. 124). In these ways, the algorithms that are at the core of search engines are functioning in a political capacity similar to established media institutions. In the realm of social media, content recommendation algorithms are mediating not just the consumption of media content, but also the dynamics of individual social relations and interactions. In their examination of the role of Facebook in the formation of political groupings, Langlois, Elmer, McKelvey, and Devereaux (2009) emphasize the ways in which software and protocols (i.e., algorithms) function as “actors that intervene directly in cultural and communicational processes,” and thus need to be thought of as “a new type of actor … [that] fundamentally changes the dynamics of the constitution of issues and their publics” (p. 429). In a similar vein, Beer (2009) characterizes algorithms as having “the capacity to shape social and cultural formations and impact directly on individual lives” (p. 994). In these characterizations, we see strong parallels between the functioning of algorithms and established understandings of the functioning of institutions. According to North (1981), institutions “provide the framework through which human beings interact. They establish the cooperative and competitive relationships which constitute a society” (p. 20). Clearly, this is increasingly what algorithms are doing, particularly via their centrality to media platforms such as social media sites and applications. These examples are all reflective of the ways in which algorithms have been identified as having a “governmental power” and a “gatekeeping function” (Bucher, 2012b, pp. 8–9) and thus operate as an extension of the institutional functionalities that long have been associated with traditional media (Gillespie, 2014). Consider the parallels with DiMaggio and Powell’s (1991a) statement that “Institutions do not just constrain options; they establish the very criteria by which people discover their preferences” (p. 11). This statement perfectly encapsulates the functionalities of content search and recommendation systems in users’ media consumption behaviors. The algorithmic turn in media production Algorithms are also playing an increasingly prominent role on the production side of the media equation. As the media environment grows more complex, with audiences increasingly fragmented and empowered, and with a growing array of technologies and platforms at their disposal, media organizations are increasingly turning to data and algorithms to help them effectively navigate this environment (Davenport & Harris, 2009). Two of the primary functions that algorithms are performing in the media production realm at this point are: (a) serving as a demand predictor and (b) serving as content creator. Looking first at the realm of demand prediction, in this “Big Data” era, media organizations have an ever-expanding supply of data on audiences’ media consumption patterns and preferences to draw upon (Napoli, 2011), and algorithms play a central role in producing decision outcomes from these stores of data. The motion picture industry, for instance, has begun to rely on predictive software packages such as Epagogix, which employs algorithms to predict the success of prospective film projects based upon the plot elements contained within the individual film scripts, and linking these content characteristics with historical data on box office grosses (Davenport & Harris, 2009; Gladwell, 2006). Similarly, Netflix has been developing its slate of original programming by feeding its enormous trove of audience behavior and ratings data into a predictive algorithm that then identifies the type of original programming most likely to succeed (Carr, 2013; Leonard, 2013). The inputs in this case are obviously very different than the inputs being utilized by a system such as Epagogix, but the outcome is essentially the same—algorithmically derived performance forecasts that increasingly are dictating production decisions. Perhaps the most controversial application of such algorithmically driven demand predictors has been in the realm of journalism. In some cases (such as Patch, AOL’s failing hyperlocal news venture), algorithms that analyze demographic, social, and political variables related to individual communities and their demand for local news have been used to determine where local news outlets will be established (Tartakoff, 2010). In such cases, the very existence of local news operations is, to some extent, algorithmically dictated. In many other cases (including Patch), news organizations are increasingly relying on analyses of various forms of user behavior and feedback data to calibrate more precisely their news gathering and reporting activities. Many newsrooms now operate with comprehensive and immediate feedback related to various aspects of online news consumption, ranging from page views to time spent on a site/story, to ratings, to volume and valence of comments (see, e.g., Anderson, 2010; Anderson, 2011b). But the issue here goes beyond the availability and analysis of new forms of audience data. Rather, the specific concern is the role that algorithms play in making sense of these data, and how these algorithmic analyses then affect content decision-making. Consider, for instance, the case of “content farms.” Content farms mine search engine data to estimate demand for content on various topics, and then produce that content rapidly and cheaply in order to meet that demand (Bakker, 2012). Once again, the process is algorithmically driven. Leading content farm Demand Media, for instance, feeds its algorithm three types of data: (a) popular search terms from search engines, (b) the ad market for keywords (i.e., which keywords are currently being sought and for how much), and (c) the competitive environment (in terms of content that is already available online) (Roth, 2009). The output then represents a prediction of the type of content for which there is the highest unmet audience and advertiser demand, and Demand Media produces that content accordingly (Anderson, 2011a). In cases such as these, content production decisions are increasingly being algorithmically dictated in contexts in which traditional institutional norms emphasized a decision-making process based less on audiences’ expressions of their interests and wants and more on professionally established criteria regarding audiences’ informational needs in order to be better-informed citizens (Anderson, 2011a). This transition essentially has as its core a tension between two fundamentally different guiding institutional norms related to the practice of journalism and the formulation and application of news values. The content farm example also represents a tension between different, but intersecting, algorithmic institutional structures. Specifically, in an effort to combat the extent to which “low quality” content farm articles appeared prominently in its search returns, Google adjusted its algorithms to reduce the relevance of “low-quality” sites (defined as sites that copy content or provide low value-add to users) (Tartakoff, 2011). Here, the mechanisms for algorithmic media consumption are being adjusted in response to a system of algorithmic media production that, ironically, relies heavily on data from users’ algorithmically driven media consumption behaviors. As this example, along with the more recent example of Facebook adjusting its algorithm to downplay the visibility of “viral” content such as photo memes (see Ingram, 2013) illustrates, certain algorithms operate as institutional structures that other content providers must try to navigate effectively. The algorithmic turn in media production is, in some instances, being expanded in ways that go beyond demand prediction and extend into the realm of content creation. Essentially, the direct human element in the process of content creation is, in some contexts, being eliminated. This is not to say that the human element is being eliminated from content creation. Algorithms are human creations. Rather, the point here is that the human role in content creation is migrating from a direct to an indirect role. Algorithms have been developed and employed to perform comparably to human content creators in areas such as poetry and music composition (Steiner, 2012). They are also playing an increasingly prominent role in areas of online content creation such as tweets, where a large number of tweets are automatically generated by algorithmically driven bots (Chu et al., 2011). This model is at the core of Narrative Science, a start-up based around a software package that can generate complete news stories once it is fed the core data around which the stories will be based (e.g., sporting event scores/stats, company financial reports, housing data, survey data) (Lohr, 2011). In this realm of algorithmic media production, we once again see strong intersections between the functionality of algorithms and the functionality of institutions. Consider, for instance, Douglas (1986) statement: “Institutions encode information. They are credited with making routine decisions, solving routine problems, and doing a lot of regular thinking on behalf of individuals” (p. 47). As this section has illustrated, in the realm of media production, common decisions regarding what content to produce and how to produce it are being delegated to algorithms. Given the extent to which media institutions research has focused over the course of its history on understanding the processes via which such culturally and politically significant decisions are reached within media organizations, incorporating the algorithmic turn into such institutional analytical frameworks is now clearly vital. Algorithms and institutional theory The previous section sought to forge some basic definitional and functional connections between algorithms and institutions. This section seeks to build upon these connections by exploring some lines of institutional theory that seem to have direct relevance to the role, function, and development of algorithms in the dynamics of media consumption and production. Thus, the goal here is to illustrate that the institutionality of algorithms is not just definitional, but theoretical as well. Institutional isomorphism Institutional isomorphism refers to the tendency for organizations in a particular field to resemble one another across a variety of dimensions. Explanations for this tendency include: (a) coercion, which involves the pressures exerted on organizations by other organizations upon which they are dependent, and by cultural expectations in society within which the organization functions (DiMaggio & Powell, 1991b); (b) mimetic processes, which involve organizations responding to uncertainty in their environment or objectives by modeling themselves on similar, or more successful organizations in their field; and (c) normative pressures, which involve the processes of professionalization (education, training, acculturation) that result in increased similarity across organizations (DiMaggio & Powell, 1991b). This theoretical framework may help explain some patterns we see in the role algorithms are playing in media production and consumption. For instance, despite employing different Web crawling procedures and ranking algorithms, major search engines often exhibit high levels of similarity in their search returns (Hindman, 2009), despite the many different interpretive approaches that can be taken to the concept of relevance (Van Couvering, 2007). Such findings raise the question of whether processes of institutional isomorphism may be at work, perhaps a product of normative pressures that reflect high levels of similarity in the professional training and acculturation of those developing the relevant algorithms. Institutional isomorphism in the realm of algorithmic media may even occur across algorithmic and nonalgorithmic media platforms. For instance, as Gillespie (2011) points out, the emphasis on novelty that has characterized news reporting in the traditional media sector seems to be reproducing itself in the online space in contexts such as the design of the Twitter Trends algorithm (discussed above). Similarly, Rogers (2004) has demonstrated how Google search returns tend to map quite closely with the issues and sources characteristic of mainstream media. Both coercive forces (related to the need to meet the established expectations of news consumers) and mimetic processes (related to efforts to respond to uncertainty in audience demand for content) would seem to be potentially credible explanations for these patterns. It also seems reasonable to consider algorithms as a potential driver of institutional isomorphism. To the extent that more and more content producers are, for instance, relying upon algorithmic demand prediction models that are derived from historical data on audiences’ exhibited consumption and appreciation patterns, there would seem to be the possibility that different organizations are increasingly likely to produce similar outputs, given the extent to which they are all essentially data mining the same history and producing content on the basis of those results, in an effort to cope with the persistent uncertainty of audiences’ demand for content. For instance, the more motion picture studios that rely on Epagogix to determine their production slate, the more likely it would seem that these studios will produce similar films, as the same algorithm and underlying data are driving their decision-making. In this regard, algorithmically driven institutional isomorphism essentially results in diminished diversity of content output. Social constructivism Social constructivism addresses the ways in which institutions provide shared meanings and cognitions, which then serve as important mechanisms for guiding behaviors. This theoretical perspective emphasizes that social reality is a product of social processes directed at the establishment of shared knowledge and belief systems (Scott, 2008). The cultural-cognitive dimension of institutions emphasizes “the extent to which behavior is informed and constrained by the ways in which knowledge is constructed and codified. Underlying all these decisions are socially constructed models, assumptions, and schemas” (Scott, 2008, p. 68). Algorithms serve as prime examples of constructors and codifiers of knowledge, particularly in contexts such as search engines, which play a central institutional role in aggregating, categorizing, organizing, and presenting information (Halavais, 2009; Vaidhyanathan, 2011). This theoretical perspective points to the importance of understanding the social construction of algorithms, given that, as Berger and Luckmann (1966) have emphasized, “To understand the state of the socially constructed universe at any given time, or its change over time, one must understand the social organization that permits the definers to do the defining” (p. 116). It seems reasonable to think of algorithms as “the definers” that “do the defining,” in many contexts. Understanding the social organization underlying these definers means understanding the social processes underlying the construction of the algorithms that play an increasingly influential role in the social construction of knowledge. From this standpoint, the institutionality of algorithms encompasses not only their role and function in media production and consumption, but also the analytical flip side, which is the institutional forces that may impact the development and deployment of algorithms. These particular definers are, of course, uniquely technological in their orientation, which highlights the relevance of a related line of social constructivism—the social construction of technological systems. Technological systems have been described as having messy, complex problem-solving components and are both socially constructed and society shaping (a recurrence of the duality theme that is central to our understanding of both institutions and algorithms) (Hughes, 2012). Their components include a physical component, organizations, and regulations and laws (Hughes, 2012). Importantly for this context, they are also defined to include both hardware and software (Constant, 2012). Algorithms would seem to fit quite well within this definitional framework, and so we can perhaps think of algorithms as technological systems that play an increasingly integral role in the social construction of knowledge. This social constructivist approach to algorithms can be seen in Van Couvering’s (2007) research examining how the cultural schemas possessed by those engaged in algorithm development and maintenance affect the thought processes and procedures in their work (e.g., Van Couvering, 2007). Van Couvering (2007) illustrates how market (oriented around competitive and revenue-generating concerns) and science/technology (oriented around norms of objectivity, innovation, and experimentation) schemas dominate the conceptualizations of “quality” in the design, assessment, and modification of search algorithms. In this way, we see how specific social values and norms get reflected in the design of algorithms—important initial steps to understanding the social construction of these technological systems that are becoming increasingly central to knowledge construction. Conclusion This article has attempted to serve a number of functions. First, it has made a case for recognizing the institutionality of algorithms in that they facilitate and constrain the behaviors and cognitions of both media organizations and media users. In making this case, this article also has attempted to illustrate number of ways in which institutional theory can meaningfully inform our understanding of algorithms, their construction and usage in the media sector, and serve as a useful theoretical framework for future research. In developing these arguments, this article also has illustrated number of specific contexts in which the political and cultural implications of the operation of algorithms in the media sector are quite pronounced—strengthening the institutional connection between algorithms and traditional media institutions. It is important to emphasize, however, that the goal here is not to argue that all algorithms should be thought of as institutions, but rather to make the case that an institutional analytical frame is appropriate/valuable in those increasingly common instances in which algorithms are serving in capacities that intersect with those of traditional media institutions. Certainly, not all algorithms perform the type of specific institutional functions discussed here, even when we focus on the relatively narrow confines of the media sector that is the focus of this analysis. Perhaps more important, however, is that not all algorithms that might fall within the scope of this discussion have achieved the kind of widespread social acceptance or significance that is fundamental to the connotative meaning of the concept of an institution. At the same time, one could certainly argue that (at least in some contexts) it may be too early to approach algorithms from an institutional perspective; that they have yet to demonstrate sufficient permanence in the dynamics of media production and consumption that is also a key definitional element of institutions. Consider, for instance, that Barnes & Noble’s latest version of its Nook e-reader specifically touts that the book recommendations that users receive are generated by actual humans, rather than algorithms (Pierce, 2013). This approach is obviously a strategic response to Amazon’s algorithmically driven recommendation system, and suggests that media organizations may already be finding it advantageous to position themselves in opposition to algorithmically driven processes of media production and consumption. These efforts to delineate the relevant boundaries of this analytical framework are admittedly a bit muddy. A useful contribution of future research would be to clarify these boundaries, both conceptually and empirically. Nonetheless, at a point in time in which algorithms are being assessed and critiqued in terms of issues such as their demonstrated objectivity or bias (Goldman, 2006); their role in censoring audiences’ access to content (Gillespie, 2011; Pariser, 2011), and even in terms of the extent of their First Amendment rights (see, e.g., Benjamin, 2013; Wu, 2013) (all issues that have been central to the analysis of traditional media institutions), an institutional orientation toward algorithms and their role in the dynamics of media production and consumption seems appropriate. As these specific contexts of algorithmic debate and critique illustrate, there are likely practical policy implications that arise from approaching algorithms and their functionality from an institutional perspective. For instance, an institutional perspective on algorithms may provide a more compelling basis from which policymakers, scholars, and policy advocates can justify regulatory interventions or even the imposition of public service obligations of one form or another. From this standpoint, it is perhaps telling that algorithmically driven organizations such as Google and Facebook have, at various points in time, steadfastly resisted being characterized as media companies (see, e.g., Carr, 2011; McMains, 2012; Ulanoff, 2014). Such rhetorical positioning can easily be interpreted as an effort to minimize the institutional nature and scope of what these companies do, and thereby better insulate them from the type of policy interventions that are well established in relation to traditional media organizations, and that are premised, at least in part, on the institutional functionalities of these organizations in the political and cultural spheres (Napoli, 2001). As an extension of the institutionality of algorithms put forth here, future research should delve more deeply into this process of institutionalization and explore the process by which both media organizations and media users embrace algorithmically driven decision-making tools. Such research could ideally explain both how and why “the delegation of personal autonomy or of trusteeship to materialized technical systems is, in many situations, preferred to handing them over to other humans” (Joerges & Czamiawska, 1998, p. 370). Some scholars have emphasized that future institutional research should shift from the current focus on the outcomes or products of institutional influence and more on the processes via which institutions are established (Suddaby, 2010). Such a direction would seem particularly important during this period of what appears to be fairly rapid institutionalization of algorithmically driven decision-making in the media sector. We know little, at this point, for instance, about the organizational dynamics surrounding the adoption and usage of algorithmic tools in the media sector. Are there intraorganizational tensions; and if so, how are they being resolved? How are established professional norms, identities, and practices adapting? How are algorithmic tools becoming legitimized in organizational processes? Future research should also delve more deeply into the organizational dynamics of algorithmic development, deployment, and calibration. Institutional theory could prove useful in this regard in terms of illuminating the ways in which algorithms themselves are institutionally generated. Greater attention to this subject may be able to revive an area of media scholarship that has, by some accounts, grown dormant. Some critics of the field of media sociology (from which the bulk of media institutions research has emerged) have contended that the field “largely stalled” after the many path breaking institutional analyses of news organizations in the 1970s and 1980s (Kaplan, 2006, p. 173). While one could easily take issue with this characterization, it does seem reasonable to contend that an emphasis going forward on the “sociology of the algorithm” (Anderson, 2011c, p. 6) would represent a vital new direction for media sociology that reflects this important, emergent institutional context. References Abrutyn , S. , & Turner , J. H. ( 2011 ). The old institutionalism meets the new institutionalism . Sociological Perspectives , 54 ( 3 ), 283 – 306 . doi:10.1525/sop.2011.54.3.283 Google Scholar Crossref Search ADS WorldCat Ahlkvist , J. A. ( 2001 ). Programming philosophies and the rationalization of music radio . Media, Culture & Society , 23 ( 3 ), 339 – 358 . doi:10.1177/016344301023003004 Google Scholar Crossref Search ADS WorldCat Alchian , A. A. , & Demsetz , H. ( 1972 ). Production, information costs, and economic organization . American Economic Review , 62 , 777 – 795 . doi:10.2307/1815199 WorldCat Anderson , C. ( 2006 ). The long tail: How the future of business is selling less of more . New York, NY : Hyperion . Google Preview WorldCat COPAC Anderson , L. ( 2010 ). AOL and its algorithm . Columbia Journalism Review . Retrieved from http://www.cjr.org/feature/aol_and_its_algorithm.php?page=all Anderson , C. W. ( 2011a ). Deliberative, agonistic, and algorithmic audiences: Journalism’s vision of its public in an age of audience transparency . International Journal of Communication , 5 , 529 – 547 . WorldCat Anderson , C. W. ( 2011b ). Between creative and quantified audiences: Web metrics and changing patterns of newswork in local U.S. newsrooms . Journalism: Theory, Practice, and Criticism , 12 ( 5 ), 550 – 566 . doi:10.1177/1464884911402451 Google Scholar Crossref Search ADS WorldCat Anderson , C. W. ( 2011c , October). Understanding the role played by algorithms and computational practices in the collection, evaluation, presentation, and dissemination of journalistic evidence . Paper presented at the 1st Berlin Symposium on the Internet and Society , Berlin, Germany . Anderson , C. W. ( 2013 ). Towards a sociology of computational and algorithmic journalism . New Media & Society , 15 ( 7 ), 1005 – 1021 . doi:10.1177/1461444812465137 Google Scholar Crossref Search ADS WorldCat Bakker , P. ( 2012 ). Aggregation, content farms, and Huffinization: The rise of low-pay and no- pay journalism . Journalism Practice , 6 ( 5–6 ), 627 – 637 . doi:10.1080/17512786.2012.667266 Google Scholar Crossref Search ADS WorldCat Beer , D. ( 2009 ). Power through the algorithm? Participatory web cultures and the technological unconscious . New Media & Society , 11 ( 6 ), 985 – 1002 . doi:10.1177/1461444809336551 Google Scholar Crossref Search ADS WorldCat Benjamin , S. M. ( 2013 ). Algorithms and speech . University of Pennsylvania Law Review , 161 , 1445 – 1493 . WorldCat Benson , R. ( 2006 ). New media as a “journalistic field”: What Bourdieu adds to new institutionalism, and vice versa . Political Communication , 23 , 187 – 202 . doi:10.1080/10584600600629802 Google Scholar Crossref Search ADS WorldCat Berger , P. L. , & Luckmann , T. ( 1966 ). The social construction of reality: A treatise on the sociology of knowledge . New York, NY : Anchor Books . Google Preview WorldCat COPAC Bucher , T. ( 2012a ). Want to be on top? Algorithmic power and the threat of invisibility on Facebook . New Media & Society , 14 ( 7 ), 1164 – 1180 . doi:10.1177/1461444812440159 Google Scholar Crossref Search ADS WorldCat Bucher , T. ( 2012b ). A technicity of attention: How software “makes sense.” Culture Machine , 13 , 1 – 23 . WorldCat Carr , D. ( 2011 , March 20). The evolving mission of Google . The New York Times . Retrieved from http://www.nytimes.com/2011/03/21/business/media/21carr.html?_r=0 Carr , D. ( 2013 , February 24). Giving viewers what they want . The New York Times . Retrieved from http://www.nytimes.com/2013/02/25/business/media/for-house-of-cards-using-big-data-to-guarantee-its-popularity.html?pagewanted=all&_r=0 Cho , J. , & Roy , S. ( 2004 , May). Impact of search engines on Web page popularity . Paper presented at the WWW2004 Conference , New York, NY . Google Preview WorldCat COPAC Chu , Z. , Gianvecchio , S. , Wang , H. , & Jajodia , S. ( 2011 ). Who is tweeting on Twitter: Human, bot, or cyborg? Proceedings of the 26th Annual Computer Security Applications Conference (pp. 21 – 30 ). New York : Association for Computing Machinery . Google Preview WorldCat COPAC Constant , E. W. ( 2012 ). The social locus of technological practice. In W. E. Bijker , T. P. Hughes & T. Pinch (Eds.), The social construction of technological systems: New directions in the sociology and history of technology (Anniversary ed., pp. 217 – 236 ). Cambridge, MA : MIT Press . Google Preview WorldCat COPAC Cook , T. E. ( 2005 ). Governing with the news: The news media as a political institution (2nd ed.). Chicago, IL : University of Chicago Press . Google Scholar Crossref Search ADS Google Preview WorldCat COPAC Cook , T. E. ( 2006 ). The news media as a political institution: Looking backward and looking forward . Political Communication , 23 , 159 – 171 . doi:10.1080/10584600600629711 Google Scholar Crossref Search ADS WorldCat Cosley , D. , Lam , S. K. , Albert , I. , Konstan , J. A. , & Riedl , J. ( 2003 , April). Is seeing believing? How recommender system interfaces affect users’ opinions . Paper presented at the CHI 2003 Conference , Fort Lauderdale, FL . Google Preview WorldCat COPAC Couldry , N. ( 2008 ). Actor-network theory and media: Do they connect and on what terms?. In A. Hepp , F. Krotz , S. Moores & C. Winter (Eds.), Connectivity, networks and flows: Conceptualizing contemporary communications (pp. 93 – 111 ). Cresskill, NJ : Hampton Press . Google Preview WorldCat COPAC Davenport , T. H. , & Harris , J. G. ( 2009 ). What people want to know (and how to predict it) . MIT Sloan Management Review , 50 ( 2 ), 23 – 31 . WorldCat Diakopoulos , N. ( 2014 ). Algorithmic accountability reporting: Investigation of black boxes (Tow/Knight Brief). New York, NY : Tow Center for Digital Journalism . Retrieved from http://towcenter.org/algorithmic-accountability-2/ DiMaggio , P. J. , & Powell , W. W. ( 1991a ). Introduction. In W. W. Powell & P. J. DiMaggio (Eds.), The new institutionalism in organizational analysis (pp. 1 – 38 ). Chicago, IL : University of Chicago Press . Google Preview WorldCat COPAC DiMaggio , P. J. , & Powell , W. W. ( 1991b ). The iron cage revisited: Institutional isomorphism and collective rationality in organizational fields. In W. W. Powell & P. J. DiMaggio (Eds.), The new institutionalism in organizational analysis (pp. 63 – 82 ). Chicago, IL : University of Chicago Press . Google Preview WorldCat COPAC Douglas , M. ( 1986 ). How institutions think . New York, NY : Syracuse University Press . Google Preview WorldCat COPAC Elgesem , D. ( 2008 ). Search engines and the public use of reason . Ethics and Information Technology , 10 , 233 – 242 . doi:10.1007/s10676-008-9177-3 Google Scholar Crossref Search ADS WorldCat Epstein , R. , & Robertson , R. E. ( 2013 , May). Democracy at risk: Manipulating search rankings can shift voting preferences substantially without voter awareness . Paper presented at the annual meeting of the Association for Psychological Science , Washington, DC . Google Preview WorldCat COPAC Ettema , J. S. , & Whitney , D. C. ( 1994 ). The money arrow: An introduction to audiencemaking. In J. S. Ettema & D. C. Whitney (Eds.), Audiencemaking: How the media create the audience (pp. 1 – 18 ). Thousand Oaks, CA : Sage . Google Preview WorldCat COPAC Flanagin , A. J. , Flanagin , C. , & Flanagin , J. ( 2010 ). Technical code and the social construction of the Internet . New Media & Society , 12 ( 2 ), 179 – 196 . doi:10.1177/1461444809341391 Google Scholar Crossref Search ADS WorldCat Galperin , H. ( 2004 ). Beyond interests, ideas, and technology: An institutional approach to communication and information policy . The Information Society , 20 , 159 – 168 . doi:10.1080/01972240490456818 Google Scholar Crossref Search ADS WorldCat Giddens , A. ( 1984 ). The constitution of society: Outline of the theory of structuration . Berkeley : University of California Press . Google Preview WorldCat COPAC Gillespie , T. ( 2011 , October). Can an algorithm be wrong? limn , issue 2. Retrieved from http://limn.it/can-an-algorithm-be-wrong/ Gillespie , T. ( 2014 ). The relevance of algorithms. In T. Gillespie , P. Boczkowski & K. Foot (Eds.), Media technologies (pp. 167 – 194 ). Cambridge, MA : MIT Press . Google Preview WorldCat COPAC Gladwell , M. ( 2006 , October 16). The formula: What if you built a machine to predict hit movies? The New Yorker . Retrieved from http://www.newyorker.com/archive/2006/10/16/061016fa_fact6 Goldman , E. ( 2006 ). Search engine bias and the demise of search engine utopianism . Yale Journal of Law & Technology , 9 , 111 – 123 . doi:10.1007/978-3-540-75829-7 WorldCat Grimmelman , J. ( 2008/2009 ). The Google dilemma . New York Law School Law Review , 53 , 939 – 950 . WorldCat Guzman , T. ( 2005 ). The little theatre movement: The institutionalization of the European art film in America . Film History , 17 , 261 – 284 . doi:10.1353/fih.2005.0020 Google Scholar Crossref Search ADS WorldCat Halavais , A. ( 2009 ). Search engine society . Cambridge, England : Polity Press . Google Preview WorldCat COPAC Hindman , M. ( 2009 ). The myth of digital democracy . Princeton, NJ : Princeton University Press . Google Preview WorldCat COPAC Hrynshyn , D. ( 2008 ). Globalization, nationality and commodification: The politics of the social construction of the Internet . New Media & Society , 10 ( 5 ), 751 – 770 . doi:10.1177/1461444808094355 Google Scholar Crossref Search ADS WorldCat Hughes , T. P. ( 2012 ). The evolution of large technological systems. In W. E. Bijker , T. P. Hughes & T. Pinch (Eds.), The social construction of technological systems: New directions in the sociology and history of technology (Anniversary ed., pp. 45 – 76 ). Cambridge, MA : MIT Press . Google Preview WorldCat COPAC Ingram , M. ( 2013 , December 9). The scorpion and the frog: Who wins as Facebook makes tweaks to its newsfeed algorithms . Gigaom . Retrieved from http://gigaom.com/2013/12/09/the-scorpion-and-the-frog-who-wins-as-facebook-makes-tweaks-to-its-newsfeed-algorithms/ Jepperson , R. L. ( 1991 ). Institutions, institutional effects, and institutionalism. In W. W. Powell & P. J. DiMaggio (Eds.), The new institutionalism in organizational analysis (pp. 143 – 163 ). Chicago, IL : University of Chicago Press . Google Preview WorldCat COPAC Jiang , M. ( 2014 ). The business and politics of search engines: A comparative study of Baidu and Google’s search results of Internet events in China . New Media & Society , 16 ( 2 ), 212 – 233 . doi:10.1177/1461444813481196 Google Scholar Crossref Search ADS WorldCat Joerges , B. , & Czamiawska , B. ( 1998 ). The question of technology, or how organizations inscribe the world . Organization Studies , 19 ( 3 ), 363 – 385 . doi:10.1177/017084069801900301 Google Scholar Crossref Search ADS WorldCat Jones , J. ( 2012 , October). Creating networks through search: PageRank, algorithmic truth, and tracing the Web . Paper presented at the annual meeting of the Association of Internet Researchers , Salford, England . Google Preview WorldCat COPAC Kaplan , R. L. ( 2006 ). The news about new institutionalism: Journalism’s ethic of objectivity and its political origins . Political Communication , 23 , 173 – 185 . doi:10.1080/10584600600629737 Google Scholar Crossref Search ADS WorldCat Katzenbach , C. ( 2011 ). Technologies as institutions: Rethinking the role of technology in media governance constellations. In M. Puppis & M. Just (Eds.), Trends in communication policy research (pp. 117 – 138 ). Bristol, England : Intellect . Google Preview WorldCat COPAC Keating , G. ( 2012 ). Netflixed: The epic battle for America’s eyeballs . New York, NY : Portfolio . Google Preview WorldCat COPAC Kim , J. ( 2012 ). The institutionalization of YouTube: From user-generated content to professionally generated content . Media, Culture & Society , 34 ( 1 ), 53 – 67 . doi:10.1177/0163443711427199 Google Scholar Crossref Search ADS WorldCat La Torre , M. ( 2010 ). Law as institution . Law & Philosophy Library , 90 , 97 – 134 . Google Scholar Crossref Search ADS WorldCat Laidlaw , E. B. ( 2008 ). Private power, public interest: An examination of search engine accountability . International Journal of Law & Information Technology , 17 ( 1 ), 113 – 145 . doi:10.1093/ijlit/ean018 Google Scholar Crossref Search ADS WorldCat Langlois , G. , Elmer , G. , McKelvey , F. , & Devereaux , Z. ( 2009 ). Networked publics: The double articulation of code and politics on Facebook . Canadian Journal of Communication , 34 , 415 – 434 . Google Scholar Crossref Search ADS WorldCat Latour , B. ( 2005 ). Reassembling the social: An introduction to actor network theory . Oxford, England : Oxford University Press . Google Preview WorldCat COPAC Leonard , A. ( 2013 , February 1). How Netflix is turning viewers into puppets . Salon . Retrieved from http://www.salon.com/2013/02/01/how_netflix_is_turning_viewers_into_puppets/ Leonardi , P. M. ( 2012 ). Materiality, sociomateriality, and socio-technical systems: What do these terms mean? How are they different? Do we need them?. In P. M. Leonardi , B. A. Bardi & J. Kallinikos (Eds.), Materiality and organizing: Social interaction in a technological world (pp. 25 – 48 ). New York, NY : Oxford University Press . Google Scholar Crossref Search ADS Google Preview WorldCat COPAC Lessig , L. ( 2006 ). Code and other laws of cyberspace 2.0 . New York, NY : Basic Books . Google Preview WorldCat COPAC Lohr , S. ( 2011 , September 10). In case you wondered, a real human wrote this . The New York Times . Retrieved from http://www.nytimes.com/2011/09/11/business/computer-generated-articles-are-gaining-traction.html?pagewanted=all&_r=0 Lowrey , W. ( 2011 ). Institutionalism, news organizations and innovation . Journalism Studies , 12 ( 1 ), 64 – 79 . doi:10.1080/1461670X.2010.511954 Google Scholar Crossref Search ADS WorldCat Lowrey , W. , & Woo , C. W. ( 2010 ). The news organization in uncertain times: Business or institution? Journalism & Mass Communication Quarterly , 87 ( 1 ), 41 – 61 . doi:10.1177/107769901008700103 Google Scholar Crossref Search ADS WorldCat Mager , A. ( 2012 ). Algorithmic ideology . Information, Communication & Society , 15 ( 5 ), 269 – 787 . doi:10.1080/1369118X.2012.676056 Google Scholar Crossref Search ADS WorldCat Manovich , L. ( 2013 ). Software takes command . New York, NY : Bloomsbury . Google Preview WorldCat COPAC Mayer-Schonberger , V. , & Cukier , K. ( 2013 ). Big data: A revolution that will transform how we live, work, and think . New York, NY : Houghton Mifflin Harcourt . Google Preview WorldCat COPAC McMains , A. ( 2012 , March 27). Is Facebook a media or tech company? Adweek . Retrieved from http://www.adweek.com/news/advertising-branding/facebook-media-or-tech-company-139242 Moe , T. ( 1990 ). Political institutions: The neglected side of the story . Journal of Law, Economics, and Organizations , 6 , 213 – 253 . doi:10.1093/jleo/6.special_issue.213 Google Scholar Crossref Search ADS WorldCat Moe , H. , & Syvertsen , T. ( 2007 ). Media institutions as a research field . Nordicom Review , 28 , 149 – 167 . WorldCat MOZ ( 2013 ). Google algorithm change history . Retrieved from: http://moz.com/google-algorithm-change Napoli , P. M. ( 1997 ). A principal-agent approach to the study of media organizations: Toward a theory of the media firm . Political Communication , 14 ( 2 ), 207 – 219 . doi:10.1080/105846097199443 Google Scholar Crossref Search ADS WorldCat Napoli , P. M. ( 2001 ). Foundations of communications policy: Principles and process in the regulation of electronic media . Cresskill, NJ : Hampton Press . Google Preview WorldCat COPAC Napoli , P. M. ( 2003 ). Audience economics: Media institutions and the audience marketplace . New York, NY : Columbia University Press . Google Preview WorldCat COPAC Napoli , P. M. ( 2009 ). Navigating producer-consumer convergence: Media policy priorities in the era of user-generated and user-distributed content . Communications & Convergence Review , 1 ( 1 ), 32 – 43 . WorldCat Napoli , P. M. ( 2011 ). Audience evolution: New technologies and the transformation of media audiences . New York, NY : Columbia University Press . Google Preview WorldCat COPAC North , D. ( 1981 ). Structure and change in economic history . New York, NY : Norton . Google Preview WorldCat COPAC Pariser , E. ( 2011 ). The filter bubble: What the Internet is hiding from you . New York, NY : Penguin . Google Preview WorldCat COPAC Pasquale , F. ( 2010 ). Dominant search engines: An essential cultural & political facility. In B. Szoka & A. Marcus (Eds.), The next digital decade: Essays on the future of the Internet (pp. 401 – 418 ). Washington, DC : TechFreedom . Google Preview WorldCat COPAC Phillips , N. , & Malhotra , N. ( 2008 ). Taking social construction seriously: Extending the discursive approach in institutional theory. In R. Greenwood , et al. (Eds.), The SAGE handbook of organizational institutionalism (pp. 702 – 719 ). Thousand Oaks, CA : Sage . Google Scholar Crossref Search ADS Google Preview WorldCat COPAC Pierce , D. ( 2013 , October, 30). Barnes & Noble’s new Nook GlowLight is faster, lighter, and full of ideas . The Verge . Retrieved from http://www.theverge.com/2013/10/30/5044378/barnes-nobles-new-nook-glowlight-lighter-faster-full-of-ideas Plesner , U. ( 2009 ). An actor-network theory perspective on changing work practices: Communication technologies as actants in newswork . Journalism , 10 ( 5 ), 604 – 626 . doi:10.1177/1464884909106535 Google Scholar Crossref Search ADS WorldCat Rogers , R. ( 2004 ). Information politics on the Web . Cambridge, MA : MIT Press . Google Preview WorldCat COPAC Roth , D. ( 2009 , November). The answer factory: Demand Media and the fast, disposable, and profitable as hell media model . Wired . Retrieved from http://www.wired/com/magazine/2009/10/ff_demandmedia Schudson , M. ( 2002 ). The news media as political institutions . Annual Review of Political Science , 5 , 249 – 269 . doi:10.1146/annurev.polisci.5.111201.115816 Google Scholar Crossref Search ADS WorldCat Scott , W. R. ( 2008 ). Institutions and organizations: Ideas and interests (3rd ed.). Los Angeles, CA : Sage . Google Preview WorldCat COPAC Scott , S. V. , & Orlikowski , W. J. ( 2012 ). Great expectations: The materiality of commensurability in social media. In P. M. Leonardi , B. A. Nardi & J. Kallinikos (Eds.), Materiality and organizing: Social interaction in a technological world (pp. 113 – 133 ). New York, NY : Oxford University Press . Google Scholar Crossref Search ADS Google Preview WorldCat COPAC Sparrow , B. H. ( 1999 ). Uncertain guardians: The news media as a political institution . Baltimore, MD : Johns Hopkins University Press . Google Preview WorldCat COPAC Sparrow , B. H. ( 2006 ). A research agenda for institutional media . Political Communication , 23 , 145 – 157 . doi:10.1080/10584600600629695 Google Scholar Crossref Search ADS WorldCat Steiner , C. ( 2012 ). Automate this: How algorithms came to rule our world . New York, NY : Portfolio . Google Preview WorldCat COPAC Suddaby , R. ( 2010 ). Challenges for institutional theory . Journal of Management Inquiry , 19 ( 1 ), 14 – 20 . doi:10.1177/1056492609347564 Google Scholar Crossref Search ADS WorldCat Tartakoff , J. ( 2010 , August 17). AOL’s Patch aims to quintuple size by year-end . paidContent . Retrieved from http://paidcontent.org/2010/08/17/419-aols-patch-aims-to-quintuple-in-size-by-year-end/ Tartakoff , J. ( 2011 , February 24). In latest anti-“content farm” move, Google changes its search algorithm . paidContent . Retrieved from http://paidcontent.org/2011/02/25/419-in-latest-anti-content-farm-move-google-changes-its-algorithm/ Ulanoff , L. ( 2014 , January 30). Facebook Paper is content – but don’t call Facebook a media company . Mashable . Retrieved from http://mashable.com/2014/01/30/facebook-paper-app-analysis/ Ullman , E. ( 1997 ). Close to the machine: Technophilia and its discontents . San Francisco, CA : City Lights Books . Google Preview WorldCat COPAC Uricchio , W. ( 2011 ). The algorithmic turn: Photosynth, augmented reality and the changing implications of the image . Visual Studies , 26 ( 1 ), 25 – 35 . doi:10.1080/1472586X.2011.548486 Google Scholar Crossref Search ADS WorldCat Vaidhyanathan , S. ( 2011 ). The Googlization of everything (and why we should worry) . Berkeley : University of California Press . Google Preview WorldCat COPAC Van Couvering , E. ( 2007 ). Is relevance relevant? Market, science, and war: Discourses of search engine quality . Journal of Computer-Mediated Communication , 12 , 866 – 887 . doi:10.1111/j.1083-6101.2007.00354.x Google Scholar Crossref Search ADS WorldCat Webster , J. G. ( 2010 ). User information regimes: How social media shape patterns of consumption . Northwestern University Law Review , 104 ( 2 ), 593 – 612 . WorldCat Webster , J. G. ( 2011 ). The duality of media: A structurational theory of public attention . Communication Theory , 21 , 43 – 66 . doi:10.1111/j.1468-2885.2010.01375.x Google Scholar Crossref Search ADS WorldCat Wu , T. ( 2013 ). Machine speech . University of Pennsylvania Law Review , 161 , 1495 – 1533 . WorldCat Zucker , L. G. ( 1977 ). The role of institutionalization in cultural persistence . American Sociological Review , 42 ( 5 ), 726 – 743 . doi:10.2307/2094862 Google Scholar Crossref Search ADS WorldCat © 2014 International Communication Association TI - Automated Media: An Institutional Theory Perspective on Algorithmic Media Production and Consumption JF - Communication Theory DO - 10.1111/comt.12039 DA - 2014-08-01 UR - https://www.deepdyve.com/lp/oxford-university-press/automated-media-an-institutional-theory-perspective-on-algorithmic-02XMXWBsOa SP - 340 VL - 24 IS - 3 DP - DeepDyve ER -