Computationalism and Political Authority

chapter seven

Computationalism, Striation, and Cultural Authority

Mass computerization is part of a complex world-historical politics in which reciprocal desires to see the world as computable and to see computer technology as an ultimate achievement of modernity walk hand-in-hand. It is no accident that globalization includes a massive migration of rural minorities toward cosmopolitan centers, and it is no accident that this migration is most typically accompanied by an explicit embrace of computer technology—which is said, it is critical to remember, to be a great supporter of distributed knowledge and productivity—and a loss of (and often concomitant disparaging of) minority languages and cultures. Such metropolitan migrants are among the most forceful in suggesting that this movement needs to be seen not as a loss, but rather as the rightful attainment of modern opportunity that can only be realized in majority language and technologies.

It cannot and should not be a goal of critical and political reading to deny anyone the right to change their life, to seek economic opportunity, to learn new languages and cultures. Yet at the same it cannot be the job of criticism simply to allow the surface articulations of such movements to stand of and for themselves, if and when there are also underlying political and historical forces at work. To the contrary, only by detailing exactly the nature of these forces do we hold out any hope of managing them. Our received histories of transitions that are accepted as transparent in the public ideology—the Early Modern transition to print culture, for example, or the colonial annexation of much of the world not already part of major empires—have been shown with a certain amount of conclusiveness to be both more contingent and more politically shaped than we are (or can be) taught in school. Yet when it comes to our own cultural frame, we are often content to accept the presentist extension of exactly this ideological picture.

From both internal and external perspectives, computers demonstrate their overweening commitment to striation as a grounding cultural principle. Despite the insistence of neoliberal writers on the libratory potential of computers, the computer investment in striation is worn quite plainly on its sleeve, perhaps hidden in plain sight. While we must often look to not-yet-developed paradigms and marginal practices for signs of what looks like a return of nomadicity or smoothness, striation is everywhere in computers, at levels from the smallest to the largest. The process of what Armand Mattelart calls “networking the world” (2000) is itself a massive and underappreciated act of striation. After all, in principle, the world was always networked, by any number of analog means; it was then further networked by analog electronic technologies such as land phones, the telegraph, radio, and so on. Today one can scarcely avoid proclamations that the world is being networked as if it never had been before, as if global communication is now possible and had always before been impossible; as if we had always previously been perfectly isolated individuals and computer technology now allows us, finally, to speak with each other.

Of course this is highly odd, since nothing of the sort is true, even if computer technology does enable an acceleration of worldwide communication. Yet perhaps even more than communication itself, what computerized networking entails is the pinpoint location of each object and individual in a worldwide grid. The cellphone, a vanguard technology of computerization although not always recognized as such, relies on digital technologies such as packet-switching to transfer partial messages from point-to-point. We are told that cellphones free us to move anywhere we wish, like nomads, no longer tethered to a central location; no doubt there is some truth to this. But at the same time the cellphone itself, precisely demarcated via a numeric identity akin to the Internet’s IP number, becomes an inescapable marker of personal location, so much so that with much more frequency than land-line phones, it is routine for cellphone users to be asked why, at any hour of the day or night, they failed to answer their phone—as if the responsibility for such communications lies with the recipient and not with the maker of the call. If I call you I expect (rather than hope) that you will answer. What excuse is there, after all, to be “away from” a device that is always in one’s company?

In this example, a previously smooth space—the space from which one felt free not to answer the phone, to be away from the phone, perhaps even to use the inevitable periods of unavailability as a means to use space and time for one’s own ends—now becomes striated, and in this sense visible only as its striation becomes visible. This is an important characteristic of striation, in that we do not simply have “smooth” or “free” spaces that we resist letting be incorporated into striation—rather, we typically rely on the smoothness of space (and time) not even recognized as such for important parts of culture and personal practice. This smoothness, as with the “lack of a toothache” in Buddhism, becomes apparent only when it is removed. For some of us, striation comes on without recognition at all—thus we have had for years cellphone advertisements that present as a virtue the ability for these devices to keep one in constant contact with one’s employer, as if this were an obvious virtue for all persons rather than for employers. It is beside the point that some employees enjoy this relatively unmediated contact with their jobs—except at the highest level, this sort of contact is largely uncompensated and, over time, extremely stressful for the individual worker.

Spreadsheets, Projects, and Material Striation

Among the exemplary tools of computational striation is one that has long been considered among the personal computer’s most important applications, namely, the spreadsheet. Conceptually, a spreadsheet is somewhere between a word processor, which is generally capable of creating ordered tables of information, and a true database, which can perform extensive computations on stored data. In practice, the majority of word processors and other text-processing applications, including ones for the web, rely heavily on a table model that is not unlike the basic table at the heart of spreadsheets; database applications consist of tables and relations between them. As such, it is slightly inaccurate to think of these three main, everyday computer applications as entirely different in kind; it might be more accurate to think of them, along with other applications that are currently part of the Microsoft Office suite, as similar bundles of capabilities in which different aspects are brought to the foreground. This is one reason why it was crucial to Microsoft to own the entire suite of what are now Office applications, in that the potential always exists for one way of looking at the capabilities to swamp the others unexpectedly.

Spreadsheets are a chief example of computational thinking and of striation because they existed before computers proper; they became nearly ubiquitous with the rise of business and personal computing and eventually came to redefine significant aspects of business and personal conduct; they also take extensive advantage of real computational resources. Spreadsheets are one of the chief computational applications to walk hand-in-hand with scientific management (see Liu 2004 for a thorough discussion of this connection) and to remake the world in a computerized image, while passing largely unnoticed in the public eye. Today nearly all corporate bodies are organized in large part according to the logic of spreadsheets, including educational institutions and other nonprofit entities. Along with databases and project management software, spreadsheets make possible and encourage the capitalization of almost all resources, far beyond the ordered capitalization of physical labor that Taylor studied so carefully.

In theory, a spreadsheet is nothing but a table: a simple, ordered document composed of rows and columns. The canonical spreadsheet application is one familiar to any corporate employee and most investors: the balance sheet. Balance sheets existed long before computers, but it is a fundamental mistake to see them as a mere accounting artifact of interest only to financial professionals inside of companies. No doubt there was some truth to such an understanding before the widespread use of computers, when the terminology and perhaps more importantly the data underlying balance sheets was inaccessible to all but finance professionals and perhaps the president and CEO of a company. In those days, the data for balance sheets could be collected only with great effort, close to the SEC filing deadline, and of use only in the highest-level decision making.

Balance sheets (and other financial tools like income statements, annual reports, etc.) and the computer were more than made for each other: they are computational devices that existed long before computers became physical objects. With the advent of computers, the thinking behind balance sheets could be widely expanded and implemented at every level of corporations and other organizations, rather than existing as esoteric tools for only the initiated. Today, to be the manager of a segment of any size within a modern corporation means essentially, at least in part, to manage a spreadsheet. This spreadsheet is not merely an abstraction used for reporting purposes; it is a literal device that is connected, usually by direct networking, into the master spreadsheets used by financial employees to assemble the organization’s official balance sheet. While the information in the manager-level spreadsheet might, in general, be kept hidden from subordinates, it is in some ways more real than the persons employed by the company. It frequently includes information of vital importance to the employees, for example productivity rates, employee costs, costs vs. revenue generation, and so on, and yet is in general considered part of “management information” to which only manager-level employees are entitled access.

When managers meet for regular supervisory duties with their managers, it is the spreadsheet reality and not the everyday physical activity of employees that is the most “real” object of discussion. Here hierarchy and striation are clearly visible: whatever the corporation may say with regard to labor law and other constraints about who is “management” and who is a “worker,” it is when spreadsheets become the focus of discussion that it becomes clear who is part of the management and ownership of a corporation and who works for it. This is among the brightest lines in contemporary corporate life, and raises profound questions, not under direct consideration here, about the nature of the contemporary corporation and its political and economic power in our social lives. Often, the managers and owners of a company, who may themselves constitute less than 1 percent of the total number of employees of a given company, see the reality portrayed by and manipulated via spreadsheets as the “actual” reality of the company, and the rest of the corporate activities as a kind of epiphenomena connected to the spreadsheet reality only via metaphoric and metonymic processes. As such, these extensions are much more ephemeral and expendable than most employees themselves may understand, and as such they may be widely globally distributed: not just because the work of a company via networks like computers can be shipped out all over the world, but because actual work is now a kind of extension of the company’s central mission, which is exclusively visible via the company’s spreadsheet reality.

The spreadsheet reality is profoundly striated: it is arguably just the application of striation to what had previously been comparatively smooth operations and spaces. The spreadsheet encourages everyone in management to think of the parts of a company as “resources” that can be understood as “assets” or “liabilities”; each human being is more liability than asset, regardless of rhetoric about “human assets” or “human resources.” One of the most interesting characteristics of spreadsheets is their division not just of space but of time into measurable and comparable units. The common quarterly balance sheet is always presented in terms not just of the current quarter of operations but of the prior quarter, and this information is of critical importance to investors and analysts who follow each particular company. What has happened to revenues, costs, inventory over the recent quarters? Are they increasing or decreasing? Companies almost always want to be in a position where revenues, sales, and assets are increasing, costs are not increasing unless justified by new products or services that will result in even greater revenues, etc. From this view the human activity of employees—which in some cases can represent the bulk of the active hours for years—appear as little more than distant decimal places in calculations rarely present on the “summary” or “master” financial statements.

While spreadsheet thinking encourages a typically computational focus on a Platonic, numeric reality that in some ways supersedes experiential reality, it is not at all the only computational means used by contemporary organizations to striate and structure the everyday world. Within corporations and other large organizations, among the most significant means for managing human beings and their actions is a device loosely known as Project-Management Software and that is most famously represented by an application available as part of the “professional” versions of Microsoft Office, namely, Microsoft Project. Microsoft Project is similar to a spreadsheet program, and in fact its functions can be and are performed with spreadsheets, but it has been adapted to the specific needs of “project management,” which can generally be understood as a euphemism for “human activity management.” While some activities can be and are tracked via project management software, its chief purpose is to follow and track the actions of individuals, who are represented as computable symbols in the application. These symbols also represent “dependencies” and other relational concepts, providing the project manager with a visual representation of any sort of large-scale institutional activity.

Any sort of activity can be represented in a project management document, and this software is frequently used as a kind of large-scale to-do list for employees, but its main purpose is to provide project managers with computational, hierarchical, and striated authority over the human beings contributing to a given project and to the company as a whole. The project might be as small as a simple software upgrade or as large as the creation of an entirely new product. The project manager in such situations often serves an interesting liminal position between management and employee, and the document he or she maintains is typically connected both to central corporate resources and to the spreadsheets that are management’s main concern. The manager and project manager may have supportive and interested meetings with employees at various levels, but often their main concern is with the “project reality” and its relationship to the “spreadsheet reality,” which is often enough their own main point of responsibility with regard to the company’s structure.

No doubt, project structuring via heavily striated applications like Microsoft Project contributes to efficiency and can aid employees in tracking their tasks; there is little doubt that being certain of accomplishment can contribute to a sense of employee well-being and to the documentation of services rendered to the corporate entity. But the main point of project management software is to provide the benefits of striating smooth space to those at the top of the hierarchy, and in some cases to abstract out the capital and other resources from human collectivities only in so far as they contribute to the numerical spreadsheet reality on which management must focus. As such the facts of day-to-day life in the corporate entity become less and less the actual activities of human beings and more and more the facts (or actually symbols and figures) included in the project management document. At higher levels of management it is quite possible to fulfill corporate requirements well only by interacting with spreadsheets and project documents; what is only required in addition is to mask, via the human performance of lower-level managers, the degree to which this computational reality supersedes social experience.

In The Laws of Cool (2004), Alan Liu draws attention to the ways that Frederick Taylor’s system of “scientific management” was applied first to blue-collar physical works, especially in Henry Ford’s production plants (91-5), and then “migrated to the white-collar environment” (96) through fields like industrial psychology and employee testing, ultimately producing the famous “organization man” (Whyte 1957) of the 1950s whom Liu rightly associates with the first wave of comprehensive computerization in the form of industrial mainframes. In this environment, Liu writes, “there arose a . . . universe of information-for-information that manifested itself not just as more information but as a mental construct of that information: an ‘overview,’ ‘visible’ rending, ‘voice,’ or ‘reflection.’ This construct was secondary and figurative relative to the ‘real’ work at hand . . . But secondary and figurative overlapped with primary and real” (108). “The definitive feature of the mainframe era,” Liu writes, “was precisely that it insisted that IT conform to organizational patterns optimized for the earlier automation paradigm.” Citing James Beniger, Liu suggests that “computerization in the mainframe era was the logical extrapolation of the apparatuses of ‘generalized control’ that originally fostered the great bureaucratic organizations of industrial society” (112).1 Liu’s goal is to account for the “culture of cool” surrounding computers that clearly did develop as early as the 1970s and which in modified form continues to this day. While it is no doubt accurate to see a change in the methods and implementation of computation as a society-wide technology in the various decades since the 1950s, it is nevertheless clear that the kinds of organizational control and global surveillance that were so obvious in personnel management then may have been pushed out of the spotlight, but have also continued to be developed unabated. Those structures of control are no less effective for appearing less personalized and rigid on the surface than they did in the 1950s: to the contrary, I am arguing that they are more effective and perhaps more insidious precisely because they appear to be more individualized today (see Zuboff 1988 for a few hints along similar lines).

An organization fueled by the use of Microsoft Project and other similar tools is not necessarily the one best positioned for corporate success. Rather, the widespread use of Microsoft project reveals a thoroughgoing belief in computationalism, showing how deep the belief in chess-style striation can filter down to the level of nearly every employee of the contemporary organization. Perhaps it is somewhat familiar how software tools now enable doctors, lawyers, and other professionals to precisely track how each minute of each day is spent and to whom the time can be billed, and at what rate. Subjectively, such precise oversight of every moment of every day seems likely to produce high levels of stress and paranoia, and threatens to take over the relatively “smooth” space of the apparently “self-scheduling” mid-level office employee with a thorough striation according to which each moment of time must be accurately documented, perhaps even via automated logging tools. Subjectively, again, the response of most employees to the sight of their time so precisely scheduled, tied to project income and expense, “rolled-up” into global pictures of corporate finance and resource management, can only be understood as profoundly dehumanizing in just the way that the most stringent Fordist management of physical labor is. But the effects of global surveillance via Fordist shop management (as distinguished from the theoretically more-benign intent of Taylor’s original plans) were in part countered by organized labor and its collective revolution against such rigid supervisory regimes. Today’s computerized methods of employee tracking and surveillance are both more effective and less visible, and have so far seemed less tractable to organized resistance, despite the apparent computational knowledge and interest of the people whose work is so closely monitored and controlled.

In this sense, it is a triumph of what Liu calls the “culture of cool” that computational employees can accept, and in some cases help to implement, the tools and methods that can contribute to extremely high-stress work environments, severe productivity demands, and real-time, invasive surveillance of work activities (and personal activities conducted at or during work-time). It should be no surprise that as tools become more sophisticated and as organized labor is increasingly disempowered through a variety of methods—including the prevalent view in white-collar firms that all employees are salaried “management” and therefore not subject to hourly and weekly work limits—that the effective methods developed by Ford in particular should return to the workplace. What is perhaps surprising is that many workers would come to accept, implement, defend, and support the methods that are used to restrict, manage, and in some cases automate and outsource their own employment.

Tools for Authority

One area of computing that has so far remained outside of critical attention is the proliferation of engineering protocols which are used by corporations for the implementation of large-scale systems (large both quantitatively and geographically). The most well-known of these are called “Enterprise Resource Planning” (ERP) and “Customer Relationship Management” (CRM), though these are only two examples of a widespread approach in business computing. Some of the best-known software manufacturers in the history of commercial computing, including SAP, BAAN, Microsoft, Oracle, PeopleSoft, Computer Associates, and others, have derived much of their income from these protocols. They have been explicitly developed and implemented for the purpose of defining and controlling the system of social actions and actors; they constitute a sizable fraction of the work taken up by the thousands of engineering students produced by today’s institutions of higher education. They are the “skills” that computer scientists and engineers develop to “use” in the world at large, and they are always construed and described as such. They are “tools,” supposedly free of ideological weight, and so far largely free from critical scrutiny. Like other such phenomena, on reflection they turn out to be among the most precise and deliberate structures of political definition and control, and they remain so far largely outside of the purview of cultural interpretation and understanding.

ERP is one name for a system of strategies in business computing that began to be explicitly articulated in the 1980s, though their roots extend back further than that. A perspective from inside industry shows the clear connections of ERP-style technologies not to information processing as much as to trends that we now identify as forming the so-called Industrial Revolution itself:

Instead of being responsible for and capable of building an entire product from start to finish, the Industrial Revolution spawned the division of work into specific defined tasks. The division of tasks (division of labor) led to greater efficiency, productivity, and output—the ultimate aim of the Industrial Revolution. This, in turn, led to specialization in different areas, such as machine operators, assemblers, supervisors, industrial engineers, and so on. Whether this specialization led to the discipline of inventory and materials control is not clear, but we can assume that certain people were allocated the task of purchasing material for the business, and others were employed to sell the finished product. . . . This logic of breaking down the responsibility for production between different functional areas is evident in ERP systems, even though their objective is to integrate all operations and support more efficient sharing of data about business processes. (O’Gorman 2004, 26)

Computing does not merely enable accountants to keep better records of monetary transactions; it provides whole new levels and kinds of power and control over money. This power and control, as businesses almost immediately realized, could be extended to every aspect of running a business; today, ERP vendors see

“backoffice” functions (such as operations, logistics, finance, and human resources) and “nontransaction-based systems” or “front-office” functions (such as sales, marketing, and customer service), as integral components of ERP systems. These inclusions result from the emergence of Supply Chain Optimization (SCO), or Supply-Chain Management (SCM) and Customer Relationship Management (CRM) strategies and systems. . . This “beyond the corporate walls integration” [is sometimes referred to] as extreme integration. (Sammon and Adam 2004, 7)

“Enterprise Resource Planning” sounds vague until one realizes that “Enterprise” means just “organization,” especially “big organization,” and “resource” refers to every constituent of the organization, whatever its real-world label (a pattern that is reflective of Object-Oriented programming approaches more generally; see Columbia 2001); ERP thus refers to software designed to allow business executives and IT managers to subject every aspect of their organization to computerized control.

By computerized control we are referring not to simple recording and tracking, but just as much to development of control and planning structures that would not be available to the business without the kind of large-scale planning and allocation made possible by computers. The kinds of constituents that can be identified by computers are to greater and lesser degrees formalized; they are “business models,” abstractions from the real world that allow the computer to manage them as representations; and the strategies suggested by computation are those that the computer finds it especially direct to provide. ERP business process analyses look for so-called “inefficiencies” in the system, finding places, for example, where a resource is sitting idle when it could be doing work. We know that “profit” will never fail to be included as the primary value toward which the system is skewed. In some systems of discourse, cultural studies has uncovered what seem to be covert marks of orientations toward capital; in ERP systems these orientations are explicit.

The inefficiencies exploited by technologies like ERP are almost strictly economic inefficiencies. Economic in this sense refers to the entire process of software representation: from modeling the real world as a series of economically valued objects, to assigning putative quantitative values to phenomena that, in the real world, seem not to have direct quantitative equivalents (this despite the fact that some features of the environment can be “construed” as consisting in quantifiable elements like food inputs). The world represented within the ERP infrastructure departs from our world precisely in that the unconscious, irrational, and largely interactive nature of social and personal life is replaced with explicit, conscious inputs. The maximization of these conscious inputs represents the maximization of efficient output—maximizing profit, or winning the game. (ERP is as much ideology as it is actual software, and the literature of ERP is replete with vendors selling their customers elaborate ERP packages that are ultimately rejected by the customer; see Sammon and Adam 2004.)

Most of the time, in the “real world,” one does not win at business in quite this way. In fact it is fairly hard to say just how one “wins” at business, or just what “winning” at business is. Is it owning the business, selling for a profit, becoming rich? Working for many years and retiring? Is the business run in the interests of the managers, the Board of Directors, the shareholders, the workers, the customers? In many cases, it seems the ERP approach actually does attempt to answer these questions, by offering up enticing and increasingly abstract values for apparently quantifiable problems. Human values that are hard to quantify are, often as not, simply ignored. The movement of “shell corporations,” “tax shelters,” “A/R and A/P,” “subsidiaries,” can all be manipulated, but “human happiness” and “social well-being” must be actively imposed by human managers. “Sustainable business model,” “reasonable growth,” “nondestructive interaction with the environment”—all seem as well to be values that a business can pursue, but which are not likely to be default settings for enterprise software.

In recent models of enterprise-wide software, control and monitoring mechanisms move beyond the representable “resources” and toward a more loosely defined object known as the customer. The best-known post-ERP software paradigm is Customer Relationship Management (CRM), which is a marketing term for a suite of contact, maintenance, and knowledge software modules. This integrated suite of applications, which range from simple contact management functions (the names, addresses, phone numbers, etc., of customers) to highly focused personalization features, is designed “to gain efficiencies in direct mail campaigns and to communicate more effectively and relevantly with” consumers. The software helps companies to “develop dialogues” with consumers “based on their needs, encourage more loyalty and retention, and generate more financial success” (CPM Marketing Group 2001). “An important hallmark of a CRM system,” the same press release for widely-used software goes, is its ability to track an organization’s outbound and inbound contacts with its customers in a closed-loop feedback system . . . These communication records are matched against individuals and households within the database to access activities, behaviors, and service utilization, and to calculate ROI” (ibid.).

This is not the storage and maintenance of bare information, but the interpretation and subsequent management of information to maximize “return on investment (ROI).” By ROI we mean, simply, profit (despite some accounting mechanisms designed to make it seem slightly otherwise). We are looking to minimize our own resources while maximizing the appearance of providing resources to the consumer. Using CRM, we can “value data and its ability to predict future behavior”; “leverage value-based services to support targeting, cross-selling/up-selling other financial products” (whatever “vertical” we happen to be operating within), and develop additional ways to “track and measure results to calculate ROI” (Gauthier 2001). Thus the entirety of the company-customer encounter is finally reduced to its quantitative equivalents, the human being reduced to virtually nothing but actor reading knowledge-based scripts. One can imagine being unable to determine whether one’s interlocutor is a live human being, or a set of taped phrases taped from a live encounter—an odd if dramatic example of an apparently spontaneous Turing machine. What motivates such a machine, perhaps more purely than it can motivate a human being, is plain on the surface: “the data used to manage your customer relationships are always clean and up-to-date.... The result is a marketing system that boosts

short-term marketing return and helps build customer relationships that improve long-term profitability” (Fair, Isaac n.d.).

Profitable customer management begins with knowing your customers. Yet few companies have the cross-channel knowledge required for consistent, personalized customer management and marketing decisions. With MarketSmart, information is collected at all your push and pull touchpoints—including email, direct mail, stores, inbound and outbound telemarketing, Web sites and kiosks. As a result, you now can have a complete picture of each customer’s behavior and preferences—a picture that will help drive profitability across all of your channels and product lines.

MarketSmart integrates powerful decisioning tools that help you create, test and execute strategies across your business. By combing Fair, Isaac’s gold-standard analytics with best-of-breed software tools, MarketSmart offers you unparalleled decisioning power. It also gives you the ability to analyze marketing results and integrate strategies across channels. Imagine knowing which elements of a multi-channel marketing campaign are effective and which aren’t. Or automatically triggering personalized Web pages and offers based on a visitor’s clickstream and purchase history. Or how about matching the right action to the right credit account at just the right time. You can with MarketSmart. (Fair, Isaac n.d.)

The corporate author of this product description, Fair, Isaac, is the country’s leading vendor of credit-scoring software and credit scores, an explicit formalization of the customer relationship built on just such mechanisms, setting the quantitative lives of most consumers beyond their apparent intentional control. The development of elaborated systems of social control such as personal and small business credit scoring represent the sacrifice of knowledge and control to quantitative formalizations. Who has given their consent to this control, and under what circumstances and with what understanding has this consent been given?

Nowhere are the effects of CRM more paradigmatically evident than in health care, especially in so-called managed care. The inefficiencies exploited by managed health care are precisely based on the use of ERP and CRM simulations and management systems. Today, nearly everyone in the United States knows at least something of what it is like to have one’s health managed by one or more of these systems—systems that respond not to comprehensible human processes but to abstract gaming simulations, where responses to our behavior are dictated not by precisely inefficient— unformalized—human systems, but by gaming conjectures spinning several “moves” ahead of us in a game—one structurally very similar to an RTS game—we are often not aware we are playing. The response may be, take this medicine instead of that, which may in no way reflect the likelihood of the medicine to treat our problem, but rather may be an opening gambit against our requesting proper treatment for whatever underlying (possibly environmental) condition that need be addressed. Unless one is prepared to treat the health care system as an “AI opponent” in the “game,” one is likely not going to receive maximal care for one’s state of bodily health, but rather a set number of resources dictated by the “AI opponent.”

There is no doubt that adding CRM to health care represents an extension of insurance. We all understand—don’t we—that insurance is the addition of statistical profit tables to pools of investment, advance payment for risk that either pays off handsomely or, sometimes, goes against the house. When a number of Illinois area hospitals implemented CRM systems, the “closed-loop feedback system” of communication enabled a range of “target campaign management solutions for each provider” (CPM Marketing Group 2001). “The campaigns focused on education and awareness, wellness, disease management and intervention in clinical areas such as cardiology, women’s services, orthopedics, neurology, pediatrics and senior services. To attain program goals, [the hospitals] used the CRM database and software to segment the audience for each campaign, to ensure messages reached the right individuals” (CPM Marketing Group 2001).The whole infrastructure presses on what is beyond awareness, instead of true dialogue with a human doctor we trust, who understands the range of our health issues and can present us integrated information. Even if such a tool would be useful to a doctor, does advice that is predicated on “exploiting inefficiencies” by “segmenting markets” with the goal of “maximizing ROI”—does this advice really fall into the realm of “informed consent”?

CRM makes it possible for health care providers to focus on “bottom line benefits” by affecting the consumer’s “propensity to use External Service Providers” (Gauthier 2001). It helps the organization to exploit already existing strategies involved “acquiring] and retaining] profitable customers.” In the best cases, of course, “patient-centric relationships are possible within an integrated systems, people, and process environment,” but in practice these multivariate goals are subordinated to ROI. In one case study, a “physical therapy practice uses CRM to accelerate orthopedic surgeon approval for continued care, and therapist productivity. A large physical therapy practice wanted to increase quality and consistency of clinical documentation to increase therapist capacity, referrals.” The solution involved “predefined fields and drop-down menus to simplify process, accuracy,” allowing “quicker approvals for orthopedic surgeons on modalities, progress and follow-on treatments; increased patient referrals” and “increased patient/therapist time, increasing practice revenues while maintaining quality treatment” (Gauthier 2001). In a second case study, a “large hospital seeks to target market customers for elective LASIK eye sur-gery/vision correction in highly competitive, non-differentiated market.” Here CRM “data mining software that uses legacy system patient data to identify high-propensity characteristics that will help predict future services behavior” allows “more efficient marketing campaigns and greater return on investment” (Gauthier 2001).

In a third and most telling case study presented by this CRM consultant, “a pharmaceutical company uses CRM to promote and create a community around its leading drug to build brand value and customer loyalty.” Here, an “Internet Community” is created “with focus on ‘trusted advisor’ role to insulate the brand from competing drugs”; this is because “customer relationship and retention represent primary lever of profitability,” because (despite the fact that) “educated patients depend on value of information, not just medication” (Gauthier 2001). So despite the fact that the information provided by the CRM system may be wrong—in other words, the competing product may in fact be more appropriate for any given individual’s condition—the CRM system is structured to maintain “retention.” But “targeted marketing campaigns” satisfy the “educated” consumer’s need for “information,” which must be of high value, nevertheless false, or at least tailored to obviate those factors that might make a competitive product better.

Understood as both a business model and a specific kind of software application, CRM is most often implicated in the kinds of business practices that are typically understood by the term globalization. CRM helps to implement a hard-and-fast division between the sovereign intelligence that runs organizations and the various intelligences that work for them. A small coterie of management typically has access to the setting of parameters in CRM software, while more employees use the software or help to maintain it. As such, CRM software (and its affiliates) has contributed to the Hobbe-sian picture of a corporation as run by a small, even oligarchical, group of princely leaders and a large, undifferentiated group of workers (whom labor laws have typically not yet recognized in this fashion). T

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值