Wednesday, July 11, 2018

Reading :: A Civic Entrepreneur

A Civic Entrepreneur: The Life of Technology Visionary George Kozmetsky
By Monty Jones


George Kozmetsky (1917-2003) “is most widely known today for two accomplishments -- taking the early steps that propelled the business school at the University of Texas at Austin toward its current position as an internationally prominent institution, and playing a central role in the economic transformation of Austin from a sleepy college town in the mid-twentieth century to its present-day status as a center of high-technology research, development, and manufacturing” (p.3). In the second role, Kozmetsky founded the IC2 Institute and its unit, the Austin Technology Incubator, and played a pivotal role in bringing MCC to Austin in the early 1980s. This thick biography (461pp plus end matter) covers his entire life, from his birth as the child of Russian immigrants in Seattle to his death of ALS.

For my purposes, the most important parts of this story relate to Kozmetsky’s hand in Austin’s economic transformation. Kozmetsky was at one point one of the richest men in the US, thanks to his cofounding of Teledyne, and one unofficial reason that he was selected as Dean of UT’s business school was that the University expected him to be a donor in addition to courting donors. He did not disappoint on either count, but his personal donations often strategically advanced his own objectives. One example was the Institute for Constructive Capitalism, which he began planning in 1966 and established in 1977 (p.293), largely through his own donations. This “think and do tank,” which had “an emphasis on practical, hands-on economic development activities as well as academic research” (p.293), was meant to promote a constructive capitalism in line with Kozmetsky’s politically liberal views, involving collaboration among government, business, education, and labor (p.297). The term “capitalism” fell out of favor, so in 1985 the Institute received a new name: the IC2 Institute (named for innovation, creativity, and capital) (p.305).

IC2 evolved over the years. Originally, its goals were to “underwrite advanced research on issues such as determining the role of capitalism in society, encouraging business enterprises to contribute more toward solving societal problems and improving life, nurturing entrepreneurship and gaining a better understanding of the role of small business in caapitalist society, and promoting better public understanding of business, including improving business education in public schools.” In the 1980s, its scope shifted “into a more proactive role of spurring economic development” (p.296). In the 1990s, Kozmetsky focused on technology commercialization pursued as a commitment shared by stakeholders in government, business, education, and labor (p.297). Kozmetsky remained director until 1995; after stepping down, he remained closely involved in its leadership (p.300).

IC2 had six characteristics that, Kozmetsky said, made it a unique research organization: (1) it dealt with unstructured problems; (2) it attacked those problems with interdisciplinary “think teams”; (3) it went beyond traditional business ed subjects; (4) it linked theory and practice viaa interdisciplinary conferences, initiatives, and “experienced practitioners”; (5) it tried new ways of solving problems while remaining in a university environment; (6) it transferred research results to other institutions (pp.301-302).

IC2 developed several innovations that were eventually transferred to the business school. One was Moot Corp, a business plan competition (p.314). Another was the Master’s Degree in Science and Technology Commercialization (MSTC), which began in 1996 and was one of the earliest degree programs of its type, and was moved to the business school in 2014 (p.382). (Note: The MSTC was one of the models for the HDO program.)

Kozmetsky wanted to build up the regional capacity for developing new firms, based in the philosophy that it’s better to create new firms than to “steal” existing ones (p.336). Among his efforts was the Center for Technology Venturing, a 1988 joint project between IC2 and the business school’s Bureau of Business Research (p.336). In 1989, this Center spawned teh Austin Technology Incubator (p.337), which “provided companies with strategic advice, mentoring, financing, marketing, public relations assistance, emnployee benefit programs, and office and manufacturing space for as long as three years” (p.339). The Center for Technology Venturing also spawned the Texas Capital Network in 1989; “by 1994 it had expanded nationwide and changed its name to the Capital Network” (p.338).

Kozmetsky also “led a group of entrepreneurs in establishing the Austin Entrepreneurs’ council” in 1991 (p.345), which in turn spawned the Austin Software Council, a group that in 1998 separated from IC2; in 2002 it changed its name to the Austin Technology Council (p.347).

IC2 also did work for NASA in 1993, laying the ground for NASA’s “nationwide network of technology transfer programs by the end of the century” (p.381).

As you’ll notice, I’ve mainly focused on the Austin parts of the Kozmetsky story. But the entire book is interesting reading, painting Kozmetsky as a driven, focused, generous, yet sometimes flawed person. He was a visionary strategist and he also excelled at the “retail” parts of being dean -- maintaining relations with local and national government agencies and external stakeholders. On the other hand, he was not always effective at the “management” parts of being dean, such as identifying and addressing faculty grievances.


If you’re interested in technology, entrepreneurship, leadership, technology transfer, technology commercialization -- or Austin -- definitely pick up this book.

Reading :: The Sustainability Edge

The Sustainability Edge: How to Drive Top-Line Growth with Triple-Bottom-Line Thinking
By Suhas Apte and Jagdish Sheth


This business book tackles the question: How do you build a business around sustainability? That is, rather than building a business around simply maximizing business profits and/or maximizing shareholder value, how do you maximize benefits for all stakeholders, and how do you do it as a source of competitive advantage (p.16)? As the authors state, “Today, the best companies are generating every form of value that matters: emotional, experiential, social, and financial. And they’re doing it for all their stakeholders, not because it’s ‘politically correct’ but because it’s the only path to long-term competitive advantage” (p.17).

This angle may remind my readers of the socially responsible capitalism of Kozmetsky or, more cynically, Boltanski and Chiapello’s argument that capitalism incorporates its critiques. In any case, the book is built around the “sustainability stakeholders framework,” which attends to “triple-bottom-line thinking”: direct impact (consumers, customers, employees), indirect impact (NGOs, governments, media), and enabler impact (suppliers, investors, communities) (p.25). Most of the book involves describing each kind of impact and each stakeholder, providing illustrations from major companies such as Clorox.

To be honest, most of the heavy lifting in the book is done by the framework described above (depicted at the beginning of most chapters) and the attendant Stakeholder Sustainability Audit in the appendix. The (singular) bottom line of the book is that companies will be more sustainable if they identify a way to balance the needs of all the listed stakeholders. In other words, this book could be easily summarized in an HBR article, but the illustrations make it easier to apply.

Should you read this book? If you’re trying to formulate a business model, I’d suggest skimming it.

Wednesday, June 27, 2018

Reading :: New Wealth

New Wealth: Commercialization of Science and Technology for Business and Economic Development
By George Kozmetsky, Frederick Williams, and Victoria Williams


George Kozmetsky was dean of the School of Business at the University of Texas as well as the founder and first director of the IC2 Institute. Since I've been doing with IC2 and specifically with technology commercialization, I thought I'd better pick up this 2004 book.

Kozmetsky was an enthusiastic promoter of the development of socially responsible capitalism. In this book, he and the other two authors describe a research agenda for understanding technology-based enterprise creation, "with the initial goal of identifying those variables apparently critical in the creation of businesses where success was based on the commercialization of technologies, application or both" (p.13). Their research, they say, confirmed:

  • technology as a type of wealth, one that may need new measurements
  • the need for technology policy
  • the interactions among markets, no one of which is wholly insulated from others
  • the need for effective management and entrepreneurial training
  • technology transfer as a process
  • fast-company design
  • new management strategies
  • the consequent need for enhanced applied research (p.14)
They list policy implications (pp.14-15), which amount to finding ways to encourage technological activities through public policy that appropriately harnesses private talent and enterprise.

Throughout the book, they discuss relevant concepts, often drawing from other IC2 and IC2-adjacent publications. For instance, Chapter 8 is about creating the technopolis; it summarizes the insights from Smilor, Kozmetsky, and Gibson as well as Gibson and Rogers. Chapter 9, Adoption of Innovations, treads the same ground as Rogers

They also clarify some pieces that I haven't seen discussed elsewhere. For instance, they succinctly summarize factors of technology commercialization: 
  1. "Technology is a constantly replenishable national resource."
  2. "Technology generates wealth, which in turn is the key to economic, social, and political power."
  3. "Technology is a prime factor for domestic productivity and international competitiveness."
  4. "Technology is the driver for new alliances among academia, business, and government."
  5. "Technology requires a new managerial philosophy and practice." (p.62)
In technology commercialization, R&D results are "transformed into the marketplace as products and services in a timely manner" (p.65). Traditionally, "industrial laboratories concentrate on mission-oriented products and universities confine themselves primarily to basic research and teaching," but this approach is inadequate, resulting in fewer opportunities, more layoffs and closures, a weaker global position, poorer regional and local development, and poorer growth opportunities. "Since 1996, a new paradigm has been emerging ... [which] includes institutional developments involving academia, business, and government technology venturing." This new paradigm involves "accelerating the successful commercialization of innovation in a competitive environment" (p.65). (For examples, see my recent papers on technology entrepreneurship education.) 

Related, the authors have a chapter on industrial parks and incubators. This chapter includes a short history of the IC2 Institute's Austin Technology Incubator (p.85). In 1989, ATI was founded. In 1995, it moved into the MCC building (p.85). 

Chapter 15 reviews "The Austin Model"; I want to note this chapter for later, but I won't review it.

Finally, the book concludes with Chapter 20, "Toward Capitalism with Conscience." Specifically, "we will consider the 'conscience' of capitalism as that of avoiding or rectifying inequities in the sharing of wealth and prosperity" (p.200). The authors draw on Milton Friedman here in claiming an interdependence between free enterprise and freedom (p.201). More skeptical readers may be reminded of Boltanski and Chiapello's claim that capitalism always reconfigures itself to incorporate its critiques.

In all, this was a useful book for me in terms of understanding IC2, ATI, and Austin as well as technology commercialization's raison d'etre more broadly. If you're interested in such things, definitely pick it up.

Reading :: Posthumanism

Posthumanism: Anthropological Insights
By Alan Smart and Josephine Smart


This slim book (98pp. plus end matter) provides a useful, accessible introduction to posthumanism, a term that I have been hearing but have been until now unmotivated to explore. Spoiler alert: it involves Haraway, Hayles, Latour, Maturana & Vela, Pickering, Wrangham, and others I've reviewed and written about. So, although the term has been a bit of a question mark for me, it encompasses a great deal of familiar material.

The authors note that for some, "posthumanism is mostly about how new technologies are changing what it means to be human," but for them, "we have always been posthuman" in the sense that "becoming human involved our intimate interaction with more-than-human elements" such as fire and bacteria (p.2). "Becoming human involved the adoption of new extrasomatic technologies (i.e., things that go beyond our bodies and their basic abilities) and fundamental changes in our microbial ecologies. ... Inhabiting the globe required collaboration with plants and animals" (p.3).

Posthumanism, as the authors put it, denotes both posthuman-ism (after humans) and post-humanism (after the Western humanist tradition, with its emphases on Western-defined secularity, rationality, and human progress) (p.4).

Not surprisingly, actor-network theory constitutes a big chunk of the discussion, with the authors essentially claiming that Latour's "modernity" is roughly equal to their "humanism" (p.23). The authors are interested in the poststructuralist critique of the coherence of the individual leveled by Latour as well as Derrida, Foucault, Haraway, Althusser, and Deleuze & Guattari (p.52). But the authors also point to other lines of thought, such as distributed cognition and Haraway's cyborg anthropology (p.77).

All in all, I appreciated the straightforward simplicity of this book. The authors manage to lay out a clear, well illustrated account of posthumanism, which is quite a trick given some of the abtruse philosophical sources from which they draw (I'm thinking of Deleuze and Guattari here). They draw relationships among the lines of thought that contribute to posthumanism, and they abstract some basic principles for us. If, like me, you have been wondering about the term, this book is a strong introduction; pick it up.

Reading :: Naturalistic Decision Making

Naturalistic Decision Making
Edited by Caroline E. Zsambok and Gary Klein


This book was originally published in 1994 based on the Second Naturalistic Decision Making Conference that year. It was reprinted in 2009.

Naturalistic decision making (NDM), as Caroline Zsambok argues in Chapter 1 ("Naturalistic Decision Making: Where are We Now?"), "is the way people use their experience to make decisions in field settings" (p.4, her emphasis). NDM studies suggest that "the processes and strategies of 'naturalistic' decision making differ from those revealed in traditional field research" (p.4). For instance, in NDM, "the focus of the decision event is more front-loaded, so that decision makers are more concerned about sizing up the situation and refreshing their situation awareness through feedback"—in contrast with traditional decision making, which "emphasizes understanding the back end of the decision event—choosing among options" (p.4).

Key contextual factors of NDM, Zsambok says (quoting Orasanu & Connaly, 1993), are:

  1. "Ill-structured problems"
  2. "Uncertain, dynamic environments"
  3. "Shifting, ill-defined, or competing goals"
  4. "Action/feedback loops"
  5. "Time stress"
  6. "High stakes"
  7. "Multiple players"
  8. "Organizational goals and norms" (p.5)
We can see how these relate to Klein's later books, which are reviewed on this blog. Interestingly, many (especially 4) are also related to John Boyd's OODA loop, with potential interaction between these two lines of inquiry. (It looks like this connection has been explored somewhat in the literature.) Zsambok also notes the connections with research on expertise (p.9) and the difference between cognitive and behavioral task analysis (p.13; see also Crandall et al.). 

Gary Klein discusses applications of NDM in Chapter 5, "An Overview of Naturalistic Decision Making Applications." Here, he notes that "The initial impetus behind the NDM movement was to describe what people do, whereas the motivation behind traditional decision research was to improve the way people made decisions" (p.49). NDM research "tries to describe the strategies proficient decision makers are doing, and does not yet have any central claims about what might led to implications for improving decision quality" (p.50). (Klein later felt comfortable producing such claims, leading to his string of books.) He identifies reasons that NDM might be better applied to decision quality than traditional approaches:
  • "Classical methods do not apply in many naturalistic settings."
  • "Experienced decision makers can be used as standards for performance."
  • "Naturalistic Decision Making tries to build on the strategies people use."
  • "Experience lets people generate reasonable courses of action."
  • "Situation awareness may be more critical than deliberating about alternative courses of action."
  • "Decision requirements are context specific." (p.50)
Zsambok takes up this theme in Chapter 11, "Naturalistic Decision Making Research and Improving Team Decision Making." Based on research, she asserts that good decision-making teams "monitor their performance and self-correct; offer feedback; maintain awareness of roles and functions and take action consistent with that knowledge; adapt to changes in the task or the team; communicate effectively; converge on a shared understanding of their situation and course of action; anticipate each others' actions or needs; and coordinate their actions" (p.112). NDM field studies validate these assertions (p.112) and specifically the idea that teams share mental models (p.113). 

In Chapter 13, "Cognitive Task Analysis," Sallie E. Gordon and Richard T. Gill argue for cognitive task analysis as opposed to behavioral task analysis. Whereas BTA focuses on what people do externally, CTA attempts to capture their cognitive work as well (p.132). CTA analysts try to capture a subset of these:
  • "Concepts and principles, their interrelationships with each other, and their relationship to the task(s)."
  • "Goals and goal structures"
  • "Cognitive skills, rules, strategies, and plans."
  • "Perceptual learning, pattern recognition, and implicit or tacit knowledge."
  • "Mental models"
  • "Problem models"
  • "How novices move through all of the above in various stages to become expert."
  • "Difficulties in acquiring domain knowledge and skills."
  • "Instructional procedures useful for moving a person from novice to expert." (p.132)
In all, this was a useful look at how NDM researchers were positioning their approach against traditional decision making in 1994. We can see here why Klein positions his subsequent books the way he does, specifically pursuing CTA in field studies. We readers from other fields, especially those with a strong field research tradition, may find it odd that some of these arguments have to be made—but the way in which they are made helps us to understand how NDM developed in the subsequent years. 

Wednesday, June 20, 2018

Reading :: Streetlights and Shadows

Streetlights and Shadows: Searching for the Keys to Adaptive Decision Making
By Gary Klein


I just reviewed the methodology text that Klein coauthored; this book is a chance to see his methodological approach in action. Here, Klein focuses on how we make decisions in ambiguous situations. This question is actually quite hard to investigate in the lab, since "systematic errors aren't so serious outside the lab"; indeed, "reasoning strategies let us do many kinds of tasks without consciously or subconsciously performing calculations to perform an estimate" (p.59). So Klein turns to the scenarios that he always turns to in his popular/summary books: aircraft controllers and pilots, firefighters, NICU nurses, etc. (I would complain that he rehashes these scenarios too much across books, but I understand why he does so—they're all great illustrations, and the books use them to make related-but-different arguments to related-but-different audiences.)

Much of this book goes over principles that Klein addresses in his other books, so I'll just highlight a few standouts.

Klein points out that experts avoid data saturation by self-selecting which data to seek. That is, they know which data are most relevant and they shut out the extraneous data, making them more effective (p.133). In fact, he says, "there is never a right amount of information" and "we would be better off if we stopped worrying about getting the right amount of information and instead tried to see the meaning in the data that we do have" (p.135).

People need feedback—"feedback is essential for helping people become more skilled." But feedback itself isn't sufficient (p.165): outcome feedback (what was the result?) does not improve performance as much as process feedback (what were the cause-feedback relations in the performance?) (p.166).

Problems with emergent goals—so-called wicked problems (p.212)—include things such as business models (p.213). For such problems, "when facing wicked problems we have to re-define the goals as we try to reach them. ... No amount of thinking and analysis will make these goals well defined. In such cases, we are going to have to figure out the goals as we go along. The faster we can learn, the more successful we'll be" (p.223, his italics). Yet, he points out, many in this situation will instead "try to increase their control over events" and will thus "stumble into goal fixation" (p.223). In such situations, he advocates "Management by Discovery": "when we face complex conditions we should expect to revise and replace goals on the basis of what we learn" (p.224).

Overall, this book is readable and valuable. It's a little less valuable if you've read Klein's other books, since there's a lot of overlap, but his angle here is different—to dispel myths about decision making. If you're interested in how people make decisions in ambiguous situations (for instance, when entrepreneurs evaluate their business models), definitely pick it up.

Reading :: Working Minds

Working Minds: A Practitioner's Guide to Cognitive Task Analysis
By B. Crandall, G. Klein, and R.R. Hoffman


I've discussed Gary Klein's work before, and specifically how much I appreciate his attitude of trust and respect toward his participants. Klein's work focuses on how experienced professionals (such as firefighters, NICU nurses, and soldiers) make intuitive decisions in high-stakes, high-pressure environments.

To research such cases, Klein needed an ecological approach that allowed him to get at situated decision making in cases in which the participants couldn't necessarily articulate their assumptions, options, or triggers. At the same time, Klein couldn't just follow firefighters around—the events he wanted to study were just too rare, and when they happen, he didn't want his team to get in the way of rescue operations.

The approach that Klein and his partners developed for such cases is called cognitive task analysis (CTA), which "helps researchers understand how cognitive skills and strategies make it possible for people to act effectively and get things done," according to the back of this book. The book is, as the subtitle states, "A Practitioner's Guide to Cognitive Task Analysis." That is, it describes CTA and the situations in which it could be useful; it offers tools and strategies for performing CTA; and it discusses how CTA brings value to the participants. In this sense, it reminds me of Beyer and Holtzblatt's Contextual Design, a similar methodology book written by consultants for practitioners (although addressing different situations with a different methodological approach).

What struck me about Working Minds, though, was that the coauthors had developed a qualitative approach within psychology. As the authors note, in psychology and human factors, analysis typically happens quantitatively; students have little qualitative research training and use "preset plays" based on common statistical tests (p.107). "However, many CTA methods generate data that do not fit easily into standard statistical approaches" (p.107), and this is a problem since "quantification typically means stripping a body of data of its contextual links and decomposing it in order to assign numerical values" (p.108). At the same time, qualitative methods emerging from sociology, anthropology, and education tend to be focused on "topics that do not have a cognitive focus, such as analysis of social processes or attitudes surrounding terminal illness" (p.108).

Faced with this disjuncture, the authors set out to develop a suitable qualitative research approach for psychology's foci. Like many qualitative research approaches, this approach is not linear, with oscillations between structuring data and identifying meaning (p.110). It involves four main steps: preparation; structure data; discover meaning; identify/represent key findings (p.111). And the analysis involves creating "an audit trail that links raw data to eventual outcomes" (p.113). That is, it looks a lot like structured qualitative case study research.

In Chapter 8, the authors "introduce a level of cognitive phenomena—referred to as macrocognition— that emerges when we shift the focus to natural contexts. These are the types of cognition that CTA methods are uniquely designed to capture" (p.131). They discuss this level of cognition in terms of purpose, prior experience, situation, challenge, tools, team members, and organizational constraints (p.132). Macrocognition, they say later, is a "collection of cognitive processes and functions that characterize how people think in natural settings," as opposed to microcognition, which is "studied using carefully controlled methods and procedures" and is supposed to investigate basic, universal features (p.136). Think here of the contrast between Klein's contextualized field interviews and Kahnemann's word problems — or the contrast between laboratory measures of executive functions and ecologically valid measures. As the authors assert, "individuals make decisions but so do teams" and "decision making often depends on artifacts" (p.136). Cognitive activity, the authors assert (citing Hutchins), is "distributed across multiple agents as part of a stream of activity" (p.157).

Overall, I found this book to be rewarding. The authors have identified a need for a qualitative methodology in psychology, oriented to decision-making; they have drawn when appropriate from qualitative traditions in adjoining disciplines; but they have also recognized the differences between those methodological orientations and the one they need. They have carefully and responsibly developed and validated an approach that works for their objectives. And they have articulated it clearly and well—the book is well organized and easy to read. The result is a good intro for practitioners, but I think it would also be suitable for a methods class (with suitable framing). If you're interested in qualitative methodology, and especially if you're wondering why someone would pursue qualitative methods instead of quantitative ones, check it out.

(catching up)

I've been blogging much less regularly lately, about once a month. That's not a function of my reading so much as it is a function of my schedule: there's only so much time in the day, and the Wednesday mornings that I usually blog have been taken up with other things. Consequently, the books have been piling up.

Currently waiting to be blogged are:

  • three books on decision-making (psychology)
  • one classic book on human-computer interaction (based in anthropology)
  • one book on wealth generation (business)
  • a biography of a business leader
In addition, I'm reading a book describing a theory of the origin of language and in my Unread pile are books on posthumanism, the textual society, decision making, value, and sustainability. I'm hoping to clear my blogging backlog so I will be prepared to discuss those books as well. Stay tuned!

Wednesday, June 06, 2018

Reading :: R&D Collaboration on Trial

R & D Collaboration on Trial: The Microelectronics and Computer Technology Corporation
By David V. Gibson and Everett M. Rogers


I've been studying entrepreneurship in Austin, so I picked up this 1994 book about MCC, "America's first major, for-profit R&D consortium," which was launched in 1982 and which arguably laid the foundations for Austin's current status as a technology hub (or "technopolis," a term we will discuss more in a moment). This book, written by IC2 senior research fellows David V. Gibson and Everett M. Rogers, uses archival materials and 9 years' worth of retrospective interviews to tell the story of this consortium from the viewpoints of leaders in technology, politics, and academia. The story ends well for Austin, although MCC itself struggled through much of its history and ceased operations in 2004, a decade after this book was published.

MCC was "the United States' first major, for-profit R&D consortium," launched "by a select group of U.S. computer executives to help save their industry from Japanese competition. They collaborated in planning, implementing, and funding MCC" (p.xv). It had rules: it did not seek US governmental funding and it did not allow foreign firms to join. It also skirted US antitrust law, at least until the 1984 National Cooperative Research Act was passed in reaction to it (p.xv). The book examines the lessons of this consortium, specifically in terms of forming such R&D alliances; understanding cross-organization technology transfer and commercialization; and public/private collaboration to develop jobs and capabilities (p.4).

To understand the need for MCC, we have to recall that in the early 1980s, the US technology industry was worried that Japanese tech companies—which cooperated closely via initiatives such as the VLSI Project—would overwhelm the capabilities of US firms (p.9). This fact worried not just the US tech industry but also the US federal government, which did not want its military technology to rely on foreign companies for "essential semiconductor and computer components" (p.10).

In fact, the authors examine various examples of research consortia, beginning with English research associations in 1917, then moving to Japanese consortia modeled on those associations in the 1950s, then small-scale US consortia and larger Japanese projects in the 1970s (p.14). They discuss MCC's formation in Chapter 2, but we'll rejoin the story in Ch.3, "MCC Comes to Texas."

MCC's site selection committee, headed by Admiral Bobby Inman (who is currently on faculty at the University of Texas), considered dozens of sites before narrowing them down to four: Raleigh-Durham, San Diego, Atlanta, and Austin. That was the order in which the cities were preferred at the beginning of the final selection process. But by the end of the process, a different order emerged: Austin first, then Atlanta, Raleigh-Durham, and finally San Diego (p.99). When MCC announced that it was coming to Austin, observers in the high-tech industry reacted in disbelief: it was nowhere near the top three universities in electronics and computer science research (Carnegie Mellon, MIT, Stanford) and it was perceived as a backwater (p.103).

The authors tell an instructive story about how the selection order changed and how Austin was eventually selected. But before they do, they discuss the notion of the technopolis:
The modern technopolis interactively links technology commercialization with public and private sectors to promote regional economic development and technology diversification. Four factors are fundamental in the development of a region as a technopolis: (1) the achievement of scientific preeminence in technology-based research, (2) the development of new technologies for emerging industries, (3) the attraction of major technology companies, and (4) the creation of home-grown technology companies. (p.100)
 Using this framework, the authors argue that Austin won out in large part "because of the planned-for excellence of its research universities in microelectronics research and graduate education, which coincided with MCC's research agenda" (p.105). Local and regional leaders, including business school dean George Kozmetsky, Governor Mark White, San Antonio mayor Henry Cisneros, and Ross Perot, coordinated closely to put together a package that involved endowing professorships in computer science and electrical engineering at UT (p.117). This was a good move: the proximity to a top research university was critical to the MCC site selection group (p.124), perhaps more critical than direct financial incentives for the consortium—although Texas offered these too:
However, what won for Texas was how its incentives were structured, which reflected how well the Texas leaders obtained and used information about MCC. The Texas incentive package came largely from the private sector, statewide, not from state and local taxes, and the incentives were structured so as to be an investment in the future of Texas as a state, its universities, and business development rather than funds given to MCC. (p.148)
About a third of the incentives involved building MCC a building, owned by the UT system, on land belonging to UT (p.148). Others included $15m for endowed UT positions in electrical engineering and computer science; 30 new faculty positions in microelectronics and computer science; and $2m for purchasing new equipment for research and teaching in these areas (p.149). Texas A&M made similar commitments, but with less specificity (p.150). "In the eyes of MCC's site visitors, the university component of the Texas incentive represented a brilliant strategy" — characterized as "'what we can do together with MCC to improve university research in microelectronics'" (p.158).

Critically, UT "agreed to triple the size of its microelectronics research program and establish 30 new endowed professorships in electrical engineering and computer science" via a two-week process (pp.158-159). This is perhaps the biggest miracle of all, if you are familiar with the inner workings of universities, and it was (obviously) accomplished by ignoring typical decision-making procedures (p.159).

Texas wanted MCC and its leaders put in the effort, did the homework, and talked to the right people in order to make it happen. As the authors point out, acquiring MCC set the conditions for Austin to become a technopolis. Specifically, it built the educational infrastructure for Austin's technology focus; attracted companies and people working in technology; worked with the Austin Technology Incubator (established in 1989; see p.271) to spin off new technology companies; and set priorities for keeping such companies.

In Ch.5, the authors go on to discuss the necessity and problems of technology transfer, which they characterize (rightly, in my view) as a type of communication:
There is usually agreement ... that (1) technology is not just a "thing," and (2) transfer is a profoundly human endeavor. Essentially, "technology" is information that is put to use in order to accomplish some task, the knowledge of how to do something. "Transfer" is the movement of technology via some channel from one individual or organization to another. So technology transfer involves the application of knowledge, putting a tool to use.
The transfer of technology is a particularly difficult type of communication, in that it often requires collaborative activity between two or more individuals or functional units that are separated by a range of barriers. ... we can think of technology transfer as an interactive process with a great deal of back-and-forth exchange among individuals over an extended period of time. (p.333)
The authors characterize technology transfer as having four levels: quality R&D; acceptance; implementation; and application (p.335).

Skipping a bit, let's get to an aside about the establishment of the Austin Technology Incubator (ATI). "ATI had been formed in 1989 as an alliance of public and private interests to nurture technology-based companies for regionally based job growth and economic development. The IC2 Institute, The University of Texas at Austin, and the Institute's director, George Kozmetsky, had launched ATI as an experiment in business, academic, and government collaboration." (pp.413-414). In 1989,
the idea of a regionally based technology incubator was being championed by Dr. George Kozmetsky. To Kozmetsky, such an incubator would facilitate public/private collaboration at the regional level and it would spur economic development, fill vacant office space, train entrepreneurs, and create high-value jobs. The facility, which came to be called the Austin Technology Incubator (ATI), would act as a 'lightening rod,' linking talent, technology, capital, and business know-how to market needs. (p.451). 
The authors also give a thumbnail history of IC2, which Kozmetsky founded in 1977, while still dean of the business school (p.453).

Overall, the book is just what I was looking for. In providing a history of MCC, it also provides a history of Austin's emergence as a technopolis, including backgrounds for institutions with which I am working—IC2, ATI—and greater insights into people who continue to be associated with them. It covers relevant subjects, such as technology transfer and infrastructure. And it's well told. If you're interested in technology, entrepreneurship, civic development, or Austin, definitely pick it up.

As a side note, this book made me think more about the question of cities competing to be sites for companies. Austin is currently one of the cities competing to be Amazon's second headquarters, and critics focus on the question of whether Amazon's potential contribution to the city actually outweighs the city's incentive package. But these deals have broader effects than raw revenue. Structured well—like the MCC deal—these deals can be realized in infrastructure-building that sets the city up for long-term success far beyond that of an individual company. After all, MCC closed its doors in 2004, but Austin remains a technopolis.

Wednesday, May 09, 2018

Reading :: The New Rhetoric

The New Rhetoric: A Treatise on Argumentation
By Chaim Perelman and L. Olbrechts-Tyteca


Perhaps you read this book, or excerpts of it, in grad school. I didn't, but I've seen it cited enough that I thought I should. So I picked it up a few years ago, and started and abandoned it at least twice before I was able to bear down and get through it.

Not that it's a bad book. It's just the equivalent of Goffman's Frame Analysis. Here's what I said about Goffman in that review:
Here, even more than in Goffman's other books, it becomes clear that Goffman is a cross between Aristotle and Art Linkletter. Like Aristotle, he likes to exhaustively taxonomize the subject he's describing—in this case, frames. And like Art Linkletter, he is an inveterate gossip, pulling examples of frames and frame ruptures from everywhere he can (odd newspaper stories, magazines, television shows, books on cons and magic, and repeatedly from Dear Abby columns) in addition to published research. 
Perelman and Olbrechts-Tyteca run a similar playbook, although their examples come from philosophers and sermons rather than Dear Abby columns. Their goal in this postwar book was to describe non-formal argumentation, specifically examining audiences and shared values, and they reached back to the forgotten Greco-Roman rhetorical tradition to do so. This was fairly radical stuff in 1948, but in 2018 the arguments in The New Rhetoric have become so foundational to contemporary rhetorical theory that they hardly seem radical. As a result, reading the book now is a tedious exercise, at least for me.

The book is divided into three sections:

  1. The framework of argumentation
  2. The starting point of argument
  3. Techniques of argumentation
In this review, we'll spend most of our time on the Introduction. Here, the authors draw a line in the sand with their first two sentences:
The publication of a treatise devoted to argumentation and this subject's connection with the ancient tradition of Greek rhetoric and dialectic constitutes a break with a concept of reason and reasoning due to Descartes which has set its mark on Western philosophy for the last three centuries.
Although it would scarcely occur to anyone to deny that the power of deliberation and argumentation is a distinctive sign of a reasonable being, the study of the methods of proof used to secure adherence has been completely neglected by logicians and epistemologists for the last three centuries. (p.1, their emphasis)
Descartes, they say, "made the self-evident the mark of reason, and considered rational only those demonstrations which ... extended ... the self-evidence of the axioms to the derived theorems" (p.1). Thus deliberation and argumentation were neglected. (They echo Aristotle, who says that rhetoric comes into play when the truth cannot be known.)

What resulted was an understanding of rational science as incompatible with probable opinions—it is "a system of necessary propositions" in which "agreement is inevitable," and thus in rational science, "disagreement is a sign of error" (p.2). Thus "logicians and modern philosophers have become totally disinterested in our subject" (pp.4-5), and the authors instead draw on studies of persuading, convincing, and deliberation from Greek, Latin, and Renaissance authors (p.5). They focus on the proofs that Aristotle termed "dialectical," but since "dialectic" had taken on a different meaning due to Hegel, they lump the original meaning's focus on the probable into "rhetoric." "It is in terms of an audience that an argument develops," they emphasize (p.5). And that is the focus of this book.

Furthermore, the authors primarily examine printed texts, preserving the idea of audience but neglecting "mnemonics and the study of delivery or oratorical effect" (p.6).

They also restrict themselves to incidents in which language is used (excluding silent examples, rewards, and punishments) and specifically used to communicate (excluding blessings and curses) (p.8). They acknowledge the persuasive effects of nonlinguistic elements, but these go beyond their study (pp.8-9). (Note that contemporary rhetorical studies have gone past these boundaries, using and extending the authors' principles.) Within these linguistic bounds, the authors characterize different argument structures (p.9).

In Part I, The Framework of Argumentation, the authors discuss the conditions under which rhetoric applies. "All argumentation aims at gaining the adherence of minds, and, by this very fact, assumes the existence of an intellectual contact" (p.14, their emphasis). And "For argumentation to exist, an effective community of minds must be realized at a given moment" (p.14). That community includes the audience, defined as "the ensemble of those whom the speaker wishes to influence by his argumentation" (p.19, their emphasis). "The audience, as visualized by one undertaking to argue, is always a more or less systematized construction" (p.19). Rhetoric as an academic exercise has been addressed to conventional, stereotyped audiences, and "it is this limited view of audience ... which is responsible for the degeneration of rhetoric" (p.20).

The knowledge of the audience, they say, "cannot be conceived independently of the knowledge of how to influence it" (p.23). At the same time, the speaker must also adapt to the audience (p.23).

The authors draw a distinction between persuasion and argumentation:
We are going to apply the term persuasive to argumentation that only claims validity for a particular audience, and convincing to argumentation that presumes to gain the adherence of any rational being. (p.28)
Audiences can include universal audiences, single interlocutors, and the subject himself (p.30). Some quirks:

  • The universal audience is "often merely the unwarranted generalization of an individual intuition" (p.33). 
  • When engaging with the single hearer, discourse degenerates into dialogue (p.35). 
With this base, the authors get into Part 2, The Starting Point of Argumentation. And here is where the book begins to strongly resemble Frame Analysis: The authors describe a principle, then provide various examples. I won't go through this section in detail—this section is meant to function as a reference.

Let's rejoin the authors for the conclusion:
Instead of basing our philosophy on definitive, unquestionable truths, our starting point is that men and groups of men adhere to opinions of all sorts with a variable intensity, which we can know only by putting it to the test. These beliefs are not always self-evident, and they rarely deal with clear and distinct ideas. The more generally accepted beliefs remain implicit and unformulated for a long time, for more often than not it is only on the occasion of a disagreement as to the consequences resulting from them that the problem of their formulation or more precise definition arises. (p.511)
Should you pick up this book? Yes—eventually. It is a bit of a slog, and a reader with a background in contemporary rhetoric will find parts to be self-evident. But it's still rewarding for contemporary readers and an invaluable foundation for the study of rhetoric. 

Wednesday, April 04, 2018

Reading :: Creating the Technopolis

Creating the Technopolis: Linking Technology Commercialization and Economic Development
Edited by Raymond W. Smilor, George Kozmetsky, and David V. Gibson


This 1988 collection developed from a 1987 international conference held at the University of Texas at Austin. I picked it up primarily to understand how the Austin entrepreneurial ecosystem developed and how IC2 figured into it.

In the Preface, the term technopolis "reflects a balance between the public and private sectors. The modern technopolis is one that interactively links technology commercialization with the public and private sectors to spur economic development and promote technology diversification" (p.xiii). The Introduction puts it a little differently: "Sometimes referred to as a technology center or a high-tech corridor or triangle, the technopolis appears to be an emerging worldwide phenomenon" (p.xvii). Technopoleis include Route 128, Silicon Valley, the Research Triangle in North Carolina, and the Austin-San Antonio corridor. Authors in this collection discuss each of these technopoleis, but also technopoleis in Japan, China, England, and southern Europe as well as US locations such as upstate New York and Phoenix.

For me, the most important chapter was Ch.10, "The Austin/San Antonio Corridor: The Dynamics of a Developing Technopolis." Here, Smilor, Kozmetsky and Gibson discuss the development of this corridor, using the "technopolis wheel" (p.146) to discuss the different factors involved in sustaining it. This wheel includes anchors such as University, Large Corporations, Emerging Companies, Federal Government, State Government, Local Government, and Support Groups. Among other information that was valuable (at least to me) were a bar graph of high tech manufacturing companies in Austin, 1945-1985 (p.155) and a timeline of companies being founded or relocated to Austin, 1955-1985 (p.157). The authors also recapitulate the MCC story, which I'll cover in depth in another book review.

Should you pick up this book? To be honest, it is most useful for (a) historical perspective about perspectives on high-tech regional development in the late 1980s and (b) heuristics for understanding current high-tech regional development. If you're interested in one of those two, yes, grab a copy. Otherwise I don't think it's a crucial collection.

Reading :: The New Spirit of Capitalism

The New Spirit of Capitalism
By Luc Boltanski and Eve Chiapello


In his influential 1905 book The Protestant Ethic and the Spirit of Capitalism, Max Weber argued that capitalism works because of an ethos—labor must be performed as a calling, pursued with virtue and proficiency rather than for enjoyment and enrichment. It is this selfless performance of capitalism that makes it work as a system. And Weber laments:
The Puritan wanted to work in a calling; we are forced to do so. For when asceticism was carried out of monastic cells into everyday life, and began to dominate worldly morality, it did its part in building the tremendous cosmos of the modern economic order. This order is now bound to the technical and economic conditions of machine production which to-day determine the lives of all the individuals who are born into this mechanism, not only those directly concerned with economic acquisition, with irresistible force. Perhaps it will so determine them until the last ton of fossilized coal is burnt. In Baxter's view the care for external goods should only lie on the "saint like a light cloak which can be thrown aside at any moment." But fate decreed that the cloak should become an iron cage. (p.181)
Writing at the other end of the century, in 1999 (the original, French publication date), Boltanski and Chiapello agree with Weber's basic thesis but argue that capitalism continues to reinvent itself. They argue that the "spirit of capitalism" is the "ideology that justifies engagement in capitalism" (p.8) and that this ideology has periodically had to change in order to address and incorporate critiques (p.19). In fact, the authors identify three "spirits" of capitalism at different periods—familial, bureaucratic, and globalized—each of which were in tune with their time periods (p.19). The third spirit, which is what we are living through today (or at least were in 1999), must restore meaning to the accumulation process, combined with social justice (p.19).

More broadly, they say, critiques function as a motor for capitalism, which must align with other values to survive. Capitalism relies on its enemies' critiques to identify moral supports, which it then incorporates (p.27). (For a quick example, think in terms of social entrepreneurship.) In rhetorical terms, capitalism concedes critiques and adjusts its argument to address them. Paradoxically, this means that capitalism is the most fragile when it is triumphant (p.27)—when it doesn't have a critique to incorporate.

To substantiate this analysis, the authors turn to a corpus of management books read in France in the 1990s. (They limit their claims to the French management context, but acknowledge that these claims may be more broadly applied as well.) These management texts emphasize ideas that may be familiar to readers of this blog: networked organizational structure, distributed leadership, projectification, self-direction, trust (Part I, Ch.I). These lead to workers managing themselves and pursuing personal development, autonomy, freedom, and fulfillment (p.90)—the ethical critiques of bureaucratic capitalism being incorporated into globalist capitalism.

One might object here that the corpus is biased toward growth areas: familial and bureaucratic capitalism have their places and do specific jobs well, but since their principles are more established, we won't see a lot of new management books focused on them. In contrast, new information and communication technologies enable new organizational approaches to emergent objects, and thus we see a glut of new management books addressing them. The authors do not address this objection head-on, but they do acknowledge in the next chapter that successive organizational and technical innovations and managerial modes gradually transformed mechanisms, and that the corpus reflects an attempt to unify these mechanisms into a coherent vision (p. 103). The term "network" is frequently used in the corpus to impose coherence on these highly disparate elements (p.103). They charge that the notion of network absolves us from positing or addressing the idea of justice: in a networked world, low-status people are simply excluded (p.106). The authors do discuss network analysis and Latourean and Deleuzian sociotechnical networks—unfortunately conflating these (p.150; see also p.356).

In a network world, the authors say, our focus is no longer on saving money as in Weberian (familial) capitalism, but rather on saving time: it must be spent on the best connections and reinvested immediately (p.152).

In Part II, the authors examine the history of labor in the second half of the 20th century in France, specifically the negotiations between trade unions and employers. In this telling, the trade unions in 1968 saw compromise as an exit lane from capitalism (p.182), but management addressed critiques by accommodating demands for social justice (p.183), providing profit-sharing in lieu of control/power (p.184), improving working conditions to quell rebellion (p.185), and replacing autonomy with security (p.190).

There is much more to the book, but let's skip to the conclusion, which presents these axioms:

  1. "Capitalism needs a spirit in order to engage the people required for production and the functioning of business." (p.485)
  2. "To be capable of mobilizing people, the spirit of capitalism must include a moral dimension." (p.486)
  3. "If it is to survive, capitalism needs simultaneously to stimulate and to curb instability." (p.487)
  4. "The spirit of capitalism cannot be reduced to an ideology in the sense of an illusion with no impacts on the world." (p.488)
  5. "Capitalism has a constant tendency to transform itself." (p.489)
  6. "The principal operator of creation and transformation of the spirit of capitalism is critique (voice)." (p.489)
  7. "In certain conditions, critique can itself be one of the factors of a change in capitalism (and not merely in its spirit)." (p.490)
  8. "Critique derives its energy from sources of indignation." (p.491)
It would be a little facile to say that capitalism succeeds because it listens to critique and addresses it. Boltanski and Chiapello, I think, rather argue that capitalism identifies damaging critiques and incorporates changes to defang those critiques so that it can retain and legitimize its essential focus on the accumulation process. It is more nimble, more flexible, and more supple in argumentation than its competitors. 

Overall, I think the book is a solid piece of work, although the authors have staked a lot on their reading of the management corpus, and I agree that they may not be able to generalize their conclusions beyond France. I am also not thrilled with how they have conflated different uses of "network," which I think muddies the analysis. Like other books in this vein, this one also endorses a grand narrative in which changes can be traced to a single actor (capitalism) rather than multiple factors in tension (ex: information and communication technologies, changes in transportation, the broadening of infrastructure, etc.). Nevertheless, it provides a much-needed rethinking of Weber's original thesis and provides a smart critique of the management literature. I wish I had read it before writing All Edge, although I think I would have—er—incorporated the critique rather than fundamentally changing my argument. If you're interested in capitalism, the so-called new economy, or Weber, take a look.

Wednesday, March 28, 2018

Reading :: The Cambridge Handbook of Cultural-Historical Psychology

The Cambridge Handbook of Cultural-Historical Psychology
Edited by Anton Yasnitsky,‎ RenĂ© van der Veer,‎ and Michel Ferrari


To be honest, I finished this book last summer, but I have been waiting for a substantial block of time to spend on it. My block of time today is no longer than usual, but I want to get this book off my Review shelf and back to my office shelf, so here goes.

In the Introduction, helpfully subtitled "What is this book and what is it about?", Anton Yasnitsky and Rene van der Veer lay out the book's intent: as an edited handbook, intended for higher education, focused on the cultural-historical psychology of the Vygotsky-Luria Circle as well as associated work by Vygotsky's predecessors, contemporaries, and later followers (p.1). They note that "cultural-historical psychology" was coined as a slur on Vygotsky's theory, but was picked up and appropriated by Vygotsky's followers (p.2). Nevertheless, Vygotsky-derived "cultural-historical psychology is firmly grounded in the belief shared by a great many researchers who postulated the necessity and possibility of an integrative psychological science of cultural-historical and bio-social development" (p.2), a belief "in the possibility of a holistic human science of mind, body, and consciousness in their inseparable unity and in cultural and historical development" (p.3).

The book is structured in six parts: theory; method; child; language and culture; brain; and cultural-historical applications beyond psychology. I won't thoroughly explore each, but I will pull out specific chapters for discussion.

Ronald Miller, "Introducing Vygotsky's cultural-historical psychology"
This chapter summarizes part of the argument Miller made in his book: the centrality of signs in Vygotsky's cultural-historical psychology. In this chapter, Miller discusses some of Vygotsky's thought, including his "law of sociogenesis" (that "every function in the cultural development of the child appears on the stage twice ... at first as social, then as psychological" — Vygotsky 1998, p.169, quoted in Miller p.26) and his law of "transition of a function from outside inward" in which "the social means becomes the means of individual behavior" (Vygotsky 1998, p.170, quoted in Miller p.26). These lead to Vygotsky's self-declared major discovery that "word meaning changes and develops" (Vygotsky 1987, p.245, quoted in Miller p.28)—a discovery that leads Vygotsky to identify the stage of adolescence in which abstract concepts are formed (p.28). After some valuable discussion that I won't summarize here, Miller notes that Vygotsky "puts paid to the view that practical activity and everyday experience provide a sound basis for understanding or explaining the psychological underpinnings of human action, let alone the view that conceptual understanding derives from or is an extension of everyday practical experience" (p.40). This argument is the core of Miller's brief against activity theory, which was developed by Leontiev in part to harmonize with the Stalinist ideological requirement of practicality (see Krementsov on this emphasis on practicality, and see Leont'ev and Zaporozhets for an early example of AT developing to address this emphasis).

Janette Friedrich, "Vygotsky's idea of psychological tools"
This chapter focuses on Vygotsky's use and development of the notion of psychological tools, which the author believes is an essential notion in Vygotsky's writings—although the phrase isn't used in Thought and Language, the concept itself is fundamental (p.48). The author also argues that the idea is shared in the writings of Vygotsky's contemporaries, Kurt Goldstein and Karl Buhler (p.48).

Vygotsky's view was that "all higher psychological functions—such as voluntary attention or logical memory—originate with the help of psychological tools, and thus constitute mediated psychological phenomena," and thus the unit of analysis must include not just stimulus and response but also mediator (p.48).

Such psychological tools differ from physical (work) tools in their directiveness: "this [psychological] tool is a means subjects have of influencing themselves, a means of self-regulation and self-control" (p.50). Psychological tools are signs, but they are not just signs. They have three other characteristics:
A psychological tool (1) is an artificial adaptation that (2) has a non-organic (that is, social) nature, and (3) is destined to control one's psychological behavior and that of others. (p.51)
Friedrich goes on to argue that to understand psychological tools, we must understand the difference between mediated activity and mediating activity that Vygotsky introduced in his book on higher mental functions, based on Hegel. "Work tools and psychological tools both fall under the more general concept of mediating activity" (p.53) in which nature acts on nature (p.54). But when we intervene directly in nature via an instrument, we are involved in mediated activity (p.54).

Friedrich goes on to examine psychological tools in the works of Goldstein, who conducted aphasia research and became interested in detours: strategies "that patients develop to do everyday tasks that their illness prevents them from performing normally" (p.56). She also examines the works of Buhler, specifically his 1934 "masterpiece, Theory of language, in which language is defined as a mediating instrument" (p.58). She argues that a dialogue among the three is possible. Friedrich concludes that "Vygotskian psychological tools do not exist over and above their use by an individual" (p.61).

Ekaterina Zavershneva, "The problem of consciousness in Vygotsky's cultural-historical psychology"
Zavershneva has been examining Vygotsky's notebooks. Here, she discusses how Vygotsky understood consciousness, arguing that "we may even assume that [Vygotsky's cultural-historical theory] is the most notable contribution to general psychological theory to date" (p.65). Yet Vygotsky, she says, offered three distinct models of consciousness (p.66):

  1. as a reflex of reflexes (1924-1926) (pp.66-68)
  2. as a system of secondary connections between higher psychological functions (1927-1931) (pp.69-74)
  3. as a dynamic semantic system (1932-1934) (pp.74-78)
She argues that understanding the first two models is important for truly understanding the third (p.66). Nevertheless, here, I'll skip to the second model: the system of secondary connections between higher psychological functions. Zavershneva notes that from 1927-1930, "Vygotsky was studying isolated psychological functions, but not consciousness per se and as a whole" (p.69). "The earliest variant of the idea of a mediated action" and of word meaning is in Vygotsky's papers of 1926 (p.69). But "By the end of the 1920s, Vygotsky gradually came to the conclusion that 'psychological tools' cannot be 'built into' any single 'higher psychological function' because a person is an integral being and in every act of human behavior all psychological processes are manifested" (p.70). This led Vygotsky to attempt to explain consciousness holistically, so he introduced the principle of a system to his psychology in his 1930 paper "On psychological systems" (p.71). From this point on, Vygotsky "repeatedly criticized his previously held views as incomplete and even erroneous" (p.74). "The idea of the sign as the mediator between nature and culture was still used as a heuristically useful abstraction, but it gradually shifted to the notion of the 'system of psychological functions'" (p.74). 

This led to the third model, in which Vygotsky "introduced a theoretical innovation—the notion of consciousness as a dynamic semantic system" —but could not theoretically develop it due to his untimely death in 1934 (p.74). Zavershneva traces Vygotsky's development of this innovation from word meaning to sense to perezhivanie ("intellectual and emotional life experience")—but notes that this "theorizing remained at the level of mere speculation, and Vygotsky's theory of consciousness was not developed any further than a sketch of a promising future theory" (p.78). 

In the latter half of the chapter, Zavershneva discusses Vygotsky's later work on types of concepts, noting a possible interplay with Lewin's work via two former Lewin students who moved to Moscow and worked under Vygotsky (p.84). In a footnote, she argues that "the members of the Vygotsky Circle frequently used the conceptual apparatus of Kurt Lewin's theory and, thus, by doing this they merged Vygotskian theory with Lewin's topological and vector psychology" (p.85). 

In summary, Zavershneva notes that Vygotsky's three models are united by the leading role of speech; systematic and semantic organization; and the origin of consciousness in the acquisition of social norms of behavior (p.92). But as Galperin argued in 1935, this intellectual system remains incomplete: "it did not have a theory of motivation, affect, and volition" or "a well-developed theory to explain the interrelations of the personality with the environment" (p.92). 

Aaro Toomela. "Methodology of cultural-historical psychology"
Toomela has published several pieces critical of activity theory, and this piece is no exception, although AT is not directly in Toomela's sights throughout most of this piece. He argues that in cultural-historical psychology, "science is always about facts and about the way the facts are interpreted" (p.102). 
In sum, in Vygotskian cultural-historical psychology scientific activity is understood as study of the world that is based simultaneously on method and methodology. Method is the procedure of study, the technical aspects performed. ... Methodology, in turn, is the study of the method of scientific cognition that determines why the study is conducted, what is the place of science and its nature; it is a philosophy of scientific cognition. (p.106)
After some discussion of Vygotskian method and methodology, Toomela critiques activity theory, arguing in a footnote that AT "cannot be theoretically defended" and pointing to his previous articles (p.117). He notes that in a previous exchange, Engestrom ignored his methodological arguments and instead noted how broadly cited AT is; Toomela claims that this indicates that "a reason why Activity Theory is doing so well might be that its methodological foundation is underdeveloped. Activity Theory is based on a psychologically implausible theory that environment somehow directly determines the development of mind" (p.117). Toomela complains that "the quality of the theory is not in the arguments, it is in the number of citations" (p.117).

Jaan Valsiner and Rene van der Veer. "Encountering the border: Vygotsky's zona blizhaishego razvita and its implications for theories of development
The zone of proximal development (ZPD in English, ZBR in Russian) is a frequently cited notion that Vygotsky used in the early 1930s. Here, the authors discuss its origin in Bergson (p.153) and Vygotsky's uptake of the concept as he moved into the study of pedology (p.155). They note that Vygotsky was interested in crisis periods in child development: Vygotsky identified crisis periods at ages 0, 1, 3, 7, 13, and 17 (p.156), and "it is during these periods that the emergence of higher levels of psychological organization take place" (p.156). Each crisis has an involution process, then a culmination point "that is the locus at which the dialectical synthesis is accomplished" (p.156). Implicated in this process is the ZBR, which cannot be studied directly in the present; "it refers to the hidden processes of the present that may become explicated in reality only as the present becomes the (nearest) past, while the (nearest) future becomes the present" (p.161).

The authors argue that in its recent uptake, the ZBR/ZPD has lost its original context and has become ontologized—it "is assumed to exist as an entity among other psychological functions," not as a "dynamic process of emergence" but as "a static depiction of some process of teaching and learning" (p.167). It's not the help that is important in the ZBR, they say, it is the horizon (p.167).

Galina Zuckerman. “Developmental Education”
The author discusses education and human development from a Vygotskian perspective, noting two laws of human development that Vygotsky formulated: “The law of the transition from natural to cultural forms of behavior that are mediated by tools, signs, and symbols” and “The law of transition from cooperative, interpsychological to individual, intrapsychological forms of behavior” (p.178).

Under the first law, the author discusses the self-development involved in mediation. The gap between the wish and the action, she says, yields the fruit of civilization (p.179).

Anke Werani. “A review of inner speech in cultural-historical tradition”
Werani examines inner speech in the Vygotskian tradition (p.273), noting that understanding involves words, thoughts, and motivations (p.280). The really valuable thing in this chapter is the table on p.282, which summarizes the functions of inner speech explored by Vygotsky, Luria, Halperin, Ananev, Sokolov, and Achutina (p.282); the author discusses this Soviet tradition of inner speech research as well as its uptake in the West .

Eugene Subbotsky. “Luria and Vygotsky: Challenges to current developmental research”
Here, Subbotsky reviews Vygotsky’s distinction between lower and higher mental functions (LMFs and HMFs). In Vygotsky’s account, LMFs are innate, non mediated, involuntary, and isolated, while HMFs are socially organized, mediated by the social world, voluntarily controlled, and linked (p.297). Vygotsky opposed this view to that of Gestalt, which relies on universally structured laws of perception; Vygotsky objected: how did these develop? (p.297).

From here, Subbotsky moves to recent work on executive function (EF), which is “a complex cognitive construct” consisting of “working memory, inhibitory control, and attention flexibility” (p.300). Yet in contemporary work, EF is studied as exclusively cognitive, not related to social and cultural contexts (p.300). He argues that such contexts are extremely important to development and developmental disorders, and calls for alternative approaches to EF; he nominates Vygotsky’s approach to the development of conscious action (p.301). Subbotsky moves from Vygotsky to Luria, whose foundational EF work was based on Vygotsky’s (p.304). As a side note, Subbotsky claims that Vygotsky-Luria HMFs were “the first primitive version of what late became known as ‘mental models’” (p.308).

Aaro Toomela. “There can be no cultural-historical psychology without neuropsychology. And vice versa.”
Toomela’s second chapter uses the sort of title Toomela enjoys using — a declarative sentence that draws a clear line in the sand. Here, Toomela argues that the Vygotsky-Luria approach to cultural-historical psychology “is pregnant with promises for many new discoveries that may lead to fundamental changes in our understanding of the human mind” (p.315).

Toomela bases this chapter on his own readings of the authors’ works and acknowledges that this reading is different from others (p.316)—by which he means not just Vygotskian scholars, but more directly, scholars in neuropsychology (e.g., p.328). The conventional interpretation, he says, is wrong: “Vygotsky (and Luria) consistently followed not the linear cause -> effect but the differentiated-holistic structural-systemic way of scientific thinking” (p.335).

Vygotsky and Luria argued that semiotically mediated thinking develops hierarchically and therefore “there can be more or less developed cultures depending on the hierarchical forms of word-meaning-structure of development that are available in a particular culture” (p.338). However, contemporary cultural-historical psychologists such as Cole and Wertsch disagree and label this thinking as ethnocentric. Toomela rejects this recent view for at least two reasons. The first is that the recent activity theory-flavored approaches are founded on non-Vygotskian “linear environment->individual relationship thinking” (p.338). The second is that “the conclusions of the activity theory are based on superficial similarities between tasks and task performances. Activity theory rejects a priori the possibility that mental structures underlying external task performance may be different” (p.338). (I guess I have been out of this loop for the past 20 years, since this assertion seems new to me.)

And I think this is where I’ll leave the review. This handbook covers an extraordinary range of viewpoints and draws on a broad set of disciplines to expand our understanding of cultural-historical psychology. Should you read it? If you’re interested in Vygotskian theory, or activity theory as an outgrowth of it — of course.

Wednesday, March 21, 2018

Topsight 2.0 > From analysis to design

At the end of the first edition of Topsight, I exhorted readers to use their new insights—generated from a field study, processed through analytical models—to design better solutions. But taking that next step, I said, went beyond the scope of the book.

That ending always bothered me for two reasons.

One was that, although I could point to other resources, readers would likely find it difficult to join their Topsight-generated insights with a given design approach.

The other was that I did know how to join these insights with a design approach—participatory design. I had written several articles on PD, one of which is my most broadly cited article, and I had already integrated PD methods into my field methods class. But when I was writing the first edition of Topsight, I hadn't worked out these connections, and I wanted the book to get out there.

In the intervening five years, however, I developed materials for better integrating the Topsight approach with PD. Among other things, I articulated the connection between Topsight-generated insights and design approaches and I emphasized the "fail faster" aspect of design work. The latter was influenced by my recent work with entrepreneurs, who (at the early stages, when their offering is still malleable) must continually reposition their offering to interest stakeholders—a process that encompasses design as well as argument, application, and financial model.

These insights were driven into Topsight 2.0. In this second edition, I add an entire new section—six chapters—discussing how to turn Topsight-generated insights into design decisions. The section covers PD techniques such as prototyping, organizational games, and future workshops, providing step-by-step directions and discussing when each might be brought into play. And, critically, it discusses how to feed the results of these techniques back into the design process so that readers can continue to develop insights and quickly iterate them.

To be honest, there is a ton of information on using prototyping, and much of it goes deeper than I can in Topsight 2.0. But you'll be hard-pressed to find much material on organizational games—an intriguing technique for understanding organizational relationships and routines, one that is a great match for the Topsight approach and that I discuss in detail here. Similarly, future workshops can help stakeholders to understand the deeper contradictions underlying their organizations so they can talk through these contradictions—but this technique also does not have a lot of published material.

Why this dearth of material? I think it's because design research has moved away from organizations and toward consumer software and products, an arena in which organizational games and future workshops don't make as much sense.

But for readers of Topsight 2.0, who want to design new solutions in the context of an organization, organizational games and future workshops are a great fit. If that sounds like you, please pick up a copy and let me know what you think!

Wednesday, March 14, 2018

Topsight 2.0 > Now under Kindle Unlimited

I just announced that Topsight 2.0 is now available on Amazon.com. Much more content, but the same price, so that I can get this book in the hands of people who need it.

For the same reason, I'm happy to announce that the Kindle version of Topsight 2.0 is now up, and it's listed under the Kindle Unlimited program. If you're signed up for Kindle Unlimited, you can download and read it for the low price of $0.00 (USD).

If you're not signed up for Kindle Unlimited, you can buy it for the same price as the original Topsight for Kindle — $7.99. That's lower than the $19.99 print price, and you can get it in your hands immediately!

I hope you try out Topsight 2.0 in either format. I'm excited about the book and its additions. Let me know what you think!

Topsight 2.0 > I have an announcement

Five years ago, I decided to try an experiment. What if I published a research methods book—a book that described how I conduct qualitative field methods for workplace studies? What if I made it as simple and accessible as possible? And what if I kept the price low so that people could access it easily?

The response was better than I hoped. Topsight has been used in graduate and undergraduate classes across North America. It has sold globally. It has been used in industry. Right now it's sitting on a perfect five-star rating on Amazon, with comments such as "Topsight is my favorite book. Hands down" and "THE book to buy for conducting research and writing a report." I'm thrilled that the book has been useful.

I'm also gratified that Topsight is being recommended by professors to their Ph.D. and MA students—and surprised that it is increasingly being cited in scholarly research (28 times as of today).

But.

I've been using Topsight for those five years to teach the principles of workplace research to BA and MA students, as well as a reference for my Ph.D. students. And through those activities, I've noted some areas in which Topsight could be made even better.

For instance,

  • the early chapters discuss organizations, but not in as much detail as I would like.
  • the chapter on coding data is critical, but it isn't that easy to follow. 
  • the section on modeling provides several models, but doesn't give advice on how to build one's own customized data models.
  • the interim report isn't well aligned with the advice I give in the instructions.
  • the book ends by suggesting that students go on to engage in design—but doesn't talk about basic approaches to design, such as prototyping, organizational games, or future workshops.
Topsight is good, but it can be even better. 

I'm happy to announce that it now is. 

Topsight 2.0 has just been launched on Amazon.com. It's reformatted, it's a lot longer, and it's addressed the points above as well as others. Better yet, it's the same list price—which makes my margins a little thinner, but keeps the book accessible to the people who need it.

As of right now, Topsight 2.0 is available in print; within the next two days, the Kindle version will also be available. It'll have the same content and the same features as the print version—and it'll also be the same price as the original Topsight for Kindle. 

Over the next few days, I'll be blogging more about the new features in Topsight 2.0. I hope you'll pick it up, and please don't hesitate to let me know what you think!

Wednesday, March 07, 2018

Reading :: Bodies in Flux

Bodies in Flux: Scientific Methods for Negotiating Medical Uncertainty
By Christa Teston


Christa Teston has been applying the tools of rhetoric to medical practice for a while, authoring a series of articles that specifically focus on how medical professionals use agreed-upon methodological principles to work across fields. In this book, she pulls that work together, using Annemarie Mol's work on multiple ontologies to theorize this cross-field work. Here, Teston focuses squarely on methodology: how it is used to generate authority (p.1) and negotiate uncertainty (p.2).

Specifically, Teston looks at cancer care, and four ways to negotiate uncertainty in this activity: "evidential visualization, evidential assessment, evidential synthesis, and evidential computation" (p.2). Evidence, here, is understood as fundamentally rhetorical, and "backstage methods" (in the Goffman sense) allow medical professionals to "coproduc[e] evidential order from biological chaos" (p.15). Medical professionals use these methods to deal with inevitable flux, creating "evidential attunement," which "necessitates entanglements between human, nonhuman and computational actors" (p.15).

To develop this argument, Teston draws on "the so-called new materialist and nonhuman turn" (p.18). Perhaps Teston dislikes the term "new materialism" as much as I do: after name-checking Latour, Callon, and Bennett, she adds, "Although some have called this brand of materialism new, others (i.e., those who align themselves with materialist feminists) would suggest that there is nothing new about this materialism" (p.18). She goes on to discuss others in this vein, such as Hekman, Pickering, and Deleuze & Guattari, then concludes, "In this book, I locate material-discursive intra-actions between humans and nonhumans at the seat of method" (p.18).

She adds that although medical care strives for certainty, "this book unearths reasons for how and why it is that cancer care is not and can never be an objective science." She is not critiquing the "black boxes" of cancer care, which "are essential" because they "do the hard work of stabilizing, qualifying, and mobilizing 'future use of ideas and facts' while aggregating and mobilizing alliances (Danius 2002, 41-42)" (p.22). Rather, she seeks to examine how this work happens rhetorically, demonstrating rhetorical theory's explanatory power (p.22).

To reach this goal, Teston draws on case studies in which she investigates method attunement in cross-field medical work related to cancer care. She describes these qualitative studies and draws on rhetorical tools and concepts—Toulmin analysis, stasis theory, enthymematic reasoning, kairos, phronesis—to take apart the suasive work happening in each. And I appreciate that she performs these cases with a high degree of methodological explanation, demonstrating the rigor of each case.

On the strength of these cases, Teston concludes that "defining and diagnosing disease is a kind of quixotic empiricism" (p.169). Within these cross-field cases, she says, evidences result from rhetorical attunements—"In medical practice, rhetoric is a material-discursive performance that involves dissecting corporeal differences and similarities into manageable bits and bytes. Rhetoric is a material-discursive act of designing and deploying algorithmic protocols capable of predicting and communicating about possibility" (p.171).

Overall, the book was well done, both methodologically and theoretically. Teston offers a materialist approach to understanding medical rhetoric in particular and methodology in general. If you're interested in medical rhetoric, scientific rhetoric, materialist approaches to rhetoric, or methodology, definitely take a look.

Wednesday, February 28, 2018

Reading :: Exploring Semiotic Remediation as Discourse Practice

Exploring Semiotic Remediation as Discourse Practice
By Paul A. Prior and Julie A. Hengst


In this 2010 collection, the authors discuss semiotic remediation—that is, how symbols can be taken up in an activity and, through that taking up, "produce altered conditions for future action" (p.1). This work is grounded in the dialogic semiotics of Voloshinov and Bakhtin (pp.2-3) and the notion of remediation as discussed in Bolter & Grushin as well as genre theory's taking up of genre assemblages (pp.7-8).

The collection features contributions from rhetoric and writing studies; communication studies; speech and hearing; anthropology; and cognitive sociology. In this review, I'll zero in on two of those contributions.

Julie Hengst's "Semiotic remediation, conversational narratives and aphasia" examines the phenomenon of aphasia, in which acquired brain damage results in lost ability to understand and/or express speech. "Clinical accounts often describe individuals with aphasia as being able to communicate better than they talk, that is, as individuals whose communicative competence is better than, though masked by disruptions in, their language abilities" (p.109). Aphasia, Hengst argues, "disrupts not only the isolated performance of individuals but also the typical communicative practices of all participants in an interaction," and thus "individuals with aphasia and their communication partners must work together to reorchestrate the semiotic resources of communicative interactions and redistribute the burden of meaning-making in interaction" (p.109). That is, because communication is social, aphasia-related communication disruptions are addressed socially. In a social performance,
communicative competence can exceed linguistic performance in interactions of individuals with aphasia. The ability of individuals with aphasia to engage in complex, frame-shifting discourse practices so successfully and yet with sometimes quite limited linguistic signaling also helps us to see beyond the bright lights of language, to recognize how much communicative weight other semiotics can and routinely do bear. (p.110)
Hengst illustrates this point with "narrative tellings" taken from semistructured interviews and observations of pairs that included an individual with aphasia and one without (p.117). Through descriptions and transcriptions, Hengst demonstrates how the participants combined narrative, gestures, and other symbolic resources (such as a map) to jointly tell stories.

Paul Prior's "Remaking IO: Semiotic remediation in the design process" examines how a multimedia design team jointly produced objects (p.207). Specifically, Prior examines this team in terms of situated practice, in which writing is treated as a verb (activity) rather than a noun (artifact) (p.209). By closely examining videos of the team's design interactions—presented here as screen grabs combined with the transcript—Prior gives us a detailed picture of these interactions. In one case, "the drawing/text on the whiteboard ... involved at least 29 different actions that touched the surface of the whiteboard, movements made by two people ... over a period of less than three minutes of interaction" (p.219). And these movements were coordinated with further movements, including a laptop screen and gestures in the air. "Inscription at the whiteboard then emerged in sequential, temporal, co-present interactive acts; it represented writing-as-activity rather than writing as only artifact" (p.219).

Prior connects this empirical work back to the notion of chronotopic lamination that he had developed earlier with some of the other contributors to this book: "the simultaneous management of multiple social frames and footings as laid out by Goffman ... and Goodwin and Duranti..." (p.228). Here, chronotopes are linked or laminated. Chronotopes can be representational (narrative) or embodied (experiental), but they can also be embedded in affordances, such as when the chronotope of the road is embedded in "such sociomaterial forms as roads, signage, maps, and inns for travelers" (p.228). And "in activity all three of these chronotopic dimensions are necessarily fused" (p.229, his emphasis).

Thus "inscription and semiotic production [are] both situated in local interaction and dispersed across time" (p.233). The team Prior studied used multiple mediational means to semiotically remediate their design processes. But, he adds, "this kind of heterogeneous and heterochronic mix of mediational means, this kind of semiotic remediation, is a pervasive feature of human social practice, not an anomalous development of the digital age" (p.233). People demonstrate "semiotic agility" in switching between semiotic worlds; "managing multiplicity is simply part of everyday existence" (p.233).

Overall, this was a really interesting and useful book. I haven't done justice to all of the contributors in this review, but the two chapters I have overviewed should give you an idea of what the rest of the book offers. Pick it up!