Thursday, September 1, 2011
Getting Over "Social Media"
I don't think we need a new definition of social media. I think we need to get over it.
When we are able to accept it as no more exceptional than ordinary conversation it will finally achieve the status of an unremarkable, unnoticed, natural and ubiquitous human activity. It will become simply, as Brian Solis points out, "media."
The end-point of the evolution of "social media" is its disappearance from our collective consciousness. It's when nobody ever asks about it or thinks about it, much less promotes it or professes to understand it better than other people. It's when the phrase becomes meaningless. Once everybody is a "social media expert," nobody will be—and that's when it will have achieved all it can.
I detest the phrase "social media."
All media are "social." The word means "relating to human society and how it's organized, relating to the way people in groups interact and behave toward one another, living (or preferring to live) as part of a community."
All communications media have a role in organizing society, uniting or dividing communities, and establishing standards of behavior. It's the nature of communication; defined by James Carey as "a symbolic process whereby reality is produced, maintained, repaired, and transformed." "Society," Carey tells us, "is possible because of the binding forces of shared information circulating in an organic system." The purpose of all media is to share information and, thus, bind society together—to be social.
While traditional media may seem to lack the participation of the audience as producers—which is generally considered a defining characteristic of "social media"— even newspapers invite letters to the editor, radio stations broadcast calls from listeners, and American Idol asks viewers to vote.
People certainly talk with one another about the content of all media. We join reading clubs or chat about books with our friends. We discuss TV programs at the water cooler. We buy things, sell products, vote, and form relationships based on media messages.
Thus, "traditional" media's social aspects extend beyond its physical or audio-visual manifestations, and I think it's wise to think of any medium as including not only those manifestations, but also its extended social influence. In that sense, some part of the "Arab Spring" uprisings and the recent demonstrations at our local rapid transit stations here in San Francisco are not merely the results of, but also components of communications media. Cause and effect are parts of the same phenomenon, and part of any medium is its intended or unintended social effects. We are all radio, and it is us.
We use the term "social media" to lump together all manner of Internet-enabled, audience-participation communication solutions: Facebook and LinkedIn, Quora and Pandora, Twitter and Foursquare, Yelp, Digg, Flickr, Google Groups, multi-user games, and hundreds if not thousands more. I propose that this "lumping" does a disservice, distracting our attention from the unique attributes, functionality, and uses of each.
Some respondents to the LinkedIn question proposed that the coffee houses and pubs of old were the social media of their day. So were newspapers, as the literate few read them to groups of friends and neighbors who discussed their contents. Nobody bothered to call all of these "social media" or needed to think of them as anything other than what they were.
"Social media," the term, serves only two purposes, it seems to me. It's a handy buzzword to give cachet to the products of entrepreneurs and thereby capitalize on press and public fascination with such things; and it's shorthand to express that a particular medium is new, participatory and, probably, Internet-enabled. It's a disguise, not a description. It conceals the social aspects of all media.
I'm rather tired of hearing about social media; I'd prefer to use it instead of talking about it. And I wish everyone else would do the same.
Friday, April 16, 2010
No, I'm Not Shampoo
Lot of talk these days about "personal branding." Tom Peters is one of the better known business gurus doing the talking, and at Amazon.com there are pages of books by other experts on the topic.
Brenda Bence, MBA, is one of them: an "internationally-recognized branding expert" who has worked for Procter & Gamble and Bristol-Myers Squibb and as a motivational speaker and executive coach. She has created an industry around the personal branding fad that includes several books all titled with variations on "How YOU Are Like Shampoo."
In the first of these, she says, "I firmly believe that people—just like shampoo and other products—are brands, too." Ms. Bence goes on to remind us that Brad Pitt, Mel Gibson, and Britney Spears are individuals with very specific personal brands, with the implication that the rest of us surely want to be just like them.
I doubt that. In fact, I doubt any of them are entirely happy being "just like them." Their celebrity brand serves a publicity purpose but, probably to their own dismay, it is not a reflection of themselves as three-dimensional human beings. And it is not the secret to their success in show business. Don't be fooled: there's a lot more behind Brad Pitt's success than the gloss of his celebrity. Gibson is hired for his whole self, not merely his public brand.
Celebrity is a facade that some wear with grace, others not so much. But what they bring to those who hire them is not their personal brand—it's their talent and their work ethic and their humanity, with all its strengths and quirks.
But back to our shampoo marketer/executive coach...
Bence goes on to dismisses the reasonable objection that the rest of us can't be like those people—Pitt, Gibson, Spears and the rest—because, unlike us, they are celebrities. She suggests that the only thing that makes them "different" from the rest of us is that they all employ image specialists to manage their brands.
But there's hope, says Bence: We don't need expensive help to manage our personal brand's perception, we just need to read her book and take her advice.
The first bit of that advice is that "perception is reality in marketing ... it doesn't matter who you think you are. What matters is how others perceive you."
So to Ms. Bence, personal branding is all about managing perception, not about substance: about perceived value, not real value; image, not integrity.
And that's why I don't believe one does, or should, create and market a personal brand. The term is meaningless and the very idea dehumanizing, inappropriate, and dishonest.
"Personal branding" is a two-dollar name we give to the age old act of posing-to-impress. We use such high-falutin' phrases to make ourselves seem (or feel), more knowledgeable, sophisticated, and fashionable. (We don't look for jobs anymore, we "network." We don't act ourselves our improve ourselves, we "develop our personal brand.")
"Personal branding" is the most currently hip in a long string of self-help management techniques, except that it is not about self-improvement, but conveys something less genuine: self-packaging.
The only value of the phrase is that it gives us a slightly different way to think and talk about our ambitions and how to achieve them—modeling the process on tricks pioneered by the "hidden persuaders" of yore. I'll grant that. But it's an inherently dangerous model that can make us less than what we are—not more.
The phrase has the ring of scientism and enlightened, dispassionate management—but also the accompanying smell of fraud, exploitation, and fakery. It reeks of the rudest ambition and the most unseemly self-absorption. It sounds dishonest and beneath the dignity of human beings.
Branding is for corporations, not people. It is the creation of meaning around a business or product that is otherwise devoid of meaning and differentiation. In practice, branding is more the manipulation of image, less the creation of substance. It's something we do to cattle, potato chips, the aforementioned celebrities, and cosmetics. It's what Ms. Bence did for shampoo at Procter & Gamble.
What real people do is engage with other people and build their reputations—through good works and value, through their contributions to the success of others, through their humanity, and by their demonstrated integrity.
Branding is too restrictive for anything as versatile and deliciously unpredictable as a human being. Despite what all the gurus proclaim, I am most decidedly not a brand; I am me, take it or leave it. Today and in this place with these people I am one me; tomorrow, elsewhere, or with others I will be another. My generation fought against the grey flannel suit, the organization man, the pigeonhole, the stereotype, the glass ceiling—and won. We won the right to not be pigeonholed or defined by others, and it would be hypocritical and foolish to do that to ourselves.
We should not think of ourselves as brands—and should not want to—any more than we should think of our faces as logos, our beliefs as positioning, our character as our "unique selling proposition," or our friends, colleagues and associates as a network or the "value chain" we bring to the market. I am not a definable collection of features and benefits, not a platform or an ecosystem, but an ocean of possibility. My name may be my word, but I refuse to call it a brand promise. My sizzle is not for sale.
Personal success does not come from packaging, but from performance; not from buzz, but respect, not from a marketing strategy, but from a consistent habit of goodwill, kindness and humor.
We are hired for the value we provide others, for our honor, honesty and reliability—not because we have succeeded in creating an appealing "personal brand." Branding may get us in the door as objects to be oggled, but we will be judged for something else: our true selves; our unadorned substance; our un-spun character; our raw, naked unpackaged and unpretentious humanity.
I have nothing against genuine and sincere self-improvement, no quibble with the value of learning and skills development, and certainly no problem with ambition nor argument against the self-promotion necessary to get what one wants. But let's leave the branding to objects that cannot engage with others on their own behalf. We're better than that.
Thursday, January 28, 2010
The Message of Silence

As is tradition, the audience punctuated the speech with repeated standing ovations, for about 17 of which Republicans joined their Democratic colleagues. The State of the Union message is one of very few presidential responsibilities that are specified in the Constitution. That the members of his party frequently stand and applaud the President's words—especially the fighting ones—is an unwritten rule. When those words are about the Country's greatness or the valor of American heroes, the rule applies to Members on both sides of the aisle. Regardless how enthusiastic or how bipartisan the ritual standing and clapping is, it tells us nothing we don't already know.
Any impact of those ovations paled in comparison to the unanimous silence that met the long conclusion of the President's speech. For a full five minutes and fifty seconds Mr. Obama called for government, business, and the press to act with the dignity and demonstrate the values of the American people. And for all that time, the audience was hushed, still, attentive, and perhaps even contemplative. The Members of Congress responded as would a chastised child, listening to a parent's quiet, wise, and reasoned counsel.
It was, for me, an emotional and rhetorically effective few minutes. Several times, Obama paused for four or five poignant seconds to let his words sink in.
Whether his message and plea will have any practical effect on the tone of debate or the progress of legislation in Washington, whether it will turn the opposition from obstructionism to governance, is yet to be seen. I am hopeful, but not optimistic.
But those riveting few minutes of respectful silence spoke very clearly about the nature of leadership, the stature of the President, and the seriousness of the situation in which we find ourselves.
(Listen to the last 5:50 here.)
Saturday, January 23, 2010
To Be, Or To Do?
It used to be that word processors were machines that people used for the single purpose of preparing documents. Then personal computers came along, absorbed the functionality of those dedicated devices, did the job better, and did other things, too.
Appliances like word processors, telephones, fax machines, GPS navigators, scanners, calculators, games, televisions, radios—and even computers—aren't just discrete gadgets anymore. Not necessarily.
They are functionalities that are built into myriad devices. They are capabilities, not contraptions. They are things that are done, not gizmos that do a thing. Their physical forms have dissolved away into the digital soup of possibilities; their potential floats freely, to be sucked up into other, more complex forms.
Making voice calls is only one functionality of the smart phone, and it's a function that's available as well in computers and automobiles—and will be in your television, too, if it isn't already. Television isn't just a box in your living room; it's a function that's available in your computer, your phone, and your game console.
It isn't just computers that can connect to the Internet. So can a cell phone, a refrigerator, a home irrigation system—anything that's equipped with digital communications functionality and the necessary software.
A doorknob can be connected to the Internet. But there's no reason to do that, unless connecting to the Internet makes it in some way a better doorknob—or provides some valuable benefit: increased security, or useful information. In many buildings, doorknobs connect to security systems, and some of those use the Internet to send information about who's entering, when.
So this raises questions: If your computer can be a telephone and your telephone can browse the Internet, what is a telephone? What is a computer? What is a calculator? A game? A radio? A television? Or a doorknob?
What they are not (or no longer need to be) is single-purpose, stand-alone gadgets. They are functionalities that are absorbed into other things; they are things that can absorb other functionalities. That's the result of information of all kinds in digital form, of the ubiquity and power of microprocessors to deal with that information, of software to tell those microprocessors what to do, and of communication networks that connect discrete systems to others.
I doubt there's any reason to connect my toaster to the Internet, any benefit that's worth the effort or expense. I don't need a hammer that can find a hardware store through Google when it knows I'm running short of nails. Some tools will continue to be single-purpose and rather dumb gadgets that don't connect to anything—or need to.
But devices that communicate and deal with information are dissolving and becoming functionalities of other things. It wasn't so long ago that people wondered what computers could possibly be used for. Many of us struggled to justify buying the things. Now that they have sucked up so many capabilities from the digital soup, we wonder how we ever lived without them.
At one time, we thought that "digital convergence" meant that you could handle just about any kind of information on a computer. We thought it was a threat to industries that delivered information through other means: publishing, broadcasting, telephone and cable companies.
I made a movie for Bill Gates (see "Digital Convergence") to explain this perception in humorous ways and describe how the media and communications industries were reacting to the menace.
Now we know better: that convergence is not so much a threat to these industries, as to their old business models and product lines. It is an opportunity to transform both and add value to their offerings.
Television can get out of the box in the living room, and has done so. Telephones run applications and games; know where they are in the world; retrieve, store, and present information and entertainment. They have become re-defined, and much more useful and valuable. Books and magazines, even in their present ink-on-paper form, can be interactive communication systems of greater value and relevance—if publishers embrace and promote technologies that are already available and ask the question: What is a book? What is a magazine?
The only threat is to those who persist in the old definitions of what things are, and who think that things are and will always be just things—objects instead of functions, nouns, rather than verbs.
Ulysses S. Grant and Buckminster Fuller both said, "I am a verb." The objects around you are saying the same thing. Are you listening?
Friday, December 11, 2009
Pain English
I suspect the reason business people so often resort to gobbledygook in their writings is not just laziness or habit, but that they're afraid specific language is too confining and restrictive. Ambiguity seems more inclusive. It relies on the reader to fill in the blanks of meaning. These would-be communicators fear that explicit, precise, and detailed writing might not tweak every reader's interest or encompass every possible interpretation.
Problem is, it doesn't say anything either.
You don't have to look hard for examples. As I started to write this, IBM spammed me with an email they hope will draw my attention to an article in their newsletter. They say the piece is about how that company helps "organizations make strategic decisions that enhance competitive advantages, create new sources of value, improve revenue growth and develop the change programs necessary to meet business or mission objectives."
Well, who wouldn't be interested in all—or at least some part of—that? Aren't those things that all businesses want to do? But the email doesn't give me a clue what IBM's consultants really offer, or how they deliver it, or what they've done for somebody else.
On the contrary, the message I get—loud and clear—from their spamMail is that IBM doesn't know a thing about my business or me. They think of Bob Kalsey and Bravura Films as just another organization with problems and issues no different from those faced by any other. By trying to offer everything, they offer nothing at all. That's the ROI for ambiguity: nothing.
These thoughts are inspired by a question today on LinkedIn, where I occasionally try to contribute to the conversation. Loren Hicks referred there to a help-wanted ad that includes the phrases “The role will leverage all aspects of the offer matrix” and “ … will include presenting and evangelizing xxx’s offering …”
Loren asks how people respond to such a thing and whether they'd apply for the job—whatever it is.
My response was:
I would be pleased to apply for a role that leverages all aspects of the offer matrix and proud to present and evangelize the company's offerings, especially if the aspects are of the unique, cost-effective and robust next-generation aspect type.
At the end of the day, though, the matrix would have to be flexible, scalable and optimized in terms of metrics that deliver value added outcomes, and the offerings would, hopefully, be easy to use, world-class and unique—as well as focused on high performance innovations from a leading provider of new and improved feature sets. Heck, the bottom line is, I'd give 110 percent commitment—or more—to such a win-win partnership of all stakeholders. Wouldn't you?
I have to give credit for many of the buzzwords I used to David Meerman Scott and Dow Jones, who created and provide (under Creative Commons) a list of the things. Just in case you've forgotten some of them and don't want to leave any out of your next spam.
Scott is as ferocious about blather and baloney as I am, and you might want to read his Gobbledygook Manifesto, in which he points out another reason business writers default to those words: they don't really understand their products, or how customers use them.
Shame on them.
Monday, September 14, 2009
Social Networking
Names are kind of funny. We like to name things, because it gives us the illusion of understanding them and the hope we will ultimately control them. When we name a disease, for example, we begin to think that one cure – if we're lucky or smart enough to find it – will remedy all instances of the malady. Unfortunately we often mistake symptoms for diseases and forget that a symptom may have many causes. Cancer comes to mind. Or the common cold. We forget, too, that the disease might be entirely imaginary – caused by mass hallucination or hysteria.
"Social networking" has a lot in common with diseases in those respects. It isn't a single thing, nor is there a single way to deal with its many instances. Giving it a serious-sounding and techno-babbly kind of name may make us feel as though it's one thing and that we understand it, but those impressions are false. It might even be imaginary, brought on by exposure to the radiation of computer monitors and Blackberry LCD screens.
We like to categorize things, too: also so we can understand and control them. Ever since Linnaeus we've tried to categorize the flora and fauna of the Earth, for instance. But we've embarrassed ourselves many times because the categories we've invented have sometimes turned out to be meaningless and those in which we've chosen to place a thing have not always been the most appropriate.
"Social networking" is a whole jungle of creatures and they don't all belong in the same part of the zoo. Facebook and Flickr, YouTube and Twitter may be cousins, branches on the Internet family tree, but should they be in the same cage and fed the same diet? Maybe they're all in the kingdom "Digital," the phylum "Internet," and the class "Social," but are they all in the order "Advertising Medium?"
Marketers are some of the most dedicated namers of things. One of them came up with a condition known as "halitosis" in order to sell an elixir for bad breath. Many folks who sell advertising and technology consulting services have latched onto "social networking" as a way to foment a profitable combination of greed and fear, dread and avarice in the marketplace for their wares. It helps that nobody really knows what "social networking" is (it can mean anything you want it to mean), but everybody wants to turn it to his own advantage or save himself from its potential ravages.
We sure do like to try to turn everything we encounter to our advantage, and that can bring good results or otherwise. Thankfully some guy long ago saw a spiny lobster crawling around and said "I don't care what it looks like, I'm gonna eat the thing." But it's not a good idea to leave infants unattended around cans of paint thinner.
Many thirsty folks these days are thinking seriously about swallowing the social networking Kool-Aid. I guess we'll find out how that turns out.
Friday, August 21, 2009
A View of Mt. Twitter
Doc Searls is a most interesting fellow and he has a wonderful sense of metaphor. The other day he wrote that tweets have "the impact of snow on water" while "blogging is geology."
Tweets, as you know unless you've been comatose for a while, are those usually trivial and often incomprehensible mini-messages that some folks like to send out into cyberspace from their phones or computers in hopes of relieving their feelings of inadequacy and/or irrelevance.
Some folks in Iran did a lot of tweeting after the recent elections there, you may recall, and there were important social and political reasons to demonstrate their relevance. Their tweeting served the common good, to be sure, which shows that twittering offers real and potential benefits. Still, so much drivel, so much snow on water.
Not that there's anything wrong with drivel. It serves a purpose. It's part of the glue that holds people together, and there's value in that.
Twittering is sort of like the "Active SETI" project that attempts to send messages to intelligent aliens (should there be any) elsewhere in the universe. Both twitterers and the Active SETI people assume somebody may be out there listening for signs of intelligent life, though tweet-makers sometimes seem to be less concerned about the intelligent part. In the case of SETI, the subtext behind the messages is "You are not alone," while for tweeters it's often "I am here."
Doc's point, if I may be so bold to hazard an interpretation, is that tweets are ephemeral – part of the babble of the human brook flowing by. Blogs, on the other hand, become part of the record of human experience, just as sediments become a record of biological and geophysical events.
No doubt part of the appeal of Twitter is that it's so darned hip. But another part is that it IS ephemeral, which makes it a low-risk form of communication. Tweets aren't as likely as blog or Facebook postings to come back and haunt us someday. They go away pretty quickly, almost as fast as the remarks we make in conversation, so we can be spontaneous and frivolous and not fear that others may use our words to our detriment in the future – to make us seem supercilious or trivial or careless or worse.
I think Twitter may be changing blogging, making its recording function more significant and its reporting function less so. People use Twitter now to point others to things they find interesting or provocative and to publish trifles – things they might have formerly done with weblogs. Blogs are, I think and hope, becoming a medium for more carefully considered and painstakingly prepared messages. Blogs may become more worthy of the preservation that is part of their nature. They may become more interesting. They may even remain interesting to the cultural archaeologists who will dig around in them in the future to find out what people were like back in the early part of the 21st century.
It's a commonplace that each new medium adopts some of the characteristics of those that preceded it. Television, before it found itself, was a lot like radio – but with pictures. It's also true, though, that new media change those that are already in use. Radio became something different when TV came along. Twitter is a new medium that has taken to itself some of what was once the purview of the blogosphere, and I expect that blogging will change, now that the "frivolous" stuff we can't stop ourselves from producing finally has another place to go.
Of course, all this presumes that Twitter will persist long enough to make an impact beyond the few million digitally devout souls who use it now. Or that something else will take over its niche. It seems important enough to survive, but I wonder if its importance might be an illusion.
Seems like Twitter may seem important mostly because people talk about it. When people talk a lot about something, marketers perk their ears up, wonder if they can use it to sell stuff, and start sniffing around like dogs around a sandwich bush. When people with money in their pockets start sniffing like that, the cadre of consultants sees an opportunity to transfer some of that cash into their own pockets. Those consultants join the crowd of talkers. And pretty soon you've got a phenomenon on your hands, and pretty soon after that it becomes a mania.
There is a marketing principle that says the best way to success is to stake a claim on top of some mountain, where the mountain is an idea or a proposition or a gizmo or what marketers call a "category." Stake a claim at the top where you can be most visible. Many companies and would-be gurus are battling for control of and visibility atop Mt. Twitter – which seems to be about the highest peak on the horizon these days. I wonder, though, if Mt. Twitter is a real mountain or just another hill piled so high with curious marketers and hungry consultants that it has the look of an actual mountain without the granitic core to hold its own against the forces of erosion.
Time will tell. It always does.
Thursday, July 23, 2009
The Future of the Newspaper

Certainly technology will bring profound changes to newspapers and to the ways in which people experience news and information. But whether newspapers survive in their present physical form or some other, I expect them to evolve in significant ways if current trends continue. Their evolution, I believe, will take inspiration from the media with which they compete.
People often decry the bias of news sources, yet the most biased commentators are often the most popular. What we publicly decry may be exactly what we privately crave. We are naturally predisposed to accept and agree with interpretations of facts that support our preconceptions and, similarly, to distrust and differ with those that conflict with them. A trend in information media, facilitated by a greatly expanding number of media outlets, is toward increasing segmentation along socio-political lines.
It's a disturbing development, rooted in the profit motive that is essential to the current model of commercial information providers. Communication was once called "the glue that holds society together." It has become, instead, an adhesive that more tightly bonds individuals of particular socio-political leanings to one another, rather than a unifier of human society as a whole. Success in mass media once required providers to appeal to a broad audience but now it is possible to thrive in a niche.
I expect that xenophobia will spread from talk radio and cable punditry to newspapers. A 2004 study of young adult readers by Readership Institute found, not surprisingly, that "people want to read about people like themselves in their local daily newspaper," and "There is less interest (in) coverage of groups to which one does not belong." Perhaps newspapers will become more overtly opinionated in their coverage, cater more to the xenophobic tendencies of their readers, and position themselves more as the voices of specific identity groups. Already we are seeing more opinion, gossip, and biased analysis creeping from the op-ed pages into formerly hard news sections; more column inches throughout local papers topped with the photos and by-lines of their own celebrity pundits.
Another trend, although it has always been prevalent, is information as entertainment. By far the most popular newspaper features are the comics and sports pages. In advertising and news content, readers under 35 prefer information about "things to do," such as recreation and local activities, and "ways to get more out of one's life," such as health and fitness features. Reports of events from around the world are instantly available on the Internet, through Twitter, and on radio and television. Newspapers are unable to compete with the immediacy and pungency of these other media, so we can expect their focus to shift away from event reporting in favor of lifestyle features, amusement, and the narcissistic concerns of their audience.
A third trend I will call "informer-as-celebrity." I think that in a strange way Walter Cronkite is to blame – not personally, but because of the value he brought as an individual to the CBS television network. The other networks competed against Cronkite's highly successful "that's the way it is" reporting not by doing a better job of authoritative, credible coverage, but by emphasizing the personalities of their own anchors. They fought substance with style, and it proved to be a successful strategy.
NBC created "The Huntley-Brinkley Report" to succeed its "Camel News Caravan," tellingly replacing the name of the program sponsor with the names of its anchors. Chet Huntley and David Brinkley were superb newsmen, but their network traded on their personalities rather than their journalistic acumen. National and local news outlets followed suit and polyester-haired anchormen (and later, women), along with clownish weather and sports reporters filled the airwaves with happy-talk news programs. The spokespersons became the medium and largely the message of broadcast information.
The spawn of the ménage à trois of these trends – socio-political segmentation, information-as-entertainment, and informer-as-celebrity – is hostility-as-entertainment. What appeals to a large audience about cable television's motley crew of bloviators is the anger and rage they express and the gleeful pleasure they take in bitterness, insult, derision, and obstinacy. "Yellow" journalism – sensationalism, scandal-mongering, and unprofessional practices – has a long tradition; it's nothing new, and it's always masqueraded as "real" news. In the past, it has been part of a newspaper's overall brand and only occasionally identified with a specific reporter or columnist. We may well see more – and more outrageous – sensationalism as newspapers experiment with ways to emulate the appeal of their broadcast competitors. And I expect that a breed of bullying celebrity journalist "stars" will become more important to each newspaper's brand.
One can expect other trends as newspapers cater to their perceptions of audience demands.
Young people think newspapers are too big; they prefer concise, bite-size news. According to the 2004 Readership Institute survey, this group tends to agree that: “I wish this newspaper had fewer pages,” “It has too many special sections,” “It tries to cover too much,” “Too many of the articles are too long.” The same organization's study of a broader reader group similarly concludes that people who feel overwhelmed by news, tend to read newspapers less.
Motivated to expand – or maintain – their readership, newspapers seem to believe that their regular, devoted readers can be counted on to continue their newspaper habit, so they are catering more to "lighter readers" – ironically by providing less: fewer pages, shorter articles, and more limited coverage.
Younger readers also say that they highly value "dynamic visual treatment," and newspapers are certainly trying to cater to this with their colorful eye-candy designs – just as cable news relies heavily on high-tech graphics and the endless repetition of dramatic imagery.
Whatever physical form the newspaper takes in the future, we can expect news delivery media to: target segmented audiences; appeal to narcissism, xenophobia, and the thrill of sensationalism; rely on celebrity pundits; deliver less news more concisely; and do it all with dazzling graphics.
Sorry, news fans, but "that's the way it is."
Friday, November 7, 2008
Mark Twain Doesn't Live Here
The website statistics tell me that just yesterday Google sent ten people here (darned near a single-day record), who had typed one variation of that saying or another into the search field. (Yahoo sent a total of none, which may indicate why that company is on the skids.)
I typed the first part of the phrase into Google myself, just now, and this blog came up third on the results page. Kind of gratifying, I guess. Another blogger over at the Humanities Division at Northwest College has put a link to my "It ain't..." post on their website, and that seems to have brought some folks here, too. (To return the favor: it's here.)
I don't know what it is that fascinates so many people about a thing that Mark Twain may have -- or may not have -- said. But people in California, Illinois, British Columbia, our Nation's Capitol, England, Texas and even Vietnam demonstrated on the same day this week some curiosity about my favorite aphorism.
I've written in this space about John McCain and why the McCainines lost the election. Real important and insightful stuff, I thought. But nobody seems curious about that.
I've posted some stories that I've passed off as humor, and few people seem to give a hoot.
Somebody checked in from Durham, North Carolina, didn't see what they were looking for, and bounced away in under a second, while a devoted fan in San Francisco visited three times yesterday, looked at three pages each time, and spent all of eight minutes here -- probably looking for the exit.
One individual dropped by to find out something about Arthur C. Clarke, who I happened to mention in one post, and stuck around for 17 minutes to peruse 6 pages. This is an example of how the Internet can get you off track. Whoever that was got distracted by other things and totally forgot why he or she came into the room. I sometimes do that myself, so I understand the feeling.
If there were some way to make a buck off people's curiosity about "It ain't what you know..." I would sure like to know what it is. More than that, though, I'd like to find out why people in so many places in the world are so darned interested in it. Must be important enough to them that they spend their valuable time on Google tracking down the phrase.
Google Analytics doesn't let me know who you are, but it shows me a little bit about how visitors got here and where they hail from and even what browser they use. I wish it would give me some insight into what the heck they're doing here, what they were thinking.
So, do this for me if you'd be so kind: Leave a comment and let me know why you dropped by. What were you looking for that you did or didn't find? I won't be offended if you got here by mistake; most of my visitors probably did.
Wednesday, November 5, 2008
How John McCain Lost
Perhaps the biggest no-no in a political campaign is to allow oneself to be defined by the opposition. That didn't happen to McCain; he did it to himself.
Seems to me that many people are more influenced by the persona a candidate projects than by the candidate's stands on specific issues or his professed beliefs and values. Even specific deeds, such as McCain's ill-advised selection of his running mate, are more viewed (at least subconsciously) in the larger context of what they reveal about the general character of the man, his overall essence, and less as insights into his decision-making abilities or other specific attributes.
McCain identified himself, repeatedly and with uninhibited relish, as the underdog. I don't think he could have prevented himself from doing so. It's his nature. (Surely some psychoanalyst is working on a book about McCain's psyche and its roots, so I'll leave the scrutiny of his id and ego to the shrinks. They can speculate about the "victim syndrome" and how it relates to his ancestry, his family's early disappointment with him, his imprisonment, and all that other psychobabble rubbish.)
While "everybody loves an underdog" and we may root for them at times, most of us don't really believe that an underdog is the right choice for the "top dog." I think that view is programmed in our genes. (More cud there, with my compliments, for the shrinks to chew on.)
In what ways did he act the underdog?
-- He viciously and unfairly attacked his opponent when he might have stood proudly on his own achievements. He snarled about irrelevancies and yapped at Obama's heels--while the latter stood firm and resolute, composed and presidential.
-- He emphasized trivial, inconsequential chinks in his opponent's armor.
-- He partnered with an insubstantial running mate of trifling accomplishment and minimal intellect, who likewise yipped about petty matters--another underdog who proudly self-identified as something akin to a "pit bull."
-- He introduced us to his friends and most ardent supporters, Joe the Plumber and a mangy gang of rabid hounds, and together they gave the impression of a pack of growling mongrel misfits more suited to a kennel than the White House.
-- He appealed to the insecurities of factions of the electorate: people who feel like underdogs themselves and thought McCain's mongrels were "just like us."
-- He whined about being treated unfairly--a common tactic of frail children who are incapable of defending themselves.
-- He repeatedly raised the specter of the usual bogeymen: higher taxes, socialism, terrorism--rather like a hound barking at the wind in the trees.
-- He charged his opponent with the crime of celebrity--implying that he himself was the antithesis of a superstar, the runt of the litter.
-- He self-consciously lowered himself to a more humble plane than he deserves by constantly addressing the public as "my friends." I don't know whether he did this because of an irritating rhetorical tic or as a desperate ploy to gain acceptance, but either way the habit made him seem pathetic.
But to appear pathetic ("provoking feelings of pity") and feeble was apparently his goal. For he actually TOLD us--on many occasions and most frequently as the contest came down to the final days--that he WAS an underdog, and proud to be one.
And we listened, and we believed him, and we followed the bigger and better-bred dog.
Monday, October 6, 2008
It's a Debate, Wink, Wink
What the listener failed to understand is that political "debates" are hardly the same as the kind you might have engaged in when you were in high school. They are theatrical performances in which one's ability to convey predetermined messages--irrespective of the topic at hand--is greatly prized and highly rewarded. Avoiding-the-question is an important skill for politicians and diplomats--they do it all the time. While I am no fan of Ms. Palin I give her some credit for her ability in that regard, although her execution remains a bit clumsy.
In a preview of the debate for the San Francisco Chronicle, staff writer Joe Garofoli described the techniques of Bridging, Hedging, Hooking and Flagging -- all designed "to maximize performance." None of these tactics will win points for your debating team, but they're the stuff that political jousting is made of. "Bridging," says Garofoli is "Used to avoid answering directly and pivot to one's main messages. Example: 'I understand your point. The more important issue is ... (insert key message)' or 'No. I'd like to explain ... (insert key message).'"
I doubt Palin read Garofoli's unsolicited advice ("If you're stumped, don't be obvious about steering the questions back to a safe knowledge harbor"), but she did just that several times. Natural talent, I presume.
Another thing that counts on the stump but not so much in high school debates is body language. Gestures, tics and physical appearance can win or lose points with the electorate. Nixon perspired: bad. Gore scoffed and snorted: bad. George H.W. Bush checked his watch: bad.
Both Palin and Joe Biden used gestures well last week; she with her wagging head and hypnotic eye-contact, he by cupping his hand to his ear to visualize that "I haven't heard" the difference between McCain and Bush, pounding the lectern to underscore his side's determination to end the war, and pointing a finger for emphasis each time he said, "Let me say that again..."
I have to give the advantage to Biden, though -- and not only because I find Palin frightening. Her gestures underscored her positioning as "just plain folk," which I take to mean "inadequate for the job," while his helped to articulate his commitment, sincerity and strength. Her smiles were broad, but seemed disingenuous.
And her winks! My God, those WINKS!

For a commentary on those, please see "Sarah Palin, all-American cheerleader" by Tim Kingston and Lisa Moore from this morning's Chronicle on sfgate.com. I'm not sure Sarah's eye-squinches represent "the promise of power in exchange for sex," but they sure seem manipulative to those of us who don't consider the moose-hunter from Wasilla to be a hottie.