January 10th, 2013 by Gwen McCarter
Tech innovation is the stuff of magic when it presents us with the devices we never knew we couldn’t live without, or when it fulfills our collective geek dreams to see Star Trek become reality. (Now that we have the iPad mini with FaceTime, here’s hoping teleportation is next.)
And then there are times when perfectly useful and good technology morphs into something hideous and terrifying. In the case of The Terminator (or, more accurately, in Terminator 2: Judgment Day, which gave me many a nightmare), machines were positioned to take over the world because of one scientist’s desire to see his research through to its full potential. (If Joe Morton’s character couldn’t have foreseen the consequences in 1991, fair enough, but this wipe-humans-off-the-earth scenario has apparently become so much of a pressing concern that Stephen Hawking recently joined an “anti-robot apocalypse think tank” called The Cambridge Project for Existential Risk.)
A much more common and seemingly benign version of this shift happens when a popular tech innovation is stretched just a little too far. Take high-definition television (or the floundering 3-D fad, for that matter). In contrast to the charming pixelation that we were willing to tolerate for decades, HD has felt nothing short of amazing as a way to make pictures clearer and experiences closer. But the time has come when companies are attempting to make that experience so close that it’s no longer comfortable. Instead of interacting with a story through a filter that leaves something of its world to the imagination, instead of looking at a news anchor through a lens that kindly leaves his or her pores out of the picture, we’re confronted with an in-your-face, never-before-so-intense degree of the hyperreal.
This morning, I came across news that one of the products being showcased at the 2013 Consumer Electronics Show is a horrifying ultrahigh-definition television set. New content will have to be created specifically for this device because it contains four times more pixels than current-generation HD TVs (a difference that would allow viewers to see the veins on a leaf, for example…or on Lady Grantham’s hands, should she go without gloves). According to NPR, only about 50 films have been shot using an ultra-HD camera since 2004.
I can imagine the reasoning behind the decision to introduce this 110-inch monstrosity relied on the idea that people love HD, so extreme HD must be the ticket to even greater success. More is more! That logic certainly applies to some things in the world, and not everyone despises the ultrareal experience as much as I do, but tech innovators are at risk of missing an important trend: the reintroduction of the human.
As a counterpoint to all things glitz, glam, corporate, and mass-produced, many of us have been gravitating for some time toward objects and experiences that are handmade, flawed, underground, and singular (hello, Etsy! hello, letterpressed and microbrewed everything!). The point isn’t necessarily to displace perfect objects completely, but neither is it to only fill our homes with handcrafted goods from Brooklyn. It’s more so to strike a balance, taking advantage of tech advancements while making sure we don’t lose touch with the human condition in the process.
In a December New Yorker piece, Trent Reznor discussed a nascent project–a collaboration with Beats Electronics that will launch a new kind of streaming music service as a follow-up to the likes of Spotify and Pandora. The idea behind the project (known informally as Daisy for now) is to offer suggestions to the listener through a combination of algorithms AND expert curation. According to Reznor, there was a need for this type of service because the first generation’s model “has begun to feel synthetic.” In another article, Reznor was quoted as saying Daisy would be a platform “in which the machine and the human would collide more intimately.”
In this new year, I’m looking forward to seeing more projects pushing in that same direction, more thinking that doesn’t thumb its nose completely at technology but that helps us reclaim some humanity in our digital lives.
Meanwhile, Daisy is set to come out in early 2013, and I will be watching The Graduate in glorious, romantic Technicolor on my regular old high-def TV tonight.
= = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
The PARAGRAPH Project is a marketing research and strategy firm based in Durham, NC. We are, at times, a strange brew. But this is what works for us — and inevitably, it works for our clients. The types of people who work at PARAGRAPH are strategists, anthropologists, artists, engineers, entrepreneurs, negotiators, students and builders. Herein lies our value.We are able to look at problems from many different perspectives and apply this diverse point of view to solutions for our clients. After all, if we conduct the same research in the same ways as our competitors, what advantage do we gain? By using old research methodologies in new ways and inventing new methodologies unique to each client’s research objectives, we quickly explore more territory to find insights often overlooked. We believe creativity is the missing link between useful information and actionable inspiration.
February 4th, 2011 by Gwen McCarter
This month’s edition of Fast Company reports the triumphant return of the matchbook-as-marketing-tool. You might be wondering what the big deal is, but this is exactly the type of trend I like to see.
For one thing, it demonstrates creativity — in this case, a reinvention of expectations and purpose. After all, with more and more states jumping on board the smoking ban bandwagon (35 of them, at last count), matches might have seemed as if they were on their way out. And we’re not talking about your big 250-count “strike anywhere” boxes, which will live on as long as we have gas stoves, power outages, candle-lit dinners, and wood-burning fireplaces. (By the way, has the design of that box ever changed?) No, we mean the diminutive boxes, books, and occasional tubes that we’ve all received over the years at restaurants, bars, hotels, weddings, fashion boutiques, and who knows where else. And after all this time, to breathe new life into something as seemingly functional as a matchbook certainly requires a little ingenuity. What we’re seeing here is a reevaluation of what the object means.
Of course matches “have a certain charm,” as Fast Company notes. But the imaginative part of this matchbook revamp is more than that, and it’s beautifully clear: these objects are tiny pieces of physical culture. Products of a particular time and place, they are not just a tool; they are, simultaneously, an experience.
Picture this: when a company gives me a matchbook with a distinctive design, they’re giving me an invitation to remember them over and over again, and to recall the (hopefully) amazing evening out I had at their new restaurant. I want to keep hold of those memories, and so I hang onto the matchbook. It has become infused with meaning of one sort or another, so I don’t throw it away as I might an event flyer. And by keeping the matchbook, I get reacquainted with the brand each time I use it. In short, these matchbooks are a way to reinforce unique and intimate brand experiences with a tangible, unmistakable token.
Not too shabby for an object that had allegedly seen its heyday come and go. Another new and inventive way that repurposing is becoming all the rage.
January 5th, 2011 by Gwen McCarter
For weeks, I’ve been musing about what it means to be bold today. If you think this task has the potential to snowball into an awkwardly grandiose endeavor…well, you’d be right. And so, I’ve also been thinking about how to avoid that problem: It’s usually helpful to ask smaller, more pointed questions, seek fewer ostensibly comprehensive answers, and look around to see what’s in the air. All good tactics for evading pointless speculation, and for achieving something concrete and timely.
I suppose I’m a product of my experiences, and after all my years in school studying cultural difference, it’s clear that what I’m really interested in is not the definition of boldness, but rather how boldness gets enacted in the world. (If you have anecdotes of your own, post ‘em here!)
You see, this whole thing started a little while ago when friends of ours asked us to name a few of the Triangle’s boldest leaders. And what made that question interesting to me was the difficulty I had answering it. Of course, some aspects of boldness never seem to change: courage, defiance, and a rare ability to shake things up in a way that inspires others to do great things, too. But it also feels as if talking about boldness has become a more perplexing task.
Why? Simply put: because it’s easier to give the impression of being bold today without actually delivering. And as a result, the purpose of boldness has become murky. More than anything else, what’s missing is a greater emphasis on action. If there’s one sure thing about boldness, it’s that no one will know you’re a bold thinker if you aren’t a bold actor, too.
To illustrate the point, we need only think about noise. Chatter. A veritable din. We live in a society where more people are free to voice their opinions than ever, and everyone with Internet access also has a soapbox within reach. In many ways, this democratization via technology is empowering. And as Malcolm Gladwell wrote last October, it’s not our imagination that social media outlets such as Facebook, Twitter, and various blogging platforms are “making it easier for the powerless to collaborate.”
But Gladwell also warns against mistaking online activity for real-world action. The digital setting is often confusing because boldness online can feel both satisfying and effortlessly productive. If we want use the example of activism, social movements that grow online can amass a follower base of millions. All the same, the palpable impact of those virtual efforts can be an entirely different story. Gladwell happens to cite the Save Darfur Coalition’s Facebook page as one place where participation is high but commitment and investment are relatively low (he puts group membership at nearly 1.3 million and the average donation at 9 cents). But the same could be said of a number of other initiatives — social media-based or otherwise – that don’t or can’t place enough emphasis on backing their bold online campaigns up with tangible follow-through.
So, for most everyone, it wouldn’t hurt to spend a little more time in action. At the same time, a single bold act cannot be your end game; it needs to be well conceived as part of a larger strategy, supported by other, more sustained initiatives.
For example, when it comes to bold fashion statements, the trick is to be arresting. Inciting people to discourse is a good thing. But that type of boldness still can’t stand on its own. There has to be more depth. Think about Lady Gaga’s raw meat dress from last September. That garment has earned its keep in the popular imagination — for better or for worse — but it’s not all the Lady herself has to offer. It’s just one part of her public persona that is also comprised of hit songs, popular music videos, and sell-out live performances.
If there’s a lesson here, it’s that boldness needs a purpose. Being bold for its own sake might sound like a positive thing, but it should make sense in the grand scheme of your personality — or your brand’s personality, for that matter. Today, seizing someone’s attention with a stunt is not enough. You have to get people talking, get them moving, and keep them that way. So go ahead and experiment with being bold, but make sure you can keep the revolution going.
October 15th, 2010 by Gwen McCarter
Word on the street is that the new CEO-led Healthy Weight Commitment Foundation wants to cut 1.5 trillion calories per year from the U.S. marketplace by 2015.
What gives? Isn’t modern life supposed to be about having as many options as possible, not fewer?
The weird part about this is not that someone is trying to make the world a better place. (We’re cool with that one.) It’s that it seems like people aren’t being trusted to make their own decisions.
Will we soon enter a dystopian future where everyone is forced to eat exactly the same thing, with the fare simply disguised in such a way as to make us feel like we can still exercise some semblance of choice?
(I’m channeling the 1985 movie Brazil here in all of its satirical glory. Imagine the scene where the character Sam [actor Jonathan Pryce] is confronted with his lunchtime choices: eight piles of vibrantly colored slop. They might all be eye-catching, but they’re still just slop. Begrudgingly, Sam orders the number 3 — the “steak.” It arrives as three ice cream scoops of pink mush. Maybe that’s a step up from the green piles that form his mother’s number 8?)
In any given five-year period or decade, there is a new taboo product that you’d better not even think about eating. Eggs, anyone? At one point, eggs had too much cholesterol. And then it was okay to eat egg whites, because the evil really lay in the yolk. Now, aside from the occasional salmonella scare, eggs are back in — yolk and all.
Do consumers want to be saved? Maybe so, especially when it comes to keeping the paint on children’s toys lead-free — but that’s a matter of protecting against threats that aren’t obvious to the naked eye. And the issue of how to best protect children is a nut that needs to be cracked somewhere else. I suppose the real question here is whether consumers want to be saved from themselves, which is exactly what limitations on unhealthy foods would ostensibly try to do.
What people desire are real options, not ten flavors of bland. With very few exceptions, we all have changing tastes, and we all want the ability to throw caution to the wind every now and then. Putting people in a dietary box is not the way to win hearts, minds, stomachs, or much else.
Last week’s Economist hit the nail on its proverbial head: “Kraft promises to reduce the sodium content of its North American products by an average of 10% by 2012. But will anyone eat them? It is individuals, not corporations, who hold the nation’s spoons.”
Just you try telling the millennial generation that you’re going to take enormous flat screen televisions off the market because they should spend their time doing something more constructive. Try restricting the signals on smart phones to only allow web access for a few hours per day, because people should interact more with other people, face to face.
Educating about healthy options is one thing, but prescribing lifestyle choices is an entirely different animal. Even the Cookie Monster has come out to say that cookies are a “sometimes food.” But the point is we can still have a cookie if we feel like it.
Go tell your customers that you know what’s best for them when it comes to food, technology, or anything else. And watch how fast the riots ensue.
August 25th, 2009 by admin
The relationship between any product and its customer can be broken down into a series of steps starting with desire and ending with consumption. However the number of steps and the level of consumer involvement at each step differs for every product.
Once you understand all the steps and the role of each, two forms of innovation can take place.
1. Remove Steps from the Process. This is the option that gets all the ink. Making things simpler for customers is the most basic form of innovation. Take a walk down the beer aisle and you’ll see what we mean. The twist off cap and the fridge pack are two examples. These innovations were born from the search for inefficiencies or unnecessary motions that could be removed from the process.
2. Add Steps to the Process. Although often overlooked, savvy marketers realize it can sometimes be beneficial to add steps to the process. Trader Joe’s, Target and Fresh Market actually added an extra step for beer customers by allowing them to assemble their own six-packs. It takes more time and adds complexity to the shopping process. But it creates new value for customers.
So, there you go. Two simple paths to innovation. Now you give it a shot.
July 1st, 2009 by admin
There has been some healthy debate surrounding Chris Anderson’s latest book, Free. I’d like to join in and give the world a definitive answer regarding technology’s ultimate impact on the pricing of intellectual property but I left my crystal ball at the track.
A not-so-subtle fact lost in this debate is that price is not the same thing as cost.
Every consumer decision – even those without a price attached – comes with a cost. Sometimes it’s time loss. Sometimes it’s a drop in status. Sometimes it’s a lack of convenience. Sometimes, in the case of most free online services and applications, the cost is allowing yourself to be exposed to ads. Or having to deal with a lack-luster user experience, having to waste time digging for what you’re looking for. (Yeah, I’m talking to you, YouTube.)
There are thousands of free products and services available to the American consumer right now. Many won’t survive despite the to-good-to-be-true price.
Pricing models are important. But giving your product away doesn’t guarantee success. What’s more important is making sure the value of your product exceeds the costs to consumers (both monetary and otherwise).
Sometimes free costs too much.
May 15th, 2009 by admin
The one thing that the whole Miss California dust-up has taught us is that controversy is powerful… sometimes more powerful than competence.
Her divisive statements forced people to take sides. It made those passionate about the issue speak up and join in the conversation.
Miss (long-forgotten) North Carolina was the most competent of the contestants. She left with the crown. But Miss California will be leaving with more speaking engagements and endorsement deals – for better or worse.
Being competent is rarely enough.
April 9th, 2009 by admin
“I don’t have a Twitter, a Facebook or anything like that. I kind of value people not knowing where I am or what I’m doing.” – Zac Efron
Marketers sometimes forget that trends aren’t unidirectional. It’s a lot like Newton’s law. When the public runs off in one direction there is usually an equal and opposite movement simultaneously taking place.
As Twitter and Facebook continue to get a lot of ink, keep your eyes open for a growing movement toward privacy and anonymity. Being un-famous, un-trackable, and un-reachable may become the most desirable thing in the world.
April 3rd, 2009 by admin
Over the past couple decades it seems we’ve evolved our market research practices to weed out respondents with extreme biases. We don’t want to include anyone in our research who rejects our product or uses too much of it. We don’t want anyone who is too shy or too talkative. Too young or too old. Too savvy or too inexperienced.
We go to extreme lengths to capture the opinions of the “average” customer. However, now more than ever, it’s the biased customer – not the average customer – that is driving our businesses.
Morgan Spurlock was biased. He would have never met the focus group criteria. But his film perhaps changed how McDonald’s does business moreso than any other piece of consumer research that company conducted over the past 50 years.
Marketers who try to eliminate biases from their research are sacrificing inspiration for consensus.
Each marketer has to ask themselves this key question: Would I rather hear one uniform opinion from eight identical people or eight different opinions from eight different types of people?
To truly come up with innovative solutions and ideas, we need to find mechanisms for harvesting diverse viewpoints.