www.TrendsInAdvertising.com
Brought to you by Communicus
Design for Your Viewer and Their Goals
Mar18

Design for Your Viewer and Their Goals

The rise of infographics, scorecards, dashboards and the need for clarity in reporting has made data visualization one of the most important analytical topics of the last few years.  At Communicus, we’ve thought hard about visualization and how to employ it as effectively as possible.  A focus on the audience and their goals is a great place to start any design project.  Understanding the way visualized data is going to be used is critical for creating the highest impact and most engaging reports and presentations.  Below, we highlight three possible goals that our typical audiences might have in mind when engaging with data: Decision-making, Discovery, and Drama. Decision-making We all know that in many industries decision-making has become increasingly data-driven over the last few decades.  In this time, we’ve also seen a massive increase in the amount of data available to help inform these decisions.  Visuals for decision-making are most powerful when they provide the all of the most relevant data in a format that is as simple as possible.  KPI scorecards are often a good option for this goal.  They typically provide a structure that allows for a high density of critical information, while keeping the visualizations themselves streamlined, predictable, and intuitive. Discovery Data discovery is the use of visualizations to aid in data analysis.  It has become increasingly clear over the last decade that visuals are not simply for the purpose of displaying information to others, but for better understanding the data that you have gathered. Discovery-focused visuals should be dynamic, interlocking, and allow the analyst who will be using the visual to quickly move from insight to insight to get to answers and solutions.  When designing for discovery, focus on maximizing ease of transition between complementary visualizations, and provide a streamlined user-experience for interactive options as much as possible. Drama Visuals for storytelling are critical for conveying insights that are based on complex analytics to those who are less quantitatively inclined, have less exposure to data, or simply are more interested in the big picture than the minutia of a particular data set or project.  Used correctly, graphs and charts can bring your data to life, and add critical drama to the story you are trying to tell.  Visuals for drama are also often the most difficult to make well.  As Tufte and others stress in their work, when designing for storytelling, graphs should maximize the data displayed and minimize non-data (Tufte calls this the data-ink ratio).  While it can often be tempting to add aesthetic bells and whistles to our visuals, these almost always have the effect of diminishing the story being told by the...

Read More
The Perils of Copy Testing in Today’s Advertising Environment
Mar16

The Perils of Copy Testing in Today’s Advertising Environment

Advertisers spend millions of dollars copy testing ad executions before in-market launch, often testing in rough stages to avoid producing a commercial that could potentially be a weak performer. This research investment is widely regarded as prudent, to ensure that the investment isn’t wasted on advertising that may not perform. However, for the copy testing investment to be sound, it has to result in good decisions – which means that the predictive metrics that are provided must be solid. Unfortunately, the underlying principles upon which the major copy testing systems are built are no longer suited to today’s advertising environment. This mismatch between how advertising works and what copy testing measures has resulted in predictive margins of error so wide as to be nearly unacceptable. Copy testing can identify some of the losers but it gives far too many ill-performing ads a passing grade, resulting in millions, even billions, in wasted ad dollars. Click HERE for...

Read More
Copy Testing Without Accountability
Oct22

Copy Testing Without Accountability

Most advertisers would love to know in advance whether the new ads in which they invest so much were going to actually work in-market. So they copy test, often testing several different concepts, selecting a winner, and then fine-tuning on the basis of the diagnostic feedback they gain. But how well does copy testing work? Two of the larger copy testing systems have been developed and refined – reverse engineered, really – on the basis of those companies’ in-market measurement systems. For reasons discussed below, there is a fairly wide margin of error in the predictive accuracy of these systems, but advertisers who use them do at least improve their odds of success. Then there is the rest of copy testing – the non-accountable systems. Copy tests that claim to predict how well the ad is going to work in-market (either via traditional survey designs or biometric measurement) but have no in-market feedback loop cannot provide any degree of confidence in their predictions. If you keep testing ads, keep deciding which are the winners based on getting better scores on your metrics, but haven’t validated and refined your algorithms, how can a client be sure that your system works? The ‘diagnostics’ that are provided by these non-accountable copy testing systems are equally suspect. The supplier can tell you with a high degree of confidence how to make the ad better. But if better means it’ll score higher in their system, and scoring higher in their system has no apparent correlation with in-market success, you’re refining to the copy test, not to the real world. Based in part on the frustration with traditional copy testing, another new category of diagnostic systems has emerged. Because we know that advertising often works most powerfully on an emotional level, advertisers have been experimenting with brain wave analyses, facial expression tracking and other means of observing how ads make people feel. Some have found this feedback to be of value as they seek to improve their ads’ ability to connect and persuade. These emotion-based measurement systems are, however, a long way from providing predictions of in-market performance, with most not even attempting to build quantitative links with actual in-market performance. But even the copy testing systems that do have the feedback/refinement loop struggle mightily to provide reliable predictions. You’d think it wouldn’t be so hard – Will this ad engage? Will the brand be remembered? Will it persuade? Copy testing fails in large part because it does not acknowledge that the world has changed. First, people don’t watch TV the way they used to – there’s less family around the TV all watching together,...

Read More
Connecting consumer preconceptions and advertising engagement
Oct02

Connecting consumer preconceptions and advertising engagement

Why is it that some individuals choose to engage with a particular ad while others ignore it all together? That’s the million-dollar question. Creators of advertisements strive to ensure that their ads secure the attention of the widest possible swath of consumers within their target audience. With that said, not all consumers are created equal when it comes to the likelihood of engaging with a specific ad for which they have exposure opportunity. Click HERE for...

Read More
Will Your New Creative Approach Succeed In-Market?
Sep24

Will Your New Creative Approach Succeed In-Market?

A recent article in Ad Age discussed an issue that we’ve been hearing a lot of clients talking about lately – the problems with old copy testing methods in a changing world. The topic of copy testing has always been polarizing. One camp firmly believes in copy testing’s ability to pick the winners, provide actionable diagnostic feedback and thus mitigate risk and ensure advertising that works in-market. The other camp has always firmly believed that copy testing isn’t actually very successful at predicting winners, instead rewarding formulaic advertising and stifling creativity. Dramatic shifts in the world of advertising have created even more pressures on copy testing, with advertisers needing more rapid feedback in a fast-moving world, and the expansion of online branded content requiring ever more copy be produced, including copy that is highly customized for specific audience segments. Many campaigns that have been copy tested fail to produce in-market success, lending credence to the argument that copy testing doesn’t work all that well. (Of course, this begs the question of how much worse the overall advertising environment might be if copy testing didn’t exist…) On the other hand, there are countless non-tested approaches that crash and burn (or fizzle and die) when they hit the market, suggesting that not all brave anti-copy testing senior executives are all that prescient either. It would be great if all advertising decision makers had the intuition and foresight to identify the potential power of ‘Just Do It’ – to name just one of the countless, non-copy tested classics. This individual also needs to be able to identify with certainty those ideas that look good on paper but will fail to connect – either because they are actually small, non-breakthrough ideas or are clever but don’t speak effectively to the hearts and minds of the target audience. But, unfortunately, the kind of advertising wisdom and judgement that can make those calls – not to mention the senior management culture that’s okay with a no-copy testing scenario – isn’t found in many companies. I would suggest that the odds of launching a successful new campaign – or even of producing strong executions within an existing campaign – can be improved dramatically by advertisers who take a more disciplined approach to examining and learning from past in-market results – both successes and failures. Examining actual marketing results in an in-depth manner, and documenting the foundational learning that results, can lead to both dramatic and incremental improvements that are, in fact, highly predictive of both short term and long term advertising success. No, I’m not talking about simply re-executing what’s worked in the past. To the...

Read More
TV Commercials: Why Creative is King
Apr10

TV Commercials: Why Creative is King

Of course it’s important to create great ads, everybody knows that. Why, then, is most common theme among advertisers who want better sales results is their belief that they need a bigger ad budget. Some believe that their advertising can’t be expected to work all that well because they’re being outspent by the competition. Additionally, much of advertising research is focused on counting and optimizing exposure opportunities, and attempting to correlate exposures with sales results. Most advertisers and ad researchers behave as if there’s more power in how much you spend and where you spend it than in the creative that you get in front of those eyeballs. Nothing could be further from the truth. Communicus norms for identifying the branded engagement achieved by advertising campaigns across a wide range of spending levels make this clear. The ‘average’ $20mm campaign achieves branded engagement (the first step to reaching sales goals) of 53%. An advertiser who invests $20mm in media for a multimedia campaign, with great creative, can achieve branded engagement of 80 to 85%. Other, less successful advertisers, that spend the same $20mm, achieve branded engagement of less than 10%! Increase spending by 50%, to $30mm, and the average campaign shows a 6% bump in awareness. A doubling of the ad budget nets the advertising another 5% gain. All told, double your ad budget to gain 11% better results, or spend the same amount on your campaign, improve your creative, and potentially achieve 800% better results. The choice is obvious. When it comes to the creative power of TV specifically, three years of Communicus Super Bowl findings confirm that executional excellence is paramount. Only one in five Super Bowl ads persuade those who engage with the commercial to change their behavior and behavioral intentions. The fact that the other four of five ads don’t trigger the same results isn’t because of the media buy, the position during the game, or placement within the commercial pod. It’s because the creative wasn’t persuasive. When it comes to engagement, Super Bowl commercial performance ranges from a low of 10% to a high of 74%. Branding ranges from 9% to 99%. And these differences can’t be explained by commercial position, either – even in 2014, a blow-out game, commercials that aired in the fourth quarter were equally likely to succeed as those that aired at the beginning of the game. The difference between success and failure: the commercial creative itself. Our advice to advertisers and ad researchers: Stop asking, “How much should we spend to get the results we want?” and start asking, “How can we make the dollars that we are spending...

Read More
Page 1 of 212