Our July 2021 Facebook Ads Test

Our July 2021 Facebook Ads Tests

Antoine Dalmas Facebook Ads Guide

Every month, our expert team of Media Buyers performs a series of Facebook Advertising Tests to offer you expert tips on how to optimize your campaigns and strategies.

To be the first to receive our test results, subscribe to our iOS 14 Squad Newsletter! Click here to subscribe for free and begin taking advantage of our expert tips.

iOS Placements : July 30, 2021

Here is our new test delivered on July 30, 2021.

On a panel of 12 customers (lead gen & ecom), we noted an interesting fact with the passage of iOS 14.

In April, 6 out of 12 customers were spending more money on Android mobile than on iOS.

In May, we were at 7 out of 12.

In June, 10 out of 12 customers spent more money on Android phones than iOS.

From May to June, Android CPM took in an average of $2 more, while iOS only increased by $0.50.

From our side we did absolutely nothing to make that happen, we were still on Automatic Placement, so it was the platform that decided that. And we are definitely not the only ones, we have to believe that.

As we know, Facebook can be lazy sometimes, it's constantly knocking what works, even when it does not. So we need to know how to take it back in hand and force it to do something different.

Unlike iOS, Android retains all of its attribution and tracking advantages. It is normal that Facebook generates more results on it and that money is spent only on this OS.

But despite all this, Apple is not going away. Worse, it's getting a lot cheaper!

What if we gave Facebook no choice and launched 100% Apple campaigns without Android?

Here are the results of this test from July 30, 2021.

--------------------------

TEST #1 - CLIENT APP - 7 DAYS OF TESTING.

Context : We tested this initiative for a client with a mobile application.

So we spent a week testing an installed app campaign where we needed to generate as many installs as possible at the lowest cost.

Results : After spending almost $100, we only installed 6 apps with a CPI of $16, which is too high. Usually it's about $4.50 CPI.

Conclusion : I do not approve this test because the CPI is too expensive

-------------------------------------------------------------------------

TEST # 2 - ECOMM CLIENT- 6 DAYS OF TESTING.

Contexte : In June, I ran a 100% iOS campaign for an e-commerce client for 6 days at a cost of about $150 per day.

Results : We found many problems with Facebook's reporting that made our job more difficult. But the little we saw only matched our campaign, never surpassed it.

Conclusion : We do not approve this test.

-------------------------------------------------------------------------

TEST #3 - ECOM CLIENT - 2 DAYS OF TESTING.

Context : I had a campaign competing IOS only with the same adsets as our other Evergreen campaign targeting all devices. We found that CPM was very low with the IOS only audience, but that sales conversions did not follow.

Results : The iOS CPM shows €1.41 after spending €380 vs € 2.46 for the All device (with the same expense ). But the ROAS is 1.1 points lower!

Conclusion : We do not approve this test

-------------------------------------------------------------------------

TEST #4 - ECOM CLIENT - 7 DAYS OF TESTING.

Context : We already had a conversion-to-buy campaign on the website and wanted to compare the results with a campaign that targeted iOS phones only.

Results :The campaign did not generate sales, and the performance of this campaign was not very good in terms of other KPIs. So we paused after spending $110 on July 12.

Conclusion : The test was inconclusive because no sales were generated.

-------------------------------------------------------------------------

This is a big blow to the iOS14 team, as this is the second test we do not recommend repeating. It makes no sense to force Facebook to spend money on iOS devices, even if the CPM is extremely low …

We do, however, draw a lesson from this. The other failed test involved the assignment window (1 day click). So it's not interesting to run tests that force Facebook to use less volume.

Our work must now be based on the examination of creative advantages, funnels, purpose and other such objectives!

1 day clic attribution : July 23rd, 2021

Here are the results of our third test as of July 23rd, 2021.

As you must know, Facebook is an intelligent platform.

At least, that's what you'd think. When you ask Facebook to optimize for clicks, that’s what it does. For ‘add to carts’ and purchases, it does as well.

Yet, what if we ask the platform to shorten the customer buying journey?

That’s what we asked ourselves when 1-day click attribution was implemented.

With this option, you have a 24-hour window to convert. When you're used to 28-day clicks & views, such a prospect is quite scary!

But, if we choose this option, are we telling Facebook to look for people who will convert in 24 hours and convincing the platform that our promotional offer is appealing enough to accomplish such a mission?

As you’ve certainly guessed by now: For our 3rd experiment, we thought we’d put 1-day attribution to the test.

Here are the test results for several different clients as of July 23, 2021.

-----------------------------

TEST #1 - E-COM CLIENT - 3 DAY TEST.

Context: We put a 7-day click, 1-day view campaign in competition with a 1-day click campaign. Both campaigns had the same audience, creatives, budget, and so on. We used a Broad audience (no interests) and a budget of $75/day.

Results:

1 day click :

  • Investment: $212
  • Add To Carts: 4 at $35.43
  • Purchases: 0
  • ROAS: 0

7-day click, 1-day view:

  • Investment: $141
  • Add To Carts: 49 at $4.34
  • Purchases 4

Conclusion: After nearly 72 hours of distribution, it is clear that the 7-day click 1-day view attribution window was better. In fact, the results of the 1-day click campaign were 88% poorer.

-------------------------------------------------------------------------

TEST #2 - E-COM CLIENT - 8 DAY TEST.

Context: For this client, we ran a campaign with a Purchase objective - 1-day click. Our goal was to generate as many purchases as possible the day of the ‘click.’

ResultsAfter spending more than $300 on this campaign, we generated 0 purchases and 0 add to carts.

Conclusion: We didn’t succeed in generating any results, despite our elevated budget. We do not recommend this test.

-------------------------------------------------------------------------

TEST #3 - E-COM CLIENT - 4 DAY TEST.

Context: We created a new campaign similar to the campaign we were already running for this client, except we changed the attribution window to 1-day click and also opted for a very different budget.

Results: The results of this new campaign are currently lower than the 7-day click/1-day view campaign (A difference in ROAS of 0.34).

Conclusion: We do not recommend this test.

-------------------------------------------------------------------------

TEST #4 - E-COM CLIENT- 4 DAY TEST.

Context: The results of this test were so bad, we only ran the campaign for less than 24H. As a result, I chose to create a 1-Day click ad set to take advantage of CBO and still, results weren’t great.

Results :

1-day click: Spend = 248 ; ROAS = 0.97
7-day: Spend = 350; ROAS = 2.37

Then
1-day click: Spend = 178 ; ROAS = 1.13
7-day: Spend = 436 ; ROAS = 2.55

Conclusion: We do not recommend this test.

-------------------------------------------------------------------------

TEST #5 -E-COM CLIENT- 8 DAY TEST.

ContextWe had a campaign that was performing quite well that suffered a significant drop in results. I decided to test out a new campaign (similar to the last) with a 1-day click attribution model.

Results: We spent a total of $147.03 and only generated 1 purchase. The ROAS was less than 1. I thus decided to stop this campaign earlier than intended (July 5th).

Conclusion: Failure.

-------------------------------------------------------------------------

This is our third test and the first time the results are entirely unanimous: we do not recommend this strategy.

Unlike for the conversion objective, it appears as if Facebook cannot optimize attribution. Especially since, as we were running our tests, iOS 14’s ‘opt-out’ option changed from 1-day to 7-days, which had an immediate impact (positive) on our campaigns.

That being said, what can we learn from this test more globally?

Attribution is the key to success.

You need to find ways to continue tracking your sales and attributing them to specific components/elements of your advertising. Anything that gives you a prolonged view is incredibly beneficial, whether it's a UTM or cohort attribution system.

Broad Audience : July 16th, 2021

As promised, here are the results (as of July 16th, 2021) of our latest test.

Among all of the issues brought about by iOS 14, the ones that troubled us the most were those related to Lookalike audiences. 

With iOS 14 looming over us all, we immediately noticed that our Lookalike audiences simply weren’t converting as much as usual...

...and there’s a logical explanation for that!

With the Facebook Pixel losing performance points, Custom and Lookalike audiences were bound to follow suit. That’s why we asked ourselves the following question: Wouldn't Facebook perform better without any targeting limits in place?

The platform can’t offer up all of the information we need...but that doesn’t mean it can’t access it regardless.

With that in mind, we decided to test Broad audiences for several client campaigns – no interest-based or geographical specifications. We wanted to see if Facebook would be able to outperform one of its most important tools.

Here are the results as of July 16th, 2021.

----------------------------------------------------

TEST #1 - E-COM CLIENT - 10-DAY TEST.

Context: The first client we launched this test for was an e-commerce client (that sells only 1 product) for a Conversion (Purchase objective) campaign. We tested our Broad audience against 2 Lookalike audiences of prospects that had already converted (3% and 6%). 

RESULTS: The Broad audience held up for about 10 days and used up $9,200 in budget (64% of our total budget), generating a CAC of $71.06. The two other audiences hit CACs of $53.58 and $37.97. respectively. The limit set in place by our client: $55.

Conclusion: Inconclusive test.

----------------------------------------------------


TEST #2 - LEAD GEN CLIENT - 1 MONTH TEST.

Context: For this Lead Gen client, we tested a Broad audience against 2 large interest-based audiences.

Results: Very quickly, the Broad audience received the majority of our budget, hitting a CPL of €13.1 for over €2,900 in spend. In comparison, our highest-performing audience for this client hit a CPL of €11.39 for €1,400 in spend. Globally, the Broad audience was certainly more expensive, but it kept getting better and better week after week. The month after we initially launched this test, the broad audience hit a CPL of €12.13, making it the highest-performing audience for this campaign thus far.

Conclusion: This test certainly proved to be more nuanced.  The Broad audience initially seemed less effective, yet it allowed us to scale the campaign by over 76% in just one month. We were thus able to centralize all of our efforts during a period in which all of our other audiences were becoming more costly.

----------------------------------------------------

TEST #3 - E-COM CLIENT- 20-DAY TEST

Context: For this client, we decided to pit a 5%LK audience (previous buyers) against a 100% Broad Audience with no specific targeting filters (interest, location, etc.) for a Conversion campaign with a purchase objective.

Results: During the first few weeks, both ad sets performed quite well and generated similar CPAs. Admittedly, the Lookalike ad set achieved a better ROAS. However, shortly after, the campaign ran out of steam, and we saw a drop in results for both ad sets.

Conclusion: Upon reflection, both audiences work well together. One generates a profit while the other provides us with plenty of new customers for our retargeting campaigns. We recommend this test!

----------------------------------------------------

TEST #4 - E-COM CLIENT- 7-DAY 

Context: We tested a Broad audience against a 3% LAL Purchase audience. The campaign also had a stacked remarketing audience. 

Results: The Broad audience received a budget of €2,185, compared to the 3% LAL Purchase audience, which received a total of €1,158, achieving a ROAS of almost 1 point greater.

Conclusion: We recommend this test! It outperformed one of our best audiences for this client. 

----------------------------------------------------

TEST #5 - LEAD GEN CLIENT - 14 DAY TEST.

Context: For this client, we tested a Broad Audience against a 3% Lookalike audience of prospects who had previously registered for our client’s webinar.  The campaign also had a stacked remarketing audience. 

Results:

  • Broad audience: €3,023 budget ; CPL = €26.06 
  • LAL audience: €4,158 budget ; CPL = €21.66 

Conclusion: We won’t be performing this test again for this client. 

----------------------------------------------------

TEST #6 - E-COM CLIENT - 14 DAY TEST.

Context: For this subscription-based e-commerce client, we tested a Broad audience against an Interest-based audience that historically has worked very well for us. 

Results

  • Broad Audience: $1,599 budget ; 
  • Interest-based Audience: $2,115 budget ;

The Interest-Based audience’s ROAS was slightly higher than that of the Broad audience, but not by much.

Conclusion: Inconclusive test for both audiences. We would have to redo this test using another campaign and offer.

----------------------------------------------------

TEST #7 -  E-COM CLIENT - 7-DAY TEST.

Context: For this e-commerce client, we put a LAL audience against a Broad audience (no interests) using CBO.  Except for the audience, all other campaign elements were identical

Results

Lookalike Audience

  • Investment: $9,060
  • Purchases: 41 at $220 each 


Broad Audience :
  
  • Investment: $10,696
  • Purchases: 48 at $222.84

The Lookalike audience’s ROAS was slightly higher than that of the Broad audience, but not by much.

Conclusion : Inconclusive test – both audiences yielded very similar results. 

----------------------------------------------------

TEST #8 - E-COM CLIENT - 7 DAY TEST.

Context: We have an Evergreen campaign for this client that has been running for several months. The campaign promotes the client’s products (at full price!), with the products changing every season. We’ve also been testing a Broad audience versus a 6% Lookalike audience - Buyers for several months now.

Results

BROAD 

  • Amount Spent: $6,823.71
  • ROAS: 3.07
  • Purchases: 182 
  • CAC: $37.49

LOOKALIKE 6%

  • Amount Spent : $13,413.22
  • ROAS: 2.54
  • Purchases: 249
  • CAC: $53.87

Conclusion: Upon reflection, we can safely say that this test worked. We recommend testing a broad audience vs. your best Lookalike audience. This will allow you to reach two different audiences over a long period of time, if needed.

---------------------------------------------------

We also conducted 4 other tests, which yielded the following results:

  • 2 passed the test (Clients: E-commerce & Infopreneur)
  • 1 failed (Client: Lead Gen)
  • 1 was inconclusive (Client: E-commerce)

As you can see, we were very curious to see if Broad audiences would be a saving grace for us, which explains why we conducted so many tests.

Based on the results we presented to you today, we will continue to test Broad audiences for different clients and campaigns and we encourage you to do the same!

In our opinion, the best way to determine whether Broad audiences will work for you will be to put audiences that already work well for you (that you use regularly) in direct competition with new Broad audiences.

Lead Ads : July 9th

As promised, here are the results of our first test, which ended July 9th.

To give you a bit of context, we realized rather quickly that iOS 14 affects the Facebook pixel and Conversion Lead campaigns across all clients.

Costs have increased for VIP Lists, Lead Magnets, and other such campaigns for no good reason (well, we know why…), which is why we wanted to find answers to this problem first.

Faced with a Facebook pixel and campaigns that aren’t performing/converting as well, we easily decided on our first ‘test subject’: Lead Ad Campaigns.

While Lead Ad campaigns can be less appealing and less complete than landing pages, they're actually much more effective in dealing with the challenges of iOS 14. Because Lead Ads are 100% integrated into Facebook and don’t require prospects to leave the platform, you don’t lose any conversion tracking data.

Please note that the attribution setting for these campaigns is 7-day click (and not 1-day view).

Below are the test results for several different clients up until July 9th, 2021.

———————————————————————————

TEST #1 - LEAD GEN CLIENT - 7 DAY TEST

Context: For this client, we already had a Lead Generation campaign for a free webinar that redirected prospects straight to the client's website. It was performing quite well, but we wanted to acquire more leads. So, we launched a Lead Gen campaign directly on Facebook.

Results: After spending $40 on this campaign, we only acquired 1 lead on Facebook. Thus, we decided to pause the campaign rather than let it run until the end.

Conclusion: Inconclusive. We were able to generate much better results on the website directly.

———————————————————————————

TEST #2 - E-COMM CLIENT - 5 DAY TEST

Context: We ran a Lead Ad campaign alongside a Conversion campaign that historically cost $4/lead.

Results: After just 5 days of spending the same amount of budget for both campaigns, the Lead Ad had a CPL of $4.06 and the Conversion campaign a CPL of $7.68.

Conclusion: Conclusive results that speak for themselves.

———————————————————————————

TEST #3 - LEAD GEN CLIENT - 14 DAY TEST

Context: For this client, the Dynamic Ad Campaign and Standard campaign (with a lead conversion objective) we were running were beginning to fail. As costs began to add up, we knew we had to test a Lead Ad campaign.

Results: With the same budget, the Lead Ad campaign cost $3.91/lead, versus $5.04/lead and $6.93/lead for the other two campaigns.

Conclusion: Conclusive test.

———————————————————————————

TEST #4 - LEAD GEN CLIENT - 2 MONTH TEST

Context: For this particular client, we decided to test out a Lead Ad campaign back in May after our Lead Magnet campaign began to die down.

Results: The average cost for our Lead Ad Campaign was $22.33/lead, while our Magnet - Broad Audience Campaign cost $25.34/lead. Our old Evergreen Campaign (no longer running) cost $28.63/lead.

Conclusion: Conclusive test.

———————————————————————————

TEST #5 - LEAD GEN CLIENT - 1 MONTH TEST

Context: For this client, we decided a little competition wouldn't hurt. We decided to launch a Lead Ad campaign to see how it would perform compared to our existing Lead Conversion campaign.

Results: Over the last 30 days, the Lead Conversion campaign generated 12 leads for $82.88 (+ It cost about $1000 to broadcast the ads). This campaign is no longer running. In comparison, the Lead Ad campaign generated 12 leads for $51.25 and cost only $615 to broadcast.

Conclusion: Conclusive test. We’re keeping the Lead Ad campaign!

———————————————————————————

Based on these results, we are 95% sure that you should try this test out for yourself.

Here's why:

  • Our testing schedule varied depending on the client/test (From 1 week to 2 months)
  • The majority (almost all) of the clients we performed these tests for are Lead Gen clients


Subscribe to the iOS 14 Squad and receive our tests for free now!