I did not get the (complete) results I was expecting! Of course, that’s why we test, isn’t it?
As you recall (if not, see part 1 and part 2), I went into this test with the 3 leading ads from the previous round:
- Paying too much for SEO? Not getting results? Download the action plan and start saving 95%. bit.ly/UcGbHq
- How to Save 95% on SEO: bit.ly/UcGbHq
- Paying too much for SEO? Not getting results? Download the action plan. Save 95% compared to SEO services. bit.ly/UcGbHq
Then I reduced the audience size by cutting out followers of certain sources that appeared to be to generic. (This action is likely what upset the purity of results. After all, I was futzing with a second variable.)
So What Happened With This Test?
My overall engagement rate stayed the same as the previous round at 0.40%. The best ad remained the same, #1 as shown above. But this time it performed much better than the other two ads with greater than 90% over #3, and more than 95% confidence over #2. Clearly ad #1 is the best ad, having come out on top through two rounds of testing.
Twitter congratulated me for this ad. See the nifty picture from the email they sent me. It’s telling that an engagement rate of 0.40% is considered a good number.
Where this test fell short of my expectations was that the cost per engagement went up instead of down. It nearly doubled with the CPE ending up at $1.05. I spent $40, made 9,475 impressions, and got 38 engagement. But ZERO CLICKS to my website!
So, while ad #1 was good for engagement, it did not produce clicks to the website. Furthermore, the engagement did not produce any new Twitter followers. Ultimately, that makes the ad a failure in the present and in future opportunity.
Hmm, Now This is Interesting!
When I began this experiment I figured I would spend $100 to see what I could learn. I still had $10 left. I had thought that I might be able to take that $10 and apply it to the best converting ad, driving all of that traffic to the website.
But with the miserable click-through-rate, I figured that would just be like donating $10 to Twitter.
Instead, I set up a Promoted Account campaign instead of a Promoted Tweets campaign. Since my last $40 resulted in no clicks to the website and no new followers, I figured I would try using my last $10 to see if I could get followers. That would at least give an opportunity for future connection.
I used my winning ad to promote my account. But in this case, I’m not paying for engagement, only for new followers. And I bid a maximum of $1 per follower. This was much less than Twitter recommended to maximize my reach. (In this case I didn’t need to maximize my reach.)
I ended up with an engagement rate just a little lower than the previous test, but the overall cost per engagement dropped down to 45 cents. Out of those 22 engagements, 15 were follows. That made the cost per follow 67 cents. (I don’t particularly like paying for follows, but it’s better to get a follow than no results from advertising.)
While I did not achieve the results I was hoping for, this last test using the Promoted Account campaign, turned out more interesting than I expected. Maybe next time I will try this approach for round two or reserve more funds for this part of the process.
There you have it. My $100 test is exhausted. In the end I had total engagement with 149 twitter users. Out of that I got 16 followers at @upatdawnllc, and 6 visits to the offer promising 95% savings on SEO.