Skip to main content

Magpie testing results part 2

The second major ad campaign launched on the twitter powered magpie service is over and I have been pouring over the results to try and make sense out of the results and try to refine the campaign or possibly abandon the use of the service. As discussed in a previous post, I found the magpie test campaigns I ran for and for the sites to be more efficient in targeting relevant users than the Facebook tests I performed last month, however, getting clicks to the link doesn't necessarily convert to signed up users in or a purchase of a product at the sent2null shop. This latest campaign shows this clearly in both campaigns which I'll discuss separately below.

sent2null shop ad campaign #2

After getting a good amount of clicks for the ad service purchased I decided to fund a $40 campaign with magpie for the ad that I had run during the test campaign. The actual test ad text:

Check out sent2null shop for getting custom limited edition tees. Sarcastic, graphic and geek designs reign.

According to the magpie statistics the ad was clicked on 553 times but was shown to 11,051 twitter followers for a click rate of 5% which is quite good, unfortunately none of those clicks turned into a purchase at the site. There can be many reasons for that, my designs could suck and no users found them appealing or too expensive, the link could have failed for many followers, or the followers clicking are coming from outside the US and are unable to purchase using a credit card or finally, the keywords I've targeted are not bringing the people that want to purchase shirts though it may bring people that don't mind perusing designs. The difference is important. The possibility of foreign clicks is ruled out by the metrics that magpie makes available for each campaign to find out the geographic location of most of the clicks, it confirms an overwhelming majority of clicks came from the US. The possibility of link failures is possible but likely to be very rare and not contribute much to the number of people that clicked, that leaves two possibilities...sucky designs on my part and selection of viewers with badly targeted keywords as opposed to prospective purchasers. Assuming my designs are good enough, and not overly expensive (most shirts under $25) the targeting could be the problem, so I adjusted the keywords to restrict to words that are more likely to attract design purchaser clicks over design viewer clicks.

The new keywords for the next campaign will be:
custom-clothing funny geek limited-edition math nerd nerd-fashion sarcastic science shirts t-shirts tees witty

I reduced the set to less general words and created hyphenated sets of words to describe specific targets. The words "geek" , "funny" , "science" , "math", "nerd" and "witty" are not shirt specific but should target users that would find my designs which focus on those topics more useful. The explicit use of "shirts" , "t-shirts" and "tees" should ensure more specific targeting of followers that are actually looking to purchase items related by the other terms. So I think I'll get both a higher click through rate and finally a positive conversion rate from this new campaign we shall see.

numeroom ad #2

Simultaneously with the sent2null ad I bought the same amount of ad purchasing power for the following numeroom ad:

group chat in any language, share any file, link rooms, follow friends, IM, message, securely, join free:

This campaign had the following keywords:

attach bookmark business-collaboration chat chat-room customer email embed feed file file-share files follow following invite language link secure service translation

and is subject to similar reasons for not being able to generate conversion (in this case measured simply by the amount of users that create a free numeroom account after coming to the site). The design could be confusing, link redirect could fail, or the keywords chosen were just targeting the curious and not those interested in using the service. The corresponding click numbers for this campaign show that it was delivered to 8,650 followers of which 216 are reported to have cliked through to the site. In the time the ad ran however, not a single user targeted from the campaign created a free account. Though I think it is unlikely that the same fate would befall two different campaigns for two completely different businesses, it could point at a common problem in the targeting of the campaigns, so this campaign will be changed to remove and make the keywords more specific for the next run, the new keywords to hopefully target people that will sign up for the free trial are:

business-collaboration chat chat-room web-chat customer-service email feed file-share language-translation translate collaborate web-im group-chat

This shorter and more focused set of keywords should pull out more account sign ups from those that click through.

So here I go on round 3 of the great marketing and sales expedition. When the campaigns are over I'll provide an analysis which hopefully will finally show some good news in the form of actual conversions from the clicks!

As a final method to ensure that the metrics being reported by magpie match what is occurring at the actual links I have enabled google analytics on the pages targeted by the magpie ad links for both campaigns. Google Analytics allows detailed break down of visitors to the page including the ability to track their navigation history on the site once they land on the site, this will allow me to determine if users are being hung up by a particular page so that I can potentially redesign the pages to make them more clear. It will also allow me to keep magpie honest as I compare the reported clicks to the received clicks at the site. This will be very revealing of what the ultimate source of the lack of conversions for both sites really is and will cost me only $80.00, not bad considering all the detailed visibility that I am getting into the campaigns and the demographics of visitors.

stay tuned.


Custom T-shirts said…
What a nice blog you have..thanks for all this information

Popular posts from this blog

On the idea of "world wide mush" resulting from "open" development models

A recent article posted in the Wall Street Journal posits that the collectivization of various types of goods or services created by the internet is long term a damaging trend for human societies.

I think that the author misses truths that have been in place that show that collectivization is not a process that started with the internet but has been with us since we started inventing things.

It seems that Mr. Lanier is not properly defining the contexts under which different problems can benefit or suffer from collectivization. He speaks in general terms of the loss of the potential for creators to extract profit from their work but misses that this is and was true of human civilization since we first picked up a rock to use as a crude hammer. New things make old things obsolete and people MUST adapt to what is displaced (be it a former human performance of that task or use of an older product) so as to main…

Engineers versus Programmers

I have found as more non formally trained people enter the coding space, the quality of code that results varies in an interesting way.

The formalities of learning to code in a structured course at University involve often strong focus on "correctness" and efficiency in the form of big O representations for the algorithms created.

Much less focus tends to be placed on what I'll call practical programming, which is the type of code that engineers (note I didn't use "programmers" on purpose) must learn to write.

Programmers are what Universities create, students that can take a defined development environment and within in write an algorithm for computing some sequence or traversing a tree or encoding and decoding a string. Efficiency and invariant rules are guiding development missions. Execution time for creating the solution is often a week or more depending on the professor and their style of teaching code and giving out problems. This type of coding is devo…

Live Coding Exercises: How NOT to hire potentially Brilliant Engineers.

I've intimated this view before but, I abhor "live coding" exercises for engineering interviews and will never have them as part of any interview process I conduct. They are simply unrealistic to real world engineering in every possible way, they only test familiarity (or luck) with a tiny subset of solution methods to a specif subset of problems...that you either "nail" or get spectacularly wrong depending on who is observing you.

They are mostly entirely unfair to the candidate on top of the pressure of having a gun under them while coding, only in the most extreme cases is coding under the gun and that's just competitions where the code is far from real world engineering why test for general coding ability with such tests?? Stupid.

I posit, it is significantly more effective to see examples of a candidates finished working code in the form of a project or projects they've created. How long it took some one to get some uber algorithm work…