How Artifact helped Wiland cut the cost of composing daily newsletters by 64%…
…and enabled them to meet their goal of increasing newsletter output.
Wiland specializes in building and delivering digital audiences for some of the top brands in the world. By predicting the consumer response to campaigns, they help advertisers, agencies, and non-profits improve their click-through and conversion rates, increase their return on ad spend, and increase revenue from direct mail prospects.
23 email newsletters were being sent out daily, each with a unique brand and audience, and each with 4-6 unique stories from around the web tailored their respective brands. Two content editors were tasked with aggregating relevant stories from Feedly and then populate them into HTML email templates to be showcased in the newsletters. This involved manually copying and pasting each story’s headline, short description, thumbnail image, and URL into the appropriate positions within the HTML, all the while ensuring the email newsletter’s syntax remained intact.
The Two-Phased Approach
The priority for phase 1 was to build an application that remedied the cumbersome labor of populating the HTML email templates. For our first task, we rewrote the email newsletter templates to improve deliverability to various email clients—they were especially rendering poorly in multiple versions of Outlook. During this process, we added syntax that would signal to the application where each content element needed to be positioned when a story was inserted.
The application allowed the team to select a brand and then view a list of all the existing newsletters for that brand as well as a preview of what the selected newsletter looked like. We then built a content scraper whereupon a URL was entered, the tool would automatically scrape the title, description, and all images on the story page (up to approximately 8) and present this data in editable fields for the team to either confirm or modify as needed. If the data looked good, they would choose a position (slots 1-6) and click an Insert button and the story would be inserted into the newsletter. The newsletter would refresh to display the change.
The scraping functionality alone had a significant impact by:
- Reducing the number of copy and paste operations
- Removing the need to store html templates on the file system
- And eliminating the burden of conserving the integrity of the HTML syntax when editing a newsletter
Time cost savings: ~3 hours per day.
We also built an image store that gathered the images and preserved them for 6 months, even if the source link became unavailable.
The priority for phase 2 was to improve the organization of the 100+ brands and respective newsletters and aggregate the feedly stories into the application itself. When the second phase was complete, the content editors were then able to select a brand and see all of the newsletters currently being composed vs. the ones that were already sent.
We then connected the application to the feedly API and two mailing APIs. The team was able to prepare a newsletter through either the scraper, the feedly aggregator, or a combination of both. When a newsletter was ready, they could select which mailing API to send it through, choose the appropriate recipient list(s) with all relevant inclusion and exclusion rules, and schedule the send for a future date.
The relief was felt.