Best Practices for Data Aggregation and Analytics

Mobile Marketing Magazine sat down and talked to Stuart Moncada, vice president of products at Ad-Juster, Inc., about the current state of data aggregation and analytics

MM:  What’s the current issue in the market? 

SM: First off, there is huge fragmentation in the industry, so there are dozens and dozens of partners doing the same or very similar things. It means that there are a lot of partners who are offering you services, such as monetization, that provide data. The second issue is that there’s a lack of standardization, which means it’s very complicated and time intensive for a publisher to understand what’s going on in the industry, and how to manage their business.

MM: Explain more about the disconnect between gathering data and final analysis? 

SM: Gathering data is step one of the overall analysis. We actually break that down into a four-step process that we call GTNR, which stands for Gather, Transform, Normalize, and Report (or analysis) being the last step. The reporting and analysis are the end result, but you have to start with actually gathering the data, which you would think is straight forward, but it’s not when you’re trying to gather data across 20 different partners. Some may have an API, but a lot of them will not. Some don’t even have capabilities for automated scheduled reporting. So just gathering data is a labor-intensive process for publishers to handle. 

After that, you would want to transform and normalize the data. You can aggregate the data and have an overall holistic view of your business. Gathering is just step one, and at the end, after you do the transformation and normalizing, you end up with a clean, aggregated data set. That’s when you can begin to do the analysis and the reporting, but there’s several steps before that are not always super straight forward if you’re working with a lot of different partners.

 MM: What are some best practices for aggregating data from all your demand partners?

SM: Step one is define business goals, so if you haven’t defined your business goals, then you really don’t have a clear idea of which data you should be pulling. Ask yourself, ‘What analysis do I want to perform?’, ‘How do I want to slice the data?’, ‘How do I want to aggregate?’ Many times, defining your goals could be something as simple as saying “I want to optimize ECPM across all of my demand partners”.  Sometimes, you can get a little bit more specific, such as if you wanted to have a strategy around a new product, wanted to optimize the performance of your video, or you wanted to bolster your programmatic direct business. 

Once you define your business goals, you want to work with your different partners and identify your partner data matrix. What that means is “What are the different data points that I’m going to get from each partner?” If you have a high-level goal to optimize your ECPM or your revenue, that’s pretty straightforward – you know you want to get your revenue and impressions from each partner. But if you’re trying to do something more complicated or more nuanced, like growing your programmatic direct business, then you would need to make sure each of your partners is giving you transaction type data. This will allow you to determine, differentiate and split out your programmatic direct revenue from your open exchange revenue – maybe even your direct revenue. So that’s when generating a partner data matrix comes into play, where you’re defining from each of your partners what are the data points I need in order to measure and optimize those business goals you defined in the previous step. 

After that, you need to figure out how you’re going to Gather, Transform, Normalize, and Report. And this is the actual work of gathering the data, transforming it (the different fields across the different partners), normalizing it, cleaning it up and then analyzing and reporting on it. So that’s when a platform like ProgrammaticIQ from Ad-Juster really helps automate and streamline that.

The last step we recommend is to iterate and improve. Everything is constantly changing in the digital advertising place – especially in the programmatic side – so as the data from your partners changes, as your goals change, you want to make sure you have a system set up that you can iterate on that’s not super rigid. Make sure that adding a new data point that a partner is exposing in their API, does not require weeks or months of effort due to required engineering sources. If you have an automated platform or tool you’re using, you should be able to toggle that additional field and incorporate it into your reporting.

MM: Where does Ad-Juster come in? What’s unique about Ad-Juster’s solution? 

SM: When I think about the tools that a publisher has, there’s two ends of a spectrum. On one end, you have very generic visualization tools that have nothing to do with the digital advertising space, like Tableau or Looker. Or maybe you have ‘Do-It-Yourself’ methods that you’re building in-house. So, if a publisher is using one of those tools or that alternative, they’re still having to spend a lot of resources, gathering the data, pushing it into Tableau, figuring out the mappings, doing normalizations and maintaining if they’re building connectors. Plus, if they’re doing it manually, it’s an ongoing maintenance effort.

The one thing those tools do nicely is that they have very flexible and dynamic visualizations and dashboards. So, what’s unique to Ad-Juster is, and specifically ProgrammaticIQ, is that we brought a lot of the digital advertising and programmatic intelligence. We have the specific connectors to the ecosystem that allow you to automatically pull the data, and then there’s smart connections that will map it and normalize it automatically for you. We also have the very flexible dashboards with drag and drop capabilities. It’s very dynamic, you can extend the data sets and do custom fields very easily like in those visualization tools. So we offer the programmatic and the digital advertising intelligence, combined with the nice visualizations and customer dashboards, so that you get it all in one tool and you don’t have to spend a ton of resources building connectors and maintaining them. Publishers exist to create content and not necessarily to build technology for reporting and analytics.

MM:  How does this compare to other BI tools and workflows in the industry? 

SM: Some of the tools that I mentioned like Tableau and Looker don’t have specific connectors for the industry. If you’re using a tool like Tableau or Looker, a lot of times a publisher has to hire or assign engineering resources to build out the API connectors to pull in and gather the data. So, one way that we differ from those tools is that we have those specific connections for all these systems, so you don’t have to worry about pulling or aggregating the data. We also have what we call smart connections, that will automatically map the different fields across these platforms so that you don’t have to spend weeks on it. There are other BI tools in the market that are specific to ad ops, but you still have to spend a significant amount of time doing these mappings and maintaining them on an ongoing basis, whereas we built our ProgrammaticIQ product to have that intelligence built into the connections so that it automatically does it out of the box.

If you go towards the other side of the spectrum, to the BI tools that are specific to ad ops, their dashboards and UIs are a bit rigid. You don’t have full flexibility to build any visualization on any data set and field. Only certain fields are exposed, or they require engineering requests, while our goal it to empower the user to visualize any data that the system ingests. Any data set that you pull in or any custom field that you create, like a calculation or formula, you can visualize in a dashboard or add to a report. 

Array