Imagine this conversation:
Joe: “I just got the wireframes for the new site filtering tool. We need an analytics BRD and Tech Spec so developers can begin work.”
Anna: “But my team of developers is working on a priority project through November then goes into 3-month code freeze!”
Joe: “K, well, this new site feature is also a priority, and we need tracking on the new filtering tool. “
Mike: “Our reporting needs to focus on the KBOs that just came down from the top. How does the new filtering tool relate to conversions? What business decisions can we make if we know it’s being used?”
Susan: “Speaking of, we know that conversion tracking on mobile is broken- has been since September. Can we prioritise getting THAT fixed?”
Dan: “But, we’ve been grading our personalisation efforts using that report! We need to get that fixed, like… yesterday!”
As painful as that conversation feels, can you imagine how much more painful it is in places where those conversations are NOT happening? I know that no one wants more meetings in their schedule, but a regular, FOCUSED check-in between key stakeholders can make all the difference. But who should attend such meetings?
We’ve seen a lot of value in establishing an Analytics Model- some folks may call it a Center of Excellence (CoE), for others it may still just be called the “analytics team”. Whatever you call it, the important thing is to really think out the roles, goals, processes, and responsibilities so that this team- and their data- can really drive the conversation, rather than “be driven”. I’m going to call this the Data Core Team.
To start, figure out who is going to be on your Data Core Team. I’ve seen this filled by a single person, and I’ve seen a team of 6 or more. Either way, with however many people you have, you’ll need to fill these roles:

Solution Owner

The Solution Owner is the “business requirements” gatekeeper. Their world is one of Key Business Objectives (KBOs) and Key Performance Indicators (KPIs). They:

  • gather reporting requirements
  • give focus to the solution by running reporting requests through a value-driven filter, prioritizing work that will provide truly actionable data to their organization
  • interface with executives, product managers, and analysts to make sure their data practice aligns with their company’s business objectives and roadmap
  • work with the Implementation Architect to design a solution that will suit their reporting needs
  • are in charge of keeping implementation documentation in a centrally-accessible place

Implementation Architect

The Implementation Architect owns the technical side of the solution. The Solution Owner says if something is worth tracking; the Implementation Architect figures out how to make it happen. They:

  • know the tools of their trade- for instance, for an Adobe Analytics implementation, they’d know when to use an eVar instead of a prop, or how to set the products string. For a Google Analytics implementation, they know when to use an event or a custom metric, and the best practices behind event categories, actions and labels
  • make decisions and enforce standards for variable maps and data architecture. Often, the decisions they make are a bit arbitrary- for most folks, it doesn’t REALLY matter if you identify your pageName in a JavaScript object named “” or in “”, or if you use eVar41 or eVar42- the important thing is that someone is in a position to make that decision and keep it standard.
  • administer any Tag Management Solution their company uses, perhaps just controlling access and settings standards, or perhaps going so far as to be the editor and publisher of changes.
  • work with the Data Steward to document what is needed from site developers.

Data Steward

The Data Steward works with site developers to apply the analytics solution to the site. As the person charged with owning the data for your site(s), they have more of a technical understanding of analytics and how it fits into site development. They:

  • may not be a developer themselves, but they need to understand the processes developers use, the overall way the site works, and to be able to make informed decisions about data layers, tag management, JavaScript frameworks, SDKs…
  • work closely with the Implementation Architect to design and deploy a solution that works, given your site’s architecture and developer resources.
  • interface with site engineers and developers and represent their interests to the rest of the Core Data Team.
  • own Data Quality- they run the QA processes and help maintain implementation health with regular audits.

Report Administrator

The Report Administrator does a lot of the housekeeping needed to get data to the end users within their org. They:

  • interface with the report users, ensuring they have the access and training they need
  • distribute reports, create logins, and provide access to training
  • may serve a PM role within the Data Core team, keeping track of upcoming initiatives and timelines.

I’d say it’s rare for these responsibilities to actually be split among 4 people as I’ve described here. The important thing is that you have clear ownership of each responsibility, and that this Core Team works closely together as the single source of the “Big Picture”.
Each role may need to pull on other resources freely- for instance, if your company doesn’t have an Adobe Analytics implementation expert, then your Implementation Architect may hire outside consultants to help them. I’d say in general, this doesn’t mean outsourcing the OWNERSHIP of your implementation architecture- each company still needs an internal resource with motivation and access to resources to move the solution in the right direction. No matter how excellent your consultants are, they will never be able to own your implementation as well as someone internal could. A good consultant, however, will support that internal resource, providing industry knowledge and guidance, and investing in the future of your org’s analytics practice by training internal resources. Basically, outside consultants should be tasked with making internal owners look like Rock Stars.
If this seems like a bit much, or it’s hard to sell your organization on the idea of such an investment of resources, consider this: Each of the bullet points above- as well as other, more specific tasks- aren’t negotiable. They are all things that inherently need to happen to have an analytics solution. What we frequently see happen is that when not enough resources are assigned to supporting these tasks, reporting can still happen, but the net amount of effort is higher (because there was no forward-thinking master plan and folks have to make it up as they go) and the value of the reporting is lower. I promise, you will get a return if you invest in getting the right resources and support for your Analytics Practice.
This, of course, doesn’t cover everything you’d see done in a healthy Analytics Practice. I’d love to hear from readers if I left off anything they view as critical, and what they’ve seen work well or not work well!

About the author

Jennifer Kunz

I'm a principal Analytics Engineer with 10 years of experience implementing digital analytics. I work with Matt Alexander and our team of Analytics Engineers to ensure our clients have quality, actionable data by helping with everything from JavaScript and app development to governance and processes. You can find me on twitter at @jenn_kunz.

eBook: Understand Your Customers

Cognetik eBook: Guide to User Journey Analysis


Related Articles