A simple, straightforward and inexpensive approach to establishing an impactful contact center analytics program is to ‘define’ some foundational elements. Let’s take a look at the three main steps.
Step One: Define Data Source
Establish an explicit understanding of ‘where’ data is being resourced from and that only ‘decision-quality’ data will be accepted.
Under normative standards a recurring and predictable delivery of data is provided for analysis and department applications. When possible, always take a quantitative approach and have your outcomes anchored directly to data.
When asked ‘why’, our objective is to be able to respond confidently with, “This decision is based upon our understanding of these metrics aligned with historical trends…” At that point, the ‘variable’ is simply analytics and not source material.
Envision that data is to ‘seeds’ what your mind is to a well-prepared ‘field’ for planting.
Bad seeds have never produced anything worthwhile, wasted valuable time and resources, and in some instances, have suffocated dreams entirely.
Thus, demand decision-quality data and refrain from cheap shortcuts that fall prey to entering into a qualitative analytical approach based on:
- ‘Known’ bias of leadership
- Misinformation on social media
- Thoughts from a popular blogger
- Momentary ‘flavor’ of public opinions
- Strong opinions of a relative or friend
I’ll stop there.
Once it is clearly determined ‘who’ will be providing our data, then we need to determine ‘how’ said data will be analyzed, reported and applied.
Step Two: Consistent Data Analysis
Come to full agreement on an immutable formula that governs ‘how’ all departments apply specific data in instances of data overlap.
We can all be viewing the same data; however, if each department has their “favorite” staffing formula and all departments respond to leadership with their version of staffing capacity…there is suddenly five different outcomes of a single weekend coverage request.
At this point — quite self-inflicted — the emails, messages in Teams, and phone calls will never stop!
Eventually, given this myriad of conflicting choices, a game of ‘Data Whack-A-Mole’ begins until there is only a single option left; most often, this is the outcome that leadership prefers.
But, when following this guidance produces a disastrous result, instantaneously a blitzkrieg of fingers are pointed to the sacrificial data lamb. (FYI: Unless it involves Joey Ramone, the term “blitzkrieg” is almost always a negative connotation.)
We waste valuable time/resources having to figure out ‘where’ someone else went wrong while the inevitable erosion of trust occurs. That’s the worst part; someone else’s mistake can call into question the data integrity for everyone.
To avoid all of this, assign the primary overseer of the data to determine what formula will be used, present that formula to all applicable parties, and explicitly agree via email, pinkie promises, a vandalized bathroom stall, tattoos with dates, whatever doesn’t get lost, that this and only this formula will be how it is analyzed in all germane matters.
Therefore, once the data source and analytical approach are defined, it is vital that single ownership is appointed to applicable departments.
Step Three: “One Source of Truth”
Establish undeniable guidelines on ‘who’ is the sole and only point of contact to report and answer for specific data analysis.
The Department of Treasury, when conducting counterfeit recognition training, doesn’t have bankers or tellers examining thousands upon thousands of fake bills. Instead, they have them study a legitimate bill so closely that any variant is immediately identified. In that same mindset, having a single entity fully culpable for specific data analysis greatly increases the probability that when data is incomplete or incorrect it is identified, returned for correction, and goes no further.
My brilliant team has two primary mantras.
- “Numbers are hard” [shared frequently when the data ‘story’ isn’t easily digestible]
- “One Source of Truth” [don’t ask four other departments for an analysis when mantra #1 arrives in your Inbox]
This is of vital importance because anytime a member of the planning team attempts to ‘play’ forecaster.
Or, when a member of training ‘updated’ schedules for several teams to make room for the next training sessions and notified no one in workforce management.
As well as hearing of leadership reaching out to different teams (including non-SMEs) in a double-blind experiment to see what numbers are produced, and then using whatever numbers suit their purposes.
A confusion buffet is unleashed, and those interactions never end well.
‘Stay in your lane!” is not just a frantic command screamed by parents to their offspring learning to navigate a vehicle. Again, so much more could be accomplished via collaboration if we weren’t required to invest hour after hour after hour having to re-address “who is supposed to do what” and deal with the misunderstandings that result from not respecting data stewardship.
Coincidentally, know what you do to the degree that being referred to as a SME [Subject Matter Expert] isn’t just an additional acronym that you’ve inherited by virtue of your team assignment, but it speaks to the exemplary outcomes and high expectations of all you apply your time toward for the benefit of others.
In conclusion, many large-scale data disasters can be easily prevented by implementing these simple and required safeguards.
- Take the time.
- Define these steps.
- And may you only hear of blitzkriegs in your life while enjoying the Ramones.
Honored to learn and serve alongside you.