What are the benefits of data mining? Imagine your company as a field full of haystacks, with each haystack representing a store. The pieces of straw making up each stack are the thousands of transactions generated by that store each week. Now, picture yourself standing at the edge of that field, and you have been tasked with finding the “rotten” straws of hay. How can you possibly find those bad pieces in the millions before you? How do you even know which haystacks to look at?
Before the advent of transaction analysis programs and data mining, you probably would have seriously thought about forgetting the whole thing and redoing your resume. But with the proper transaction analysis program, it’s like those bad pieces are now neon orange, and you can see them poking through the stacks if you look at them from just the right angle.
How do you find that correct angle? Sometimes, to see things just right we need to move around, squint, and periodically take a look from varying distances. It takes a lot of effort to find that perfect “angle.” In other words, the development of a transactional analysis program doesn’t happen overnight
At Rite Aid we started our program by first rolling out a front end tool, NaviStor, to our field users only. Next, we added a pharmacy analysis tool, NaviScript, which was also only for field use.
As we expanded the program and began to reap its benefits, we felt that we had left our field users standing all alone on the edge of the hayfield. They received training on the systems, but after that, they were somewhat on their own as we had yet to set up a sufficient support structure. In order to help them better use the system, we started building a corporate team that was tasked with supporting our field users.
Cross-Training Corporate Analysts
The support that this corporate team provides the field loss prevention managers (LPM) is an essential piece of the puzzle. Today, we have one data-mining analyst for every division, and they are cross-trained on both front end and pharmacy. This training includes store training. They are available to offer assistance, or to just be a fresh pair of eyes
We have also found that an essential element of effective support is an understanding of what the field users experience on a day-to-day basis. So, periodically, each data-mining analyst will travel with an LPM. They have traveled to average suburban stores and to stores in the big cities, so they can understand how different environments can affect a store’s business. While traveling with an LPM, they may sit in on an interview or assist with research, but the main goal is to see the business from the other side of the data. This enables them to have a better understanding of the big picture and helps them to provide superior support.
Every new field user goes through training at the corporate office with the corporate data-mining team as well as their field training. We maintain a one to three trainer-to-user ratio to ensure that when they complete their training, the LPMs have a solid grasp of the system
They are trained on NaviStor, NaviScript, and NaviScript RX Inventory (another tool we added to help us analyze the pharmacy), as well as NaviCase, our case management system. Occasionally, the team does follow-up and advanced training. Typically, we will gather LPMs who have similar needs and put together a custom class when needed.
Avoiding Duplication of Effort
LPMs may spend four hours per week on transactional analysis, whereas data-mining analysts spend all of their time either directly or indirectly. The corporate team also supports the field by having access to data and tools that the LPM does not have, such as credit card encryption and older transactions that have “rolled off” the system.
One challenge of building a data-mining team was to develop a program that would eliminate duplication of effort. We did not want our analysts doing the exact same things that LPMs could do on their own. Some of this is automatically controlled by the level of research needed. The analysts perform the “deep diving” research on back-end tables, delving into the raw data for more intensive research. In other words, they figure out which haystacks have a significant number of neon orange straws. This research may be driven by something an LPM discovers, a request by corporate management, or as an extension of something suspicious the analysts themselves may have uncovered
Generating Key Performance Indicators
We have created several hundred key performance indicators (KPI) to help analyze where potential loss is occurring. Our front end KPIs summarize data at both the store and associate levels. Based on how our pharmacy business operates, we summarize pharmacy data only at the store level.
KPIs group similar transaction or activity types together, such as all returns. The KPIs can be just that simple, or they can be much more complex with varying degrees of criteria. A few examples of these criteria would be no sales, voids, or filled prescriptions. As with any type of analysis, sometimes it is best to look at numbers and dollars, and at other times, you might find that looking at averages and percentages yields the best results. Therefore, the same piece of data can be viewed several different ways, which will help you achieve that elusive “right” angle.
In addition to basic KPIs, which group all transactions of a certain type together, it is also valuable to create exception-based KPIs, which establish an additional level of criteria or threshold. By doing this, you pull the most probable issues into an exception report. An example would be to “flag” any associate who had refunds higher than a certain dollar amount and/or over a certain percentage. The purpose here is to find those associates or stores that are outside of the normal ranges and then to research the reason why
Again, the complexity of these type criteria can vary. The key in writing successful exception KPIs is establishing the correct criteria or thresholds. Using the term “correct” does not imply that there are wrong criteria, but that there is a certain level of flags you do not want to exceed. You always want to make sure the results are reduced to a reasonable number of exceptions to be researched; in other words, you want to narrow twenty haystacks down to two.
A review of historical data will assist in setting the correct level. You may also need to adjust your criteria or threshold based on geographic areas. As an example, a threshold level that works in the Northeast region may produce too many exceptions in the South, such as low average sale. You will lose the value of the tool if too many exceptions are generated.
Developing a Weekly Process
As we have progressed, we have transitioned most of what we consider our run-of-the-mill KPIs to the field and have begun to move our corporate analysts to the deeper level of research mentioned previously. LPMs or field users typically locate all issues relating to returns, price modifications, and price verifications. Recently, we have rewritten our LPMs’ “dashboard,” which is the home page that they use for research
We began this project with three goals in mind. First, to develop a method to make every minute our LPMs spent using the tool to be valuable; that is, we did not want them to take a random approach to the research, but rather to have a focused, ordered approach. Second, to develop a more systematic method for LPMs to perform their research. This would impact the strong need to reduce duplication in effort between the corporate and field users. Third, we felt we had an opportunity to increase our case count
As our transaction analysis program developed over time, we found that there were far too many KPIs for the LPMs to be able to research them all every week. So, how do you ensure that all the valuable KPIs are covered without ignoring any of them? You provide a process to help your users stay on top of things.
On a weekly basis, our LPMs use NaviStor, NaviScript, and the RX Inventory tools. There are two parts to their weekly process.
Focus KPIs. First, they have an assigned weekly focus. There are a group of front end KPIs, a group of pharmacy KPIs, and a set of KPI graphs for each week. The KPIs selected for each given week support or compliment one another. For example, while researching price verifies, you would also want to see those price verifies followed by no sales, as well as all no sales and also low dollar (or penny) sales. Another example would be all returns, which includes bottle returns, non-scanned returns, and returns on prepaid cards
By establishing preset groupings for designated weeks, you create a research flow. By rotating their focus week to week, a fresh view is created weekly. This flow allows our LPMs to cover every focus area in a five-week period of time. LPMs across the country are researching the same KPIs on the same week. This flow enables our corporate team to know exactly where the focus is for the given week
Exception KPIs. Although the majority of associates are honest, you wouldn’t want to wait five weeks to find that associate who had an opportunity to steal and then rationalized their need had begun stealing cash by creating false refunds. So where is the balance? That is where the second part of the LPMs’ weekly research comes in—exception KPIs.
We like to think of exception KPIs as our “back-up plan.” As mentioned previously with exception KPIs, you are seeking those far above the norm in one or more areas. For example, any cashier who has more than five price verifies followed by a no sale in one week
Additionally, there could be a sequence of events on a register…or, in our case, during the prescription dispensing process…that would almost never be acceptable. An example would be a price modification on a prepaid card. You may choose to have this type of transaction flag as an exception every time it happens
Our exception KPIs are based on one week’s worth of information, as opposed to the weekly focuses, which gather five weeks. This way, every week you see the stores or associates that have risen to the top.
Piloting the New Dashboard
As with any new idea, we needed to test our theory. Since the primary goal was to assist our field users, who better to run our new and improved dashboard though its paces then a group of experienced users. Therefore, we pulled together about ten percent of our users and showed them the new program, with strong emphasis on the fact that it wasn’t done yet and we needed their help.
For six weeks, our test group used the system. We held weekly conference calls so they could offer their input and we continued to fine-tune the “new face of exception reporting.” Results from our pilot program were very impressive; not only did we see a considerable increase in case count in the pilot area, but overall the feedback was very positive. One LPM said that he enjoyed using the system more now because he felt more successful.
With the assistance of experienced users, the program was finalized and rolled out to the rest of the field.
Adjusting for the Pharmacy
Currently, the majority of our exceptions are in the area of sales data, rather than pharmacy dispensing. Analysis of pharmacy data presents a unique set of challenges. From a loss prevention perspective, we wanted to create KPIs that showed us prescriptions that fell outside of the realm of normal pharmacy business. But what is “normal” in the pharmacy business? Each prescription has its own set of rules, based on third-party coverage, dispensing amounts, and dosage. Also, special programs and circumstances, such as government and state programs, as well as local promotions impact payment expectations.
A procedural issue in the pharmacy can cost a company millions, whether it is in shrink or loss of gross profit. In addition, unlike the front end, pharmacy inventory has nothing to do with the point of sale. Inventory is tracked through the replenishment system and is decremented when the pharmacist “QAs” (quality assures) the prescription. Our corporate and field users keep this difference in mind when dealing with the pharmacy side of the business
Influencing Changes in Procedures
Another advantage of transaction analysis programs is the ability to influence the change in policies and procedures. Influencing this change is a two-fold process.
First, by being able to pinpoint exact methods that allow losses to occur, we can partner with the appropriate departments to initiate corrective action. Whether it is in the front end or pharmacy, sometimes our own store systems can create an opportunity for loss by not having the necessary restrictions, such as the requirement of approval codes. By analysis of the data, we can show where the problems are occurring, and how often
Second, you need a superior case management program. The analysis of data only shows the existence of a possible issue. The cases recorded in a case management system based on that data are proof that the issue is now a problem. By combining the two, you can institute change
A top-notch case management system can provide the much needed ability to pinpoint areas of opportunity, as well as case tracking. Our field users enter their cases upon discovery. They use our case management system, NaviCase, to enter their cases and update the file as the case progresses. We provide weekly and monthly reports, based on our users’ data entry, to various levels of management to track pending, as well as closed, cases.
Everything can be tracked, from the average life of a case to procedural issues that are causing a loss.
Also, by tracking how the case was discovered, you can easily extract the data to measure the success of your various programs. You can easily delineate what percentage of your cases was discovered via transactional analysis, employee tips, or other means.
There is also a benefit to be found in case follow-up on procedural issues. For example, the data may suggest there is an issue, but the investigation could show that it was merely a procedural problem.
Another advantage of transaction analysis programs is the ability to influence the change in policies and procedures. By being able to pinpoint exact methods that allow losses to occur, we can partner with the appropriate departments to initiate corrective action.
Sharing the Knowledge
Although our tools are primarily intended to be loss prevention tools, many other departments have benefited from them. Our internal assurance department uses NaviStor and NaviScript for research and investigations as well as NaviCase for their case management. Pharmacy operations benefit from having access to NaviScript and RX Inventory. And accounting regularly uses the coupon KPIs as well as others. Although the transactional data is available other places, our tools simplify the viewing and reporting of this data. Therefore, we frequently provide data to marketing and store operations. After all, we are all trying to accomplish the same goal.A successful exceptionreporting program is one that continually incorporates new data, queries, key performance indicators, exceptions, and, most importantly, new ideas. It needs to evolve, not only to stay a step ahead of those who are causing the loss, but to keep it fresh for the users. Because transactionalbased reporting is completely based on data, it is easy for it to become mundane if you don’t keep it spicy
If creating a transactional analysis program is your company’s goal, be sure to do your homework. Before you begin, outline what you are trying to accomplish, where you have resources, and where you need them.
Another important thing to keep in mind is that you can’t build a successful program in a vacuum. In other words, you can’t build a user-friendly system without involving the users. Talk to your users and let them help select the software package that is right for your company. Or, if you’re building a system in-house, bring in your users during the development stage to get their input.
Most importantly, make sure you take the time to find those unique angles that will make your data glow. This often involves a lot of work—rehashing data, redoing criteria, and sometimes even chucking out your initial ideas and starting over from scratch. But in the end, all of your time and effort will be worth it. Armed with your new transaction analysis program, you will soon find that standing next to that big hayfield is not such a daunting prospect after all.