Inside the analytics projects of five of the nation's best
Data is king in the enterprise, so it’s no wonder business analytics remains one of the top disruptive technologies today.
Analytics platforms give organizations insights so they can make better business predictions. Big data and analytics software is on a growth trajectory. The market is expected to grow at a five-year compound annual growth rate (CAGR) of 13.2 percent through 2022, according to research firm IDC, which is also forecasting worldwide revenues for analytics software will reach $189.1 billion this year.
Not only should organizations be taking stock of the data they are creating and capturing, but they should be applying “novel analytics” and developing unique data that can be monetized, advises the International Institute for Analytics in its report on Predictions and Priorities for 2019.
Drexel University’s LeBow College of Business recognizes organizations that do just that – demonstrate innovation in analytics. The Drexel LeBow Analytics 50 is a national recognition of industry analytics distinction where 50 companies are honored for their use of analytics to solve business challenges.
The judging panel is comprised of LeBow research faculty and industry practitioners. Nominations are judged by the complexity of the business challenge, the analytics solution implemented and the solution’s impact on the organization. Honorees are recognized at a biennial awards ceremony at Drexel University.
Here is a look at the analytics challenges and outcomes of five of the Drexel LeBow Analytics 50 winners for 2019.
CareFirst BlueCross BlueShield
The Federal Employee Program (FEP) actuarial team supported by CareFirst BlueCross BlueShield in Reston, Va., is tasked with pricing benefits modifications and new benefits for 36 BCBS plans around the country. This was becoming more of a challenge, however, because the FEP actuarial team had an out-of-date mainframe-based tool for pricing analysis that was not keeping up with their business agility and functionality needs.
The FEP Operations Center processes almost one million claims per day for 5.4 million subscribers and family members, says Len Rosenblum, senior director for FEP Products and Delivery. The team was increasingly finding that the tool was not adequate for complex pricing analysis.
The tool “had just run its course and didn’t have configurability to do what the actuaries were doing today,’’ such as the ability to change deductible amounts and quickly assess the impact based on real claims data, he says.
As a result, the actuarial team spent a lot of time working around the capabilities of the system pulling information together from multiple sources and then manually combining the data to perform the analysis required.
In early 2018, the FEP Operations Center rolled out a new Benefit Pricing System (BPS) that enables actuaries to analyze benefit category usage from prior years’ data and evaluate the impact of changes to the copay, coinsurance, deductibles and catastrophic maximum limits.
The system has three main components: data loading and summarization: the ability to integrate prior year’s claims data into Hadoop and summarize them by benefit category; data entry: users can enter cost share scenarios that apply copay, coinsurance, deductible and catastrophic maximums to each benefit category; and reporting and analytics: users can run reports directly against Hadoop that display selected metrics by multiple benefit category combinations applying the chosen cost share scenario.
On a monthly basis, the BPS system processes and summarizes more than 500 million claims from the previous two years. Then, as reports are run BPS simulates the claims adjudication process by applying business rules to the summarized claims on the fly to show key metrics for the specific scenarios and benefit category combinations selected.
Now, actuaries can change and add benefits categories more dynamically and in a more granular and flexible way than in the past, making it easier to do analysis on them, Rosenblum says.
For example, skilled nursing facilities is a relatively new benefit “and in the past there was no grouping, so we created a new grouping for that,’’ he says. Although the actuaries could manually collect the costs to understand how to price that benefit, “now they can look at skilled nursing facilities claims in greater detail and in combination with other claim characteristics.”
He points to time savings as a significant ROI for the team. “They’re getting better quality data and it is more [refined] to what they want to see,” he says. The actuaries can also more easily import data into Excel.
Looking ahead, Rosenblum says the team is planning to add more benefit categories and provide additional data to continue to enhance the actuaries’ ability to assess multiple changes, such as product, deductible, catastrophic maximum and Medicare, within one analysis in support of their pricing responsibilities.
City of Raleigh, N.C.
Human resources data is highly sensitive and the city of Raleigh, N.C. was using a variety of disparate ways to collect information for analysis purposes in areas such as hiring staff. In some cases, data was not tracked or was inaccessible or incomplete.
Data was hosted in both an on-premises ERP PeopleSoft system and a government-specific recruiting system in the cloud and both needed to be accessed in a secure way, says Bonnie Danahy, enterprise data & productivity manager for the city’s information technology department.
“We like to track why people are leaving the city to improve ourselves,’’ she says. “We want to make sure we’re building the right data visualization and narrative” in an initiative through the city of Raleigh’s analytics program dubbed “CORA.”
To become a more data-driven organization and provide actionable insights, a tool was needed to help collect, aggregate and visually display data in a secure environment. The tool also needed to help staff make better decisions about staffing well ahead of posting an open position.
The city’s Enterprise Data Management (EDM) team conducted a pilot a couple of years ago evaluating five tools and then selected Power Business Intelligence (BI) as its primary analytics tool.
The team created an enterprise data warehouse that combines data extracted from its ERP and recruiting systems.
“By doing this and being able to apply analytics to the data, we were able to find issues with people, processes and tools,’’ Danahy says. For example, EDM team members found instances where both systems let them do things it shouldn’t -- like storing two ethnicities for a person when it should only allow for one, she says.
That meant reporting was inaccurate, she says. “Or, we’d discover something in a business process,’’ like an item someone wanted to analyze but there was no requirement to fill out a particular field. Then when the data was put on the dashboard “we knew there was a process problem.”
Whereas previously, staff had to use “a lot of manual processes to manufacture the same data sets, we were able to automate that,’’ Danahy says. Now, there is greater access to information like termination data associated with overall turnover rate and analysis of the reasons for short-tenure employees.
Users do not need to be PeopleSoft experts, she adds. Power BI is part of the Office 365 suite “so it has a very similar look and feel to the tools that [staff are] used to using every day, and that helps with adoption.”
The city now has a comprehensive picture of data including current vacancies, the number of referrals for each position, time to hire, time to source. Staff also have visibility into diversity demographic analysis of applicants, referrals, and hires.
For current employees, there is now a more complete set of demographics in one place, including years of service, salary equity, and retirement eligibility.
Moving forward, the EDM team will be able to do predictive modeling to spot hiring trends. Danahy says IT will continue refining the analytics process and add more metrics to Cora to support HR business practices.
“We’ve rolled out Power BI [beyond] HR and we’re working on performance and operational metrics for almost every department,’’ she says.
The value of data and analytics is when they can provide insight and predictive foresight. While FICO is often associated with financial credit scores, the analytics company does a whole lot more.
In 2016, FICO launched their first driver safety score. Despite advancements in safety technology, per the National Highway Traffic Safety Administration, human error accounts for 94% of crashes.
So FICO created a predictive analytic model called the FICO Safe Driving Score, which uses telematics-based driving data to predict the likelihood of future driving incidents.
“We are a data analytics company, helping businesses make better decisions through data. FICO provides value by applying our analytic skills and domain expertise to new business problems, distilling vast amount of data down into a meaningful and actionable score” explains Can Arkali, senior director of analytics.
The company says the score provides a consistent and objective measure of driver risk and safety based on driving behavior characteristics including acceleration, braking, cornering, speeding and cellphone distraction.
The application is geared at commercial fleets and aggregates anonymized data from a platform called MentorSM, developed by FICO’s partner, eDriving. Drivers are then ranked and given a risk score based on their driving behaviors.
The higher the score, the more likely a driver is to display safe driving behaviors in the future. “The FICO Safe Driving Score was developed off of several hundred thousand drivers and millions of trips,’’ says Arkali. “We are continually enhancing the model through independent data validations and monitoring trends in driving behavior data collected through Mentor®” on a weekly basis from thousands of drivers, he says.
For commercial fleets, the goal is to reduce preventable accidents and associated repair, downtime, liability costs and most importantly, ensure its drivers arrive home safely every night. The application is only used if the driver consents, FICO says. The Mentor platform provides a “playlist” of short, interactive training videos that are customized for each driver to promote safe driving behaviors to improve their driving risk, he says. Drivers who complete the training consistently receive the highest scores (or the lowest risk), according to FICO.
Because it’s challenging to predicting future collisions based on telematics-based driving data alone, Arkali says that “The next step is to obtain and leverage as much contextual information as possible” from external factors such as road and weather conditions, traffic flow, or even a measure of the driver’s mood, since that can strongly influence driving behavior.
The incorporation of this data will make the FICO Safe Driving Score a more complete solution for the commercial fleet market, he says. “In addition, continuous validations of the FICO Safe Driving Score against collisions will help introduce incremental changes and can make the model a valuable tool in both the commercial and personal insurance market.”
Marathon Petroleum Corporation
Increased energy efficiency is top of mind for Marathon Petroleum Corp. (MPC), which wanted to create valid fuel-use and emissions-reduction targets for the marine organization’s ocean-going fleet.
On the heels of a successful project that reduced fuel consumption and emissions in its “brown water” fleet (vessels that travel on rivers) between 10 percent and 20 percent, MPC wanted to apply what officials learned to its “blue water” vessels – which travel offshore, such as in the U.S. Gulf of Mexico.
In 2017, the fleet conducted a six-month analytics pilot to obtain historical speed, mileage, and fuel data for at-sea travel. Working with a third-party partner, the goal was “to slow down vessels to reduce speed and use less fuel and at the same time, make sure we were meeting our schedule of deliveries of fuel products or raw materials like crude oil,’’ explains MPC data scientist Melanie Clarke.
The data was used to establish baselines for fuel usage and typical speeds for three classes of vessels with varying efficiency levels. Collaborating with the captains and maintenance experts, MPC set economical operating conditions and speeds for each vessel class. For the six-month period, schedulers identified trips to run at economic speeds and gathered fuel consumption data for those trips.
A team of MPC analytics professionals then analyzed data including historical speed, fuel consumption and conditions.
“We did some analysis on vessels when they leave one port to find out how long it would take to get to the next port,’’ says Clarke. “We found there was a good portion of time they had to wait before they could get into the dock to load or unload cargo, so we knew we had time in our schedule to slow down and save fuel.”
The results revealed significant findings and MPC implemented an economic speed as the norm for all ocean-going vessels. Since implementation, MPC’s ocean-going vessel fuel usage has decreased by 20% at sea, saving the company more than $2 million last year. The fuel optimization enabled MPC to save 1.18 million gallons of diesel fuel, which prevented 10,000 tons of greenhouse emissions in 2017, according to Travis Vollmar, MPC’s IT manager of data science and emerging technologies.
This project has inspired similar marine analytics efforts with third-party shipping providers. For MPC, the project’s success highlighted the potential of advanced analytics, resulting in analytic growth opportunities across the company.
“Enhancing overall efficiency was key to,’’ optimizing the supply chain. Focusing in on vessel speed and fuel usage helped to minimize bottlenecks, particularly with vessels stacking up at certain destination facilities,” says Vince Petrella, marine logistics and commercial manager at MPC. “Using analytics helped us prove we can achieve efficiency.”
Looking ahead, Petrella says he’d like to see where analytics can be used in other areas of the supply chain to “hone in and tweak other parts of the system to gain additional efficiency.”
The blue water project has opened up possibilities for other groups within the organization, says IT business analyst Lindsey McPherson.
“Something seemingly simple can lead to big changes and leads to other groups doing similar projects,’’ she says.
The world’s largest privately funded nonprofit organization, United Way Worldwide conducts nearly a dozen annual and biennial worldwide studies to gauge performance and analyze outcomes in all aspects of its business, from fundraising activities to operating efficiently and how to utilize human capital effectively.
“We collect information on the impact United Ways are having on communities … their investments in communities, the outcomes of those investments and what the United Ways are achieving, as well as strictly operational information,’’ says Lisa Wilder, research director.
Among the data collected is market research “to understand how the community sees us and how well we’re communicating our message and understanding what the community wants from United Way,” she adds.
But staff was finding that dissemination and uptake of the information from United Way studies to a network of 13,000 employees at 1,800 global membership franchises was challenging, at best. The performance data the organization was sharing was presented in static monthly reports, which were not conducive to helping people interpret, understand and immediately apply the findings.
“They all exist in their own ecosystem and if someone wanted information about us we had to give them spreadsheets to present a 360-dgree picture,’’ Wilder explains. Staff wanted the ability to tie study results together in a simple, accessible way in the United Way network, “to tell our story and the story of the network and communicate in a simple way.”
After spending several months looking at different business intelligence data visualization vendors, in late 2017 United Way launched the PerformanceLink Portal, a self-service analytics application that allows tens of thousands of users, regardless of their technical abilities, to quickly visualize the data to reveal performance by population groupings for any dimension they want to see.
“The goal of this initiative was to democratize data and provide access to it’’ in a self-service model, says Lisa Bowman, chief marketing officer. That way, if a local United Way wants to implement a specific program or initiative they would have access to the information and see the best practices, rather than having to ask the research group to dig through raw data and interpret it -- either correctly or incorrectly -- and them have them come back with additional questions, she says.
Users can generate scorecards depicting variables such as volunteer involvement, investor trends, impact offerings, or operating efficiency. And, they can view dashboards that highlight these metrics at a particular United Way affiliate or see graphs that depict how they are doing on a variety of dimensions relative to their peers.
With the portal in place, in June 2018, United Way rolled out Salesforce Philanthropy Cloud, a digital platform that allows people to become “citizen philanthropists,” says Bowman.
The data from philanthropy cloud will be housed on the portal and “we will create visualizations that can pull out the insights within the information,’’ she says. This will help local United Ways understand how they can best serve the donors in their community going forward.
Now, United Way is better equipped with the insights needed to serve 61 million people each year with targeted and relevant services. Detecting patterns in the data not only enables more targeted and successful fundraising initiatives, but also allows insight into how communities allocate funds to maintain and promote critical health, education and financially stable work.