Taming the Wild West of embedded analytics

Nearly 90 percent of execs who’ve dabbled in predictive analytics say the technology packs positive ROI, according to a 2015 study by Forbes Insights. Yet, only 13 percent of practitioners see their organizations as being highly advanced in its use. And, even among those proficient pioneers, the field is like the Wild West says Michael Goul, professor of information systems and associate dean of research. That’s because it’s still relatively new, untamed and a little bit lawless.

But, Goul and two colleagues — Professors Raghu Santanam and Robert St. Louis are the self-appointed new sheriffs in town. As such, they’ve been working for the past two years to track best practices in analytics programs and map out a reference model for companies to follow.

Goul presents ongoing studies and progress reports to the Society for Information Management’s Advanced Practices Council, which is a group of CIOs who support and learn from the work of leading IT researchers. The reference model shows a framework and common language analytics professionals can use throughout the organization. That supports process standardization, which is something that’s not even close to being in place right now in the analytics wilderness.

A land of rugged individualists

Some organizations have centralized analytics teams working almost as a consulting agency to help other departments throughout the firm. Many, however, have a bunch of rugged individualists who’ve staked their claims and are now eking out a wide variety of department-specific, business-building code.

An article on analytics in marketing in the April 21, 2016 issue of Ad Age noted, “‘Math Men’ are replacing ‘Mad Men.’ CMOs face pressure to operate like numbers-driven CFOs. Marketing departments are hiring data scientists.”

Human resources teams are using analytics, too. An article published by the Society for Human Resource Management covered the great results Xerox Corp. enjoyed when it discovered that call center workers with relevant experience were no more successful than those without it. That vastly expanded the company’s candidate pool. Further research showed that creative types and those who belonged to multiple social networks tended to stick around longer. That information helped Xerox cut attrition by 20 percent and save a bundle on the company’s $5,000-per-employee training costs.

In the health care field, Montgomery, Alabama-based Baptist Health Care has been using predictive analytics to identify patient risk for complications from an illness or surgery to drive enhanced patient care as well as improve the organization’s own revenue cycle and denials management.

“Companies are building all kinds of these analytics and they’re embedding them into their business processes all over the place,” Goul says. “The marketing team is working on them. The supply chain team is working on them. And the question for the overall company now is, ‘Are all these things in sync?’”

They weren’t for one cellphone company Goul has studied. “This company found out that they were spamming customers because every division was trying to interact with them. It was overloading customers with communication, so the company centralized their analytics functions to get control,” he explains.

Along with different departments engaged in analytics, organizations often have different technologies at work. “Some vendors are really good at supply chain analytics. Others are good at marketing analytics or other applications,” Goul notes. “You may wind up cobbling things together after you’ve made technology acquisitions in several different areas.”

He adds that the integration often results in a cumbersome kluge that might be at odds with one goal of predictive analytics: moving very quickly. “The push is on in the platform side to try to integrate better, but the push is on in the functional areas to embed more and more analytics and do it faster and faster,” Goul says.

Wrangling at warp speed

Goul says that analytics professionals work in a pressure cooker. They juggle mitigation of risk, an increasing number of projects, technology dispersion, skilled-worker shortages and “pressure to deploy as fast as they can.”

Consider the world of high-frequency stock trading. Right now, NASDAQ estimates approximately half of all market activity is executed by algorithmic traders. Back in 2007, the percentage was even higher — around 60 percent — and Information Week explained why everyone was trying to execute with more and more speed. According to an article it published at the time, “A 1-millisecond advantage in trading applications can be worth $100 million a year to a major brokerage firm.”

Given this need for speed, Goul has outlined the capabilities companies must have in place to actually execute their giddy-up analytics imperatives. It starts with what Goul and team call the “champion” capability. “Under ‘champion,’ there’s a set of enablers that you have to have,” he explains. “You have to have an analytics culture, the right platforms to make analytics and the right rewards for people.”

Speaking of people, getting the right ones might be easier said than done. Currently the tech job site DICE has more than 17,000 analytics jobs openings posted. According to a study by McKinsey & Company, “By 2018, the U.S. alone may face a 50-percent to 60-percent gap between supply and requisite demand of deep analytic talent.” Not surprisingly, staffing and resource commitment are among the enablers Goul outlines.

Also falling under the “champion” capability is “awareness.” As Goul explains, “You have to know what your competitors are up to, whether or not what you’re doing is current, where you are and where you want to be in terms of your maturity and how you’ll get from point A to point B.”

Another critical set of capabilities comes under the moniker, “build,” and assuring that you’ll be building on the proper IT infrastructure and policies. “Your IT capabilities have to make it all possible.”

Still, your investment is likely wasted unless the analytics infused in your applications are aligned with business strategy. Consequently, Goul’s next set of ingredients are focused around capabilities supporting alignment. “This refers to how you coordinate analytics to support the strategy of the business,” Goul notes.

As an example, he points to how the electric power sector is now using predictive analytics to match electricity supply with demand and coordinate all the different players now in the market by using what the industry calls “virtual power plants.” These resources track the capacity and ramp-up or ramp-down speeds of things like rooftop solar installations, customer-owned storage devices, electric vehicles and even loads — which are anything and everything that uses electricity — and then control all those things for the good of the grid.

How? The virtual power plant systems bump those aforementioned data points up against weather data, electricity demand forecasts and the real-time needs of the power system, such as its frequency and voltage levels, which must be precisely controlled. The result is a computerized, algorithm-driven way to produce the capacity the grid needs without fossil fuel-based generation plants. This fits in with most utility strategies in light of potential increased regulation on greenhouse gas emissions, not to mention public preference in an increasingly environmentally aware world. Now, there’s alignment.

Go West … or not

The last set of capabilities in Goul’s list come under the header, “decide.” He explains: “A lot of analytics are automated, and the decisions that are being made are automated, too. Does your senior management support all of the decision-making capability you’re embedding? Do you have the right skills and culture in place for that? Are your assumptions behind the decisions reasonable? Can the assumption be tested? Are there more reasonable alternatives?”

Among other things, capabilities under the “decide” header dictate senior management’s role in making go or no-go decisions. This set of capabilities also deals with governance issues. Says Goul, “You have to have a set of policies in place for data accuracy. And then, of course, you need security and compliance activities” defined and followed.

Once all these capabilities are in place, organizations can keep their analytics current, relevant and effective by making sure those analytics support two sets of organizational activity. The first is “sense and respond,” by which Goul means how well organizations can detect changes or problems and quickly act to fix them. The second set of activity analytics should support is “predict and prescribe,” which Goul says relates to an organization’s ability to evaluate an embedded analytic and determine that it will provide value in the long run.

The team’s current reference model builds on a deployment methodology he created two years ago. He calls it “DEEPER,” an acronym representing the steps taken to “design” a campaign based on predictive models, “embed” the offer or analytics into business processes, “empower” employees to support the campaign, measure “performance,” “evaluate” results and “retarget” future efforts. This methodology was initially used to support analytics in marketing applications.

“Reference models are used in a lot of different places,” Goul says, adding that they’re intentionally technology agnostic and abstract so that they can be customizable. “What you’re doing is getting everybody to talk around the same points and solve the problems that you have. The goal is to improve your processes, get more standardization and keep value going.”

Get the latest from the W. P. Carey School of Business

W. P. Carey News  |  Headlines and deep dives

KnowIT  |  IT news and research

We're committed to your privacy. W. P. Carey uses the information you provide to us only to share our relevant content that you select. You may unsubscribe from these communications at any time. For more information, check out our privacy policy.