Register | Log in


Subscribe Now>>
Home Tech2Tech Features Viewpoints Facts & Fun Teradata.com
Cover Story
Download PDF|Send to Colleague

Assembling the data-driven supply chain

Integrated, quality-controlled data boosts operational supply chain analytics.

by Bill Tobey

If analytics is the new competitive differentiator, then no field of business operations promises a more challenging contest of applied imagination than supply chain management. Visionary companies in many different industries are already deploying advanced supply chain analytics to gain an edge on their competitors. These innovative applications depart from conventional supply chain management metrics in several significant ways:
Many are built directly into operational business systems as in-line decision support resources for front-line personnel, rather than off-line, after-the-fact tools for management use alone.
Both reporting intervals and the underlying data refresh processes that support those intervals are evolving from monthly or weekly batch processes to intra-day and even intra-hour intervals.
They often address an unprecedented level of granular information, driving multiple-order-of-magnitude increases in the volumes of data being aggregated, integrated and managed.
Many integrate analytical outputs with management objectives, providing critical context for front-line decision makers.

Few observers follow emerging supply chain innovations more closely than Blake Johnson, consulting assistant professor of management science and engineering at Stanford University. "These changes are taking place now because it's only recently that companies have had sufficient command of their data to collect and integrate it automatically, process it to generate key insights for managers, then propagate high-level guidance down through the organization to align front-line actions and decisions," Johnson explains.

Balancing performance and risk
Earlier this year, Johnson helped organize and moderate a special one-day meeting of the Stanford Global Supply Chain Forum, which explored the range and diversity of data-driven innovation in supply chain management. Presenters from different industries offered views of their supply chain challenges and the quantitative approaches they were exploring to increase operating efficiency, reduce costs and control risk exposure.

"What you see is a very common set of pain points across supply chains these days, driven by a combination of increasing complexity, greater rates of change and a relentless demand for speed and performance," Johnson says. "On one hand you have the requirements of lean operations and other efficiency initiatives; on the other you have the complexity and latency of global supply and distribution networks. There is a natural tension between performance and risk in supply chain operations that is a nearly universal management problem in large organizations today."

Risk is inherent in the fast pace of business change. Like every other aspect of business operations, the supply chain must adapt constantly to unforeseen change. Sometimes those developments are driven by shifts in the marketplace, sometimes by shifts in world events.

"Just recently I was talking to a high-tech company that was trying to understand the impact of the Chinese earthquake on their supply chain," Johnson says. "They were trying to find out what components they were sourcing in the affected area, and it was going to take them weeks or even months to figure out. That's not a problem you can solve just once, because even in the best of times supplier relationships come and go—sometimes suddenly. So everyone needs supplier and sourcing information that's timely, reliable, detailed and quickly accessible."

Another unavoidable source of risk is rapid change in demand levels—witness the plummeting sales of light trucks, vans and SUVs that automakers are experiencing as fuel prices explode off the charts. As organizations have moved to demand-focused supply chains, they've become much more serious about their demand data, and forecast analytics has become an area of intense development activity. The core issues are how to acquire relevant data about orders and other future demand indicators, then how to leverage that into the supply chain planning and execution processes.

Many applications, one shared requirement
"Another thing that was evident from the Stanford Forum presentations was that most companies are really just tackling one piece of the puzzle," Johnson points out. "Some start with the manufacturing piece and some with the supply or demand pieces, so everyone wants to know what the other guy is doing."

On the manufacturing side, one presenter profiled a large-scale initiative to centralize and consolidate new and existing management analytics for fabrication, assembly and testing processes. In a massive data consolidation and integration effort, the firm has created an enterprise repository that supports business, manufacturing and sourcing applications with quality-controlled data from a single source.

On the demand management side, Cisco talked about a new demand planning process based on forecast analytics that creates very accurate and detailed demand projections at the stock keeping unit (SKU) level based on statistical analysis of customer order data. The forecasts generated are then integrated with key organizational processes around demand forecasting and supply planning, as well as with the enterprise systems that support those processes.

Other, more process-centric presentations revealed widespread use of fast, lightweight prototype development to quickly and inexpensively evaluate the return on investment (ROI) potential of new data analytics and supply chain management concepts. "The idea is to do something quickly and dynamically in response to specific issues and challenges as they evolve," Johnson says. "It may not be perfect or entirely robust, but it lets us see whether this is a piece of functionality that we should invest in developing as a more robust piece of software.

"This is where supply chain people get really excited, because this is replacing a process where you would scrounge up a little data, put it in a spreadsheet, and that was the end of the story," Johnson continues. "Then you'd wait until an enterprise application came along that did something vaguely similar. Today, people are getting access to an incredible range of data they can leverage quickly and easily with smaller-scale tools. From an IT perspective, there are some control issues, but there's also real excitement that there's a way to identify real business needs, and with the right policies in place, to respond to them in a much more rapid and cost-effective way."

One requirement for this dynamic development is a readily accessible source of accurate, timely and detailed data, integrated across functional and organizational supply chain domains. "If that data is in an enterprise data warehouse [EDW] and is quality-controlled through master data management [MDM], then we can easily feed enterprise-quality data into a range of smaller, lighter, lower-cost, easier-to-build applications," Johnson says. "So what we're seeing is an interesting evolution of those types of applications, and the organizational processes that go along with them."

Critical innovation in the data management infrastructure
In fact, data access has always limited the development of analytical management in the supply chain. Acquiring sourcing information across product lines, channels, companies and geographies was a nearly impossible problem to solve on a detailed and timely basis. The data was locked up in customer-facing systems that sourcing analysts couldn't access. They might not have had the relationships to facilitate extracts. Just getting the data to support useful analysis was an almost insurmountable obstacle.

"So what we see is that innovation in supply chain analytics is closely linked with a high level of integration in the data management infrastructure," Johnson says. (See figure, above.) "Instead of applications and data isolated within operational domains, there's a clear trend toward data integration in an enterprise data warehouse, over a service-oriented application architecture, with data standardization and quality control based on master data management."

The EDW is widely employed to meet the conventional objectives of centralized data consolidation and fast access, but some applications are less conventional. For example, some manufacturers are shifting analytical workloads from their applications to the data warehouse platform itself, avoiding network congestion related to frequent, large data transfers between the data warehouse and an application.

"This is a radical departure from the traditional software model," Johnson says. "It's just a core assumption that the data stays in the database, the analytical functionality lives in the application, and the necessary data for analysis is passed back and forth. But when the data volumes become extremely large, it clogs all the pipes and the process breaks down. It's just not efficient to move that much data over the network to the application. Personally, I will be very interested to see how this trend develops over time. Will we build bigger pipes to move larger data volumes to the applications more easily and less disruptively? Will we move more functionality into places where the data resides?"

MDM is the other indispensable component of centralized data quality management. "Master data supports all transaction and decision processes," Johnson insists. "If there are quality problems in the master data, all of your analytic outputs will be suspect. Master data is the first rung on the ladder to advanced analytic capabilities.

"The companies that are driving analytical innovation in the supply chain have realized that data needs to be managed as a business asset, not as IT infrastructure. When legacy systems are retired or new ones developed, the standards established under master data management are invaluable in reducing implementation time, maintaining access to data, and accelerating change."

Managing supply chains in an unmanageable world
The impact of quantitative management techniques on supply chain efficiency and risk exposure can be quite impressive. "We used to fly blind through this complex, dynamic environment, with very little knowledge about current or near-future conditions," Johnson says. "Now we're gaining access to detailed data in a timely fashion. We're doing analytics on data that give us visibility into evolving conditions and guidance on the most efficient and profitable actions to take. It's a completely different way to run your business.

"I think Cisco is a powerful example of that impact," he continues. "In the past they've made headlines with some very high-profile inventory write-downs. Today they're applying very sophisticated analytics to very granular and dynamic forecast processes across their entire product portfolio. It's a change that takes supply chain planning and analysis down to the level where execution decisions are made. It guides the planning and execution processes, then flows the performance information that results into the organizations' higher-level management processes."

In starting out, be strategic
For companies just beginning to drive analytics into the supply chain, Johnson suggests some strategic discipline in selecting an entry point. "It would be great to be able to roll out an enterprise data warehouse and MDM end-to-end across the entire organization and have it done tomorrow," he says. "You need to be strategic, though, and be sure you start with the highest-impact points. Deploy there first and generate success, understanding and ROI. Then have a trajectory for broadening the deployment over time.

"There's just so much opportunity in the supply chain waiting to be tapped," Johnson adds. "I know a lot of supply chain people have always assumed they would never have more than a very rough cut at the data they needed. When they have reliable, accurate, detailed and timely data without a huge expenditure of customized work in every acquisition, the world looks very different to them. There is a lot of low-hanging fruit within easy reach." T

Freescale Semiconductor: Slicing cycles, boosting yield

During a presentation at the Stanford Global Supply Chain Forum, Mike Hackerott, an IT architect with Freescale Semiconductor, presented a view of large-scale data integration and analytical applications spanning a complex manufacturing enterprise.

Two critical metrics drive optimization efforts across Freescale's supply and manufacturing operations: cycle time and yield, the latter being the ratio of finished, tested product per input unit of raw material. Over a period of 15 years, the company deployed dozens of distributed data marts to track and report these measures along various segments of a manufacturing process that comprises hundreds of discrete operations. More recently, management opted to consolidate that information on a single Teradata/SAS environment to provide an integrated view of operations, support inter-departmental benchmarking and combine manufacturing with business information in a single reporting environment.

The new reporting environment is remarkable for the volume of data it integrates and the rate of inflow. Thousands of pieces of data are produced for each chip, most of it generated by automated systems and equipment. Most reporting stations upload to the data warehouse on 15-minute intervals, and manufacturing operations run 24x7 worldwide. Net growth in warehouse data volume averages 20GB per day.

Integrated reporting and analysis have given Freescale managers a significantly better view of production flow and efficiency. "The ability to quantify all those thousands of incremental yield measurements has been extremely valuable in managing costs and increasing outputs," said Hackerott. "I can also tell you that replacing all those legacy systems with a single enterprise [data] warehouse was cost-effective in and of itself. The cost reduction was huge."

—B.T.

Cisco: Consensus forecasts blend analytics and insiders' insight

For many Cisco products, the supply chain is the production line. So much of the company's production is outsourced—85% to 90% annually—that its manufacturing organization was recently re-branded Cisco Global Supply Chain Management. The group's 9,000 employees support 30-plus business units and 250-plus product families, processing more than 250,000 orders quarterly. Brad Tallman, director of global demand planning, and Anne Robinson, senior manager of analytic forecasting and modeling, recently told an audience at the Stanford Global Supply Chain Forum how Cisco is using advanced analytics to fill a longstanding gap in its planning process: a unit-denominated demand forecast for every Cisco product.

Until recently, supply chain planning at Cisco began with dollar-focused outputs of the annual corporate financial plan. These were forced into unit forecasts in an error-prone process that "lacked statistical rigor." To improve discipline and accuracy, the company assembled a team of 13 analysts and charged it with developing a unit-based forecast for every Cisco product.

Using SAS as its primary analytical platform, the team created an automated process that applies various statistical models to a diverse set of inputs—historical demand data, cost, price, product hierarchy, sales theater, customer segment and life cycle attributes. In a monthly 24-hour production run, the new process generates item-denominated forecasts for more than 24,000 products, in weekly buckets, for a 24-month horizon. Validation and exception handling are accomplished in group reviews with representatives from finance and product marketing. The new process has given the Cisco supply chain organization a complete, unit-level demand forecast for a two-year planning horizon, a more transparent work process and noticeable reductions in forecast error and bias.

—B.T.

Supply chain show and tell

The Stanford Global Supply Chain Management Forum is a leading research institute in partnership with industry and the School of Engineering and Graduate School of Business at Stanford University that advances the theory and practice of excellence in global supply chain management. Working with approximately 30 industrial organizations, the Stanford Forum is actively engaged with a broad cross-section of leading and emerging industries to identify, document, research, develop and disseminate best practices in a dynamic and increasingly global economic business environment.

On Feb. 6, 2008, a notably diverse group of management and IT professionals gathered at Stanford University to compare their experiences, aspirations and challenges in analytically optimized sourcing operations.

Unleashing the Business Value of the Data-driven Supply Chain was a one-day interactive forum co-hosted by the Stanford Global Supply Chain Forum and Teradata. Moderators Jin Whang and Blake Johnson assembled a slate of innovative organizations and individuals to document the business value, technology enablers and best practices that are emerging from data-driven management initiatives across a range of industries and supply chain domains.

Among the forum findings
> Companies are increasingly leveraging execution-level data in operational applications and automated processes to increase decision accuracy, reduce costs and improve operating performance.
> The resulting data sets are very large and widely distributed, and they require integration and analysis to yield value. But the incremental business value that can be realized is also very large and widely available.
> Innovation in supply chain analytics is closely associated with a high level of integration in the data management infrastructure. A rough architectural standard is emerging based on very large-scale data integration in an enterprise data warehouse, a service-oriented application architecture, and rigorous quality control based on master data management.
> While most participating companies are conducting tightly targeted pilot programs and prototype developments, activity is widespread across industries and supply chain domains. Presenting companies included Cisco, Freescale Semiconductor, Harrah's, HP, Google, Intel and Capgemini, which represented a consortium of digital entertainment producers and distributors.

—B.T.

Bill Tobey, a senior technology writer based in Salt Lake City, covers the business applications of information technologies.

Teradata Magazine-September 2008

Related Links

Reference Library

Get complete access to Teradata articles and white papers specific to your area of interest by selecting a category below. Reference Library
Search our library:


Protegrity

Teradata.com | About Us | Contact Us | Media Kit | Subscribe | Privacy/Legal | RSS
Copyright © 2008 Teradata Corporation. All rights reserved.