I cannot sleep. It is hard to quiet an unquiet mind. Somewhere deep inside me is the need to write this post.
Yesterday, I spoke at the Eye for Transport conference on the Big Data opportunity in supply chain. (The slides are available on slideshare.) Before I start, let me give a preamble. I hate the term Big Data. It is overhyped and overused. As a result, it has lost meaning. I writhe in my seat when I hear it. It is even more painful when I hear others speak about it to drive commercial aspirations without a firm grounding in reality.
When I go to an analytics conference today, I feel like the dumbest person in the room. The rate of change is incredible. I firmly believe that the Supply Chains in the next decade will look vastly different from those we see today. I think that we are poised for the third act of supply chain technologies. It is a new world of Best-of-Breed Technologies. The reason? Today, we are stuck. Nine out of ten companies are not making progress at the intersection of inventory turns and operating margins. Our supply chains are not able to meet the requirements of the business. Frustration abounds.
For the supply chain leader, the noise of change is deafening—rising complexity, demand volatility, new business models, commodity volatility, ethical supply chains, collaborative economy—yet, our processes and technologies are staid and unwielding. We are paralyzed by history that has been transcribed into a canonical myth of “best practices.”
“Rubbish!” I say.
So, in summary, today, we don’t have a big data problem. Instead, and more exciting, I believe that we have a big data OPPORTUNITY! It is not driven by data volumes. Few companies that I work with have databases larger than a petabyte. Instead, it is driven by the opportunity to use new forms of data, and to improve the speed of decision making with increased data velocity.
The Real World
Yesterday, Abby Mayer and I presented an update in a webinar on the two-year research project, The Supply Chain Index. As I look at the progress of the companies that I have worked with over the last decade, the stories hop off the page. They are stories that I can never tell because of NDAs. My voice is caged by my promises to my customers.
In one peer group, there are two companies. One company is outperforming the other in very volatile times. The company that is exceeding was an early adopter of in-memory analytics for self-service by business users. This company can do “what-if” analysis and on-demand reporting. They implemented ERP early and stabilized the implementation. They are now working on the adoption of new technologies that many would call “Big Data.” The other company has implemented ERP three times, and badly. In this second company, it takes five days for a business leader to get a custom report. All of the custom reporting is a service through IT.
The first company sits on one of the largest databases of channel data in the consumer products industry. They can see daily channel data daily by item. The second company reads market data through syndicated sources. They can see more aggregated data with a two-week latency. I believe that data matters. Every company that I work with that has invested in channel data sensing tells me that it is a project that pays for itself in days not weeks. Yet, the number one question that I get is, “What is the business proposition of building a demand signal repository?”
In the real world, we don’t wake every morning knowing what questions to ask. The markets are ever-changing. We see data, observe patterns and want to learn more. The business leaders at the second company are at a clear disadvantage. I can see it in their numbers. However, it is hard to package what I see into a nice, neat ROI package to delight a CFO.
When I think about the big data opportunity, it is not the world that we have today. Instead, it is about the advantages that we can garner in this next generation of technologies. For me, it is about data lakes, clouds and streams. The financial, insurance and e-commerce industries are leading. They are paving a way for manufacturers to rethink their processes. I want to break down the barriers for adoption. What do I mean?
- Clouds. In supply chain circles, data clouds get the most buzz. We have seen the impact and the effects are far-reaching. It enables new business models for B2B network providers, in-memory optimization and new forms of virtualization. Concurrent optimization allows us to paint outside of traditional Advanced Planning System (APS) frameworks. It is powerful; however, this is the concept that I find the least exciting when I think about the big data opportunity.
- Data Lakes. The ability to mine data in data lakes or pools to unearth new opportunities through new forms of analytics using nonrelational techniques is exciting. The mining of structured and unstructured data together ignites my thinking. Today, we do not have an average customer; yet, we broad-brush markets. Our abilities to sense outside-in and to listen to the market are too limiting. For example, why are we not using Google search trends as a causal factor for forecasting? Or mining sentiment data for quality insights? The market data is there. We are not using it. Why? It does not fit into our paradigm. We are frozen in our thinking on inside-out processes that are powered by relational database thinking of traditional enterprise applications. Our functional silos are struggling to get the technologies of yesteryear to work.
- Streams. This is the concept that I find the most exciting. Streaming data is pervasive in e-commerce and also core to the future of supply chain technologies. Why do I feel this way? Most companies are so busy recording transactions into nice, neat rows and columns that the streams are hidden. But, encased in the data, we have order streams, shipment streams, payment streams…. The list can go on and on. The evolution of big data techniques can allow us to sense changes in these streams quickly and change course. The same principle of streaming data from a temperature sensing RFID device can be applied to streaming data within the enterprise. The ability to write APIs to data streams and the possibilities for supply chain excite me.
I believe that we are attacking many business problems too narrowly. Let me give you an example to use these three concepts. Does master data management give you a headache? We are hard-coding master data into relational systems; and as we do, we lose the context. What if we could place enterprise reference data into a data lake that can be assembled through rules-based ontologies and cognitive learning?
What Are the Barriers?
The problem is us. There are three primary obstacles for the line-of-business leader:
- The first barrier is that we , as manufacturers and distributors, are cheap. The great minds working on big data opportunities are working in industries where investments in technologies are viewed as mandatory to drive innovation. They are not hampered by having to have a fixed ROI with a three-year payback proven by a two-year pilot.
- The second barrier is that we have to retool our minds to understand the opportunity. It requires a new language and embracing new concepts. It requires education. Spend time teaching your team about the new concepts. Here are some to start with: Hadoop, Yarn, MapReduce, R, Nonrelational Data, Unstructured Data, Rules-Based Ontologies, Canonical Integration and Cognitive Learning. Business leaders need to spend time learning the concepts and brainstorming the use cases. This shift will not come from the IT department. They are too constrained with fixed budgets, mountains of requests, and traditional vendor interaction. It will also not come from discussions with the traditional vendors that you find pasted all over the airports. Instead, it is found in conferences that are primarily attended by e-commerce pure plays, home entertainment companies, financial, and insurance institutions. Form a small team to go learn about the new opportunity.
- The third barrier is how we think about technology. We are hard-wired to think about technology as a fixed project with a set of defined deliverables. As a result, we cannot be open to the outcome of what we can learn by testing and learning with new forms of analytics and techniques to seize the big data opportunity. The companies that are doing this well have cross-functional teams that are funded with innovation seed dollars to test new technologies against a business problem. However, there is a major difference. The projects are small and iterative. They are not the massive, large consulting projects of yesteryear. I love my podcast with Fran O’Sullivan from IBM that explained this very succinctly.
I will stop now. Room service is knocking. Coffee is in my future, and soon….
This morning, I will be presenting on the opportunity for social data in the supply chain. A subject that many of you know has been top of mind for many years. Hopefully, I will sleep tonight….
How Can We Help?
If you would like to better understand the Big Data Opportunity in Supply Chains, please join us in our quest to help others think differently. Join us in our research study on Big Data and join us at our Global Summit in Scottsdale, AZ on September 10-11, 2014. It is only 89 days away…
At this conference, a number of business leaders have asked to have a private whiteboard brainstorming session. It is private. It will be a session amongst peers to rethink the opportunity in supply chain analytics. I am looking forward to it. I hope that it is standing room only. I hope to see your face in the room….