big data supply chains

Which Boat Do You Sail to Cross the Blue Ocean?

by Lora Cecere on August 6, 2012 · 2 comments

In a recent post, Time to Paint Outside the Lines , I advocated that we needed to expand our current concepts of supply chain management.  I challenged readers to rethink conventional processes and to think outside the lines; to redefine them using the new capabilities of mobile, social, cloud-based computing and more advanced analytics.  The post was about the “what.” In a discussion with a client, I was challenged to write about a third dimension. The client asked me to write about the delivery of the services or the “who.” Here I share my insights.

When we start to paint outside the lines, we begin to enter the world of blue oceans.  By definition, a blue ocean is a new market that is uncontested. For the deliverer of services, it is a vast opportunity. Full of hope and promise, the deliverer of services is bullish and aggressive on how they will cross the blue ocean. For the user of technology, it is a situation fraught with indecision, risk and uncertainty.

Blurring of Lines

The lines are blurring on packaged application delivery. (When I use the term ‘supply chain management,’ I am using the broad definition of defining cash, inventory, information and product flows from the customer’s customer to the supplier’s supplier.) My goal is to help clients build the end-to-end supply chain (E2E).

Mobility, advanced analytics, cloud-based computing, advanced predictive analytics, and the Internet of Things offers us the ability to deliver new and improved solutions. By definition, Software as a Service (SaaS) applications open the door to enable this innovation. It allows us new opportunities to deliver value in the areas of pervasive computing and analytics. The traditional software licensing model–always held back by the delivery of user-based enhancements–can now be untethered and cast free to deliver new applications through SaaS delivery. Market requirements are driving it. The processes need to be designed outside-in and there is a need for horizontal business processes to enable a level of agility that is not possible in today’s organization.

As I attend conference after conference, for me, it seems that everyone is talking the talk, but they have one foot in the first phase and one foot in the next phase trying to figure it all out.  While Silicon Valley is still in a love-fest with social applications,  I see companies slowly realizing that social for the sake of social is too limiting. It is about SO MUCH more than digital marketing.  Likewise, it is not mobile for the sake of mobile. It is about pervasive computing and real-time information.  There is also a growing recognition that it will not happen through the sticking of mobile and social data in the outdated models of CRM and SRM. These applications were defined too narrowly to sense and translate market signals into enterprise workflows. The delivery of services and products in these new more pervasive models requires the redefinition of enterprise applications. The traditional definitions of Enterprise Resource Planning (ERP) and Advanced Planning Systems (APS) are slowly becoming legacy.

After the first and second decades of digital marketing, companies are now starting to ask questions about digital business.  They want to know how to transform their very “transactionally-focused enterprise applications” into solutions that can sense and deliver a more agile response.  They want to turn to Oracle and SAP, but these very sales-driven solution organizations are well-tuned to deliver traditional solutions, not to help users cross these blue oceans. They would like to turn to the traditional supply chain planning vendors like JDA and INFOR, but they find that these organizations are busy trying to harmonize and rationalize many acquisitions and that they have lost many of their thought leaders. Deep within the IT groups of organizations, companies may reach out to the conventional analytical vendors like Teradata and SAS, but they quickly find that these organizations are used to selling servers and analytical tools and lack the deeper understanding of enterprise application processes.

The Phases

As we progress, I feel that there are three phases.  While we can argue about the names, please read past the labels to understand the broader discussion, and then let’s engage in a discussion.

  • Phase I. The Efficient Organization.  The first phase of enterprise applications is ending. It is where companies have invested and know best. The focus was on transactional efficiency.  In this phase, the organization was defined from the inside-out and the order-to-cash cycle was automated. (In most cases, it was very rigid. The focus was on control.) Decision support was layered on top of the transactional systems to improve decision making using order and shipment data. This era is ending. Leaders now realize that the dream of ERP II and building the end-to-end supply chain on the back of ERP and B2B connectivity was too limiting.
  • Phase II. Digital Business.  The redefinition of processes outside-in from market to market is the phase that we are entering. It will be enabled by cloud-based computing, business-process outsourcing, and pervasive computing.  New forms of predictive analytics will enable listening (e.g., sentiment analysis and text analysis) to understand the questions that we do not know to ask, and systems will be able to adapt through horizontal process orchestration. This movement to listen, test and learn and bidirectional horizontal process management is just beginning. It is the new blue ocean. It is the era of digital business.
  • Phase III. Systems of Commerce for E2E Value Networks.  As systems evolve, companies will come to realize that there needs to be a greater focus on value-based outcomes and inter-enterprise systems of record to better manage bifurcated trade. This phase will no longer be about industry-specific applications. Instead, it will enable the process flows of end-to-end value networks.  For example, in healthcare, the shift will move from efficient sickness (checking patients in and out of the hospital and lowering the admission rate) to sensing the body and focusing on health and wellness. Likewise, in transportation, the focus will shift from selling cars to safe transport using sensors to guide vehicles with improved safety and lower carbon footprints. We are already seeing this shift in Performance-based Logistics (PBL) in the department of defense.

Users are confused. They want to know, “Which horse do they ride to cross the blue ocean?” Simply speaking, the Best-of-Breed Service Provider will be the best bet. Here are my predictions:

  • Consultants Will Stumble. As the gravy train of ERP implementations winds down, more and more consultants are attempting to build software. This includes traditional consulting partners like Accenture, IBM, Infosys and Wipro. I do not believe that they will be successful.  The client model for consulting is just too strong. While they fundamentally understand the client relationship for the delivery of services, they lack the understanding of product marketing and product development. Of the four, IBM will do the best. They have a long history of building software, but they have struggled to market and capitalize on the software’s potential.  While they will continue to have success in the areas of analytics and data mining, and retail, they will struggle in penetrating the deeper areas of enterprise applications. I believe that each will have some initial success selling SaaS solutions, but will wake up within the year and align their skills to contribute to the market in a greater ecosystem play (e.g., putting SAP solutions into cloud-based delivery systems). I think that they would be better served to combine business-process outsourcing with global centers of excellence targeting large business problems like the Race for Africa for consumer products or the Redefinition of the Cold Chain for biologic products.
  • Best-of -Breed Vendors Will Prevail. For me, the most exciting news is coming from the Best-of-Breed Providers. I am bullish about the opportunities for E2Open, Enterra Solutions, Llamasoft, Kinaxis, ModelN, Predictix, Red Prairie, Signal Demand, Steelwedge, Terra Technology, and SmartOps to bring industry-specific solutions with greater depth to the market. They will push the envelope on the delivery of industry-specific SaaS solutions. They will do it faster with support from their clients. I also believe that vendors like Arkema, Aspen Tech, John Galt, and Logility will continue to gain mind-share with mid-market companies with industry-specific solutions. These solutions are becoming mainstream, helping to fill the gaps that the extended ERP solutions cannot fill due to cost and depth of solutions.
  • Oracle and SAP Will Follow.  While Oracle and SAP will talk “blue ocean talk,” internally they will struggle to “walk the walk.” Neither has been successful at driving partnerships and each is handicapped by a very strong sales-centered (as opposed to market-driven) model. I predict that consulting companies like Converge, Neoris, and Infosys will align with SAP to deliver on many of the blue ocean opportunities that are available through the SAP acquisition of Sybase in either the area of mobility or HANA into emerging markets. I think that they will be more nimble in their realignment than Accenture, IBM or WIPRO. With market success, Oracle will follow. Salesforce.com will be relegated to improving sales efficiency and Microsoft, despite having promising software, will continue to struggle in penetrating the enterprise software market.
  • Conglomerates Will Circle the Drain. The JDA and Infor models will continue to consolidate, and the solutions will progress, but slowly. They will continue to be a good fit for software evolution of existing implementations, but they will not be the horse to ride across the blue ocean.
  • Analytic Companies Will be Best Supporting Actors. GreenPlum, SAS, Teradata and IBM will continue to help with analytic applications, but they will bring up the rear.  None of the analytic vendors really understand how to sell and market supply chain applications to line of business leaders.
  • Business Process Outsourcing Will Grow. The use of analytics and the evolution of business process outsourcing for multi-tier processing will continue to grow.  The work that CapGemini or Genpact is doing on retail deductions or Accenture on consumer insights will continue to grow.

My Take

So, as we set our sails for new places, and plan to navigate blue oceans, be sure that you are working with partners that can help you get there. Long term, it will take a village.  Short term, it will be hoisted on the back of best-of-breed providers.  Sailing in the waters of enterprise applications for supply chain management is always choppy, but it is time to look ahead.

I look forward to your thoughts. Anchors aweigh!

 

 

 

 

Big Data Supply Chains: Boosting your Vocabulary

by Lora Cecere on August 18, 2011 · 1 comment

Earlier this week, I started a blog series on Big Data Supply Chains.  This is the second blog post in the series.

In my prior post, I argued that if we are going to build effective architectures from the customer’s customer to the supplier’s supplier that we need to embrace the concepts of Big Data Supply Chains.  This includes using new types of data and exploiting the  increasing power of computing. 

I also believe that we need to up the ante and change the game.  What do I mean?   With these advancements in computing, we have a new  opportunity to redefine the output, the goal and the cycle of supply chain technologies.  With these advancements, I also think that we have the opportunity to change the plumbing.  The 1990s definition of integration is obsolete.

I believe that an architecture that combines Enterprise Resource Planning (ERP), Advanced Planning Solutions (APS), Supply Chain Execution (SCE) Systems plus Business Intelligence (BI) is not sufficient.  Why?  Today, supply chain architectures respond.  In most cases, it is not even an intelligent response. In fact, it is a DUMB, SLOW and often INACCURATE response.    Current technologies either help us make better decisions through the use of optimization in planning or through improved visibility of enterprise transactions. 

The data is dirty.  The latency of information is long.  Most companies have invested in enterprise technologies on a project basis. For most users, satisfaction is low. It should be no suprise that Excel is the number one planning application.

Today’s technologies are primarily about supply.  Deep solutions for demand are needed and an untapped opportunity.  I believe that the future of supply chain technologies will define processes from the outside-in based on a deep and comprehensive solution for demand.  Solutions that sense, shape and drive a profitable response bidirectionally from sell-side to buy-side markets.

If used correctly, I believe that the emerging technologies can allow us to drive a more intelligent response than we were able to achieve in the 1990s through optimization alone.  I believe that through the concepts of Big Data Supply Chains that we can evoke the power of computing power to help our supply chain networks not just respond, but to dynamically sense, listen and learn.  And, for the more advanced companies, I believe that they will fine tune their architectures to sense, listen, test, shape and drive continous learning.   It is the dawning of a more agile supply chain platform.  Machine to machine learning can help our supply chains continously learn.

New approaches are emerging, if we can be open to the outcome.  It is a time to learn, unlearn and relearn.   The other day, I was interviewing a VP of Supply Chain about the future of supply chain technologies.  I asked him, “If he had a magic wand, how would he describe what supply chain technologies of the future  would look like?”  His response, “Lora, I don’t know.  I am frustrated.  I just know that what we have does not work very well.  Somehow, we need to be able to have a more agile sensing platform.  Our current architectures are too rigid and the response is too late.”  For reference, he works at a global company that is very advanced in supply chain thinking.  They have 19 instances of SAP for ERP, and have gone through five different solutions of Advanced Planning (APS), and have superalitive systems for order management, warehouse management, and transportation management.  They were also early adopters of Multi-tier Inventory Optimization and Strategic Modeling technologies.

If you buy my argument, it is time to retool and learn a new jargon.  There is a powerful opportunity for line of business leaders to lead and define the Art of the Possible for Big Data Supply Chains within their organizations.  Here are new terms to know:

Big Data Supply Chains.  Each person that you talk to will define this differently.  When it is used in a business concept, ask what the user means.  There is no standard definition, but in general, it means a dataset that is too large and awkward to use conventional relational data base techniques for capture, storage, search, visualization and sharing of data.  It is the world of terabytes, exabytes and zettabytes of data. 

Columnar Store.  A type of database management system that stores information by column versus by row.  Columnar databases enable  in-memory processing, column pruning and compression. They enable outrageous compression factors, it is not uncommon to compress a Terabyte of traditional row-store data into tens of Gigabytes.  The advantage is the ability to aggregate similar data to increase computational speed.  SAP HANA architecture is an example of advances being made in in-memory processing through advances in columnar store architectures.  It has advantages and disadvantages. I believe that SAP HANA will help us with visualization of large data sets, but it is far from a panacea to help redefine supply chain architectures.  IBM, too, provides columnar database capability to speed data warehouse queries.  The IBM Smart Analytics Optimizer provides this capability withDB2 relational DBMS on z/OS (mainframes), and related technology like the Informix data warehouses (e.g.  the Informix Warehouse Accelerator). 

Fuzzy Logic.  A form of computer reasoning that is approximate versus binary logic that is fixed and exact.  It enables decision making that is not “black and white” where the best answer lies in understanding the range between completely true and completely false.  While optimization helped drive business intelligence in the 1990s, new forms of pattern matching and the use of fuzzy logic will be combined with artificial intelligence to drive new ways to sense, act and then respond.  For an early solution in this area, check out Enterra Solutions.

Hadoop.  A framework designed to support data-intensive distributed applications to support thousands of nodes and petabytes of data.    It is often referred to as open source Apache Hadoop and is being designed by global community using Java.  Yahoo is the largest contributor.  It is new and largely unproven for use by product manufacturers. IBM builds on Apache Hadoop with its InfoSphere BigInsights product to provide an analytic infrastructure for massively distributed data.

MapReduce. MapReduce is the framework of Hadoop.  Introduced to the market by Google in 2004, this software framework uses map and reduce functions commonly used in functional programming to speed processing through distributed computing on large data sets on clusters of computers.  There are few use cases for the supply chain, but Teradata’s acquisition of Aster Data opens up new possibilities to combine MapReduce and SQL to solve big data supply chain problems.  It makes the processing of distributed semi-structured data easier.

Pattern Recognition.  Pattern recognition uses fuzzy logic to recognize sets of data like others and identify patterns in large data sets. 

“R” A freeware or open source programming language for statistical computing and graphics. Recently, it has been widely adopted by statisticians for developing statistical software and data analysis. R is not well suited for big data problems unless you like to write tons of code. It has been widely adopted by bio-informatics but has yet to penetrate the larger analytics market. Companies will be constrained by architectural memory limitations of R, but the open source nature of R will enable data-centric processes.

Natural Language Processing.  To harness the power of unstructured, electronic text data in machine learning.

Ontology.  A rules-based approach for semantic  association and category relations.  We are seeing the use of rule-based ontologies in the evolution of Sentiment data (SAS), Supply Chain Execution (Enterra Solutions) and Supply Chain Risk Management (Dunn and Bradstreet/Open Ratings). 

Semi-structured data.  A form of data which contains both structured and unstructured components.  It does not conform to formal structural definitions of relational data base tables and data models, but can may contain some defined fields, such as subject line or date, in addition to free format text data, such as the body of an email. 

Unstructured data:  A data set without pre-set structure.  Unstructured data abounds in call-center logs, social listening, contract, servicing and warranty data and risk management applications.  Early applications to harness the power of unstructured data for the supply chain is Dunn and Bradstreet’s application Open Ratings and SAS Inc.’s Social Media Analytics application for social media listening. 

 Tomorrow, we will put the definitions to work in what I think could be new and exciting applications.  Let me know what you think.  Do you think that Big Data Supply Chains are a next rung on the ladder of evolution?  Or do you think that this is revolution that will make the current solutions obsolete?