Archives For Analytic Value Management

Value Management includes how your organization scores return on investment from exploratory analytics and modeling, including a periodic assessment of the value of the effort, categorized by how it applied across the value chain. This will allow you to assess whether your program is aligned with corporate strategy and to communicate your value proposiion broadly to stakeholders. This category also includes how you identify the business case at the project level. This must be a rigorous part of the project assessment.

Why Decision Science?

November 11, 2016 — Leave a comment

Cognitive decisions are at the heart of monetizing advanced analytics for data science.  They are the functional value for why advanced analytics models and related business rules are developed and implemented.  Cognitive decisions include management decisions and insights as well as transactional decisions in service processes. The full value of analytics demands that decision analysis is in scope for every project. Without a new kind of decision process, there is little to no operational gain from applied analytics.

IoT is your job. Data Science has always been greedy for complex data and has a pretty good handle on how to process it for insights and predictions.

For most of us and most projects, the practically of getting it to model and having it available to execute run-time algorithms has been the barrier. IoT data is meaningless without algorithms to process it and provide information and predictions/optimizations from it.

IoT is exciting and will change the fundamentals of businesses and industries.  The technology is interesting and very dynamic.  All of this has implications for your analytic operation and practice.

The more interesting and challenging future of IoT (and also part of your job): what are the new processes, user roles, use cases, management scenarios and business cases for IoT. Who will manage the IoT function ‘X’ of the future and what does that role look like.

The other important reason for you to pursue IoT is to keep your data scientists engaged and retained.  Many are still working around the same types of projects, methods, tools etc. that have been around for 10 plus years.  All projects are interesting and challenging but some of them are getting bored.

This means research and discussions with your colleagues, sponsors and stakeholders (while you are still working in the pre-IoT world). Enjoy!

Edward H Vandenberg

Your stakeholders lack a shared understanding of the methods and practice of advanced analytics.  You start out with a trust deficit when explaining how the mathematics will improve business results.

To build trust, start in advance by building an ordinary business relationship to the operations management.  Next share stories of analytics successes and how they were achieved (ideally those that you have directed). Next coach your stakeholders to interpret model results by simplifying the complex model validation process.

Gradually build an Arena of shared understanding for how models can help operations arrive at a better performance state.  This is hard work and not the stuff of algorithms and data but almost as important.

Look at the Jahari Window for expanding the Arena of trust.

Edward H. Vandenberg

You cannot articulate the business case for an analytics project unless you know something about the technology.  This is the fundamental disconnect between novice business executives and the Analytics Executive.  You must be able to bridge that gap so that excitement can be generated, but with the right level of expectations, and a credibility that garners confidence.

The Analytic Executive must know a reasonable level of the science.  Fortunately, a novice level of knowledge is attainable.  Most executives are analysts by nature. Data science for executives can be learned.

Analytics Annuity

January 15, 2013 — Leave a comment

The beauty of analytical models is that they create an annuity value.  Unlike many other back office activities of a firm, a powerful prediction model, built once, creates stream of cost reductions or revenue enhancements out of thin air, year after year with a bit of care and feeding.  When determining funding priorities, your team’s expenses (people, tools, data) are part of a cash flow rather than a one-time expense. This may be a concept to share with finance but it is a relevant concept to gain agreement on within the organization.

You cannot outsource your reference models to consultants without having the internal or at least independent ability to inspect and validate the model yourself.  The outside consultant has an incentive to make the model look better than it really is.  There are many layers of technical work that a consultant for hire (most anyway) will approach differently, including rigorous sampling and cross validation to ‘break’ the model.

A bad model is not always obvious and you are committing a very high level of trust to an outside firm if they are doing all the work.  At a minimum, you should retain a holdout set of data to validate the model yourself.

More broadly, outsourcing your model development resigns this activity to a commodity play.  And, it may incidentally invite IT into the picture as outsourcing is very much a part of their business model.

Net: have your own capabilities to cover the core science. Don’t outsource the entire project.  Let the consulting firm know up front you will keep a holdout set.  Do not accept a ‘black box’ algorithm.  Do not give the consultant any data until you have agreed on the terms of a fixed price with contingency.  Do not go for any value based pricing and give away money needlessly.  Do not go for a subscription model unless you must host it.  The market currently favors buyers.

IT spent many years at developing a quality process, methodology and assessment metric.  It is a credit to some forward thinking folks that we have the CRISP-DM model for standard methodology.  Should a CMM measurement be applied to model development?  It may not be possible to put that framework around a scientific effort.  But it would help distinguish qualified practitioners and consulting firms.  Perhaps an individual training certification?  But that ignores the inherently creative and non structured thinking approach that characterizes many highly qualified data scientists.  In general, quality and maturity need a framework that is more technical and real than any I’ve seen.

Edward H. Vandenberg