Archives For Execution Management

This is a broad list of activities and management responsibilities for how analytics work gets done. It includes Governance, and the templates and processed used to scope and execute model development and integration projects, Note that I consider analytics work consisting largely of projects to develop and deploy models. For some companies, most analytics is an ongoing set of tasks to improve data and models. Since most work gets done in project teams and model development is a discrete period, I focus on the project paradigm for analytics.

Analytic Executives should be reading The Race Against the Machine, Brynjolfsson and McAfee. 2011.

I will quote from the book to raise the point that process re-engineering is critical to analytics return on investment.

“The most productive firms reinvented and reorganized rights, incentive systems, information flows, hiring systems, and others aspects of organizational capital to get the most from the technology…..The intangible organizational assets are typically much harder to change, but they are also much more important to the success of the organization.”

This is partly why analytics needs to rise to the level of a corporate function, with staff level executive leadership, so as to be able to move the organization to re-engineer itself for the technology.

Edward H. Vandenberg

IoT is your job. Data Science has always been greedy for complex data and has a pretty good handle on how to process it for insights and predictions.

For most of us and most projects, the practically of getting it to model and having it available to execute run-time algorithms has been the barrier. IoT data is meaningless without algorithms to process it and provide information and predictions/optimizations from it.

IoT is exciting and will change the fundamentals of businesses and industries.  The technology is interesting and very dynamic.  All of this has implications for your analytic operation and practice.

The more interesting and challenging future of IoT (and also part of your job): what are the new processes, user roles, use cases, management scenarios and business cases for IoT. Who will manage the IoT function ‘X’ of the future and what does that role look like.

The other important reason for you to pursue IoT is to keep your data scientists engaged and retained.  Many are still working around the same types of projects, methods, tools etc. that have been around for 10 plus years.  All projects are interesting and challenging but some of them are getting bored.

This means research and discussions with your colleagues, sponsors and stakeholders (while you are still working in the pre-IoT world). Enjoy!

Edward H Vandenberg

This is a challenging subject.  Here is a start:

1) Getting to failure mode quickly – analytics effort is too valuable and the need to stop doing pointless work is great.  Every project has a failure mode and the faster you can find it in an analytics project the more mature the effort is.  Every part of the project has a failure mode. (Side note: how different this is from many ‘technical’ projects, especially IT projects).

2) Not making the same mistake twice – look to Dr. John Elder for common mistakes made by talented people.  The least a mature team can achieve is not repeating mistakes in their work. Home – ERI.

3) Catching your own mistakes – closely related- but creating a mechanism to catch common errors is a level of maturity

4) Repeatability – mature efforts have a level of repeatability across challenging and varied projects.  This is a high mark.  The data and problem statement present unique problems that work against any repeatable effort or thought process.  TBD on whether this capability makes it into the CMM model for Advanced Analytics.

I invite the reader to extend and revise.

Industrial Analytics

February 20, 2013 — Leave a comment

What it is.

When you have moved beyond R&D and have repeatability and assurance to developing and implementing models you are industrializing.  This does not sound too attractive to many data scientists but it is what executives are expecting.

Think of this as controls for risk, quality and productivity (i.e. costs).

If you can’t offer an industrial strength department, you will not get much attention from the executive staff.

How to industrialize will be the subject of future posts.

You cannot outsource your reference models to consultants without having the internal or at least independent ability to inspect and validate the model yourself.  The outside consultant has an incentive to make the model look better than it really is.  There are many layers of technical work that a consultant for hire (most anyway) will approach differently, including rigorous sampling and cross validation to ‘break’ the model.

A bad model is not always obvious and you are committing a very high level of trust to an outside firm if they are doing all the work.  At a minimum, you should retain a holdout set of data to validate the model yourself.

More broadly, outsourcing your model development resigns this activity to a commodity play.  And, it may incidentally invite IT into the picture as outsourcing is very much a part of their business model.

Net: have your own capabilities to cover the core science. Don’t outsource the entire project.  Let the consulting firm know up front you will keep a holdout set.  Do not accept a ‘black box’ algorithm.  Do not give the consultant any data until you have agreed on the terms of a fixed price with contingency.  Do not go for any value based pricing and give away money needlessly.  Do not go for a subscription model unless you must host it.  The market currently favors buyers.

IT spent many years at developing a quality process, methodology and assessment metric.  It is a credit to some forward thinking folks that we have the CRISP-DM model for standard methodology.  Should a CMM measurement be applied to model development?  It may not be possible to put that framework around a scientific effort.  But it would help distinguish qualified practitioners and consulting firms.  Perhaps an individual training certification?  But that ignores the inherently creative and non structured thinking approach that characterizes many highly qualified data scientists.  In general, quality and maturity need a framework that is more technical and real than any I’ve seen.

Edward H. Vandenberg

Analytics is a Science

November 10, 2012 — Leave a comment

The lately applied nomenclature for advanced analytics creates a useful distinction and a challenge to businesses.  First, analytics is a science: essentially complex hypothesis discovery using quantitative theory and methods.  As a science, it naturally has its own processes, people and tools not like other activities. The challenge to business executives is where does science fit in a non-scientific company?  Advanced analytics does not have a natural home in most companies.  That’s leading to some organizational incoherence when it comes to growing the analytics function beyond a point solution or functional silo, in marketing or pricing for example.

Analytics executives must make the case that they manage a scientific function.  As such, it has its own organic management and structure.  It’s not really part of another function.  If so, the question still stands, where does the advanced analytics function belong in the corporate structure.

There is a reasonable case, being made now by several large companies, that it is its own strategic business unit with leadership alignment up to a chief executive.  I suggest that more companies will establish the role of Chief Analytics Officer.

In another post, I will discuss where this level of leadership will come from in the current level of maturity for this science.

Edward H. Vandenberg