Archives For Practice Management

Why Decision Science?

November 11, 2016 — Leave a comment

Cognitive decisions are at the heart of monetizing advanced analytics for data science.  They are the functional value for why advanced analytics models and related business rules are developed and implemented.  Cognitive decisions include management decisions and insights as well as transactional decisions in service processes. The full value of analytics demands that decision analysis is in scope for every project. Without a new kind of decision process, there is little to no operational gain from applied analytics.

‘Working’ Analytics

December 11, 2015 — Leave a comment

‘Working’ Analytics: a useful term that distinguishes building deployable models that solve problems with a minimal amount of cost and complexity. Almost by definition, ‘Big’ is not ‘working’ analytics; it’s something else. When things get big, they get costly and complex. They get impractical to operationalize much less gain useage in day-to-day operations. A foundation principle for data-science that pre-dates ‘BIG’ is parsimony, also known as Occam’s razor.

For data scientists, ask yourself whether you want to be a ‘working’ practitioner or a developer of complex, inexplicable and mostly unused solutions. You can certainly make complex solutions but your job is to make them simple.

For employers, it is temping to believe in ‘unicorns’….a wickedly complex algorithm that creates a discontinuous shift in your industry and crushes the competition for years to come. But think about hiring people with the attitude and habit of contrarian thinking (e.g. putting a camera on a phone). Hire a blend of ‘working’ practitioners with a philosophy of parsimony, and ‘explorers’ who will thrash data and models regardless of where it takes them.

There are many, many working problems to solve while you are looking for your unicorn.

On this subject, a useful (and challenging) concept from Oliver Wendell Holmes:

“I would not give a fig for the simplicity this side of complexity, but I would give my life for the simplicity on the other side of complexity.

Edward H. Vandenberg

Your stakeholders lack a shared understanding of the methods and practice of advanced analytics.  You start out with a trust deficit when explaining how the mathematics will improve business results.

To build trust, start in advance by building an ordinary business relationship to the operations management.  Next share stories of analytics successes and how they were achieved (ideally those that you have directed). Next coach your stakeholders to interpret model results by simplifying the complex model validation process.

Gradually build an Arena of shared understanding for how models can help operations arrive at a better performance state.  This is hard work and not the stuff of algorithms and data but almost as important.

Look at the Jahari Window for expanding the Arena of trust.

Edward H. Vandenberg

Share things that can (must) be shared

  • Specialized Talent
  • Infrastructure
  • Some datasets
  • Tools

 

Focus services to deliver on demand

  • Domain knowledge
  • Data and systems expertise
  • Capacity for high demand customers

 

Standardize things that will help deliver consistent quality

  • Methodology
  • Project Practices
  • Role Descriptions

 

Give synergy to the effort

  • Complimentary skills and knowledge
  • Knowledge sharing and imagination
  • Contrarian viewpoints

 

Control Risks Formally

  • Skills definition
  • Independent Quality Review of the models and interim work products
  • Checks on conflict of interests and influence
  • Management accountability (project level)
  • Sign-off and approval process
  • Ethical Standards

 

Develop Enterprise Assets

  • Reusable datasets
  • Documented models

 

An Enterprise Identity and Voice

  • Organizational voice
  • A place for people with unique skills to belong
  • Promote identity, value and scope of the work
  • Tell the story to the enterprise

You cannot outsource your reference models to consultants without having the internal or at least independent ability to inspect and validate the model yourself.  The outside consultant has an incentive to make the model look better than it really is.  There are many layers of technical work that a consultant for hire (most anyway) will approach differently, including rigorous sampling and cross validation to ‘break’ the model.

A bad model is not always obvious and you are committing a very high level of trust to an outside firm if they are doing all the work.  At a minimum, you should retain a holdout set of data to validate the model yourself.

More broadly, outsourcing your model development resigns this activity to a commodity play.  And, it may incidentally invite IT into the picture as outsourcing is very much a part of their business model.

Net: have your own capabilities to cover the core science. Don’t outsource the entire project.  Let the consulting firm know up front you will keep a holdout set.  Do not accept a ‘black box’ algorithm.  Do not give the consultant any data until you have agreed on the terms of a fixed price with contingency.  Do not go for any value based pricing and give away money needlessly.  Do not go for a subscription model unless you must host it.  The market currently favors buyers.

IT spent many years at developing a quality process, methodology and assessment metric.  It is a credit to some forward thinking folks that we have the CRISP-DM model for standard methodology.  Should a CMM measurement be applied to model development?  It may not be possible to put that framework around a scientific effort.  But it would help distinguish qualified practitioners and consulting firms.  Perhaps an individual training certification?  But that ignores the inherently creative and non structured thinking approach that characterizes many highly qualified data scientists.  In general, quality and maturity need a framework that is more technical and real than any I’ve seen.

Edward H. Vandenberg