Archives For Analytics Program Management

Program Management includes several critical management and work activities that you or your managers must be accountable for. Consult the Program Management Body of Knowledge for a detailed view of this level of organization and apply it to your enterprise analytics group.

This is a challenging subject.  Here is a start:

1) Getting to failure mode quickly – analytics effort is too valuable and the need to stop doing pointless work is great.  Every project has a failure mode and the faster you can find it in an analytics project the more mature the effort is.  Every part of the project has a failure mode. (Side note: how different this is from many ‘technical’ projects, especially IT projects).

2) Not making the same mistake twice – look to Dr. John Elder for common mistakes made by talented people.  The least a mature team can achieve is not repeating mistakes in their work. Home – ERI.

3) Catching your own mistakes – closely related- but creating a mechanism to catch common errors is a level of maturity

4) Repeatability – mature efforts have a level of repeatability across challenging and varied projects.  This is a high mark.  The data and problem statement present unique problems that work against any repeatable effort or thought process.  TBD on whether this capability makes it into the CMM model for Advanced Analytics.

I invite the reader to extend and revise.

Industrial Analytics

February 20, 2013 — Leave a comment

What it is.

When you have moved beyond R&D and have repeatability and assurance to developing and implementing models you are industrializing.  This does not sound too attractive to many data scientists but it is what executives are expecting.

Think of this as controls for risk, quality and productivity (i.e. costs).

If you can’t offer an industrial strength department, you will not get much attention from the executive staff.

How to industrialize will be the subject of future posts.

In my experience, most data scientists aspire to a career path that includes interesting work, valued by their organization that makes a difference.  They mostly do not grow into managers or expand their scope of authority.  On the one hand, executives don’t necessarily need to worry about creating a full career growth path for these unique employees.  Secondly, there simply aren’t many management roles in this narrow operating area.

On the other hand, it can be problematic in how these individuals fit into the overall human resource model.

Confirm with your own data scientists what their desires and career expectations are.  I submit that most of them do not want to manage but do want a span of control concerning analytics that count.

There is a technical level of leadership you should honor within the team.  More experienced and seasoned scientists naturally have some control over more junior staff. That leadership is important, even if the technical leader does not formally manage his or her team members.

You cannot articulate the business case for an analytics project unless you know something about the technology.  This is the fundamental disconnect between novice business executives and the Analytics Executive.  You must be able to bridge that gap so that excitement can be generated, but with the right level of expectations, and a credibility that garners confidence.

The Analytic Executive must know a reasonable level of the science.  Fortunately, a novice level of knowledge is attainable.  Most executives are analysts by nature. Data science for executives can be learned.

Analytics Annuity

January 15, 2013 — Leave a comment

The beauty of analytical models is that they create an annuity value.  Unlike many other back office activities of a firm, a powerful prediction model, built once, creates stream of cost reductions or revenue enhancements out of thin air, year after year with a bit of care and feeding.  When determining funding priorities, your team’s expenses (people, tools, data) are part of a cash flow rather than a one-time expense. This may be a concept to share with finance but it is a relevant concept to gain agreement on within the organization.

Even mature firms may turn to outside consultants for special projects or to add capacity for high priority models.  The market place is flooded with firms eager to engage, some experienced, some not. Get the firms to give you a mock business presentation with model results that they have actually delivered.  See if their presentation rings true and answers all the questions.  Make sure that they tried to ‘break’ their model; in other words, that they tested the sample dataset for bias sampling.

If a firm asks to get your data you must come to terms on the intellectual property.  If they are learning from your data because they have not done the work before, you should own all of the resulting work.  If they ask for data on a specified basis, it is a sign that they have worked on this kind of problem statement before and bring some of their own IP to the table.  The deal you work out with them should recognize that because chances are better that they will deliver a better model in less time.

Bottom line: don’t let consulting firms learn from your data and then sell you back the learning, unless you will own all of that intellectual property.  That means, the model algorithm, the input dataset and the data transformations.

If you are developing your business unit for ‘industrial’ scale, you will want to create an analytics business architecture.  It is primarily a way to communicate scope, plans, strategy, milestones etc. around the many aspects of advanced analytics that you should be addressing as you mature the unit.  This is a visual of the planning boxes that make up the current and future state of advanced analytics for your company.  It will also be a way to establish roles and responsibilities inside your business unit as it grows.

What is it?  It’s a diagram, on a single page that links ‘buckets’ of functions into a coherent and compelling picture.  In the spirit of open source, the category list of this blog represents a business architecture.  It’s a fully elaborated business unit thought process for parts of this science that need your attention and a way to educate others for activities outside of models and algorithms.

I invite you to elaborate on my version. This science is changing and growing significantly.  As such, what you need to plan for will change over time and be dynamically reshaped.

You cannot outsource your reference models to consultants without having the internal or at least independent ability to inspect and validate the model yourself.  The outside consultant has an incentive to make the model look better than it really is.  There are many layers of technical work that a consultant for hire (most anyway) will approach differently, including rigorous sampling and cross validation to ‘break’ the model.

A bad model is not always obvious and you are committing a very high level of trust to an outside firm if they are doing all the work.  At a minimum, you should retain a holdout set of data to validate the model yourself.

More broadly, outsourcing your model development resigns this activity to a commodity play.  And, it may incidentally invite IT into the picture as outsourcing is very much a part of their business model.

Net: have your own capabilities to cover the core science. Don’t outsource the entire project.  Let the consulting firm know up front you will keep a holdout set.  Do not accept a ‘black box’ algorithm.  Do not give the consultant any data until you have agreed on the terms of a fixed price with contingency.  Do not go for any value based pricing and give away money needlessly.  Do not go for a subscription model unless you must host it.  The market currently favors buyers.

IT spent many years at developing a quality process, methodology and assessment metric.  It is a credit to some forward thinking folks that we have the CRISP-DM model for standard methodology.  Should a CMM measurement be applied to model development?  It may not be possible to put that framework around a scientific effort.  But it would help distinguish qualified practitioners and consulting firms.  Perhaps an individual training certification?  But that ignores the inherently creative and non structured thinking approach that characterizes many highly qualified data scientists.  In general, quality and maturity need a framework that is more technical and real than any I’ve seen.

Edward H. Vandenberg

To some senior executives, analytics is mistaken for an IT function. This may lead to a misalignment of the analytics function or business unit. The alignment question is critical to getting work done, hiring people and communicating to internal stakeholders. Truthfully, most IT executives also probably think analytics is another IT service (or would like it to be). Analytics is a science and does not fit into the IT business model and will likely never perform well within IT. But clearly technology is critical to data science. I propose that IT establish a special service practice, organized and staffed specifically to enable the analytics function. If not that model, then analytics should have its own technology staff, reporting up to the analytics executive.  Either way, just as the organization overall needs to mobilize and re-engineer itself to fully exploit analytics, IT must step up to it’s critical but supporting role for analytics.