Archives For Analytics Program Management

Program Management includes several critical management and work activities that you or your managers must be accountable for. Consult the Program Management Body of Knowledge for a detailed view of this level of organization and apply it to your enterprise analytics group.

EMC IT Proven

lena2By Dr. Lena Tenenboim-Chekina — Senior Data Scientist, EMC IT

Smart data visualization is proving to be an essential tool in maintaining increasingly complex Big Data systems in the cloud.

The adoption of Big Data tools and technology heavily relies on distributed scaled out computing. One of the main differences in this setting is that it includes systems that operate as a whole on top of several independent hosts. These hosts coordinate their actions with limited information and as a result maintenance complexity significantly increases. One way to overcome this challenge is smart data visualization, which helps the IT experts and management pinpoint the source of problems quickly.

The need for smart visualization is not unique to this problem. Representing complex data as a concise picture which tells decision-makers a story is a key part of any data analytics or data science project. Valuable results of a rigorous analysis may…

View original post 1,250 more words

Who has it in your org today? Will they listen to the data science?

Your stakeholders lack a shared understanding of the methods and practice of advanced analytics.  You start out with a trust deficit when explaining how the mathematics will improve business results.

To build trust, start in advance by building an ordinary business relationship to the operations management.  Next share stories of analytics successes and how they were achieved (ideally those that you have directed). Next coach your stakeholders to interpret model results by simplifying the complex model validation process.

Gradually build an Arena of shared understanding for how models can help operations arrive at a better performance state.  This is hard work and not the stuff of algorithms and data but almost as important.

Look at the Jahari Window for expanding the Arena of trust.

Edward H. Vandenberg

Don’t expect the new VP of Advanced Analytics to do it all.  When processes and decision-making have to change to exploit insights from data, the line managers, with executive sponsorship, must carry the weight.  Without that, you have models that work on paper but aren’t used in the operations.  It’s tempting to blame the analytics leader but that is misplaced.  Operations managers actually resist doing things in a new way, despite the math telling them otherwise. And the VP of Analytics likely has no power to change that, short of appealing to the executive staff–not a popular move for the analytics lead that also must evangelize for new projects and problems to solve.

Also, commanding a small specialized team is not normally seen as a position of clout, despite the title and sponsorship.  Managers of large operations (financial and headcount) naturally have more influence in most organizations, even if they have a peer title to the advanced analytics leader.

In defense of the operations level manager, they are rightfully reluctant to be accountable for mathematics that is probablistic and hard to understand.  Likely they are also normally protective over the operational influence that analytics can have within their business units.  This creates tension that makes analytics ROI go sideways.

What’s the answer?  For analytics to truly be exploited, operations management must step up….understand the science more, be ready to believe in it and lead their operations to adopting it.  That means they are hired for it, trained for it and managed for it.  How an organization mobilizes for that is under the leadership of a Chief Analytics Officer and a full program management approach to advanced analytics.  Deploying advanced analytics must be seen as the path to promotion for career operations managers.

Secondly, the advanced analytics effort must include Test and Learn experiments for every model pilot that help prove the in-use value of models beyond validation on historic data.  This is a natural extension of the model development work

Cognitive Decision Science must be part of your modeling solutions. Understanding how people make decisions and what their typical biases are is critical to developing decision support models.

Daniel Kahneman,. ” Thinking Fast and Slow’ will provide some guidance on this subject.

The types of decisions follow data types; each with their own defect modes. The best models inoculate the business from decision bias that is always present when people make decisions. If you are working in a ‘flow’ transaction environment, making good decisions, consistently at speed simply does not happen without data analytics support.

Share things that can (must) be shared

  • Specialized Talent
  • Infrastructure
  • Some datasets
  • Tools

 

Focus services to deliver on demand

  • Domain knowledge
  • Data and systems expertise
  • Capacity for high demand customers

 

Standardize things that will help deliver consistent quality

  • Methodology
  • Project Practices
  • Role Descriptions

 

Give synergy to the effort

  • Complimentary skills and knowledge
  • Knowledge sharing and imagination
  • Contrarian viewpoints

 

Control Risks Formally

  • Skills definition
  • Independent Quality Review of the models and interim work products
  • Checks on conflict of interests and influence
  • Management accountability (project level)
  • Sign-off and approval process
  • Ethical Standards

 

Develop Enterprise Assets

  • Reusable datasets
  • Documented models

 

An Enterprise Identity and Voice

  • Organizational voice
  • A place for people with unique skills to belong
  • Promote identity, value and scope of the work
  • Tell the story to the enterprise

As the leader of an analytics business unit, the scope of the methods and projects you should plan to deliver include:

  • Supervised and unsupervised modeling
  • Operations Research/Optimization
  • Design of Experiment
  • Statistical Quality Control
  • Simulation
  • Forecasting
  • Text Mining (flow verbatim and text corpus)
  • Link Analysis
  • Big Data techniques (Map Reduce/Hadoop)
  • Heuristics (complex business rules)
  • Process Mining
  • Cognitive Decision Analysis
  • Visualizations (Mental Modeling)
  • Interpretation of dense signals (voice and image)
  • Interpretation of flow data (click streams, verbatim, dialogues and diaries)

You will have to work to uncover projects and understand the value proposition for these kinds of projects.  Even more challenging, you will have work to do to explain why your operations managers need these types of analytics to improve their operating results.

Lastly, you will need to bring the talent, tools and processes together to perform this type of work for your organization.  An exciting and challenging prospect.

Credited to R blogger Drew Conway

A great graphic depicting the interesting mix of qualities in data scientists.

via The Data Science Venn Diagram.

Thanks to Joe Baird for this insight on the fit between traditional CIO’s and the advanced analytics function.   I would extend the argument further.  Today’s CIO and IT organizations are not positioned to build, run and promote advanced analytics inside their organizations.  There are synergies between IT and advanced analytics at the tactical level.  But at the leadership level, most CIO’s don’t know the territory well and their experience is not aligned with data science.  Analytics is not technology. It is technical.  It is convenient to refer to analytics as a science to help make this distinction.  When considered this way, most would agree that IT is not aligned with a scientific effort.

The Chief Insight Officer, suggested by Baird, is a leader hired by senior management to drive analytics forward.

CIO’s and their management will not want to be left out of the excitement and value proposition proferred by advanced analytics.  It will be a challenge for IT executives and managers to ‘bolt-on’ much insight and knowledge, after careers and experience dedicated to the traditional IT function.

Recommended Reading

The CIO in the Age of Analytics: From Infrastructure to Insight | Joe Baird.

This is a challenging subject.  Here is a start:

1) Getting to failure mode quickly – analytics effort is too valuable and the need to stop doing pointless work is great.  Every project has a failure mode and the faster you can find it in an analytics project the more mature the effort is.  Every part of the project has a failure mode. (Side note: how different this is from many ‘technical’ projects, especially IT projects).

2) Not making the same mistake twice – look to Dr. John Elder for common mistakes made by talented people.  The least a mature team can achieve is not repeating mistakes in their work. Home – ERI.

3) Catching your own mistakes – closely related- but creating a mechanism to catch common errors is a level of maturity

4) Repeatability – mature efforts have a level of repeatability across challenging and varied projects.  This is a high mark.  The data and problem statement present unique problems that work against any repeatable effort or thought process.  TBD on whether this capability makes it into the CMM model for Advanced Analytics.

I invite the reader to extend and revise.