Create, deploy, and maintain analytic applications that engage users and drive revenue. See a Logi demo

BI Trends

Big Data & the U.S. National Institute of Standards & Technology

By Charles Caldwell | March 27, 2014
Share on LinkedIn Tweet about this on Twitter Share on Facebook

So it looks like Big Data hype has reached NIST, and they are setting off to tame it by delivering a framework to bring rigor to data science.

Frameworks are great.  They help me solve problems without first having to figure out all the aspects I should consider.  I use a checklist to pack for trips (don’t forget socks or my razor or headphones for the plane), and a standard grocery list (makes it easy to see what is missing from the fridge). MBAs and MSIT degrees are all about frameworks.  Porter’s Five Forces, SWOT, ITIL… a framework for everything.

That said, I wonder what NIST is going to tackle in this framework.  “Data Science” has had frameworks in place for a long time now.  Frameworks like CRISP-DM or SEMMA have guided the stats-heavy analysts of the past.  Why is a new framework needed?  Maybe it’s to account for new architectural issues and technology complexities, and introduce best practices for dealing with those.  And while that aspect is important and needed, I’m not sure that is the pain point that most businesses are encountering in their implementation of big data.

The organizations that have been most successful with big data aren’t doing anything new.  In fact, they do the same thing that all organizations that have been successful with data driven initiatives have always done.  They understand how data driven analytics are used to solve real-world problems that have some value associated with them.  Sure, they get good at managing the technologies involved.  They come to understand the unique people and process issues.  But the factor that so many miss, that frankly is often obscured behind adherence to a framework, is the identification of potential business value and an understanding of how the analytics might produce it.

All technologies serve to help us “elevate” a constraint in a system.  Every system has a limiting step in it, and the throughput of that system will never be more than what the constraint can handle.  Technology allows us to increase system throughput by dealing with that constraint in some way.  Analytics, in particular, contributes to this process in one of three ways:

  1. Identify where the constraint actually exists, which is particularly useful in complex system like supply chain.
  2. Monitor the overall system to ensure that the constraint point is as fully utilized as possible and the rest of the system is synchronized to it.
  3. “Elevate” the constraint by delivering the right information at the right time.  Increasingly in the “knowledge economy”, entire business models are reworked by doing this.  (Think Amazon shutting down Borders.)

Big Data hype is the same hype we had with BI in the 90s and scorecards at the turn of the millennium, and advanced analytics in the last decade.  The hope is that the magical technology, plus some set of best practices and highly skilled resources will comb through your business data and yield gold from base metal.  Sure, there is always a cool story about a totally unexpected insight that yields some benefit.  But the folks who excel at this stuff always start with a strong grasp on their business, a solid working theory of where the limiting factors are in their business, and how data and analytics, no matter how big or advanced, can help them identify the constraint, optimize and monitor it, or elevate it to change the game.

It will be cool to see the framework from NIST in 2016.  I’m sure it will have a lot of value to add in dealing with the complex landscape “Big Data” presents.  If you can’t wait, grab a good operations person, map out your business model, and start with a theory of where your constraints might be.  That is the rigor you need as the basis for all your analytics and data efforts.  Big or small, predictive or descriptive.  Even once the NIST framework is published.

 

About the Author

Charles Caldwell is the Senior Director, Global Solutions Engineering at Logi Analytics. Charles came to Logi Analytics with a decade of experience in data warehousing and business intelligence (BI). He has built data warehouses and reporting systems for Fortune 500 organizations, and has also developed high-quality technical teams in the BI space throughout his career. He completed his MBA at George Washington with a focus on the decision sciences and has spoken at industry conferences on topics including advanced analytics and agile BI.

Subscribe to the latest articles, videos, and webinars from Logi.