Create, deploy, and maintain analytic applications that engage users and drive revenue. See a Logi demo

Buying BI

Expert Q&A: Applying Critical Thinking When Evaluating Analytics Solutions (Part 1)

By Michelle Gardner | July 24, 2018
Share on LinkedIn Tweet about this on Twitter Share on Facebook

If you’ve ever tried to research analytics solutions, then you’ve probably gotten lost in a dizzying array of awards, user reviews, analyst reports, and ratings. So how do you tell which of these are real and unbiased—and which are just advertisements in disguise? To find out, we recently hosted a webinar featuring Jen Underwood, Founder of Impact Analytix, on how to apply critical thinking when evaluating Business Intelligence (BI) solutions.

>> Related: 2018 BI Buyer’s Guide <<

First, what is critical thinking and why is it important when evaluating a new embedded analytics solution?

Much of our thinking, if left unquestioned, can be biased or distorted from our own experiences. We may only have partial information, or we may have emotional connections that we don’t even recognize.

Critical thinking, on the other hand, allows us to:

  • Think independently
  • Understand logical connections between ideas
  • Identify and evaluate arguments
  • Detect inconsistencies and common mistakes in reasoning
  • Solve problems systematically
  • Reflect on the justification of one’s own beliefs and values

Often when I look at industry analysis, I find more nuggets in the challenges or the negative opinions in the report (if any). When you see those happy, fluffy comments from customers, they’re not always real. It’s the negative ones, if they’ve made it past the filter, that will give you information you won’t see anywhere else.

What is important to remember when looking at analyst rankings or expert reviews?

Different sources of information will have different perspectives. The same exact embedded analytics solutions can be reviewed by different experts and they’ll have very different results. Once you start to look at the motives and methodologies behind the research, that’s when you’ll discover the truths in the information. For instance, sometimes folks are testing solutions hands on, sometimes they’re not.

What’s the most important bias to look out for?

Probably the most influential bias we have is going to be “following the money.” There’s a lot of things in this world that are biased depending on where the money is coming from. Question if there’s any vested interests in these different sources of information. Look at clarity, accuracy, depth, and breadth. There are many different levels of solution analysis that you’ll see.

What questions can help determine the accuracy of a given report?

  • What is the author’s viewpoint and purpose? What key questions or issues are raised?
  • What information or data is used? How was it gathered? Does it accurately reflect the larger population?
  • What conclusions are made, and are they justified? What assumptions are made, and should they be questioned? Do the claims address the topic’s depth or complexity?

Beware of commonly used “chart tricks” that may skew how a report looks. For instance, grids are really easy for buyers to understand, but aren’t always a good scoring format. Also look out for “apples to oranges” comparisons, changes in criteria or axis scale placement, and inexplicable omissions. If a really high-profile report on analytics providers is missing Google or Amazon Web Services, for instance, that omission can speak volumes about the report’s reliability.

Stay tuned for the second part of our interview with Jen Underwood, and get more tips in the full on-demand webinar >

 

About the Author

Michelle Gardner is the Content Marketing Manager at Logi Analytics. She has over a decade of experience writing and editing content, with a specialty in software and technology.

Subscribe to the latest articles, videos, and webinars from Logi.