Characteristics of a Good Visual Analytics and Data Discovery Tool

Visual Analytics and Data Discovery allow analysis of big data sets to find insights and valuable information. This is much more than just classical Business Intelligence (BI). See this article for more details and motivation: “Using Visual Analytics to Make Better Decisions: the Death Pill Example“. Let’s take a look at important characteristics to choose the right tool for your use cases.

Visual Analytics Tool Comparison and Evaluation

Several tools are available on the market for Visual Analytics and Data Discovery. Three of the most well known options are Tableau, Qlik and TIBCO Spotfire. Use the following list to compare and evaluate different tools to make the right decision for your project:

  • Ease-of use and an intuitive user interface for business users to create interactive visualizations
  • Various visualization components such as bar charts, pie charts, histogram, scatter plots, treemaps, trellis charts, and many more
  • Connectivity to various data sources (e.g. Oracle, NoSQL, Hadoop, SAP Hana, Cloud Services)
  • True ad-hoc data discovery: real interactive analysis via drag-and-drop interactions (e.g. restructure tables or link different data sets) instead of “just” visualizing data sets by drill-down / roll-up in tables.
  • Support for data loading and analysis with alternative approaches: in-memory (e.g. RDBMS, spreadsheets), in-database (e.g. Hadoop) or on-demand (e.g. event data streams)
  • In-line and ad-hoc data wrangling functionality to put data into the shape and quality that is needed for further analysis
  • Geoanalytics using geo-location features to enable location-based analysis beyond simple layer map visualizations (e.g. spatial search, location-based clustering, distance and route calculation)
  • Out-of-the-box functionality for “simple” analytics without coding (e.g. forecasting, clustering, classification)
  • Out-of-the-box capabilities to realize advanced analytics use cases without additional tools (e.g. an embedded R engine and corresponding tooling)
  • Support for integrating any additional advanced analytics and machine learning frameworks (such as R, Python, Apache Spark, H20.ai, KNIME, SAS or MATLAB)
  • Extendibility and enhancement with custom components and features
  • Collaboration between business users, analysts and data scientists within the same tool without additional third-party tools (e.g. ability to work together in a team, share analysis with others, add comments and discussions)

Take a look at available visual analytics tools on the market with the above list in mind and select the right one for your use cases. Also keep in mind that you usually want to put the insights into action afterwards, e.g. for fraud detection, cross selling or predictive maintenance. Therefore, think about “How to Apply Insights and Analytic Models to Real Time Processing” when you start your data discovery journey.

Kai Waehner

bridging the gap between technical innovation and business value for real-time data streaming, processing and analytics

Recent Posts

Driving the Future: How Real-Time Data Streaming Is Powering Automotive Innovation

The automotive industry is rapidly shifting toward a software-defined, data-driven future. Real-time technologies like Apache…

3 days ago

Pinterest Fights Spam and Abuse with Kafka and Flink: A Deep Dive into the Guardian Rules Engine

Pinterest uses Apache Kafka and Flink to power Guardian, its real-time detection platform for spam,…

7 days ago

Building Agentic AI with Amazon Bedrock AgentCore and Data Streaming Using Apache Kafka and Flink

Agentic AI goes beyond chatbots. These are autonomous systems that observe, reason, and act—continuously and…

1 week ago

Inside FourKites Logistics Platform: Data Streaming for AI and End-to-End Visibility in the Supply Chain

Global supply chains face constant disruption. Trade conflicts, wars, inflation, and shifting regulations are making…

2 weeks ago

The Rise of Kappa Architecture in the Era of Agentic AI and Data Streaming

The shift from Lambda to Kappa architecture reflects the growing demand for unified, real-time data…

3 weeks ago

FinOps in Real Time: How Data Streaming Transforms Cloud Cost Management

FinOps bridges the gap between finance and engineering to control cloud spend in real time.…

4 weeks ago