• Facebook
  • LinkedIn
  • Twitter

A Framework for the Evaluation of Data Analyses and Visualization Tools

2017 SIAM Conference on Computational Science and Engineering

A Framework for the Evaluation of Data Analyses and Visualization Tools

Abstract. Data analysis and visualization (DAV) tools play an essential role supporting knowledge discovery in scientific fields. DAV tools are developed to enable analyzing data and building visualization entities to get a better cognitive understanding of data. DAV tools support the visual-information-seeking mantra of “Overview first, zoom and filter, then details-on-demand”. Data analyses and visualization systems allow users to interact with data, use mouse clicks, menus, functions, and screen-touch to generate visual-based structures to discern patterns, trends and outliers. Users can click on areas of interest and expand into greater detail. In spite of the wide development of DAV tools, the use of visualization systems by people different from the tool designers seems to be quite limited due to usability violations. The impact of not having a standard set of usability rules for understanding this problem has significant effects on adoption and trustworthiness of the results. Therefore, a user centered design (UCD) approach, using well defined rules, for evaluating the usability of DAV tool provides an impartial way to gauge usability and thereby, effectiveness of DAV tools , and this is where this study seeks to make a contribution. This study is of threefold: (i) extensions to Nielsen’s rules to fit the domain of DAV tools (ii) a framework to be used to inspect DAV tools; and (ii) a comprehensive evaluation of the usability of selected DAV tools. Using the extended version of Nielsen’s metrics we evaluated four tools that encompass different scientific needs to varying degrees and touch on three broad categories: general, specialized, and cross-cutting. First is Jupyter, a general computational environment for executing code and retrieving visual and analysis results from remote systems. The second is UV-CDAT, a framework developed for analyzing and visualizing climate data. The last two toolsets, VisIt and ParaView, provide algorithms optimized for analyzing, querying, and visualizing scientific data. To keep the problem domain constrained, we evaluate how each of these tools performed data analysis and visualization tasks are met by each of these proposed scientific tools. In summary, using the proposed evaluation metrics provides a mechanism to evaluate DAV tools for usability thereby quantifying effectiveness leading to higher adoption, lower barrier to entry, and greater insight.

Authors