I was recently able to speak with David Doney, vice-president of internal audit at SIRVA, Inc. about how analyzing data and presenting it visually are important competencies as the economy digitizes.
David will be speaking on our free webinar on May 28 at 11:00 a.m. CDT: Best Practices in Data Visualization
APQC: What do organizations need to make better, data-driven decisions?
David: Organizations should have a data strategy because how a company manages information increasingly is a competitive differentiator. Data should be viewed as an asset to be optimized. Key elements might include: 1) Executive support for major analytics initiatives; 2) Implementing a data analysis framework or process; 3) Sharing information with vendors and customers to simplify doing business; 4) Building analytical capabilities, from a central analytics group outward; and 5) Moving key data used to run the business into data warehouses, both for cost-effective retrieval and to establish “one version of the truth.”
APQC: In your experience what are the most important things to keep in mind when visualizing data?
David: It’s about the audience. What message are you trying to send with the information display (i.e., table or chart?) What alternatives are the users considering and how do the displays help them make that decision? The data contains a story and it’s up to the analyst to properly understand that story so they can communicate it for the intended purposes of the user. While the analyst may try dozens of displays to understand the messages in the data, they must then filter out the noise to deliver only the signal to the decision maker.
APQC: What techniques do you find make data presentations most effective?
David: Each information display should have a key takeaway for the user, which can be neatly summarized with a short caption. This helps ensure the message and display are tightly linked. If the message or linkage isn’t clear, you may not have the right display. Further, sequence the displays to follow a storyline. For example, if you display a comparison (e.g., a trend of rising revenues), the user will logically ask for the cause (e.g., increasing volume or price per unit), which would guide the displays that follow.
APQC: With the growing access to large amounts of data, how can data analysts ensure they are using the right data analysis framework?
David: A framework can be defined as a problem solving method, which we test against reality and modify as needed. A generic data analysis framework includes eight phases: 1) Requirements; 2) Planning; 3) Data collection; 4) Data processing; 5) Data cleaning; 6) Exploratory data analysis; 7) Modeling; and 8) Reporting. These phases are iterative. Both the data science and intelligence fields have useful frameworks that can be tailored to build our own.
Keeping user requirements in mind will help ensure you capture the data necessary to help them. Further, mechanisms to ensure awareness of new data sources (e.g., cell phone location information shared with businesses) may help the analyst guide those user expectations as well.
By applying formal project discipline to data analysis efforts and updating the framework accordingly, the organization can readily apply lessons learned from one project to the next.
Join us for our May webinar to hear David discuss:
- how to identify and communicate the key message types contained in a dataset,
- best practices for using bar, pie, histogram, and scatterplot displays,
- analytical techniques such as indexing and normalization, and
- a framework for conducting a data analysis project.