Quality during Design
Quality during Design is the podcast for engineers and product developers navigating the messy front end of product development. Each episode gives you practical quality and reliability tools you can use during the design phase — so your team catches problems early, avoids costly rework, and ships products people can depend on.
You'll hear solo episodes on early-stage clarity, risk-based decision-making, and quality thinking, along with conversations with cross-functional experts in the series A Chat with Cross-Functional Experts.
If you want to design products people love for less time, less cost, and a whole lot fewer headaches — this is your place.
Hosted by Dianna Deeney, consultant, coach, and author of Pierce the Design Fog. Subscribe on Substack for monthly guides, templates, and Q&A.
Quality during Design
Data Visualization Tips to Improve Analysis Skills
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Can visualizing your data be the game-changer you've been missing?
Discover why plotting isn't just a step in data analysis, but a crucial practice that can reveal uniformity, natural variations, and even potential flaws in your test methods. Learn about the importance of recognizing multiple failure modes and how to avoid common pitfalls such as mishandling outliers and making incorrect assumptions. This episode is packed with actionable advice to enhance your decision-making process.
Visit the podcast blog for extra links.
If your team is still catching problems too late — let's talk.
→ Schedule a free discovery call: Dianna's calendar
Want insights like this?
→ Subscribe to my newsletter: qualityduringdesign.substack.com
Get the full framework.
→ Pierce the Design Fog
ABOUT DIANNA
Dianna Deeney is a quality advocate for product development with over 25 years of experience in manufacturing. She is president of Deeney Enterprises, LLC, which helps organizations and people improve engineering design.
Always Plot Your Data for Analysis
Speaker 1Hi there, I'm Diana Dini with Quality During Design. Right now, when this podcast episode releases, I will be in a conference the ASQ's Reliability and Risk Divisions Reliability, Maintenance and Managing Risk Conference in Pittsburgh, Pennsylvania. I am looking forward to presenting on both days of this conference. I'm also looking forward to meeting other people and seeing other people's presentations, bringing back some of the things that I learned to you in future episodes, future podcast episodes and especially in the email newsletter. This is not the first time that I've been at this conference or presented at this conference. I attended a couple years ago and was inspired to do a podcast episode about plotting data. I wanted to come back and revisit that episode, since I'm at the same conference again, so I'm going to share from the archive this episode about plotting the data. Always, plot the data was the original title. I hope you enjoy this episode. I got some feedback from it after it released that people thought it was a nice one, so enjoy, and I will talk to you in the next podcast when I return from the conference. We're getting test data back from the lab lab and the numbers are looking pretty good. Our test results are within our requirement limits, so let's write it up and have it a go, but hold on, let's plot it out first. Let's talk about plots and why they're important, what we can do with them. After this brief introduction, Hello and welcome to Quality During Design, the place to use quality thinking to create products. Others love for less. My name is Diana. I'm a senior level quality professional and engineer with over 20 years of experience in manufacturing and design. Listen in and then join the conversation at qualityduringdesigncom.
Speaker 1I attended a conference last week for reliability engineers. Well, it was hosted by the ASQ Reliability and Risk Division. It was Reliability, Maintenance and Managing Risk Conference. While at the conference, I met some very interesting people, very friendly people, and I sat in on presentations of useful case studies and interesting ideas about reliability. One of the presenters was Dr Wayne Nelson. He is an expert on reliability and statistical methods. He's won several awards and has also published books and papers on statistical methods. He was a highly respected contributor to the conference. He had a couple of presentations that I sat in and they had a particular theme that he told us about the theme too. His theme was always plot your data.
Speaker 1As a reliability and quality practitioner, I think my go-to is to plot the data, but I never really thought of it as something that I would mention. That I do. I just kind of do it. I thought it was a good reminder for the reliability engineers at the conference, but it's also something good to talk about with design engineers. So let's talk about why we want to always plot our data. Talk about why we want to always plot our data. It doesn't matter if our data is discrete or continuous, or if it's counts or measures. We can still plot it. It helps us to understand what we're looking at.
Speaker 1The first thing I look at when looking at a plot is how uniform are our results? And if it's not uniform, that gives me some clue as to what's happening behind the data. Is there natural variation in our product? It could be caused from the materials itself, from the way that it's made or even the way that it's assembled. Could also be a stack up of design tolerances where everything is made within spec but the design allows for variation in the end product. Or is it our test method? Is it introducing issues? Could it be that the way that we hold the part isn't ideal? Or maybe the way that we're holding it or positioning it during test isn't really testing the area that we're looking to test, but instead putting stress on a different part and causing to skew our results. And does it look like we're dealing with multiple failure modes? Are they competing failure modes? We talk about competing failure modes and what they are, what they look like and how to deal with them in a previous episode of the Quality Daring Design podcast. I'll link to it in the show notes.
Speaker 1One type of plot may not be enough. I found that plotting it out once sort of starts a breadcrumb trail. One plot will show me the data and may highlight something interesting. Then we follow the breadcrumb and start digging a little deeper into the data. We may add more inputs into the database and generate another plot that could help us investigate what it is we're looking at Now things to watch out for in our plots and these are some common gotchas.
Speaker 1One of the things to watch out for are outliers. I know they're pesky and don't make for a pretty plot, but we don't delete or eliminate them. It's rather a source of something interesting or a telltale sign of something wrong. It could be another failure mode, maybe a new one. So we're going to check the test results or re-examine our samples. It could be those test method issues that we talked about, or a special cause of a failure in a manufacturing method. Maybe it's not the natural variation of our process, but something happened during the production of our parts that we tested. What was it, and do we need to prevent it from happening again in the future?
Speaker 1Another gotcha with plotting is just understanding the nature of your data before you start plotting it. Sometimes the data will inform how it is that we plot something. So that's one thing to consider is just the nature of our data. And then some plots, like probability plots, assume that your data meets certain criteria. You may have to plot or test your data against those criteria before you can assume the results of the plot are correct or accurate and before you can start making decisions with it. There are lots of other plots that don't require this, like run plots and scatter plots. We just need to be aware of which plots make assumptions based on the equations they use to generate them. Something I learned from a coworker and, honestly, through some software training, is just to go through the preferences and assumptions of whatever software you're using to generate the plot, Go through each window and look at the inputs and the assumptions that you're making when you're generating the plot. That will give you some indication if there might be something else you need to account for.
Speaker 1Another gotcha with plots and this is something that Dr Nelson pointed out he worked with an engineer that was comparing four different things, but the things that he was comparing didn't have the same axes. The things that he was comparing didn't have the same axes, For example. As engineers, we know to be careful to use calculations with the same unit of measure, and it's the same thing with plots. When we create multiple plots and are comparing them, we want to make sure that we're using the same units of measure, that we have the zero location the same for each plot and that we're using the same limits and range on each axis. We want to compare like with like. Otherwise it could skew our view of the data and could lead us to some wrong decisions.
Speaker 1There are two common plots that reliability engineers look at, and it's a probability density function and a cumulative density function, and these two plots we get into more in another episode of Quality During Design, and this one's a video episode so you can see a picture of what it is I'm talking about. I'll link to it in the show notes. Also, there are lots of different plots and you may not be familiar with all the types of plots that you could look at. I'm familiar with a lot of plots and I'm sure I don't have all of them covered, but just try one that you think will help you.
Speaker 1Plots are a tool to help us decipher important information from data so we can make decisions. So if the plot that we choose to try first doesn't help, then try it a different way, Try a different plot. So what's today's insight to action? When you get data, go ahead and preview it and see how things might be looking and then plot it out. You'll really get to see how things are looking and will be able to make better decisions from your data. If you like the content in this episode, visit qualityduringdesigncom, where you can subscribe to the weekly newsletter to keep in touch. This has been a production of Dini Enterprises. Thanks for listening.
Podcasts we love
Check out these other fine podcasts recommended by us, not an algorithm.
Speaking Of Reliability: Friends Discussing Reliability Engineering Topics | Warranty | Plant Maintenance
Reliability.FM: Accendo Reliability, focused on improving your reliability program and career
Reliability Hero
MAINSTREAM Community
Manufacturers Make Strides
Martin Griffiths
The Manufacturing Executive
Joe Sullivan
The Antifragility Reframe
Dr. Frank L. Douglas
The SAFE Leader with Mark McBride-Wright
Mark McBride-Wright
Coaching for Leaders
Dave Stachowiak