Preview Mode Links will not work in preview mode

Apr 2, 2024

In part 3 of this series, John Dues and host Andrew Stotz talk about the final 5 lessons for data analysis in education. Dive into this discussion to learn more about why data analysis is essential and how to do it right.

TRANSCRIPT

0:00:02.4 Andrew Stotz: My name is Andrew Stotz and I'll be your host as we continue our journey into the teachings of Dr. W. Edwards Deming. Today I'm continuing my discussion with John Dues who is part of the new generation of educators striving to apply Dr. Deming's principles to unleash student joy in learning. This is episode 23 and we're talking about goal setting through a Deming lens. John, take it away. 

0:00:30.8 John Dues: It's good to be back, Andrew. Yeah, in this first episode of this four-part series, we talked about why goal setting is often an act of desperation. And if you remember early on, I sort of proposed those four conditions that organizations should understand about their systems prior to ever setting a goal. Those four were capability, variation, stability, and then by what method are you going to improve your system? And then in the last episode, I introduced the first five lessons of the 10 key lessons for data analysis. And remember, these lessons were set up to avoid what I call these arbitrary and capricious education goals, which are basically unreasonable goals without consideration of those four things, the system capability, variation, and stability, and then not having a method. So, it might be helpful just to recap those first five lessons. I'll just list them out and folks that want to hear the details can listen to the last episode.

 

0:01:31.8 JD: But lesson one was data have no meaning apart from their context. So, we've got to contextualize the data. Lesson two was we don't manage or control the data. The data is the voice of the process. So, it's sort of, you know, the data over time shows us what's happening and we don't really have control over that data. We do have control under that underlying process. Lesson three was plot the dots for any data that occurs in time order. So, take it out of a two-point comparison or take it out of a spreadsheet and put it on a line chart that shows the data over time. Lesson four was two or three data points are not a trend. So again, get beyond the typical two-point limited comparison this month and last month, this year and last year, this same month, last year, those types of things, this week, last week.

 

0:02:25.6 JD: And then lesson five was, show enough data in your baseline to illustrate the previous level of variation. So, we want to get a sense of how the data is changing over time and we need a baseline amount of data, whether that's 12 points, 15 points, 20 points, there's sort of different takes on that. But somewhere in that 12-to-20-point range is really the amount of data we want to have in our baseline. So, we understand how it's moving up and down over time sort of naturally. Sort of at the outset of those two episodes, we also talked about centering the process behavior charts, like the ones we viewed in many of our episodes. And we put those in the center because it's a great tool for looking at data over time, just like we've been talking about.

 

0:03:11.4 JD: And I think when we use this methodology, and when you start to fully grasp the methodology, you start to be able to understand messages that are actually contained in the data. You can differentiate between those actual special events, those special causes, and just those everyday up and downs, what we've called common causes. And in so doing, we can understand the difference between reacting to noise and understanding actual signals of significance in that data. And so, I think that's a sort of a good primer to then get into lessons six through 10.

 

0:03:51.2 AS: Can't wait.

 

0:03:53.3 JD: Cool. We'll jump in then.

 

0:03:56.1 AS: Yeah. I'm just thinking about my goal setting and how much this helps me think about how to improve my goal setting. And I think one of the biggest ones that's missing that we talked about before is by what method. And many people think that they're setting strategy, when in fact, they're just setting stretch targets with nothing under it. And they achieve it by luck or are baffled why they don't achieve it. And then they lash out at their employees.

 

0:04:31.4 JD: Yeah, there was really... I mean, that goes back to one of those four conditions of setting goal capability. You have to understand how capable your system is before you can set, it's fine to set a stretch goal, but it has to be within the bounds of the system. Otherwise, it's just maybe not an uncertainty, but a mathematical improbability. That's not good. Like you're saying, it's not a good way to operate if you're a worker in that system. So, lesson six then, to continue the lessons.

 

0:05:06.8 JD: So, lesson six is "the goal of data analysis in schools is not just to look at past results, but also, and perhaps more importantly, to look forward and predict what is likely to occur in the future," right? So that's why centering the process behavior charts is so important, because they allow you to interpret data that takes variation into account, allows you to classify the data into the routine or common cause variation or the exceptional, that's the special cause variation, and allows us to turn our focus to that underlying or the behavior of the underlying system that produced the results. And it's this focus on the system and its processes that's then the basis for working towards continual improvement.

 

0:06:00.6 AS: And I was just thinking about number six, the goal is to predict what is likely to occur in the future. And I was just thinking, and what's likely to occur in the future is exactly what's happening now, or the trend that's happening, unless we change something in the system, I guess.

 

0:06:16.4 JD: Yeah. And that's why just setting the stretch goal is often disconnected from any type of reality, because we have this idea that somehow something magical is going to happen in the future that didn't happen in the past. And nothing magical is going to happen unless we are intentional about doing something differently to bring about that change.

 

0:06:39.5 AS: And that's a great lesson for the listeners and the viewers. It's like, have you been just setting stretch targets and pushing people to achieve these stretch targets? And not really understanding that your role is to understand that you're going to get the same result unless you start to look at how do we improve the method, the system, that type of thing.

 

0:07:05.0 JD: Yeah. And usually when you have those stretch goals, you've looked at what happened last year, and then you base the stretch goal on last year. But perhaps, you're seeing, for the last three or four years, the data has been steadily decreasing, right? And you can't realize that if you haven't charted that over the last three or four years, hopefully beyond that. So, you have no idea or it could have been trending positively, and you may under shoot your stretch goal because you missed a trend that was already in motion because of something that happened in the past.

 

0:07:44.8 AS: You made a chart for me, a run chart on my intake for my Valuation Masterclass Bootcamp. And we've been working on our marketing, and I presented it to the team and we talked about that's the capability of our system based upon for me to say, I want 500 students when we've been only getting 50 is just ridiculous. And that helped us all to see that if we are going to go to the next level of where we want to be, we've got to change what we're doing, the method that we're getting there, the system that we're running and what we're operating to get there or else we're going to continue to get this output. And so if the goal is to predict what is likely to occur in the future, if we don't make any changes, it's probably going to continue to be like it is in that control chart.

 

0:08:42.8 JD: Yeah. And that example is, in a nutshell, the System of Profound Knowledge in action in an organization where you're understanding variation in something that's important to you, enrollment in your course. You're doing that analysis with the team. So, there's the psychological component and you're saying, well, what's our theory of knowledge? So, what's our theory for how we're going to bring about some type of improvement? And so, now you're going to run probably something like a PDSA. And so now you have all those lenses of the System of Profound Knowledge that you're bringing together to work on that problem. And that's all it is really in a nutshell.

 

0:09:22.2 AS: Yeah. And the solution's not necessarily right there. Sometimes it is, but sometimes it's not. And we've got to iterate. Okay. Should we be doing marketing in-house or should we be doing it out using an outsourced service? What if we improve and increase the volume of our marketing? What effect would that have? What if we decrease the... What if we change to this method or that method? Those are all things that we are in the process of testing. I think the hardest thing in business, in my opinion, with this is to test one thing at a time.

 

0:09:58.5 JD: Yeah.

 

0:09:58.7 AS: I just, we I want to test everything.

 

0:10:00.4 JD: Yeah. Yeah. I read in the Toyota Kata that I think we've talked about before here, which talks about Toyota's improvement process. I read this in the book, I don't know if this is totally always true, but basically they focus on single factor experiments for that reason, even in a place as complex and as full of engineers as Toyota, they largely focus on single factor experiments. They can actually tell what it is that brought about the change. I mean, I'm sure they do other more complicated things. They would have to write a design of experiments and those types of things, but by and large, their improvement process, the Toyota Kata, is focused on single factor experiments for that reason.

 

0:10:48.1 AS: And what's that movie, the sniper movie where they say, slow is smooth and smooth is fast or something like that, like slow down to speed up. I want to go fast and do all of these tests, but the fact is I'm not learning as much from that. And by slowing down and doing single factor experiment to try to think, how do we influence the future is fascinating.

 

0:11:20.9 JD: Yeah, absolutely.

 

0:11:22.4 AS: All right. What about seven?

 

0:11:23.2 JD: Lesson seven. So "the improvement approach depends on the stability of the system under study," and there's really two parts to this. But what approach am I going to take if the system is producing predictable results and it's performing pretty consistently, it's stable, there's only common cause variation. And then what happens if you have an unpredictable system? So two different approaches, depending on what type of system you're looking at in terms of stability. So you know the one thing to recognize in thinking about something like single factor experiments, it's a waste of time to explain noise or explain common cause variation in this stable system, because there's no simple single root cause for that type of variation. There's thousands or tens of thousands of variables that are impacting almost any metric. And you can't really isolate that down to a single cause.

 

0:12:17.5 JD: So instead we don't, we don't try to do that in a common cause system that needs improvement. Instead, if the results are unsatisfactory, what we do is work on improvements and changes to the system, right? We don't try to identify a single factor that's the problem. So what we do then is we work to improve a common cause processor system by working on the design of that actual system including inputs, throughputs that are a part of that. And to your point, you sort of have to, based on your content knowledge of that area, or maybe you have to bring in a subject matter expert and you sort of start to think about what's going to make the biggest difference. And then you start testing those things one at a time, basically. That's sort of the approach. And then if you're working in an unpredictable system and that unpredictable system is unpredictable because it has special causes in your data, then it's really a waste of time to try to improve that particular system until it's stable again. And so the way you do that is at that point, there is something so different about the special cause data that you try to identify that single cause or two of those data points. And then when you've identified, you study it, and then you try to remove that specific special cause. And if you've identified the right thing, what happens then is it becomes a stable system at that point, right?

 

0:13:51.9 AS: I was thinking that it's no sense in trying to race your boat if you've got a hole in it. You got to fix the special cause, the hole, and then focus on, okay, how do we improve the speed of this boat?

 

0:14:06.5 JD: And the key is recognizing the difference between these two roadmaps towards improvement. And I think in education for sure, there's a lot of confusion, a lot of wasted effort, because there's really no knowledge of this approach to data analysis. And so people do their own things. There's a mismatch between the type of variation that's present and the type of improvement effort that's trying to be undertaken. I think the most typical thing is there's a common cause system, and people think they can identify a single thing to improve. And then they spend a lot of time and money on that thing. And then it doesn't get better over time because it was the wrong approach in the first place.

 

0:14:55.9 AS: Number eight.

 

0:14:57.6 JD: Number eight. So, number eight is, "more timely data is better for improvement purposes." So we've talked about state testing data a lot. It's only available once per year. Results often come after students have gone on summer vacation. So, it's not super helpful. So, we really want more frequent data so that we can understand if some type of intervention that we're putting in place has an effect. I think what the most important thing is, the frequency of the data collection needs to be in sync with the improvement context. So, it's not always that you need daily data or weekly data or monthly data, or quarterly data, whatever it is. It's just it has to be in sync with the type of improvement context you're trying to bring about. And no matter what that frequency of collection, the other big thing to keep in mind is don't overreact to any single data point, which is, again, I see that over and over again in my work. I think ultimately the data allows us to understand the variation and the trends within our system, whether that system is stable or unstable, and then what type of improvement effort would be most effective. And, again, in my experience, just those simple things are almost never happening in schools. Probably in most sectors.

 

0:16:25.9 AS: Can you explain a little bit more about in sync with the improvement process? Like, maybe you have an example of that so people can understand.

 

0:16:34.2 JD: Well, yeah. So, you mean the frequency of data collection?

 

0:16:39.0 AS: Yeah. And you're saying, yeah, this idea of like, what would be out of sync?

 

0:16:44.7 JD: Well, one, you need to... A lot of times what happens is there might be a system in place for collecting some type of data. Let's say, like, attendance. They report attendance, student attendance on the annual school report card. So, you get that attendance rate, but that's like the state test scores. Like, it's not that helpful to get that on the report card after the year has concluded. But the data is actually available to us in our student information system. And so, we could actually pull that in a different frequency and chart it ourselves and not wait on the state testing date or the state attendance report card has attendance...

 

0:17:27.5 AS: Because attendance is happening on a daily basis.

 

0:17:31.0 JD: Happening on a daily basis. So, if we wanted to, daily would be pretty frequent, but if we did collect the data daily, we certainly can do that. We could see, that could help us see patterns in data on certain days of the week. That could be something that goes into our theory for why our attendance is lower than we'd want it to. You could do it weekly if the daily collection is too onerous on whoever's being tasked with doing that. I think weekly data pretty quickly, would take you 12 weeks. But in 12 weeks, you have a pretty good baseline of what attendance is looking like across this particular school year. So I think when you're talking about improvement efforts, I think something daily, something weekly, I think that's the target so that you can actually try some interventions along the way. And...

 

0:18:29.3 AS: And get feedback.

 

0:18:31.1 JD: And get feedback. Yeah, yeah. And you could also peg it to something that's further out. And you could see over time if those interventions that are impacting more short-term data collection are actually impacting stuff on the longer term as well.

 

0:18:49.1 AS: And I guess it depends also on what is the priority of this. Let's say that attendance is not a big issue at your particular school. Therefore, we look at it on a monthly basis and we look to see if something's significance happening. But otherwise, we've got to focus over on another idea. And if, if, if attendance becomes an issue, we may go back to daily and say, is it a particular day of the week? Or is it something, what can we learn from that data?

 

0:19:20.0 JD: Yep, that's exactly right. And then the next step would be in lesson nine, you then, and this is why the charts are so important, then you can clearly label the start date for an intervention directly on the chart. So, what you want to do is, once you've chosen an intervention or a change idea, you clearly mark that in your process behavior chart. I just use a dashed vertical line on the date the intervention is started and also put a simple label that captures the essence of that intervention. So, that's right on the chart. So, I can remember what I tried or started on that particular day. And then that allows the team to easily see, because you're going to continue adding your data points, the stuff that comes after the dotted line, it becomes pretty apparent based on the trends you're seeing in the data, if that intervention is then working, right?

 

0:20:21.2 JD: If it's attendance, I may try, I do a weekly call to parents to tell them what their individual child's attendance rate is. And then we can see once we started making those weekly calls over the next few weeks, does that seem to be having an impact on attendance rates? And then I can actually see too, we've talked about the patterns in the data, there's certain patterns I'm looking for to see if there's a significant enough change in that pattern to say, yeah, this is a signal that this thing is actually working. So, it's not just because it increased, that attendance rate could go up, but that in and of itself isn't enough. I want to see a signal. And by signal, I mean a specific pattern in the data, a point outside the limits.

 

0:21:17.3 JD: I want to see eight points in a row in the case of attendance above the central line or I want to see three out of four that are closer to a limit, the upper limit, than they are to that central line. And again, we've talked about this before, those patterns are so mathematically improbable that I can be pretty reasonably assured if you see them that an actual change has occurred in my data. And because I've drawn this dotted line, I can tie the time period of the change back within that dataset to determine if something positive happened after I tried that intervention.

 

0:21:56.7 AS: It's just, you just think about how many times, how many cycles of improvement and interventions that you can do in a system and how far you will be a year later.

 

0:22:12.3 JD: Yes, yeah. And "cycles" is exactly the right word because really what you're doing, I didn't mention it here, but really what you were doing at the point you draw that vertical line when you're going to run an intervention, you're going to do that through the PDSA cycle, the Plan-Do-Study-Act cycle. So that's your experiment where you're testing one thing to see what impact it has on the data. So if I was going to boil continual improvement per Dr. Deming down to two things is, put your data on a process behavior chart, combine it with a PDSA to see how to improve that data. And that's continual improvement in a nutshell, basically, those two tools.

 

0:22:51.7 AS: Gold, that's gold. All right. Number 10.

 

0:22:55.3 JD: Last one, lesson 10, "the purpose of data analysis is insight." So this comes from Dr. Donald Wheeler, but he basically just teaches us that the best analysis is the simplest analysis, which provides the needed insight. But what he would say is plot the dots first on a run chart. Once you have enough data, turn it into a process behavior chart. And that's the most straightforward method for understanding how our data is performing over time. And so this approach, I think it's much more intuitive than if we store the data in tables and then the patterns become much more apparent because we're using these time sequence charts. And again, I know I've said this before, but I keep repeating it because I think it's the essence of continual improvement to do those two things. Yeah.

 

0:23:47.1 AS: And what's the promise of this? If we can implement these 10 points that you've highlighted in relation to goal setting, what do you think is going to change for me? I mean, sometimes I look at what you've outlined and I feel a little bit overwhelmed, like, God, that's a lot of work. I mean, can I just set the freaking goal and people just do it?

 

0:24:13.2 JD: Yeah. Well, I think, this is, in essence, a better way. I mean, this is really the wrap up here is that, well, one, when you understand the variation in your chart, you actually understand the story, the true story that's being told by your data. And so many people don't understand the true story. They sort of make up, that's too strong, but they don't have the tools to see what's actually happening in their system. So if you really want to see what's happening in your system, this is the way to do it. That's one thing. I think it also... I tried many, many things before I discovered this approach, but I didn't have any way to determine if something I was trying was working or not.

 

0:25:07.1 JD: I didn't have any way to tie the intervention back to my data. So what most people then do is tell the story that this thing is working if you like it. And if you don't want to do it anymore, you tell the story that it's not working, but none of its actually tied to like scientific thinking where I tie the specific point I try something to my data. So that's another thing. I can actually tell if interventions are working or not or can have a... I always try to use, not use definitive language. Scientifically, I have a much better likelihood of knowing that an intervention is working or not.

 

0:25:47.7 JD: So I think especially the process behavior chart, I think, and the way of thinking that goes with the chart is probably the single most powerful tool that we can utilize to improve schools. And we can teach this to teachers. We can teach this to administrators. We can teach this to students, can learn how to do this.

 

0:26:07.1 AS: Yeah. And I think one of the things I was thinking about is start where you have data.

 

0:26:12.3 JD: Yeah. Start where you have data.

 

0:26:14.2 AS: Don't feel like you've got to go out there and go through a whole process of collecting all this data and all that. Start where you have data. And even if attendance is not your major issue, let's say, but you had good attendance data, it's a good way to start to learn. And I suspect that you're going to learn a lot as you start to dig deeper into that. And then that feeds into, I wonder if we could get data on this and that to understand better what's happening.

 

0:26:41.4 JD: There are so many applications, so many applications. I mean, even just today, we were talking about, we get a hundred percent of our students qualify for free and reduced lunch because we have a school-wide lunch or breakfast and lunch program. And so we get reimbursed for the number of meals that are distributed. And sometimes there's a mismatch between the number that are distributed and the number we order just because of attendance and transportation issues and things like that. But the federal government only reimburses us for the meals we actually distribute to kids. And so if we over order, we have to pay out of our general fund for those meals that we don't get reimbursed for. And so, I'm just bringing this up because we were looking at some of that data just today, that mismatch, and even an area as simple as that is ripe for an improvement project.

 

0:27:40.7 JD: Why is there a mismatch? What is happening? And prior, I would just say, prior to having this mindset, this philosophy, I would say, well, they just need to figure out how to get the numbers closer together. But you actually have to go there, watch what's happening, come up with a theory for why we're ordering more breakfasts and lunches than we're passing out. It could be super, super simple. No one ever told the person distributing the lunches that we get reimbursed this way. And so they didn't know it was a big deal. I don't know that that's the case or not right, that's purely speculation. Or it could be, oh, we want to make sure every kid eats so we significantly over order each day. Well, that's a good mindset, but maybe we could back that off to make sure we never... We're always going to have enough food for kids to eat, but we're also not going to spend lots of extra money paying for lunches that don't get eaten. So there's all different things, even something like that operationally is ripe for improvement project. And the great thing is, is if you can study that problem and figure out how to save that money, which could by the end of the year, you know, be thousands of dollars, you could reallocate that to field trips or class supplies or to books for the library or art supplies, whatever, you know? So that's why I think this methodology is so powerful.

 

0:29:02.1 AS: Fantastic. That's a great breakdown of these 10 points. So John, on behalf of everyone at the Deming Institute, I want to thank you again for this discussion and for listeners, remember to go to deming.org to continue your journey. And you can find John's book, Win-Win, W. Edwards Deming, The System of Profound Knowledge and the Science of Improving Schools on Amazon.com. This is your host, Andrew Stotz. And I'll leave you with one of my favorite quotes from Dr. Deming, "People are entitled to joy in work."