You’ve done it! You’ve gone through your process! You’ve identified your problem, you’ve thought up a batch of possible solutions, you’ve chosen one of them and you’ve tested it out. You have a tidy little pile of data sitting in front of you. Now it’s time to make the magic happen: REVIEW
When we look at our data, regardless if you’re on the engineering or the physician side of the fence, you usually ask yourself the same questions:
- Does my data address the original question?
- Did I leave anything important out?
- Did I accidentally throw any bias in?
- Is my data of sufficient quality to form a reasonable assessment?
This takes us back to our freshman-year Scientific Method basics. Regardless of what you do with your data from here, you are probably using a little common sense to make sure that your approach was valid. Engineers and physicians are both subject to blind spots, bias and errors. In fact, let’s review Error Types while we’re on the subject:
- Type I Error – The conclusion drawn from the data is incorrect, sometimes called a false positive
- Type II Error – Failure to identify a finding, sometimes called a false negative
When we review our data, both Type I and Type II errors are problematic. Everyone is vulnerable to them—engineers, physicians, chemists, psychologists… No one is immune. The more interesting point about errors may be how we ferret them out.
As we mentioned in an earlier article, it is entirely possible that the Engineer and the Physician will each find problems within the system, but their individual ability to pick these problems out may be subject to their own personal experience in similar situations. We had talked about how an engineer may find a technical problem with a biowidget while a physician may find a clinical issue with the same device. Both problems may cause the biowidget to fail. Both may be identified by either party. However, due to their individual expertise, will the engineer notice the clinical problem as quickly as the physician? And will the physician spot that technical problem? Maybe. Maybe not… It is important during the review phase to ensure that these types of errors are identified appropriately. If you can’t trust your analysis, you can’t trust your next decision…
THE FINAL STEP: REPEAT
Once you’ve gathered your data and performed your analysis, you have a decision to make: Did you find an acceptable solution? Or do you need to repeat this process again starting with IDENTIFY? As any scientist would tell you, repeating this cycle until you find the solution you need is ultimately the whole point of the Scientific Method. A possible solution might be completely off base after you’ve tested it. If that’s the case, you may need to start from scratch—eliminating, of course, this choice from your possible solution list when you go through this loop again. More often, you may find that your possible solution is just incomplete—parts of it may have been helpful, but other aspects of it may not have worked. A scientist would take all of this information along for the ride on the next pass through this loop to further improve on the design process. With enough revolutions through this cycle, both our Engineer and our Physician should be able to come up with something that works. Will they be the same? Not necessarily, but successful solutions often have some common elements.
So our REVIEW/REPEAT piece of the puzzle looks about the same for our Engineer and our Physician! Just to complete our previous running table:
And just to refresh from where we originally started:
We’ve made it through our Basic Problem-Solving Model! In our next article, we’ll summarize our past few discussions and dive a little deeper into our Engineer vs. Physician differences. No homework this time! Let’s give your friendly and incredibly patient colleague from another discipline a break this time.
Photo Credit: ar130405 | Pixabay