Thursday, March 28, 2024

Niall Downey | Cardiothoracic Surgeon and Commercial Airline Pilot

by Editor

Niall Downey FRCSI qualified as a doctor in Dublin in 1993, and undertook subspecialty training in cardiothoracic surgery.  He now flies transatlantic passenger planes for AerLingus. He combines his medical and aviation experience by teaching safety management. 

Watch the video or listen to the podcast.

Welcome to this conversation with Doctor, or should I say, Captain Niall Downey. I caught up after he had finished a teaching session with intensive care nurses.   

Niall Downey: I qualified as a doctor in Dublin in 1993 and went through the surgical training system.  I did the surgical training program in Belfast and got my fellowship with the Royal College of Surgeons in Ireland in 1998 and began subspecialty training in cardiothoracic surgery.   

In 1999, four and a half thousand of us applied for the Aer Lingus cadet pilot training program and I was number 11 of 38 selected that year. It meant then I was able to leave healthcare and head off to Oxford where Aer Lingus paid for my entire training and put me on two engined commercial jets at the end of training. 

DMacA: As doctors we think a lot about comparisons between the airline industry and medicine.  Tell us about your interest in checklists, safety, and the parallels between medicine and the airline industry. 

ND: Part of the problem is that checklists were added in healthcare as an extra layer of bureaucracy and made life harder for staff.  Anything that’s a burden will be treated as such and I think that the whole philosophy behind checklists was misrepresented. In aviation we do checklists because we see it as our final safety net to protect ourselves and keep ourselves alive.   

DMacA: Is it something in the personality of doctors or is it the culture of healthcare itself? 

ND:  I think it’s the culture of healthcare itself.  A lot of things these things are added as an extra layer of bureaucracy. Something went wrong so here’s another step in the process whereas in aviation we try and simplify things as much as possible.  Simple things like how does communication work.  People think communication is very easy. It’s not. There are  three different issues:  There’s the person delivering the message, there’s the message itself, and there’s the person receiving the message.  In aviation we focus firstly on very simple communication.  Secondly, we emphasise closed loop communication, so that when you give me an instruction, I read that instruction back to make sure I’ve got it right.  I then action that instruction and my co-pilot then double checks that I’ve actually actioned it correctly.  Its a closed loop so that we are not finished that message until it’s gone through all the various stages. Communication is equally important in the airline industry and  medicine.. 

DMacA: Niall told me about an interesting acronym, NITS. 

ND: It  stands for Nature, Intention, Time available and Special instructions.  To give an example from aviation…if we have an engine fire, we fight the fire and, if we get it out, we then have to decide what we’re going to do next. We’ll then call the cabin crew to pass on the NITS brief.  

When there’s an emergency we have a fight or flight response – it’s a thing in physiology called the amygdala hijack.  The amygdala is a small area at the base of your brain and whenever there’s an emergency, It causes a startle reflex, shuts down your frontal cortex, and goes back into primitive fight or flight mode.  So we’re trained that, when we contact the cabin crew, if we begin the message and tell them we’re on fire, the chances are that they’re going to get a startle response and will not hear anything beyond the word fire.  Our approach is that we’ll call the cabin crew saying -we’ve had a problem, I’m going to give you a NITS brief, tell me when you’re ready.  It gives them a chance to get over the shock of the fact that there’s now an emergency that they weren’t expecting, to get themselves composed, get a pen and paper and be ready to take down the message which they know is going to come in the NITS format.  

The next important part of the process is – read that back to me.  So, I’m now waiting on them to close the feedback loop by reading the information back.  If they’ve missed anything or get anything wrong, I feed the information a second time.  If they’re still getting it wrong, I then move on to another crew member.  We’re in an emergency situation, we’re time dependent and need to be sure the message has been understood, so I talk to the number two and we start the process again.  But we don’t finish the process until we’re happy we’ve got a closed loop,  that the message has definitely got through because, once I finish my communication, I might not be speaking to that cabin crew member again until we’re on the ground. 

DMacA: One of the things that fascinated me was your description of hierarchies and the differences between hierarchies in the cabin crew and hierarchies in medicine.   

ND: There has to be a leader on board.  The Captain is in charge. The buck stops with him or her. They take ultimate responsibility but, if there’s a steep authority gradient, they’ll possibly not get the information they need.  I think we’ve all got stories of consultant surgeons that we’ve worked with who didn’t take well to correction, and didn’t appreciate when information that put them in a bad light was fed to them.  If there’s a very steep gradient, human nature being what it is, people are going to protect themselves.  If, when they fed back negative information before, they were shouted at, the chances are they’re going to keep their mouth shut the next time.  So, we try to have a very flat gradient.  It’s important that it’s not totally flat. Someone still has to be in charge, someone is still the captain, but if you have a relatively flat gradient, it means the most junior cabin crew member can contact me directly in an emergency, if necessary, and feed me information. 

DMacA: One of the most important messages I took from listening to you was the fact that error is acceptable.  Tell me a little about this concept that error is acceptable, error will happen, and that we need to try and prevent it happening.   

ND: Human nature being what it is, error is inevitable.  If error is inevitable, and we know we’re going to make the error, there’s no big deal when we do, there’s nothing to be ashamed of, so we can put our hand up.  In aviation we’ve got what’s called a “Just Culture”. Now, just to clarify, a lot of people think we have a ‘no-blame’ culture but that’s not true.  

Tomorrow I’m taking a 250 million dollar airplane to the States. If I get something badly wrong, I will be blamed, but the difference is that, if I put my hand up and admit to the mistake and cooperate in the investigation, I will not be disciplined and I will not be dismissed.  It’ll be accepted that I’ve been flying for 24 years, haven’t made that mistake before, so something must have been different today that made me make that mistake. 

We have a systems approach. We look and analyse the system.  If I tripped over something today, where was the tripwire, and how can we engineer that out of the system?  And then, how you can educate people to look out for that in future error management.  Again, we see it as a closed system.  Where we make a mistake, we investigate why it happened, we address it and try to rule it out in the future by putting safety nets in place and that’s the case closed.  And, with this approach, our results over the last 50 years have been absolutely phenomenal. What I’ve been trying to do for the last 12 years is to take the exact same principles and process and apply it to healthcare.   

DMacA: You have been flying 24 years but tomorrow you’re going off to retrain again.  This is interesting in a medical context. I don’t know how many doctors retrain after 24 years.  Tell me about that idea and how that works in the airline industry. 

ND: We have a series of checks throughout the year. I’m in the simulator twice a year to check that I’m still up to speed. I will have ground based training on a cabin crew simulator. I’ve got a computer-based technical skills program that we do once a year and, once a year I actually do a line check.  That’s when I do a normal flight but I have a “Check’ Captain sitting in the jump seat, supervising the whole flight, to make sure I’m doing it to the company standard. 

DMacA:  It’s an interesting concept.  I can’t see that coming into medicine just yet. If an experienced surgeon were to have another experienced surgeon in the jump seat watching their surgical procedures, do you think that’s something we should be aiming towards? 

ND: Some people are talking about that. Atul Gawande, in Boston, who wrote The Checklist Manifesto and was the driver behind the WHO checklist, wrote an article in the New Yorker about four or five years ago suggesting that, as top-level athletes have a coach, why not top level surgeons. He brought in the idea of having a someone at his level sitting in with him, assisting him in surgery and suggesting ways of doing things better and differently, and he found that extremely useful.  I suggested to one of my medical colleagues a few years ago that, when you’re doing something new or novel, you have someone very senior with you, like a co-pilot.  

Tomorrow I’m doing my first flight on the A330 and I will have an experienced training Captain sitting beside me instead the usual co-pilot. It is catching on in surgery and the idea of recurrent training ought to catch on as well. I have recurrent training seven or eight times a year where I have to prove that I can still do things like handle an engine failure, hydraulic failures, etc so that I’m still up to speed with it and people are happy that I’ve done it relatively recently.  The same principles, especially with simulation around surgery, are equally applicable in healthcare and there has been more progress in the last few years than we ever thought possible maybe 10 years ago. 

DMacA: With your joint interest in medicine and in the airline industry you’ve created this company, Frameworkhealth. 

ND: We set up the company 12 years ago and called it Frameworkhealth. The idea is that in aviation, when something goes wrong, we have a basic framework that we default back to.  We already discussed the startle response and the amygdala hijack with the fight or flight response.  This is the same idea that, when something goes wrong on board, we have a startle reflex. To try and counteract that we have a concept which, in Airbus for example, we call the ‘Airbus Golden Rules’. We immediately assess if this is a matter of life or death in the next five seconds. If not, we sit on our hands. In healthcare we have ABC; Airway, Breathing, Circulation.  In aviation we’ve got Aviate, Navigate, Communicate.  We look at the plane- is the plane physically flying or is it in a state where it cannot stay airborne? Navigate – are we flying somewhere safe or are we pointed at a mountain? And Communicate – do we need to tell Air Traffic Control if we’re going to do something wildly different to what we had been told to do up until now. We then look at the systems on board.  Have we got our automatics in place? What’s the appropriate level of automation?  Is the airplane doing what I would expect it to do in this scenario?  Once we’ve worked our way through that, which maybe takes about 10 seconds, it gives us time to get over the startle reflex and re-engage our frontal cortex so that we can make a more measured response.  The idea is that we have a framework to fall back to that applies to every scenario on board, regardless of the underlying problem. 

DMacA: With the benefit of that experience, if a hospital Chief Executive was interested in bringing these concepts into practice, what would be the first three steps? 

ND: Ideally, we should have more Human Factors professionals in healthcare. I would clarify that I’m not a Human Factors professional but I am a professional who relies on my application of Human Factors to keep myself and over 300 passengers and crew alive.  The UK has about 1.4 million staff in the NHS but perhaps 10 human factors professionals working in it. This needs expansion but that’s not going to happen overnight so, what I’d like to do is to bring in our system with people like pilots who use Human Factors on a daily basis, and let us teach what we use.  To the Chief Executive, the first thing I would say is to have a look at what we’re doing.  Start with the basics or let your staff see a taster session like ours.  Invest in trying to change your systems. Acknowledge that things are going to go wrong.   

The biggest challenge is trying to change the culture. In healthcare, it’s generally been ‘name, blame, shame, and retrain’. That needs to change to a “Just Culture” and that has to come from the top.  There’s been a lot of talk of ‘Duty of Candour’ and ‘Just Culture’ in healthcare but it takes the Chief Executive to walk the walk as well as talk the talk.  As a Chief Executive, the next time there’s a big incident in your hospital you have to show that this person is not going to be shamed, this person is not going to be disciplined,  but this person is going to be welcomed as the most important member of the team. They now have something that they can teach us and help us work out how we can do this differently. And then, try to apply an error management strategy to every member of staff from the cleaners right the way up to him or herself, the Chief Executive.   

It’s a whole team sport. Healthcare is not a spectator sport. Everybody’s involved so include all the staff members from the lowest rung of the ladder to the highest. And this has to include the passengers and crew in my scenario, and patients and their families in the healthcare scenario.  The most important person on the team is the patient as they can tell you what’s wrong, they can tell you what medications they’re on, they can tell you if this looks different to what they normally expect, and they have a fair idea about their co-morbidities, and how different things interact.  It’s a very foolish person who would disregard a patient’s input. 

DMacA: You’ve brought this all together in a book which is due out soon. 

ND: It should be out in May or June of 2023. I’ve decided to try and get a catchy title.  When something goes wrong, and we make mistakes every day, our first reaction is Oops! That’s why I’ve called the book “Oops! Why Things Go Wrong”.  Hopefully it’ll be of interest to everybody but of particular interest to healthcare and will bring my two careers together so that I will have contributed something of value back to my original career which is where my heart still lies. 

Doctor and Captain Niall Downey, there are very few people with your extensive experience in both.  Thank you very much for sharing your insights with us today. 

Related Articles

Leave a Comment