Skip to the main content.
Talk to sales Start for free
Talk to sales Start for free

19 min read

An interview with Christine Sublett: Looking to the future of healthcare cybersecurity

An interview with Christine Sublett: Looking to the future of healthcare cybersecurity

The Paubox Encrypted Interview Series allows us to chat with leaders in healthcare IT, compliance and cybersecurity to pick their brains on trends and best practices. In this Encrypted Interview, we chat with Christine Sublett, President and Principal Consultant of Sublett Consulting, on a wide range of cybersecurity topics including how regulation is catching up with innovation in healthcare, what healthcare organizations need to focus on in 2020, and more.

Topics

Early career and professional growth
On regulating Healthcare and helping interoperability
What healthcare organizations should work on for 2020
The evolution of HIPAA regulations
Final thoughts

 

Early career and professional growth

Rick Kuwahara: Before you started your consultancy, you had a tremendous amount of experience in cybersecurity and healthcare, from Stanford Children’s Hospital to being on the HHS Healthcare Cybersecurity Task Force. So, I'm curious to know, what is it about cybersecurity that you've dedicated so much of your professional life to it?

Christine Sublett: Thanks, Rick. That's really a fantastic question. And I had this question ahead of time. And I was really glad I did, because I had an opportunity to think about why it is I've been in cybersecurity this long, and what it was about it that really drew me to it and has kept my interest through the years.

And in fact, someone once, recently said to me, "What do you want to do when you retire?" And I thought about it and well, actually I can't imagine retiring. I love what I do so much that maybe someday I'll work a little less, but I would miss this world so much, I can't imagine walking away from it.

When I think back to when I first got into cybersecurity, I was on the technical side of things, working in and running IT departments and helping companies build their technical programs, departments The Security Rule was written in 1999 and went into effect in 2003. And so when you think about that, 20 years in technology is I think more like what? 1,000 years in everything else. [chuckle]

The world has changed so much, but about the way I like to think about it is , "How do we really secure this data, but do it in a way that doesn't impede patient access or doesn't impede innovation."

It really brought cybersecurity to life for me in a way that I hadn't really thought about before. And over the years, we've watched the attack vectors change,  attackers motivation change  and they’ve become much more motivated and better funded than we could ever dream of being in the corporate world, or even worse, in healthcare.

The idea that we can stay a step ahead, I think is almost laughable, but even just staying on par with them is a full-time job for literally hundreds of thousands of people.

But it's such a fascinating world, and it changes so quickly, that it really keeps my interest and keeps me excited.

From the perspective of healthcare, there are so many horrifically bad implications of things that can happen when it goes awry. And sure, it's bad when a tech company gets hacked and somebody's email and password and even potentially credit card information is stolen, but in healthcare, other things can happen that could be devastating to our healthcare sector, are things like hacks of medical devices, where someone's sitting on their couch half a world away hacks into a medical device and changes, for example, on an infusion pump the amount of medication being delivered.

So, the opportunity for harm, real actual physical harm to an individual, is significant. And so you take something like that, or you take too many hacks and breaches in the healthcare environment, and people start to lose faith in our ability to actually keep their data safe or deliver their care safely. And that's also of concern.

 

Rick: Right. Great. Yeah, you're never gonna be bored. It's always gonna be...

 

Christine: No. [chuckle] And that's my goal, never to be bored.

 

Rick: So why did you end up starting Sublett Consulting, versus doing something like a corporate role or doing something in-house?

 

Christine: So I had moved from Stanford University to the Chief Security Officer role at Lucile Packard Children's Hospital at Stanford, building an information security program and developing a security strategy, and spent a few years there. And then moved into the corporate world into a chief security officer, VP security role and a private healthcare company focused on disease management, and did that for a few years until this company was acquired by a company in the Midwest.

I stayed on and took over the CIO role and worked to integrate the two technology programs and security, privacy, and corporate compliance programs for both companies.

After about a year and a half of this, I realized two things.

First,  I'm spending more and more of my time talking with founders and folks who were either starting digital health companies or thinking about starting digital health companies about the security and privacy implications of the world they were entering into. And the more I thought about it, the more I realized that the fast-paced, exciting world of digital health really appealed to me.

And this is going back, almost nine years ago. So, it really was at the forefront of the digital health movement. And in addition to, I realized also that traveling back and forth to the Midwest every other week for work wasn't fitting in well with my lifestyle. [chuckle] So between the two of those things, I decided to jump ship and start my own organization and advise digital health and medical device startups, where I continue to work to this day.

 

Rick: Great. And what's the biggest challenge or mission that you've found working with the startups?

 

Christine: I think there are some challenges, but I will tell you, overall I find that working with early-stage companies, in a lot of ways suits my personality a lot more than working in a corporate environment. And this is true probably in any industry, but particularly in healthcare, things don't move nearly as fast.

Decisions aren't made to steer the ship in another direction very fast. And that's one of the things I really love about early-stage companies, is that they do move quickly.

I think though, one of the significant challenges that I think any consultant or advisor might face with an early-stage, or frankly, any stage company, is that unless you're in-house, sometimes there are things you might not be privy to that are really relevant, that folks may assume you know, but because no one's actually explicitly told you, you don't.

 

On regulating Healthcare and helping interoperability

Rick: Now that it's been a few years since you've been on the HHS Health Care Cybersecurity Task Force, how do you think the government has been doing in supporting cybersecurity efforts? For example, the recent Stark waiver that was allowing for cybersecurity assistance.

 

Christine: Right, so that's a proposed rule. And just so I can level set, you're talking about the proposed policies by CMS on Stark and the OIG on anti-kickback, on safe harbor for donation and receipt of cyber technology and services.

And first, let me say I am absolutely ecstatic at these proposed policies, and I'm hopeful that after feedback from some of us and different professional groups, that they will take our feedback under consideration, potentially make some changes, and push these through.

I spent a year serving on the HHS Health Care Cybersecurity Task Force,in 2016 in 2017, and this was actually one of our recommendations in our task force report. It's also one of the things that, honestly, when we looked at the full scope of our recommendations in the report, that I think a lot of us thought might be one of the most difficult to accomplish.

Historically, very few Stark waivers have been granted since Stark went into effect. And so, this is a really exciting thing.

And if it's okay, I'll describe a little bit about why this is so interesting and so important for all of us.

 

Rick: Yeah, that would be great.

 

Christine: Okay. And I'm going to talk specifically about healthcare here. But the vast majority, so about 90% of healthcare in the US, is delivered by a practice of nine or fewer providers.

Most of the healthcare people are getting is not from a huge organization like a Stanford Hospital or a UC San Francisco or a New York Presbyterian or a Mayo. It's coming from a small provider in a small to mid-sized town, and the vast majority of these folks don't have someone on their staff with cybersecurity expertise.

And so, they either don't have the expertise, or they don't have or are not willing to put forth the money to get for those cybersecurity resources. Or they can't find the cybersecurity resources, because we're really short, in terms of, in this country in terms of the numbers of actually the resources that we need and the people who actually do that kind of work.

And so, it's a huge challenge, when you think about 90% of the healthcare being delivered by these organizations, where they have inadequate cybersecurity resourcing and technology.

And so, what this waiver would do, this change of policy, would allow organizations like... I don't want to use any specific vendor, but a cybersecurity vendor to provide technology at no or low cost to different vendors.

Or it would allow a larger healthcare entity to provide these resources Let's say you're a giant healthcare conglomerate and you have a bunch of community physicians who partner with you, but they're not part of your organization. And because of the kickback rules, the anti-kickback rules, you couldn't, this giant conglomerate couldn't give this smaller community-based physician a firewall. Or couldn't give them antivirus software, or any other type of cybersecurity technology. But under this proposed policy change, it would allow these types of things to actually happen.

And the reason this is so important to all of us, is that healthcare, part of its aim right now, and for the last many years, has been this concept of interoperability, and this ability to share data freely amongst providers so that patients can get the care they need when they need it.

And that's a wonderful goal, and something we really should be aspiring towards, but part of the problem there is when we start connecting these systems.

It's like the analogy of a chain, it's only as strong as its weakest link. And if your weakest link is a community provider with two computers with no passwords and no firewall and no antivirus, and your  hacker from Eastern Europe has taken up residence in it. You probably don't want to connect that to your systems.

But if you can, you as the big conglomerate, can say, "Well, now I'm confident that we... you have adequate security practice on your side," it makes the sharing of data and the delivery of the kind of healthcare we want to be delivering in this country more of a possible reality.

In terms of other progress we've been making at the federal level, there is also a group called the Healthcare Sector Coordinating Council, and this group maintains a joint cybersecurity working group made up of government and industry partners.

And we've been working diligently since the task force report came out, to address the recommendations in the report, and have put out what I think is some incredibly fine guidance related to a variety of different areas, including medical device security, and finalizing telehealth and telemedicine security, and just a series of all sorts of wonderful guidance to industry.

And what makes us so unique is that the Joint Cyber Working Groups have been made up of professionals, not only from healthcare, what we think of patient delivery healthcare, but also security consultants and technology vendors and folks from government, so from HHS or DHS, FDA, from the federal departments that have a stake in this game, as well as folks like the medical device manufacturers, and digital health companies, and EHR companies.

And so, what's great about the different sets of guidance that they're issuing, is that nobody is getting everything they want, but we're all agreeing that this is the right approach. A great example is  the Joint Security Program, a document of security best practices  put together by medical device manufacturers and healthcare providers.

And what we've said in this document  there are things that need to be done by  a healthcare provider to ensure that these devices are secure, and there are things that a medical device company needs to do, to ensure that these devices are secure. And this is how we're going to work together to ensure that we're creating an environment where we can use these devices safely in healthcare.

So, it's fantastic, because the medical device manufacturers have agreed to these set of requirements, and the healthcare providers have agreed to their set of requirements. And so when a medical device manufacturer goes into a hospital and wants to sell their device there, the hospital can pull out this joint security plan, the JSP, and say, "Well, here's a great checklist of all the things that you should've built into your device from a security perspective, and talk to us about where you are with these." And the medical device manufacturer hopefully has seen this before, and can say, "Here's exactly how we're doing that."

We're trying to create a situation with all of these different sets of guidance, where healthcare providers can look at these, no matter their size, no matter how many resources they have or don't have on the security side, and figure out how to start from where they are, whether it's literally, "We haven't done much, and we really know we need to do things, help us understand how to start," to working with more mature programs that already do have some pretty amazing systems and processes in place, but still can move a little further along.

 

Rick: That's great, and that sounds like it's really helping to address some of the gaps that have kind of been created when all these new innovative technologies come along. There's a gap to the adoption. And also a gap in, like you said, the knowledge, the security knowledge needed to make sure that if you are implementing them, that it's done in a safe way.

 

Christine: Exactly.

 

What healthcare organizations should work on for 2020

Rick: So as we wind down 2019, a lot of organizations are planning for next year. Are there any security areas that healthcare organizations may not be focusing enough on?

 

Christine: Definitely. When I think about what organizations should be planning for in 2020, I think to a large degree because of things like the task force report, which now is, it's been out two-and-a-half years, and the work of the Healthcare Sector Coordinating Council, as well as frankly, just the realization and the fact that so many healthcare organizations are being hacked or suffering some type of inappropriate disclosure of data or cybersecurity event, where either data's disclosed, or data's disclosed and they've had a ransomware attack, or they've had a ransomware attack and can't function and are truly unable to deliver healthcare, and we're seeing that more and more often throughout the world now.

That they're just really more aware of the problem and understand that they need to be more proactive. And so most... I think that most healthcare entities at least know that they have these issues that they need to focus on.

A lot of organizations are really starting at a fairly low level from a cybersecurity posture perspective, doing a risk assessment to understand where the risks are, so that they can address those gaps first is critical.

When some organizations, particularly those that don't have cybersecurity leadership, often end up hearing about a technology and thinking that it sounds like it might address some of their issues and buy it. And they may get some value from it, but without understanding where the gaps are and the level of risk associated with those gaps, it's really hard to take a limited amount of, dollars and/or resourcing, human resourcing, and attack those things that bring the biggest value from a risk reduction perspective.

And so having a program in place to help you understand the gaps and understand the risk associated with the gaps is critical You can have a list of 100 things that are truly security issues, but many of these probably don't present nearly the level of risk as a handful of them.

So what are the things that really present that type of risk to the organization, and then how do we address those? And that's how I like to focus, generally with companies I advise is. Yes, we have a basket full of things [chuckle] we should be doing, but we can't do them all today. So how do we mitigate the greatest amount of risk in the shortest amount of time with the budget and resources we have to work with.

The other thing I think that companies in healthcare as well as in other verticals need to be doing much more of is incident response planning.I think for a long time, many companies have really believed they aren’t a target. I truly cannot tell you how many times I've heard in the last 30 years that, "We've never had a breach."

Or... And I have to confess, every time I've heard that in my head, I have this little voice that says, "That you know of."

Because in many cases, companies that say this, they don't actually have the monitoring tools in place to even know if they've had one. So it's truly not a matter of whether a company hasn't had a breach, or thinks they haven't had a breach or a lot of folks think because they're small, that they're not a target.

The reality is, is everybody's a target. If you're on the internet, you're a target.

[ss_click_to_tweet tweet=""The reality is, is everybody's a target. If you're on the internet, you're a target." - @sublettconsult" content="The reality is, is everybody's a target. If you're on the internet, you're a target." style="4" link="1"]

And so I think what we have to do is to stop thinking like that, and start thinking more about how will we respond when we have an incident because we will all have incidents and many of them will not reach the level of a breach or an inappropriate disclosure of information, or a system compromise, but we'll all have incidents, things that reach the level of what our organization considers something that we have to have an organized response for.

And so I think that companies need to spend a lot more time thinking about what their incident response plan look like.Who are the members of their incident response team? How are you going to work through these incidents? When do you actually invoke your plan?And then, of course, what I think is one of the most important pieces is testing that plan.

Having a real tabletop exercise, where you work through scenarios that are designed to help you organization bring the right people to the table and think about and literally walk through this pretend scenario from a cyber security perspective, using your incident response plan and understanding how are we going to react when this happens to you.

Because it is truly not a matter of when or if, it's not an if this will happen question, it's a when this happens question. And companies... Because frankly these types of events are so prevalent now, it used to be believed that if an early-stage company had a breach or an inappropriate disclosure of information that that would be the end of the company.

What I can say is that I have seen several early-stage companies have reportable incidents that they had to report to their customers, and then the customer sometimes had to turn around and report things to HHS, OCR or other state or other regulatory agencies, and it has not been the end of the company.

And the key really is how does your company respond to the incident. And so, the first time your company sees,  all of the members of your incident response team see that incident response plan, should not be when you're working through an incident.

It should already be something that you've tested on, and you've had an opportunity to evaluate and go, "Oh my gosh, we should have had somebody else at the table with us from legal. Why didn't we have a legal here? We should have these other types of monitoring tools and how on earth are we going to know if this is happening if we don't have these” types of questions should arise

And so, it really gives a company an opportunity and a team an opportunity to learn how to work together and work through these issues before there’s a critical event.

 

Rick: It's a good point. And we see it in the news all the time, where one company had a, like you said, a smaller company have a massive attack and they have to go out of business, they're locked out, where another company may have or organization may have had like you said, the incident, they're ready for something to happen, they had the incident response, and process in place, they're able to recover, nothing happens, knock on wood, there's no disclosure of information, they don't have to pay the ransom, but because they're ready.

 

Christine: Exactly.

 

Rick: Great point. And a lot of these cyber attacks we're seeing hitting healthcare, like those ransomware attacks are heavily focused on people as the initial front factor. So how much can be done when you're doing your planning about mitigating that risk? Or can anything be done even?

 

Christine: Right. So, I do think there are things companies can do, and you're spot on about people.

25 years ago, I remember listening to some security people joke that if we could just take people out of the loop, our systems would be way more secure. And the truth is actually that is true, [chuckle] but we don't have that luxury.

So, one of the things, or there's really three things that I think are probably the top things that companies can do here.

One is, from a technology perspective is put in place some type of technology, some type of filtering capability that can actually identify these types of phishing or other social engineering type emails that are trying to come into the organization.

Because you're right, that's like the number one attack  vector right now. And so, if we could at least stop some of these, and if we can use threat intelligence and indicators of compromise to filter out these email addresses and other things, we would at least have a leg up.

Now, it's not going stop everything, right? So, we do have to train our workforce.

I think there's two things I would do with the workforce. One is I would train. And there's some great security awareness programs out there to train individuals on how to recognize different types of social engineering attacks.

Certainly, all social engineering attacks are not phishing or types of phishing attacks, but they are by far the vast majority. And so, it's really the primary way that a lot of the account and credential compromises take place as well as ransomware are coming in through some type of email.

And helping your workforce understand how to recognize them and what to do if they suspect they've received one so that they do make good decisions about what that might be.

The other is to test. There are also some great programs out there and technologies for testing your workforce.

These products allow you to set up particular types of social engineering emails and tests to actually test the workforce to see if you catch any of them clicking on things or responding to things that they shouldn't be. And I have seen some organizations on the first test, they'll have a click rate of 70% of their workforce clicks on this first test and maybe 30% of them actually enter in their credentials.

Rick: Wow.

Christine: Right, I know. It's just staggering. [laughter] And I see numbers like this, and I'm thinking, "Oh gosh, this is not good."

But once you identify these users who've clicked on it and the users who've also entered credentials and then you have further training for these folks, and you explain to them what again, "This is why we're doing this." It's not a punitive thing. They're not in trouble. "But let us help you understand how to identify these things, so you don't do this when it's real."

And what I've seen is the amount of improvement between the first test and the second test is just, it's really significant.

That one company I'm thinking of where they had 70% click on it and 30% entered credentials, the second time they had less than 10% click on it and 1% enter a credential.

This shows that it was worth every cent of the training and the testing. And the fantastic thing is that these programs for testing and training your workforce on social engineering type attacks are actually really inexpensive. It's probably one of the most cost efficient ways to reduce your risk as a company.

 

The evolution of HIPAA regulations

Rick: That's great. So, I do have one follow-up question that we can kinda weave in. It goes back to your first answer. It struck me when you said HIPAA was written in 1999.

 

Christine: [laughter] A long time ago.

 

Rick: So how much does HIPAA have to evolve? There's HITECH and everything, but how does HIPAA have to evolve to meet the changing technology?

 

Christine: So, HIPAA has great opportunity for evolution.

The security rule was written in 1999 and went into effect in 2005 and the privacy rule was written in, I think, 1996 and went into effect in 2003. And as you mentioned, there was a HITECH Act which did update some of those requirements

But primarily on the security side, what it did is it just pushed the same sets of security requirements from the business associate to the covered entity.

And before the HITECH Act, the business associate only had those requirements if they were contractually obligated based on that contract with the covered entity customer. And in many cases, they were already, but this also made it part of the legal requirement as well.

[ss_click_to_tweet tweet=""HIPAA was designed to address a fairly limited set of healthcare data. It certainly was never designed to cover all health care data." - @sublettconsult" content="HIPAA was designed to address a fairly limited set of healthcare data. It certainly was never designed to cover all health care data." style="4" link="1"]

HIPAA was designed to address a fairly limited set of healthcare data. It certainly was never designed to cover all health care data. So currently we have a pretty significant gap in our regulatory framework from a healthcare data perspective.

And HIPAA was never intended to cover this massive set of healthcare data. And then you think about how that world has expanded in the last 20, 25 years, and it looks even less adequate today. I’m definitely looking forward to HHS issuing updates or additional guidance to HIPAA in 2020.

And of course, we'll have opportunity to comment on these proposed rule changes. But if it stays fairly narrow, which HIPAA really is fairly narrow because, again, it doesn't cover most healthcare data, it really covers this small subset. It doesn't cover any consumer information, any consumer healthcare data.

So when you use all of these different apps that are now collecting data from devices or input by patients or imported from a record you download from your healthcare provider, none of that's covered by HIPAA.

This data sitting on your phone. In many cases, it's uploading to a cloud environment from an app provider where you probably know almost nothing about how they use your data or how they share it. And currently, HIPAA doesn't cover any of that.

And so, the way I view HIPAA and what will be proposed changes is unless it's expanding the scope of the data that it's regulating, we still will have some significant gaps. And for the record, I don’t expect HHS to expand the scope of HIPAA, and I don't know that HIPAA's the right place to address this versus an over-arching privacy law.

Some states are trying to address this from a state level, California being one and the leader on this.

Our CCPA goes into effect January 1st, but it also has significant carve-outs. It doesn't cover non-profits. It doesn't cover HIPAA-covered data, so HIPAA-covered data is exempt from it. And it also doesn't cover any organization with revenue under 25 million.

If we want to really address privacy law, healthcare privacy from an overarching perspective, a federal regulation is our best approach. And it remains to be seen whether or not any of the proposed changes in HIPAA will address it, but frankly, I suspect not.

 

Rick: Right. And just to clarify, when you're saying it doesn't cover consumers, when a consumer downloads or has their own data, there's nothing. It's basically free for anybody after that. There's no regulation as far as if you're, like you said, a Fitbit or something like that. They're not a covered entity, right? So they're not covered under the scope of HIPAA. Is that what you're referring to, or is it something...

 

Christine: Yeah. So, what I said is that consumer data is not covered.

 

Rick: Right. Just making sure.

 

Christine: If a customer enters data into an app on their phone that's not a HIPAA-covered app, then it's not covered by HIPAA.

Or if they download, even if a patient downloads their record from a covered entity, from a healthcare provider, and then they upload it to an app, it's not covered.

So your medical record is covered when it's sitting with your doctor, but once you download it and do something with it yourself, it's not covered.

And so that's baffling to most people, and including me. [laughter] And so it's a question of, "Should privacy protections and security protections follow or security requirements follow the data?" And there are a lot of us who think that maybe yes they should.

But at the very least, my primary concern is that as we look at appropriate security controls for healthcare data, regardless of whether it's covered by HIPAA or not, that we're implementing appropriate controls, but also doing it in a way that does not make it difficult for individuals and their families to receive and share their data as they wish because ultimately this is about as personal a set of data as you can imagine.

People should have the ability to do exactly what they want with it.

Final thoughts

Rick: How do you keep up with industry trends? Are there any good podcasts or blogs that you kind of follow?

 

Christine: I read an immense amount. I'm a voracious reader. And I've often thought if I didn't do cybersecurity, I'd want a job where I got to read and get paid for it.

At this point in my career, I spend a lot of time talking with some of my peers and with a lot of different cyber security leaders and others, particularly in the medical device and healthcare space.

And I think that the bulk of the things that I tend to pay attention to today are primarily around medical device security and some of the different regulatory approaches to that including FDA here in the US and how they're approaching this or their counterparts in the EU or Japan or Canada or other countries, as well as a lot of the different guidance posted by different regulatory and policy entities throughout the world.

 

Rick: Great. And what do you do to de-stress and relax?

 

Christine: I am an ultra-distance bicycle racer, and I truly do my best thinking on a bike. In addition to doing my best thinking, I really do find it incredibly relaxing. And I think of it really as my moving meditation time. Even my coach thinks I'm a little crazy because he was a track racer, which is a short distance type of racing in bicycle racing and I'm like the opposite of that spectrum, and my races tend to be between 200 and 500 miles.

 

Rick: Wow.

 

Christine: I know that’s pretty crazy. But I'm really fortunate in that I have an understanding family, and I'm sponsored by Hammer Nutrition. And so, it makes cycling really, really pleasurable when  I'm well-fueled and well-supported at home.

 

Try Paubox Email Suite for FREE today.

Subscribe to Paubox Weekly

Every Friday we'll bring you the most important news from Paubox. Our aim is to make you smarter, faster.