Navigating HIPAA compliance with confidence

August 5, 2024
  • copy-link-icon
  • facebook-icon
  • linkedin-icon
  • copy-link-icon

    Copy URL

  • facebook-icon
  • linkedin-icon

 

In December 2023, the U.S. Department of Health and Human Services reported that the medical data of more than 88 million people was exposed in the first ten months of 2023. A 2018 Trustwave Global Security Reported that a single healthcare record would receive an average of $250.15 when sold, 50 times more valuable than a stolen credit card. 92% of stolen records were criminally acquired. This is a 9x increase over the past five years affecting over 145 million people.

Patient Healthcare Information is the most sensitive, valuable and prolific security challenge of the present day.

Thankfully, we have this information due to the oft-maligned HIPAA law. Truly innovative for its time and often updated due to its popularity, it is a great accomplishment in privacy law.

However, like most laws, its implementation for a business can be fraught. Consulting on HIPAA has become its own industry with an army of consultants and legal experts. In this episode of Strike Graph we are going to delve into the 2nd edition of “The Practical Guide to HIPAA Privacy and Security Compliance.” with the authors Rebecca Harold and Kevin Beaver.  The discussion highlights the importance of a comprehensive approach to HIPAA compliance, common myths, and challenges facing healthcare organizations today. The episode also addresses the growing threat of cybercrime, the evolving landscape of data security, and practical steps organizations can take to safeguard patient information. A must-listen for professionals navigating the complex world of healthcare data security.

Additional resources:

View full transcript

Secure Talk - Rebecca Herold and Kevin Beaver

Justin Beals: Hello, and welcome to SecureTalk, a podcast where we explore the critical world of information security innovation and compliance. I'm your host, Justin Beals, founder and CEO of StrikeGraph, together with our expert guests We'll provide you tools and tips to help your business 

Hi everyone, and welcome back to SecureTalk. This is Justin Beals, your host. I'm really pleased to be chatting with our guests today, two very deep experts in what we're going to discuss and certainly luminaries in their field. Today's conversation is going to be a lot about HIPAA and implementation of certainly this law and Privacy, security and compliance broadly.

Our guests today are Rebecca Harold. Uh, Rebecca has over 35 years of information, privacy, security, and compliance expertise. She is the CEO of Privacy Professor and is a partner for Privacy and Security Brainiacs, which is a SaaS solution founded in 2021 with her son, Noah. Rebecca has published 24 different books on security, privacy, and compliance over her career.

 We're also joined today by Kevin Beaver. So Kevin is an independent information security consultant, writer, and professional speaker, uh, as well as an expert with witness with Atlanta, Georgia based principle logic, LLC. Uh, Kevin has authored or coauthored 12 different information security books, including hacking for dummies.

 And one of the books that we're going to be talking about today, the book that I got to read, which is the practical guide to HIPAA privacy and security compliance. Welcome to the podcast, Kevin and Rebecca. 

 Rebecca Herold: Yes. 

Justin Beals: Thanks. Thank you. It's great to be here. Well, we're really glad for y'all to join us. I think that I certainly talk about this particular type of issue, HIPAA compliance.

A lot in my work, and I find it interesting because sometimes I'll categorize security compliance into two different realms. One is third party risk management, where I'm a vendor and I'm trying to win a deal, one is regulatory, right? And both can people from operating organizations in the places that they want to play, so they are critical accomplishments for organizations.

Um, I loved reading the book. I highly recommend it to any team that is considering implementing HIPAA. We're actually in the process ourselves of HIPAA compliance. And the book was incredible guide to the real, as you say in the title, practical work of getting it implemented. So Rebecca, I wanted to start with, uh, you perhaps, uh, we'll let you field this question first.

It's now  been over 25 years since HIPAA was initially passed. And I'm curious how you feel about that legislation and our commitment in the United States to patient health care. 

Rebecca Herold: Yes, well, first of all, thank you for liking our book. I'm sure Kevin  and I both appreciate that very much. And yes, HIPAA was passed in 1996, but the actual privacy rules supporting the high level goals of the original regulation went into effect in 2003.

The security rule went into effect in 2005.  And actually, it was really forward looking when it passed. The privacy rule has many of the forerunners of what's in other privacy regulations worldwide, such as GDPR, for instance, HIPAA gives individuals rights to access their PHI allows them to get an accounting and  disclosures, obtain copies of their PHI, which is a type of personal data, right, like GDPR talks about correcting their PHI, and that's protected health information.

For your listeners who may not know what that is. Getting consent to use it and share it and so on, in addition to providing what was generally the first private industry comprehensive regulation in the U. S. So, it really was a groundbreaker with regard to that. And over the years, HIPAA has been updated several times, most recently in April of this year.

It was a big mistake in my view, based on hindsight, that no meaningful penalties or sanctions were applied in the early years. And the reason was quite good. It was to give the organizations who were just learning about HIPAA a chance to get into compliance without being penalized, but, you know, giving them the benefit of the doubt, if you will.

But the majority, covered entities, or CEs for short, simply interpreted that if there was no actual penalty given for HIPAA non compliance, then they wouldn't face any consequences for non compliance, which made all, most of the CEs and their business associates, when they became more involved, the perception that they didn't have to really use it because there would Natural resolution agreement started in 2008.

So there was quite a gap there. So I guess that's, that's just a few starting thoughts about that. I guess Kevin, you probably have a few more to add to that. 

Kevin Beaver: Yeah, I would say Rebecca building on what you said, I'm a bit jaded. And there's something that I observed about HIPAA early on in the mid to late 2000s, shortly after the security rule was passed.

 And I was, you know, generating some work in that area. And obviously we had, we had just, our book had just come out, everyone in the industry was talking  about compliance. They said that they were compliant, they would, you know, push a lot of paperwork on you, including their, their notice of privacy practices.

But the interesting thing is, is that everywhere I went and still go as a patient, and in many cases, a lot of the things that I saw with my health care clients, it's, It's smoke and mirrors. Dare I say that? You know, again, here's our notice of privacy practices. This is how we do HIPAA. And we're not going to do anything else because there's no teeth and we don't have any incentive.

It's, it's interesting. Many organizations assume that they're compliant until it's shown that they're not and they end up on the HIPAA wall of shame. But I, I think overall the term compliance, especially in regard to HIPAA, is often just a facade. And I'm not just seeing this in smaller providers, I'm talking about mid sized clinics and hospitals and even very large organizations operating in the healthcare industry that are considered covered entities.

It's a widespread  problem, especially early on, but I'm still seeing that because I think people just aren't getting the screws tightened down quite enough to feel the pain.

Rebecca Herold: Can I make a quick comment about that? Because I've seen that too, but I have also seen organizations that were happy to have it because their security and privacy folks, mainly their security folks, they were not able to get the funding they needed until they had it.

And so, yeah, and so I have seen them be improved their security and privacy very much during that time, but I've also seen just this past year. I saw a cancer research doctor who has 25 employees who had never done anything for HIPAA compliance and he contacted me because he was getting questions. from his patients who wanted to fulfill their rights under HIPAA.

 And so he, he thought that he could get into compliance with, wait for it, $200 paying me to get him into full compliance and be his ongoing, you know, remote privacy and security officer. He thought that's all that he should have to pay. And it's like, sorry. 

 Kevin Beaver: It's that mindset. It's, it's so interesting, Rebecca, that you said that.

It's funny. I've, well, it's not really funny, but I've been going through a terrible health situation over the past five years, seeing over a dozen different doctors trying to get to the bottom of many symptoms. I've, I've been experiencing most of my life. I was meeting with a neurosurgeon just a couple of weeks ago.

We were going over my MRI and I, I asked if it would be okay to video record it. The conversation and what was being pointed out on my imaging in this neuro said this neurosurgeon was quick to point out over to the little poster on the wall next to the unlocked computer that was that was hanging there.

The poster talked about this area being a no phone zone  and no pictures, no video were allowed. It was literally me, my son, the neurosurgeon, and the neurosurgeon's assistant. And I was told that I couldn't do that because of HIPAA. And I suppose I would have exposed my own P. H. I. to myself or something. I don't know.

 But this is just one of many examples I've seen over the past couple of decades, where everyone is, Quick to talk about HIPAA, you know, they say, well, we can't do it because of HIPAA, but, but yet their networks are absolutely wide open. Their physical security is not good and so on. And it doesn't make a lot of sense, but it is what it is.

 And kind of getting back to, to what you're saying in terms of getting buy in the it and. You know, security and privacy professionals in many cases, they weren't getting that buy in. And I saw that as well with my own clients because oftentimes doctors at the top didn't want to spend money or they wanted to spend 200 on on becoming compliant.

 But it just doesn't work that way. 

Justin Beals: I love these examples of the mistakes in the way we even perceive the activity because it  happens a lot, right? To your point, Kevin, some. There's some mistakes that we make where things get super onerous, where we're misapplying, you know, the, the law in a situation because it's convenient for us to try and simplify it.

And we see that on security postures in a lot, right? Like, well, the standard or the law asks for this. And so I'm going to buy a tool that's going to solve the problem, right? To your point, Rebecca, they're like, Oh, if I just bring in an expert to talk to us about this a little bit, then obviously I've, I've solved the problem, right?

 And that's another way of minimizing it. And I constantly tell people that this is going to change the fabric of your operation. Both like day to day, like what you think about when you, when you're doing your work, as well as how you envision the future of your organization. Yeah. 

 Kevin Beaver: Everything. 

 Justin Beals: Tell me, I think that one of the things you pointed out is that the government was a little slow to start putting teeth to this law.

 And it  seems like we may have another situation like that. I'm just considering the parallel. You know, we've been hearing about CMMC for Department of Defense contracts for a really long time. But everyone we talked to says that they're putting off requiring implementation. And it seems to be a patchwork type of situation.

Is that what we saw with HIPAA, where like one big hospital way leaned in and, you know, everyone else thought they were buying a document repository tool that solved the patient health care information issue? 

 Kevin Beaver: I'll say that's absolutely what I'm seeing. It's just this patchwork of compliance and one organization will do it this way. Another organization will, will interpret it and do it this way. And it's just all over the map. And there's, there's just, it doesn't make sense. And I don't know why, because the HIPAA legislation is actually laid out quite nicely, even the early on stuff that came out in, in the early two thousands. It's, it's very simple to read.

It's very clear. And what's, what's expected. But yet, you know, there were some mistakes early \ on made early on again, with the lack of enforcement, with the whole concept of address ability and whatnot. And I think that led to a lot of picking and choosing of what people were going to do or wanted to do in order to become quote unquote compliant.

 And here we are. 

Justin Beals: And Rebecca, you have a comprehensive attitude to rolling out these programs. I, I think in the book, a broader approach, right? 

Rebecca Herold: Well, absolutely. Because to what Kevin was just saying, I've heard that to a lot of times clients who come to me, both covered entities and business associates, they'll say, well, which ones do we have to actually, you know.

And it's like, you need to look at each of them and determine if it applies to your organization. Some of them are not optional. And then I get into, you know, being addressable versus being required because that, that messes up so many of them. So, you know, I have to explain addressable means that you have to look at what your business ecosystem is like, and you need to determine, do we have a risk with this particular aspect of handling protected health information?

 If you do, then you need to actually implement that requirement. If you don't, then you need to document why you don't. You can't just. Go on your merry way and say, eh, we don't have to worry about, no, you need to document it to show you've thought about it. And then you, you have to explain why. But an example of that would be like encryption.

Encryption is addressable. And this is something I've run across. For, you know, 20 years with HIPPA is that people think, ah, that's optional. No, no, no, it is not optional. If you're, if you're sending PHI through the internet, that's risky. If  you have PHI that's sitting on your server stored there, it's vulnerable.

So that's when you need to look at, you know, what you need to do. But yeah, this picking and choosing. Is something that you have to look at things comprehensively, like you said, and and I think once you speak with someone who's actually at a high enough level to be responsible and accountable for HIPAA compliance.

 Who understands that they need to incorporate this into their full security and privacy management programs and don't look at it as a separate add on, then it helps them to realize where they can start plugging in the specifics of the requirements instead of just trying to do it all separately here, you know.Try to do it just as, as weaving it into the fabric of your full security and privacy program. 

Kevin Beaver: Exactly. It's got to be integrated into your business. I mean, it's literally  a business function, not unlike HR, finance, legal, it's got to be so interwoven within the business that it's just part of the culture.

It's just part of how you operate. And one last thing I wanted to say about the whole addressable versus required. The thing is, even if you believe that something doesn't apply, that a risk doesn't exist, it probably does. And you just haven't looked hard enough to find it. That's what I'm seeing. You 

 Justin Beals: know, one of the things that I like to tell people that we're working with around any of these standards or laws like HIPAA is that compliance is complex.

And any, any solution that looks like a magic wand, you need to be a little careful with, because I think to your point, Kevin, It's not just that it, you know, that, that it has impact to parts of the business, parts of the business have ownership over this work that they don't know of like the VP of HR.

 You know,  certainly your I. T. Manager will probably think maybe I should encrypt this data, but they need to understand they literally have a part to play an ownership in the work itself. 

 Kevin Beaver: They really do. There's been a big assumption. And I, I remember talking and writing about this over 20 years ago, I often see This mindset of well, the I. T. and security team is taking care of it. This is an I. T. thing. Oh, that's above my head. I don't that's not not for me. So I'm sure I. T. and security. They've got they've got their arms around and everything is good. But I can assure you that is not the case most of the time. These I. T. and security professionals, they need help.They need support. They need not just the financial backing, but the political backing as well. 

Rebecca Herold: Well, and also. HIPAA is absolutely not all technology. I mean, HIPAA, HIPAA security rule has a whole administrative  section that includes things that are human activities, operations that you have to establish, training and policies, procedures, and checking of things, you know.

It also has physical that's I see so many organizations not even addressing physical. And it's like, yes, that is a HIPAA issue. And they're like, it's physical. That has nothing to do with it, but no, you're wrong, because if you can physically get to a server, then you have a risk there. If you can physically get to your phone, which you, for some reason, but your entire database of your patients on that's a physical access problem in addition to the technology that is also a problem. It's not an either or it's an end type of problem. You have multiple risks involved and they all go beyond just technology. 

Kevin Beaver: Absolutely. Justin, if I can chime in real quick on a quick anecdotal story, I was doing a security assessment for a very large and very well known hospital.

And part of that assessment was to do an internal review of their network environment. So I basically was, I found vulnerabilities on the network environment. I was able to get remote command prompt to, to basically have complete control over a server by just running some exploit software, a tool that I use called Metasploit.

There were tons of network shares that had. Just absolute gobs of P. H. I. scattered about the network. Well, I actually went on site and I thought it was interesting. I went on site. I just wanted to do a quick physical security review because a lot of the assessments that I do, I can do remotely, but for a hospital, and something this important, I wanted to go on site. 

One of the very first things that I saw, I went and sat down in the lobby of the  organization and I'm looking over by some other vending machines and  whatnot. And I see there's a network port on the wall and literally an Ethernet port. So I thought, I wonder, I wonder what that is, I wonder what I can do.

And again, I'm in the lobby right across from the security desk where people check in and there's a thousand people coming and going, I go over there and I plug in and I pull up a chair. I could, I had full access to the entire network, including the vulnerabilities that I was able to exploit from my testing computer that I'd already discovered, full, full server access, network shares scattered 

It was very eyeopening. And so to, to Rebecca's point, yeah, physical security is absolutely huge. And make sure you check all those network ports because they may just be live and they may just be exposing everything and you not even know it. 

Justin Beals: I'm really excited that you still carry around a laptop that can connect to ethernet.

Kevin, that's not so common these days. 

Kevin Beaver: Oh, yeah, absolutely. Absolutely. 

Justin Beals: I think that especially the healthcare and data privacy space has changed a lot since  2019. As a matter of fact, I pulled a quote from the Seattle Times. I was wondering if y'all might respond to it. The quote is this: 

“In December of 2023, the U.S. Department of Health and Human Services reported that the medical data of more than 88 million people was exposed in the first 10 months of 2023. The department also saw a 93 percent increase in large healthcare related breaches reported to the agency between 2018 and 2022”. 

That's a massive shift. that is enormous. And Rebecca, I'm, I'm just curious what you're seeing as an expert in this breach situation that the health care industry seems to be going through. 

Rebecca Herold: Oh, absolutely. Well, just think about it. There is a lot more data being created and being collected as a result of many, many more digital types of computing products, many of which are within the healthcare space.

Medical devices didn't used to be computerized. Now, most medical devices are computerized. And so, you have all these. You have telehealth, which is absolutely great. I mean, I'm here in Iowa, and telehealth was used and is still used primarily for rural areas of our state, because the hospitals there and clinics have been closed down, but yet you still need to get health care.

So you have a lot more devices. You have also a lot more business associates that are supporting things. Outsourcing is a big change. So, once you outsource now, all of a sudden you have all these additional servers  that are storing data copies of data that you have. So now it's a preponderance of data and then you have a lot more types of IOT devices that are just simply being used within your digital ecosystem that you might not be aware of.

One of the things that I've done over the years is I'm, I've been and been happy to be I've been a subject matter expert and researcher for the National Institute of Standards and Technology since 2009, and for three years, just the past three, four years, I've been part of the IOT cybersecurity development team.

So we've, I've had a chance to help co write and author those standards for technical and non standard, but those IOT devices, my goodness, Those are collecting and sharing so much data, and then you add to that apps, which a lot of organizations don't think about. If you have an app on your phone, how many people read, first of all, what they're allowing that app to do on their phone?

Many of them are sucking the data out of your phone and sharing it with others. And then how many people download something and they're like, this isn't any fun and then they go, go on and forget about it. That app is still active and it's still sharing data. So there's a lack of awareness for even where that health data is going and it makes it so important for HIPAA, even if it doesn't explicitly say you need to have an inventory of all your PHA, PHI and know where you're collecting it, where you're storing it, where you're sending it to, you have to know that in order to be able to map the full data life cycle. So you can then identify where the risky areas are throughout the life cycle.

And I've done that with a lot of organizations and they are so surprised. almost always when they see all of the different places where this data is located. And it's just, uh, interesting. I mean, I can go on and on about that because there's so many areas throughout just that data and where it's created.

 Now with AI, it's just, you know, open the  Pandora's box for having PHI being used to train over a hundred different types of algorithms. 

Justin Beals: Tough, Kevin, because We're increasing the amount of data centralized in stores on more stores. We're manifolding the surface area that we need to secure exponentially, it feels like.

Kevin Beaver: Yeah, absolutely. Rebecca made some great points. I, I'll add a couple of things, uh, based on, based on what I've experienced personally and, and what I'm seeing professionally, I think, um, I don't know that the attacks are necessarily any different. You could say that the malware is more complicated.

 Maybe some of the threat vectors are more complex. The same stuff is still there and the same things are still happening. Therefore, the same risk outcomes are occurring. I think, In many cases, there's a parallel to what we have in healthcare. We have better technology now. We have better imaging in healthcare.

[We have, you know, I've, I've had so much imaging over the past, uh, handful of years. CT angiogram is amazing technology. MRI, the resolution of that, especially if you have a good radiologist that knows what to look for. Even DMX and DDA, I believe, is the successor to that all of this technology, we're actually seeing more stuff and therefore it makes it seem like that more things are happening.

 I think it's been there all along. We're just getting better information and more information in that regard. But I'll also add that. I think what we're seeing here is sort of the culmination of many bad decisions being made. over and over throughout the years in the healthcare industry, the things that we've already mentioned.

The interesting thing with security is that we reap what we sow. Neil Peart, the late drummer from the band Rush, one of my favorite bands, he wrote a line in one of their songs called Natural Science. He said, “Time after time, we lose sight of the way our causes can't see their effects”. And I think that's what we're seeing in healthcare.

There's a lot of lip service, so to speak, being given to HIPAA privacy and security compliance But there's not as much substance as there needs to be. 

Rebecca Herold: Can I add just one point though? There are some zero days that are very significant and I'll give you an example. I'm also do expert witness cases. And one of them that I did was so interesting because it targeted an organization that the folks had had.

You know, training about phishing and so on, but it did not have the red flags that the folks inside learned. So they fell for it. What it was doing was it was back in around 2017 was when it actually happened. Usually these cases come up, you know, years later, Kevin, you know that, but anyway, what happened was it was right in the same  month that this new type of, of spoofing impersonation of emails came through where everything was good.

I mean, because this was the first time that it was used. And so this organization lost, you know, millions of dollars because somebody wired data and also gave the HR data to a criminal who truly passed the sniff test, if you will. It's like, well, yeah, because people didn't do it. So there are a few new things.

And I think that's, that's a challenge for a lot of organizations, especially when you're a criminal. You do have a good security and where security and privacy awareness program and you're training them and your employees truly want to do that, but then when they apply what they know, and it doesn't apply to the new threats, then, you know, it, it's hard to defend against that.

Kevin Beaver: It's hard to defend against something that's never happened. Yeah. Look  at 9 /11. You know, that's a. You could argue that that was a good example, the so yeah, great point. 

Justin Beals: And I think to to both of y'all's perspective, something we talk a lot about is that this work is adversarial in nature, right? You know, you have a another group, an adversary, like a sports situation.

That's always trying to, you know, break down that security or retrieve that data. And you're never going to always win in an adversarial situation. There will be, you know, wins and losses either side. And, and sometimes I think to your point about a zero day, Rebecca, is that we do need to give ourselves some grace that.

You know, we have adversaries. This isn't always just the lack of trying that a breach can happen. You know, you have a quote in your book, it's on page 38, um, from the Poneman Institute in 2013, they had a, an ] annual patient and privacy data security study, and you quoted that the study found that the primary causes of breaches in 2013.We're lost and stolen computing devices, 46%. The secondary were employee mistakes and unintentional actions, 42%. And then the next most common breach was those by third parties, let's say like a hacker 42%, but since that study, especially in 2019. We've seen a dramatic shift in the amount of data that's being breached by third parties, right?

It, it feels like, and please confirm, or let me know if you think I'm wrong, that cybercrime is on the rise in healthcare. 

Rebecca Herold: Well, cybercrime is expanding for sure. So all those things you talked about, and I think the third parties. Um, it's like the business associates too, because that 42 percent is up to around maybe 48 percent now for business associates who are actually the cause of HIPAA breaches.

But, just think about it, you know, from the old saying, you know, why do you rob banks? It's because that's where the money is. Why do you rob healthcare organizations at PHI? Because that's valuable. And what's, what's a very effective way to hold a hospital? Ransomware! Ransomware! Is something and it always frustrates me because like, you know, 35 years back when we back when I started, but anyway, that was back when making backups and having multiple backups and having breach response and instant response plans.

So you could quickly get to those backups that were not connected to the network quickly. I don't see that being done now. And instead, they're. A lot of organizations are just depending upon insurance to pay for the ransom, and I think that is a very bad practice, and it allows the crooks to go ahead and use that  medical devices we talked about there, they're being attacked. IOT devices are being attacked and cloud services. 

And not only that, the pathways where the data is passing through. Why do people still attach to unsecured Wi Fi networks when they're sitting in a coffee shop or in the airport ready, ready to board? And it's like, come on, don't do this, you know, but 

Kevin Beaver: they're doing it.

Or even worse. Why do they leave their laptops unlocked and go off to the bathroom or outside to take a phone call when they're at Starbucks? I see that all the time. 

Rebecca Herold: Yes, that Kevin, you know what? Why do people, I must look like I'm so dependable and honest because they're always asking me to look after their stuff.

Me too, me too. Don't ask me, I could be a crook, you don't know me. Right.

Justin Beals: Um, one, one interesting aspect. that Rebecca mentioned, and I'm curious, Kevin, your thoughts is the value of the data. That's shifted a lot lately. Patient health care information is worth a lot of money. 

Kevin Beaver: It is.. I don't know what, I haven't seen the latest numbers, but I know that even the Poneman Institute has assigned values to certain records and whatnot.

 You know, health care records are worth X amount of dollars for, for patients. This and this and this. And Istopped paying attention to that kind of stuff because it's, it's, it's so hard to keep up with. And I, all I know is it's valuable and we need to protect it and we need to do better at all, get better at all of this, but yeah, there's a huge market for this type of information.

It's uh, there's a really large demand for these things to sell on these black markets and however else it's being used to commit fraud, whether it be healthcare fraud, you know, insurance fraud or IRS tax fraud, things like that. Going back to some of the, some of the stuff that, that Rebecca mentioned, I've always been a big believer of focusing on the basics of security that is the, 20 percent of the vulnerability that are creating 80 percent of the risk and thus the incidents and breaches and whatnot. 

And it's things like proper vulnerability and penetration testing. It's software patching. It's passwords data backups. And another really big 1 that I see something that I mentioned it's basically, inventorying P. H. I. And understanding where it's located on the network so that you can properly protect it. And these, these basics, these core essentials of security, they've been around for decades, and they're still being ignored in many cases. I've got a really good book here. It's called security, accuracy and privacy in computer systems written by a gentleman named James Martin.

It's got a lot of the stuff that the HIPAA security rule covers. I'll share just a couple of things from the table of contents. It talks about security exposures, computer errors, security and systems programs, who's responsible for security, auditors, legal controls. The interesting thing about this book is that it was published in 1973.

These issues have been around for how long? 50 plus over 50 years. These challenges, these concepts, these very things that we, that we still struggle with. The, the answers are ha have been in books like this, in books like the HIPAA book  that Rebecca and I wrote, uh, my book, hacking for Dummies.It's, it's out there. 

So we have to get to the bottom of it. Why is it these things are not happening? And I think it's a human challenge. It's a political issue, political issues. It's cultural issues. It's the problem almost always comes with hair on top. And that's why I shaved my head. So now I, it's, it's a big challenge and we need to get these things out there in the open and stop.

You know, trying to implement more technology and put more policies in place and bring in more regulations. We need to just do the stuff that we know that already needs to be done. 

 Rebecca Herold: Well, if I could go to that health care value to, to that book, Kevin, in 1991, I create 91, 92, I created the first, uh, policy security policies at principal financial group.

 I use the military rainbow. 

Kevin Beaver: I remember that. 

] Rebecca Herold: And that's all that was there. But anyway, with regard to health care. So I think two, two other issues, too many organizations now, and this is something just in the last few years, artificial intelligence, you can't go a day without seeing AI mentioned at least 30 times, but anyway, people are using, even people who have access to patient data are using that to train AI algorithms.

That  exposes PHI in so many ways, not only just because it's used in the algorithms and creating biases and so on,  but now that data is over here unsecured in that. And then second, throughout my entire career, marketing has always wanted to get a hold of every type of personal data that any organization has.

I've seen it in all my clients. So. When we talk about misusing P. H. I. have you organizations who need to comply with it? But have you given your marketing folks and sales folks training about what they can and can't use? Because I found too many that simply have not established. What should be, as Kevin recalled it, the, the core security rules, and that's do not use real personal data for marketing and for sales.

So, you know, if we could even get a hold of those things and get them reined in. So that can really stop a lot of breaches. Absolutely.  

Kevin Beaver: Great point. 

Justin Beals: Uh, Rebecca, I'm going to now be guilty of saying that back when I was young comment, but I did work in AI data science, machine learning back then. We called it algorithmic programming a long, long time ago, and I think what you're reinforcing is that we have good answers for how to solve these problems. Because even back then, Kevin and Rebecca. I was very curious about a couple of things when we would develop models, which was one is what's the life cycle of the data? 

So if we are taking in data, we're going to build a model, it's going to make predictions where where is the data left, you know, in the model itself? Are we finding it in other places? What's that impact? And then the other thing I worry about is accuracy. Which I think is in your book, Kevin. And I complain all the time about AI companies that don't publish accuracy measures.

You know, on the things that they're predicting. We need to know the probability that you're right or wrong before you do that work. Well, I wanted to dig into one other topic. We touched on it briefly in the beginning, but I think it's really important for organizations that are thinking they're HIPAA compliant or trying to be HIPAA compliant are caught up in HIPAA as a business associate.

But you state that “there is a common myth that HIPAA compliance is about technology that can simply be bought and put into place and magically make a CE or a business associate or a subcontractor HIPAA compliant. This is the furthest thing from the truth and any attempts at acquiring compliance in a box without applying an accompanying customization or thought are not only futile, but likely will leave your organization with huge compliance gaps”. Maybe we'll start with you, Kevin. This is, I think, a really important statement for our field. 

Kevin Beaver: Yeah, absolutely. I don't know if I wrote that or if Rebecca wrote that, but it's very true.

One of the first  articles that I ever wrote was called “HIPAA security compliance doesn't come in a box”.  That was back in 2001 or 2002. I believe I wrote that article and there was a belief back then, and it's still around, is that we can go buy all of this stuff or we can hire a managed services provider or we can bring in like someone like Rebecca for $200 and all of a sudden be compliant and it just simply isn't true. IIt just doesn't work that way. We're, we're talking about, you know, fundamental business concepts that have to be changed. Fundamental thought processes that have to be changed the way, the way that you work, the way that you, you know, that you interact with patients and whatnot. All of that. It's all within the fabric of your business.

It's not just an add on thing here that we can just pull off the shelf or go buy from some vendor and we're magically compliant. it always, uh, gets me, Rebecca and I've talked about this in the past where you see a lot of vendors out there, they, they claim that their, their solutions are HIPAA compliant and some of them will even go as far as saying, if you buy our solution, you will be HIPAA compliant and wow, that's, that's really bold statement and really dangerous.

But yeah, this is a, this is a very fundamental component or aspect of HIPAA that is misunderstood. It's, it's misrepresented or misrepresented. And it's, it's basically people just don't get it. And you've, we've got to move beyond that mindset. 

Rebecca Herold: And I might add too, and in case some of you tuned in later, I'm not going to get you into compliance for $200.

That was just a statement made earlier. But anyway, as far as. HIPAA at the core, and this is what I always explain not only to my clients, but to my students who take the HIPAA courses, you have to look at the risk management requirements within HIPAA, because really, that is like the core of HIPAA, because If you, if you understand those, it's emphasizing that you need to determine where your risk exists around protected health information, how it's used and shared, and so on. 

Once you understand that, then you can start understanding better all these other things that radiate out of the risk. Management aspect and and also, you know, risk assessments are a subset of risk management.

So that's why you can't just go through as a audit checklist doing a risk assessment. There are two different things. And I think once organizations understand that, oh, yeah, even though I know, I have a friend, Joe, who also has to be HIPAA compliant, I can't just, you know, use their results of their risk assessment and their HIPAA audit and apply it to my organization and be in compliance too, because the organizations are inherently different.

No two organizations are the same. So organizations need to understand that and once they do, I think then they, that will shift their thinking a little bit so that they then start realizing, Oh, well, then this means that our security policies, which most people or organizations have, although there's many that still don't, but most organizations have, they can say, “Oh, here's how we can apply this to HIPAA now,” even though HIPAA might not say specifically something about multifactor, you know, authentication, we can actually use MFA to support our risk management programand mitigate the risk we identified within our risk assessment. 

So, I think it takes a little bit of time that technology could not be a substitute for to get people to understand what they need to do. You can't technology does not do everything for you. One of the, just one last thing here, but before we go on to your next question, it's really disturbed me greatly when chat GPT all of a sudden started being used for everything from, you know, chat bots and everything else, to see a security vendor releasing a HIPAA security risk assessment tool. And it actually said in the marketing, which brings to my other point, talk to your marketers before they make claims. But anyway, it said you use ours and you don't even have to worry about doing anything for HIPAA. Use our our HIPAA tool, our risk assessment, just plug it in, let it go, and your risk assessment is done. And I read that. 

I thought, Holy cow, they are going to miss so much or they're going to get so many things wrong. Don't believe the promises folks. 

Justin Beals: Well, Kevin and Rebecca, this has been very fruitful conversation for me. I've really enjoyed it. I deeply appreciate your work in this space professionally,and I want to give another plug for your book. “The Practical Guide to HIPAA Privacy and Security Compliance”.

 I really enjoyed reading it. And I know that's weird, but of course, this is a professional concern we all have, but it was just, I am not a privacy and security expert. I'm a CEO at my firm. And it gave me a perspective on how to be sure that we're implementing it the right way.

So thank you both for joining us today. And look forward to continuing this work together, build a, build a better community in healthcare. 

Kevin Beaver: Absolutely. Thanks, Justin. 

Rebecca Herold: Thanks so much. So nice speaking with both of you guys today. 

Justin Beals: Likewise.

About our guests

Rebecca Herold and Kevin Beaver

Rebecca Herold is the CEO of Privacy & Security Brainiacs SaaS services business founded in 2021. Rebecca is also the CEO/founder of The Privacy Professor consultancy. Rebecca has over 25 years of IT, security, privacy and compliance experience and has authored 22 books. Rebecca supports all sectors with deep expertise in healthcare, government, industrial, financial, transportation, education, IoT, & consumer security, privacy, and compliance. Rebecca has been an expert for the NIST Cybersecurity for IoT Program for more than three years and was a NIST Privacy Framework Program expert for three years. Rebecca is also an expert witness for IT, security, privacy and compliance topics, including IoT, digital surveillance and healthcare cases.


Kevin Beaver, CISSP, is an information security consultant, writer, and professional speaker with Atlanta, GA-based Principle Logic, LLC. With over 33 years in IT and 27 years in security, Kevin specializes in independent security assessments and virtual CISO consulting work to help businesses uncheck the boxes that keep creating a false sense of security. He has written 12 books on security, including the best-selling Hacking For Dummies and The Practical Guide to HIPAA Privacy and Security Compliance. Kevin has written over 1,300 articles on security and regularly contributes to TechTarget's SearchSecurity.com and Ziff Davis's Toolbox.com. He has a bachelor's in Computer Engineering Technology from Southern College of Technology and a master's in Management of Technology from Georgia Tech. Kevin races cars in the SCCA Spec Miata class in his free time and enjoys riding dirt bikes and snow skiing.

Keep up to date with Strike Graph.

The security landscape is ever changing. Sign up for our newsletter to make sure you stay abreast of the latest regulations and requirements.