Episode Show Notes
Episode 3 features Ernie Edmonds, Sr. Managing Consultant of Cyber Physical Security Services at CMTC. Ernie explains what Zero Clear is, where it came from, and how it’s structured. In addition, Ernie debunks the common misconception that data security is a “big company” problem and discusses warning signs that an attacker may have access to a company’s data. Ernie concludes the conversation by offering suggestions for how SMMs can help secure their data as well as how they can better educate their employees to prevent attacks.
Ernie Edmonds is Sr. Managing Consultant of Cyber Physical Security Services at CMTC. Ernie has over 25 years of practice and leadership experience in the information assurance field. He holds numerous certifications, including Microsoft Certified Systems Administrator/Engineer, Certified Ethical Hacker, Certified Information Systems Security Professional, Forescout certifications, and many others. He has led some of the largest infrastructure and cloud datacenter deployments in the world as technical & product leads and has also worked with small and medium-sized companies.
00:00:00 - Introductions
00:01:21 - Definition of Zero Clear
00:05:21 - Types of bad actors
00:07:27 - Further discussion about the meaning of Zero Clear
00:11:26 - Types of data to protect
00:15:48 - No manufacturer is too small to be a target
00:21:49 - Who harvests data and why
00:26:39 - Seemingly innocuous data that can be used against you later
00:30:49 - What an SMM can do to secure their data
00:37:39 - Best way to educate employees about cybersecurity
00:41:07 - Warning signs that a company’s data has been compromised
00:43:21 - How employees who work from home can keep their data safe
00:46:07 - Advisability of establishing a relationship with local authorities before cyber incidents occur
00:48:50 - Precautions software developers should take during the early stages of a project
Gregg Profozich [00:00:02] In the world of manufacturing, change is the only constant. How are small and medium-sized manufacturers (SMMs) to keep up with new technologies, regulations, and other important shifts, let alone leverage them to become leaders in their industries. Shifting Gears, a podcast from CMTC, highlights leaders in the modern world of manufacturing, from SMM to consultants to industry experts. Each quarter we go deep into topics pertinent to both operating a manufacturing firm and the industry as a whole. Join us to hear about manufacturing sector's latest trends, groundbreaking technologies, and expert insights to help SMMs in California set themselves apart in this exciting modern world of innovation and change. I'm Gregg Profozich, Director of Advanced Manufacturing Technologies at CMTC, and I'd like to welcome you. In this episode, I’m joined by Ernie Edmonds, Senior Managing Consultant of Cyber Physical Security Services at CMTC. Ernie explains what Zero Clear is, where it came from, and how it’s structured. In addition, Ernie debunks the common misconception that data security is a big company problem and discusses the warning signs that an attacker may have access to a company’s data. Ernie concludes the conversation by offering suggestions for how SMMs can help secure their data as well as how they can better educate their employees to prevent attacks. Welcome, Ernie. It’s great to have you here again today.
Ernie Edmonds [00:01:20] Thanks, Gregg. I appreciate it.
Gregg Profozich [00:01:21] Ernie, I’m excited about our conversation today and looking forward to hearing your perspectives and your insights. Let’s get started. We’re here to talk about Zero Clear. This is a new initiative. Can you define it for our listeners to get us off and set some context?
Ernie Edmonds [00:01:35] Sure. Zero Clear. When I came up with the concept, it was something … I spent a lot of time on the dark web, and I was quite amazed. I had only been there for a little while, but I was quite amazed at how much people were talking about people’s identity—whether it would be spoofing or whatever it would be—just how many people or organizations are listening to whatever is sent over the internet or through voice, like voice over IP, and it’s sent in the clear. People are harvesting those. It can be run-of-the-mill hackers, or hacktivists, whatever that would be. But then there’s nation-state actors in our own government, different agencies there, then other governments, and then private companies, whether that would be Amazon, Microsoft, Google, whomever. There’s just so much being collected. It’s not just collected; it’s being harvested. When I would see somebody post a bunch of information on the dark web, I would see this information flowing freely. This is people’s information, and organizations’ information, businesses’ information. It can be something pretty common like a tax ID, or it could be financial data—their bank account information, or what they bought, or what this person has bought for this organization. It’s just crazy what’s out there. It’s like, “We have to do something.” That’s what Zero Clear is about. It is simply sending nothing in the clear. When we talk about sending data, it’s data on the move. There’s clear text, and then there’s plain text and cipher text. Not to get into the details there, but you don’t really want to send anything—I say you, all of us—that can be intercepted and harvested by a third party. You may say something like, “Well, I don’t have anything that somebody would want.” Yes, you do. There is so much information that just flies around. By itself, all right, maybe no harm, no foul. But you put it together, and somebody is able to glean a really good sight picture of you, and your family, and where you live, and where you go when you do these things. It’s just creepy what’s going on. That’s the purpose of Zero Clear—to protect and not have anything, whether it would be data at rest like a hard drive or whatever. But especially when it’s on the move, sending nothing in the clear to where somebody could intercept it.
Gregg Profozich [00:04:05] You said a lot there. I want to take it apart and just make sure I fully understand and make sure our listeners have an opportunity to understand if it’s not clear to them at this point because it’s not clear to me, for sure. You’re saying that you in your work in cybersecurity on the white hat side have spent some time on the dark web and found lots of data floating around. That data that’s available out there can seem somewhat, I’m guessing, innocuous, but it’s one, or two, or three pieces of a puzzle. When I get 100 pieces of a 150-piece puzzle, I have a pretty good picture of what’s going on. It’s one of those kind of things where little pieces don’t seem like much, but when you have the ability to assemble them, you can.
Ernie Edmonds [00:04:41] The amount of data, like you’re saying, the data points, you get three data points, you’ve got a little bit of what’s going on, but you get to 100 data points, that’s usually significant, whatever the total would be. Humans have the ability to use inference. If I infer that somebody has this, that, and the other, I can figure out the rest of it. With the advent of AI, that’s going to come into more play as well. But we, as just typical humans, have a really good capability of doing that. You don’t have to have somebody’s entire life to figure out their whole life.
Gregg Profozich [00:05:13] One piece of data is a point, two is a line, and three is a trend. One hundred is what? One hundred is a whole lot of information.
Ernie Edmonds [00:05:20] Really good trend …
Gregg Profozich [00:05:21] It’s a really accurate trend. The more they collect … And they’re collecting everything. It sounds like it’s not individuals. It’s not your basic Matthew Broderick from War Games back in the day, hacking things from his bedroom with his dial-up modem. This is organizations. These are people who are organized and structured, are they not?
Ernie Edmonds [00:05:38] It’s all of the above. If there is a hacker, they have a limited collection ability compared to, say, a government organization or a different nation-state, whether that would be China, or Venezuela, or whomever. Russia, they’re obviously a big player. With the tools that are available now, you can be a low-level hacker and still gain a lot of data, especially if you’re motivated towards an individual. Maybe you are mad at somebody, or something’s happened. If you have even just a script kiddie capability as a hacker, you can gain a lot of this information and then use it to your advantage and not theirs. It can be weaponized. That’s a term we use a lot—weaponization of information. It can be used against you. It’s already being used against you. Think about somebody like Amazon. You’ve got an Alexa or something like that. I don’t recommend any of those things. These things are always listening. You can tell they’re always listening because you’ll talk about a particular product, and the next thing you know, here it is in your inbox with “Here’s a sale.” Do you really need that product? If you buy it, is that really the best thing to do for your family or whoever you’re trying to provide for? That’s why I say it’s already being weaponized against you because separating you from your money is often not the best thing for your interests.
Gregg Profozich [00:07:03] It’s the best thing for their interests. In that sense, Google and Amazon are not a whole lot different than hackers. They just do it. They go about it differently.
Ernie Edmonds [00:07:12] Well, it’s not even a hack; you’re just giving them the information. Why wouldn’t they use it? You bought that device. You told it it was okay for it to listen.
Gregg Profozich [00:07:21] We’re paying them to listen to us and then use the information …
Ernie Edmonds [00:07:24] So that they can separate even more money from us.
Gregg Profozich [00:07:27] So we can pay them even more. Interesting. We talked a little bit about the data piece. You talked about data in the clear. I’m getting the sense that you mean don’t put anything out there, no matter how seemingly innocuous, where it can be picked up easily. But what does that really mean? Do you have to encrypt things? You only trust certain networks or certain tools? What does that mean?
Ernie Edmonds [00:07:46] I’ve got a signature on my email that says Encrypt everything every time. That’s what we’re talking about. It’s easy to talk about when data is sitting there, whether it would be on a hard drive or a USB, whatever that would look like, a database, even, encrypted while it’s sitting there. Encrypt your hard drive. Encrypt the thumb drive, or buy a FIPS encrypted thumb drive, if you really want to go that far. Same thing with database. Encrypt the data either in the cell, or encrypt the data going into the cell, or whatever that would mean. That’s pretty easy. What gets to be harder in real life is protecting against things like voice-over IP telephone conversations because those are usually sent in the clear. Sometimes, they’ll wrap an encryption wrapper around it. It’s good. But when you make a phone call, like on a cell phone, it is in the clear. It’s been in the clear since we changed from analog over to digital. We know that the people listening into that phone conversation are, of course, NSA, other government organizations. But if somebody has the ability to intercept that, whether on your local network, or even a spyware device on your phone, or snuck around or whatever we were talking about, they can collect that data. It doesn’t just get collected; it gets harvested forever. Maybe it’s something that you’re saying right now that is completely a nothing statement. Well, what’s that going to look like in 20 years with political motivations and shifting tides in society? Can that be used against you, or someone you love, or your company, your family, friends, whoever that would be? What’s going to be done with that in the future? It will exist in the future. I had this conversation with my doctor not long ago talking about HIPAA. She was like, “Why does everything have to be encrypted?” We’re talking about my health on the phone right now, and everybody’s listening. She was like, “What?” She’s a doctor. She doesn’t know that; she knows medicine. I don’t know anything about medicine. Coming together, we had a good conversation about this. In this case, it’s just the law. Either somebody didn’t understand technology when they wrote the law, or the technology has changed to the point that the law needs to be updated. Talking about HIPAA. But there’s a lot of other things, too. People talk to their attorney over the phone. What conversation is that a lot of times? Especially when you’re talking about child custody, or so-and-so cheated on so-and-so, or whatever that would be, or this company is trying to acquire whatever that would be. All these voice conversations are in the clear. You should not do that. You’re saying, “Well, how do you not use the phone?” Well, there’s voice over IP apps. You could use something as simple as Apple iMessage. I don’t really like Apple. I don’t like big tech in general. It is encrypted, and it’s tightly encrypted. To their credit, that’s a potential way that you could limit that, mitigate that exposure. If you’re not using Apple, then there’s other things, too. I use Signal. Signal Messenger is probably my favorite communications tool right now. It encrypts my voice. If I’m doing a video call, it encrypts that, too. Then my texts are also encrypted. This is true end-to-end client-side encryption. Signal doesn’t have the keys for any of it, so it’s always obfuscated. Assuming you don’t have something listening in like Alexa, or you got some spyware on your phone, or something like that, then you have an assurance that, at least on your side it’s protected. Now you have to worry about does Gregg have Alexa sitting in the background, on his desk. Hopefully not. I don’t think you do.
Gregg Profozich [00:11:22] I do not. I’ve talked to you guys too many times.
Ernie Edmonds [00:11:26] There you go. But when we’re talking about this … Both sides, you’ve got to protect yourself and protect the other person. There’s that. But then there’s email. Email’s always in the clear. You might say, “Well, I’ve got an encrypted connection. I installed the certs.” Okay, great. Your client to the server is encrypted, but what about server-to-server? You don’t know that. Traditionally, everything, email from server to server, was unencrypted. Talk about a gold mine, if you are able to sit on an interface or whatever that would look like. There’s the email aspect. You just look at any way that data can move. There’s text; there’s voice; there’s email, whatever that would be. Encrypt it everywhere that it moves, and then encrypt it everywhere that it resides. That’s the whole purpose of Zero Clear—to encrypt everything wherever it exists. There’s three states of data: there’s data in motion, data at rest, and data in use. Data in use, right now, where we are with technology, it’s not reasonable to have encryption for data in use. Now, there are some pockets where that can happen, but as a whole, data in use is not able to be encrypted right now.
Gregg Profozich [00:12:43] But on both sides, when I’m storing it, when I’m sending it, it can be?
Ernie Edmonds [00:12:47] When you’re sending it, absolutely. No, not every method will provide for encryption. Choose a method that does support encryption. That way, it closes that exploit vector. Again, this stuff is being harvested, and in the future, it can be weaponized against you, family, your friends, your company. One of the big things right now is brand tarnishment. If you look at some of these lifelong brands, right now many are struggling—or at least some are struggling—significantly because of bad PR. Some of them thought they were doing a really good thing, but in public opinion, it was not a good thing for their customer base, and they’re paying the price. What does that look like for you in 20 years? It is being harvested. What if somebody replays this? Now you’re the bad guy, and you’ve now ascended to where you’re the CEO of a company. You just potentially killed your company 20 years ago because now this stuff’s out. That is something we have to be aware of.
Gregg Profozich [00:13:54] Things like Gmail, or Google Docs, or things like that, they’re not encrypted, right?
Ernie Edmonds [00:13:58] No, they are encrypted. There are contract vehicles that provide for Google not to look at your stuff. But if you’ve got just the plain free Gmail, they’re looking at your stuff. There’s a saying: If you’re not being charged for the product, you are the product. In this case, you are the product.
Gregg Profozich [00:14:13] I think a lot of the schools are using Google and Google Chromebooks, and things like that. They give you the Chromebook and all the homework is done on Google. Is that protected, or is that paper that you wrote as a 14-year-old going to come back and haunt you when you’re 44 running your company?
Ernie Edmonds [00:14:29] It may. We don’t know because we don’t know what the contract looks like. I do believe that if it’s covered under contract, it’s probably true. If something says—I’m going to paraphrase—we’re not going to listen to your stuff, we’re not going to look at your stuff, we’re not going to harvest and maintain your stuff, and collect and keep your stuff, I have every idea that if Google says that, Google doesn’t want bad PR, so Google’s going to do what the contract says. They don’t want to get sued, either. Same thing with Microsoft or whoever that may be. If it spells it out that we’re not going to do it, then fine. But look at the tech that’s being used. Also, look at where the data is being stored. If it says it’s on a US server, then you’re bound by US law. If it’s Switzerland, then Switzerland’s got a lot tighter privacy laws than what the US does than the EU with GDPR. They’re stronger than we are. For what it’s worth, California, whether you like California or not—there’s different people out there—we have really strong privacy laws compared to a lot of other areas of the country. But we’re still not up with GDPR. There is opportunity there. I would encourage people to get involved with that to protect themselves and everybody else as a collective society.
Gregg Profozich [00:15:48] Ernie, we’ve talked about a lot of things so far. I want to go into a little bit more about big company, small company. Is it a popular misconception that data security is a big company problem? Am I too small to matter? Can a manufacturer be too small to matter?
Ernie Edmonds [00:16:02] I hear that a lot. With my role at CMTC, I’m a delivery resource. I deliver cyber whatever to companies, and a lot of them are small to medium enterprise, small to medium manufacturers. Yes, it’s a big company problem, but it’s not exclusive to that. A lot of my companies will make something for the DOD or whatever branch of the government they’re making it for. Just the fact that they exist makes them a target for somebody like China or somebody who would like to gain a secret. We’re not talking about secret like classified. It’s just the fact that they have a contract or they are building this widget makes them a target just because they exist. The company has intellectual property and the things that the company owns, but then there’s the people as individuals that will make up this company. Each of those persons, individuals will have their own skeletons buried in their closet, let’s say. Maybe somebody is going through a divorce, and it’s a nasty divorce. Well, maybe they can use that to get leverage over a person or socially engineer that person. Before you know it, they’re getting information from the small company that they shouldn’t have. Again, back to the innocuous information, it could be something as simple as vacation schedule. Well, if I know that somebody works from home, and then I figure out that they’re going on vacation in the third week of July, nobody’s going to be home. Well, when am I going to attack their house and maybe get a copy of this encrypted data? Those are structured attacks. Yeah, it’s a big company problem; it’s a small to medium company problem, but it’s an individual problem just because individuals have so many things that we know or we have available that added together provides a really good sight picture for exploits.
Gregg Profozich [00:17:56] Ernie, you were talking about that targeted attack thing. Are there cases or are there potential cases that we’ve seen, things we see in the news, that can be tied together, if we knew what was really going on, that would give some information about how these organizations, how these bad actors work?
Ernie Edmonds [00:18:09] Yeah. Normally, what will happen is you’ll have a bad actor that’ll … They’re motivated by something. Some of them just want to see the world burn. There’s not a lot you can do there. But often they have something in mind that starts small and grows into something bigger. You and I have known each other for years. My background is I’m a lifelong hacker. Started as a script kiddie and then made a career out of it. What happens in the attack cycle is you formulate an attack, and then you attack. Then you take that information, and you use that to seed a new attack because now you’ve got more information to use. Take a hypothetical example. Somebody is going through a divorce or whatever. They go on a dating site, hook-up site, whatever you want to call it, and the hook-up site gets hacked, and they get this person’s information. Well, now they can use what they find from that to maybe go to a bigger organization and potentially hack that. Of course, once you have passwords … It’s easy to reuse a password. I’m not going to really preach on that. Don’t use your passwords on more than one site. But whatever that would add up to. Then you look at connection. All right, within a couple of years or a couple of months after that, this other company gets hacked or this government … Say they’re a government contractor. This government website gets hacked. You have to wonder: are they taking what they learned from one to use on another? Often the answer is yes because people do use the same passwords over and over. Just general good hygiene for password management. Humans are not good at it. Use a password manager or something like that. These sites add up to where something little can add up to something that’s really big in some cases.
Gregg Profozich [00:20:04] That’s a scary thing, that they’re that organized and they’re that strategic about it. They’ll find one piece of information, go find other pieces of information. Maybe find something on the dark web. You get my medical records on the dark web, you figured out the person, you get their medical records off the dark web, you get their dating information off of the hack there, and then you can make inferences about what’s happening and put them in embarrassing situations potentially and have leverage on them to do …
Ernie Edmonds [00:20:31] It can be used that way. Look at it as far as if you are on one of those sites. Say you’re on a hookup site. Well, maybe the wife wouldn’t even care because you’re going through a divorce, but maybe her attorney would care. Now you’re in an even worse situation. Now you’re under duress potentially in this divorce. It makes you susceptible to somebody proposing whatever it would be. I would like for you to spy on me, or your company, or whatever that would be. But also, there can be people who are influencers that don’t actually seem like they’re trying to do anything. For us to come out and say, “All right, this is like the old spy game.” If somebody is a spy, they get approached by a person, and they say, “I want you to spy on your government,” or, “I want you to spy on your company,” whatever that would look like. There can just be influence agents. Maybe it’s this guy going through the divorce. He’s the same guy again. What if some really attractive person to him starts talking, and they are of a different political leaning? Well, because he or she is attracted to them, now they start to have these thoughts about, “It’s okay that you would lie because I don’t see a lie as a black and white …”
Gregg Profozich [00:21:49] Interesting. We’ve talked a lot about the data on the dark web you mentioned in your intro conversation. Who harvests data and why?
Ernie Edmonds [00:21:57] Well, the why is whatever it would be. It could be that they just want to collect data on you, or they want to collect data on our country or on a defense manufacturer, whatever that would look like. Let’s start at the top, nation-state actors, the United States of America. Three-letter agencies love to collect data. Some of them are even chartered, for that’s what they do. There’s those people. Do you trust them or not? Well, I don’t really trust them, but I don’t really distrust them either. Since I’m a citizen, I think of them as my protector. They’re there for a reason. I give them the benefit of the doubt a lot of times. All right, check the box. The US government’s spying on me, but it’s okay, at least what I know right now. I’m just working it through, not necessarily how I think. There’s the US. Then there’s our allies. Yeah, we’re allies, but does Israel, does Germany, does Australia …? Name the country, they’re our ally. Yeah, they spy on us. We spy on them. It’s not a big secret. Yeah, okay. Now they’re looking at our stuff. Are they going to be an ally forever? Well, maybe, maybe not. Probably. Tend to have long-term relationships with our allies as a country. Now we’ve got the not-so-friendly ones: China, Russia, Cuba, Venezuela, North Korea, Syria, Iran. The list goes for a while. Those countries are not our friends. They don’t pretend to be our friend unless they want us to buy their stuff. There’s nation-state actors. Then there’s the pseudo-nation-state actors. They’re not really a nation-state, but they might as well be. There you are with that. Then there’s big tech: Amazon, Google, Microsoft, all of the subsidiaries that they own that people don’t even know that they own. A Ring camera that’s one that’s in the news a lot. Avoid big tech. Word to the wise, avoid big tech. Avoid it whenever you can, too. It’s so creepy with these companies. These companies are using this data far worse in your interest, far more nefariously than what governments usually do. They’re also better at it. They’re agile in the extreme. Whatever works, they can globally deploy this thing and scale it in an instant, it seems. Avoid big tech whenever you can. Lastly, hackers of various levels. If it’s a script kiddie hacker, they might be able to do something, but what about hacktivists? One of my previous career roles, I was head of security at a pet company, pet food company, pet supply company. We had hacktivists. Some of them thought that you just should not own an animal. They didn’t want you to own a dog, or a cat, or a ferret, or whatever that would be. We were a target just because … We didn’t sell dogs and cats; we would adopt them—it was an ethical thing—but we did sell snakes and vermin, so to speak. They didn’t want us to sell those either. We would get targeted and attacked because of what our business was. That’s the hacktivist. It could be marijuana or whatever the hot button of the day is.
Gregg Profozich [00:25:15] I’m not that interesting. I’ve heard conversations with people about, “I don’t like Alexa. I’ll never have one. Do you really love it?” “Yeah. I just tell it to play me music while I’m cooking.” Well, great, it can do that. But I can play music on my smartphone just as easily; I just can’t talk to it. That kind of thing. I’m not that interesting. But are we all …? Is anybody not that interesting, or are the collectors of data after everybody, truly?
Ernie Edmonds [00:25:38] They’re after everybody. No, everybody’s not as interesting as everybody else. It just stands to reason that not everybody does have anything that many people would want. But you don’t know what you even have as far as what’s valuable to someone else. I know. I’m in the security space. I know what people would want vs. what they wouldn’t, but most people don’t think about that. They think about what’s interesting to me, not what’s valuable to someone else. If somebody just gleaned a little information from 10 different people, they have 10 different data points, back to our data point discussion. Now they can build on that and potentially find other people to exploit and other types of data that they didn’t even know existed previously. That can all be brought into a cohesive vision for something much bigger than what they even originally knew. Yeah, you do have something, but it’s based on what somebody else finds of value rather than what you think is of value.
Gregg Profozich [00:26:39] We were talking about data. Maybe we already answered this. What are some innocuous pieces of data that seem like nothing now? Can we get some concrete example of a piece of data that seems like nothing now that could be used to hurt you in the future?
Ernie Edmonds [00:26:53] Sure. Data of value. It can be, of course, your name, your address, your Social Security number. We know these things. Maybe your telephone number, or maybe your old telephone number, or maybe a previous address. If somebody’s trying to go through the password recovery … This was something that got exploited through Apple a while back. You don’t have to attack the iPhone if you can go to … What’s their backup thing that they use?
Gregg Profozich [00:27:21] Is it iCloud or something?
Ernie Edmonds [00:27:23] Yeah, iCloud. That took me, but I’m not an Apple person. If you can get in through password recovery or something like that, then that’s an avenue in. If somebody’s backing up their messages, well, now they’ve got all their messages. Certainly, they have email, and pictures, and stuff like that through iCloud. I used to be Apple. I haven’t been in a long time. Ultimately, I decided to shed big tech. Apple’s big tech, although Apple is usually on the higher end of the big tech circle. Back to what is useful information. What’s your dog’s name? How many people use their dog name in their password? It’s still a lot. They’ve cautioned about it now for years and years, but I think it’s still pretty high. Or their kids’ names, or their kids’ birthdays, or the date they met their spouse, or where they met their spouse. That’s even a security question. Thankfully, Google is now leading the way and not using security questions anymore because it’s on the dark web. Anything you want to know like that’s already out there. It’s been out there for years. Things like what do you like to eat, what do you like to drink, what do you like to watch? That’s the thing with Alexa. When you’re telling it what you like to watch, well, if you’re about watching a bunch of K-Drama, there’s something there that is not typical for an American because most Americans I know—it’s growing—right now, July 2023, do not watch K-Drama. That might be an avenue that I can influence somebody. Maybe it’s a show, it’s a particular actor, and now they get to see a pattern that he likes this person, or she likes that person, or they like this type of thing. Well, that can be used for marketing, whatever. Everything you do seems like it can be used against you. Just protect that information to where people don’t even have it to even use it against you, regardless of what it is.
Gregg Profozich [00:29:10] Just so our listeners are on the same page, K-Drama, just in case they don’t know what that means.
Ernie Edmonds [00:29:15] Korean drama. There’s J-Drama, Japanese drama. There’s K-Pop; there’s J-Pop, there’s R-Pop. R is Russian. R-Drama, Russian drama. It’s just whatever letter in front of it. It’s growing. These other countries are providing something that people would like. Like you said earlier, one data point, two data points, three data points is a pattern. Well, now you’ve gotten a pattern, and once you understand where the pattern is leading with more and more data points, now you can start to influence that person. This is all about influence when we start talking about hacking a human. Back to the Zero Clear thing. Just keep these data points to where they’re not a thing.
Gregg Profozich [00:29:58] Don’t put them out there in the first place. Be careful what you say.
Ernie Edmonds [00:30:02] Yeah. Just keep everything a secret.
Gregg Profozich [00:30:04] Social media: Facebook, Instagram, those kind of things.
Ernie Edmonds [00:30:07] They’re being used against you. We know that. When you try to talk to a 16-year-old kid now about a 45-minute topic, what’s the likelihood you’re going to make it 45 minutes? Their attention span’s blown, and it’s because of that. Is this the dumbing down of society? Maybe. Is that going to make us less able to respond to a threat from a nation-state actor once they get into their workplace and their career? Maybe. We have to look at stuff like that. We don’t know. I have suspicions, but I don’t know. They’re just suspicions. We need to have that analytical ability to make good decisions now so that it won’t hurt us later. Just keep this data from being a thing. Keep it from getting out there.
Gregg Profozich [00:30:49] We’ve been talking in general terms about the data; Zero Clear, the concept; what’s out there; how bad actors, and organizations, and nation-states can work to use this data and how they can exploit it. Let’s talk about small manufacturers, our main audience. What can a small and midsize manufacturer do to help secure their data?
Ernie Edmonds [00:31:05] Let’s start with data at rest, protected. If it’s on paper, a lot of manufacturers use … This may not be common for everybody listening as far as the audience. There’s this thing called a traveler. It goes with the job that’s being built going to the factory, whatever. What data is on the traveler? It’s usually pretty sensitive data as it relates to the part or whatever it is. That data could be of interest to your competitor. Maybe it’s intellectual property, or maybe you’re building a widget for the US government, certainly, a widget China would like to know how to make. There’s a target. If it’s paper, lock it up whenever somebody’s coming through. Do the same thing when somebody comes on the shop floor. If it’s not somebody who works there and they don’t need to see, they don’t have a need to know for this data or this information, hide it. At night, if you’ve got a janitor staff, or cleaning crew, whatever that would be, then make sure to lock that data up at night to where China can’t put an agent on the cleaning crew, and now they’re just snapping pictures, or making a photocopy, whatever it would be. That’s one way they can do it. Look at the paper. We’ve already talked about digital. Encrypt it at rest, encrypt it wherever it exists. How do you do that? Well, there’s full drive encryption. There’s things like BitLocker by Microsoft. Again, big tech. Anyway, the product works pretty well. There’s other things, too. VeraCrypt, which is a French company. It’s a software base, so it’s got its own quirks. Hardware encryption vs. software encryption there. They both have their pros and cons. You have to solution it through and see what is right for you. Both of them work well. At the end of the day, they’ll encrypt the data and keep a secret a secret. When you are sending data, most modern operating systems and applications have the ability to encrypt data on the move, like Windows file sharing. Some of my clients do use Windows. Modern Windows has the ability to do encrypted SMB, server messaging blocks. That’s native Microsoft file sharing. It has the ability for encrypting even on a local LAN segment, which is a good thing because as we go from Zero Clear into zero trust and going from a trust-but-verify to a verify-then-trust model, a closed security model, that’s going to work well for you. Implement it now, and you’re already set for when you actually have to implement it. Good advice there. That’s one way to do it. What you do is you look at what information is flowing and how it’s flowing. Back to the telephone conversation, what can you do rather than use the phone, where you can encrypt the phone? If you’re using something like a voice-over IP, most of the modern ones do have the ability to encrypt. You just need to make sure that it’s true client-side encryption to where the server can’t open it up and listen because people work there. I think it was Ring that recently got … Maybe I’m throwing out the wrong name. Maybe I shouldn’t put out the name at all. Everybody who worked for the company had access to all the videos created on these doorbells. They could see what people were ordering, or who’s coming and going from the house, or whatever that would be. Not everybody in the company needed to see these videos. They got dinged on it. I think it cost them some money. Anyway, look at your security cameras. Are those encrypted? If it’s quality camera, oh, my goodness. Quality cameras have the ability to see to a resolution that often can be reproduced later on by a third party. They’re able to be used against you. We talked about telephony. We talked earlier about email. Encrypt the email. You can encrypt the payload, or you can encrypt the transit, whatever that would be. There’s different ways to do it there. Lastly, I would say look at the way you text because text is so pervasive in our society. Use something like iMessage or Signal. Remember, iMessage only works with Apple. I don’t like that. I’d love it if they would open it up. I see why they don’t open it up because it’s theirs. Signal is platform agnostic. It does have a desk-side agent. Usually, what I’ll at least mention to a client is that email is fine to set up a discussion or to set up a relationship with the client, but once that relationship has been established, switch over to Signal or something like that. There’s a desktop application for Signal. If I want to send Gregg, you, a file—say it’s a gigabyte file, it’s some big media files, whatever it would be that would be that big—I can send it to you instantly over instant message Signal end-to-end encrypted, and you get it on your desktop. Plus, we’ve got a full-size keyboard, so we’re not tapping around with our thumb. Those can be used very effectively to replace these weaker tools and protocols that don’t allow for you to keep things encrypted. One of the other things that that does if you stop answering the phone. That is an exploit vector for phishing, spear phishing, and whatever phishing. If you stop answering the phone, that mitigates the exposure to that attack vector. I can’t remember the last time I actually answered the phone. People don’t like it. Hit me up on Signal if you want to reach me, and then we can talk all day. I’m not going to answer the phone because the first thing I’m going to say is, “Hey, let’s switch to Signal.”
Gregg Profozich [00:36:36] Ernie, it sounds like you’re saying that … Going back to setting up a relationship, use email, use telephone. Voice over IP is fine to set a meeting to have a first get-together, but after that, everything, anytime you’re talking something substantive, it should be encrypted. Don’t send financial information, don’t send confidential information of any kind over email that’s not encrypted.
Ernie Edmonds [00:36:58] That’s good, but what’s better is just default to encrypted for everything regardless of what it is. That way, you don’t have to think about it. It becomes part of your MO, the way you operate. Now, if we are just talking about who’s going to win the baseball game, then okay, it’s a nothing thing, but we don’t have to think about it if we switch over during the same conversation and start talking about security. That’s something with my clients. I absolutely will not talk about security on anything unencrypted. If we’re talking about whatever and a security topic comes up, bam, we just got hacked by our own stupidity, so to speak. Just make everything encrypted.
Gregg Profozich [00:37:39] A lot of this I can take to heart. If I run a company, though, how do I educate my employees? How do we go about making sure they understand the common tactics? How do I go about making sure that my employees understand the common tactics that attackers use to get the data?
Ernie Edmonds [00:37:55] There’s annual awareness training. We all know that. We sit there for two hours or whatever it is, and we’re just mind-numb at the end of it. Part of that information is we learned something new, but we’re waiting another year before it comes around again. A better model is a steady rain soaks. Being in southern California, we don’t get that much rain. What I did at this pet company, we had in the break rooms … It was a big company. It was 25,000 employees with stores everywhere. In the break room, we would have a monitor that would list these are upcoming holidays, or this is whatever information that might be usable—HR is going to do this. We had job postings. That would be on there. Well, every week, I would change … I had two messages about security. One would be high-level philosophical; the other would be more tactical in nature. We had a year cycle. We had 50 of these, close enough to 52. What would happen is the theoretical would be something like, “Passwords are like toothbrushes; you don’t share them.” We’d have this picture, a happy toothbrush that people se and like. That was one element. As something else that was more of a tactical, “This is how you change your password.” I’d click the three-key, Control-Alt-Delete. Go to settings. However, you change it on Windows. It would actually tell them how to do it. It was prescriptive in nature. It could be for Windows, or it could be for the scanning. We used to use these telxon terminals that you have to log into. They’d have to change it there because it didn’t support password complexity like what a real operating system would do. There was something for them to think about and something for them to do. We would change these things out. Like I said, we had 50 of each. They’d see it once a year, but these things would every three or four minutes show up on the screen. As I would go into the break room at headquarters in San Diego, I would hear people talking about it. It wasn’t even on the screen then, but they were talking about it from where it was on a few minutes ago. That was permeating in, and they were really getting into the security fabric and security DNA of what matters. That was such a good thing. I would recommend that people do that, to use the steady rain soaks approach with these little tidbits of information trickling in all the time. That way, it builds your security posture quicker and better than a complete annual awareness will do.
Gregg Profozich [00:40:24] Those are some great pieces of advice right there. I think that makes a lot of sense. It’s the constant reinforcement; it’s not the once-a-year training.
Ernie Edmonds [00:40:31] The whole thing with this smiley toothbrush with the toothpaste curled up like whatever that toothpaste is, they would talk about the whole face. It had a smile. People liked it, especially people that are into dogs and cats. Those are often pretty wholesome people. It worked out really well for us. Figure out what resonates with your folks at your company, and then go with that. If you’ve got some super uber-technical group, maybe a toothbrush is not the thing. Maybe a different analogy would work better. Figure it out, and it’ll help you.
Gregg Profozich [00:41:07] What are some of the warning signs that an attacker may have already accessed your company’s data?
Ernie Edmonds [00:41:13] We have an F-35 fighter, and somebody else in the world has a fighter that looks a whole lot like an F-35. That would tell me that probably something happened. That’s a dead giveaway. But it can be assumed that somebody has your information already. That’s the thing. Right now, we’ve sent so much in the clear, and we’ve not protected it because of our culture. Americans are really, at the core, friendly people. That can be a weakness that can be exploited. It’s not a bad thing that we are as trusting as we are, but it’s a bad thing that we’re as trusting as we are because it can be used against us. It is being used against us. There is going to be a culture shift that is going to be required going forward. I think an analogy to it is like the airport before 9/11, but that’s not the world we live in anymore. As we get into more cyber, then we’re going to have to stop giving all this gratuitous information out that can be used against us, that is being used against us. It seems like it would make us a colder place to live. Hopefully, there will be some way to balance that act to where we still maintain what is important to us as Americans and as people, but at the same time, we’re not exposing things that don’t need to be exposed.
Gregg Profozich [00:42:36] But finding that new balance won’t happen until we find out exactly how bad showing certain information is and how it works its way through, I guess, if you think about it.
Ernie Edmonds [00:42:46] Probably. I hope it’s not too bad what gets out. Once somebody gets burned, they become cautious. We’re people. We learn about things like that. We learn from things like that. That is going to happen to some degree. I don’t want us to run around like other societies. I’m thinking like North Korea. I’ve never been there, but I hear stories about just how closed people are there because any wrong word could get you or your family really in trouble.
Gregg Profozich [00:43:13] Mhmm.
Ernie Edmonds [00:43:14] Yeah. I don’t think we’d ever be it to that extreme, but I don’t want us to lose the essence of being a good people.
Gregg Profozich [00:43:21] How can work-from-home employees keep their data safe? How can businesses convey the importance of data security for their work-from-home staff?
Ernie Edmonds [00:43:29] A lot of the same things we’ve talked about can be used at home. Right now, I’ve got a room that I’m in. My monitor can be seen from the window. My blinds are pulled right now, even though we’re not talking about anything that isn’t going to be literally published out there for the world. The fact that I’m doing work, I’ve pulled the blinds. My neighbor, I don’t think she’s got binoculars, but maybe she does. I don’t want anybody to see what’s on my screen. I’ve got a big monitor, so it could be seen, but even with a small monitor, telephoto lens, whatever that would look like. Protect that. If you’ve got kids running around the house, don’t have sensitive work information on the screen while they’re in your office. Kids play around. That’s a no-no. If you leave the room to go fix lunch for the kids—I have kids—so shift out and go make a sandwich, or soup, or whatever. At least, lock your workstation to where they can’t see it. If you’ve got maid staff coming through … I do have a maid staff. Whenever they clean the office, I’ll lock it. I’ll leave the room. They can clean it, and then I’ll come back. Don’t do work while they’re in there. One, she likes to talk, so it distracts me. I might as well just give up and go to a different room. That’s some of the things people can do at home. Just always be protective. When you get into government space, they’re not a covered person. That should be a reportable incident, but who’s going to report your seven-year-old kid or themselves because their seven-year-old kid came through? Common sense goes a long way. Things like pulling the blinds that can have a real thing. Also, if you’re on an airplane, something that … I’m a fairly tall guy. I’m not the tallest. If I’m on a plane, I always get an aisle seat. The person sitting a row in front of me, if they’ve got one of those three-inch screen protectors, I can’t really see theirs because the angle’s just a little bit too far off. But two rows up I can see the screen. I can see everything that’s going on that screen even though they’ve got the screen protector. The person sitting directly in front of me they have no chance of me not seeing it because, one, I’m tall, and I’m looking right at it. They’re three feet in front of me. That’s something to watch out for is who’s around you, what we would call situational awareness. That goes a long way. When you’re working at Starbucks, or your favorite coffee place, or at a restaurant, a lot of times, people will back into a corner. Look up in the corner and see if there’s a camera up there because I hear that camera is watching you. Just be aware of technology, and of people, and situational awareness that could harm you.
Gregg Profozich [00:46:07] Great advice. Good advice. How important is it? If it’s important, should SMMs establish relationships with local authorities before cyber incidents may occur?
Ernie Edmonds [00:46:19] That’s a good question. There are some advantages with knowing who that person is as a person, if it’s FBI, or local police, or the cyber crimes group, whatever organization that would be. There can be some benefit there, because at least now, the person knows you, who you are whatever, you know them. Probably small towns, I would imagine, would have lesser cyber capability, yet by knowing the chief of police or whoever that would be as a person—you go to club, a church, or whatever—then it gives them some incentive to try to help you as a person. That’s really better in small towns than in larger cities. I think we’d all agree on that. Larger cities are going to have more adept capabilities in most cases. If you happen to live in West Side LA, you got the Wilshire FBI cyber crimes group, which are really good. They’re really busy, so you got to get in a conversation with one of those people, it’s hard. If you do file a report, God forbid something bad happened, it seems like it goes into a black hole. They’re so busy, but they’re a really good group. Also, Manhattan has a really good cyber crimes group, the Manhattan FBI field office. Then other cities have varying degrees of that. They can bring in the higher caliber people if they don’t have them local, or train them up, or whatever that would look like. I don’t know how much value it adds to know them ahead of time but have the information available to reach out when you need them. Have it written down on paper. One of the things that happens often with ransomware is … You’ve got two types of ransomware: you’ll have content denial ransomware, which encrypts your documents and stuff like that, and then you’ve got system denial ransomware, which denies you that you can log on to your system. Write this stuff down because if you can’t log into your system, and all your stuff’s in your system, you’ve …
Gregg Profozich [00:48:15] I can’t get to my web browser to search for the number for the FBI.
Ernie Edmonds [00:48:19] Exactly. Well, what if they hit your router to the point that your router’s dead? Now you can’t even get to the internet, period, even with your phone or whatever that would look like. You’d have to disconnect from Wi-Fi. Have a communications plan for something like this. This is part of having a good communications plan. Have the information—whether it would be email, or text, or phone, or whatever that would look like—to where when something happens, you will have the ability to reach out to somebody for help.
Gregg Profozich [00:48:50] Many of the SMMs that we work with design their own products and a lot of those products have software in them. When it comes to the software development, what are some of the precautions developers should take during the early stages of the development to protect their data and their customers’ data, the ones who buy the products?
Ernie Edmonds [00:49:06] Well, first of all, get a cyber architect, a security information security architect, to start on the project when the project starts. A lot of times … We had scrum teams at this pet place and other places I’ve worked. We had scrum teams, development people. We would embed a security architect from the inception of the product, or the application, or whatever it would be. Right as soon as the first meeting, the security guy was there, the security girl was there. That way, they can intercept these problems before they become a problem. What happens is if you bolt on security at the end, statistically speaking, it’ll cost you 70% more. Actually, no. The reciprocal is what’s accurate—save you 30% by embedding them in the beginning than if you try to bolt it on late. Your ultimate security budget’s 30% of what it would have been otherwise. Embed them early, and then have them look at how data is stored, how it’s transmitted, what data it is, what’s the sensitivity of it. If you’ve got a product—what comes to mind is a drone—can somebody get a drone and hack the drone, and they’ll steal your drone? Now you’re out the product, and somebody runs off with the drone, someone’s not pleased, drone vendor. What kind of data can the drone see? That’s the thing with DJI right now. DJI is a drone manufacturer that, I believe, is based in China. They’re getting bad press because, apparently, China can see all your drone footage. At least, that’s the word on the street. I don’t know that for sure. Look at stuff like that. Can your tool or your application be used downstream if somebody does bake a backdoor into it or some sort of spyware into it? From a high-level concept, look at what all that stuff means. Of course, if you’re doing software, you’re going to have static code review, and then you’ll have component. I have runtime review, all the components added up together. That’ll glean some insight into what it’s doing and give you a good sense of what your software looks like. Again, once you go out of software into something physical like this drone, or whatever that would be, or a car … I do drive a fairly tech-oriented car. Somebody hacks my car, I don’t know what I’d do. You have to look at stuff like that.
Gregg Profozich [00:51:25] Excellent. Ernie, I think we’ve talked about a lot of things here under the Zero Clear umbrella. The whole idea again—Zero Clear, have zero pieces of data, zero elements of data out in the clear where it’s not encrypted, or anybody could pick it up.
Ernie Edmonds [00:51:42] Encrypt everything every time.
Gregg Profozich [00:51:44] Encrypt everything every time because bad actors are out there collecting the data, and they will weaponize information and data as much as they can.
Ernie Edmonds [00:51:51] That’s correct. It’s being harvested so that if they can’t figure it out now, they can figure it out later, which is even worse.
Gregg Profozich [00:51:58] They’re storing it in databases, waiting to get more pieces so that they can make a better picture of the puzzle.
Ernie Edmonds [00:52:03] Correct. That’s exactly right. Just protect yourself, protect your family, protect your company, protect your career. Insert whatever noun or verb that you would use there. Protect yourself and protect everybody else.
Gregg Profozich [00:52:16] Protect yourself, protect your future.
Ernie Edmonds [00:52:18] Yeah. Protect your family. I’ve got kids. What if something comes out on me? I don’t think I’ve done anything wrong, but who knows? I was hacking for a long time. What if that comes out, and now my kids are having to answer for it in their professional career? “Oh, your dad did this.” What is that going to say for you?
Gregg Profozich [00:52:36] Reputation risk along that way.
Ernie Edmonds [00:52:39] Exactly. Or what if you’re already embedded? You’re the security officer for some big company, and now it comes out that your family has exposure to China. I don’t think I have any exposure there at all, but who knows what … Could it be added up to and misconstrued as?
Gregg Profozich [00:52:54] Those putting the data forth, if they’re trying to hurt you, are only going to put forth the pieces of data that tell the story they’re trying to tell to fit their narrative.
Ernie Edmonds [00:53:03] Yeah, telling their story, which may not be your story at all.
Gregg Profozich [00:53:06] Pulling stuff out of context. The more you give them, the more they can pull out of context and use in some way to benefit themselves or to hurt you.
Ernie Edmonds [00:53:12] That’s exactly right.
Gregg Profozich [00:53:13] To that end, from a company perspective for the small manufacturer, educate yourself, educate your employees, invest in some encryption, and remember that steady rain soaks. Keep the continuous messaging going on out there about why security is so important and thinking about just because we’re nice, humble, hardworking people doesn’t mean everybody is. There are bad actors out there who are trying to exploit anything they can.
Ernie Edmonds [00:53:35] That’s absolutely true. One thing you mentioned—invest in encryption technology—a lot of this stuff is open source. It’s not only free in most cases; it has the benefit of being analyzed and reviewed by the open-source community. You’ve got hundreds, thousands, tens of thousands of security people that are looking at this, so you can be assured that there’s nothing that’s going to harm you. That’s the problem with closed-source software. It’s security through obscurity in some cases, whereas this open source software has open vetting for, a lot of times, years.
Gregg Profozich [00:54:10] Well, if there’s no other closing comments, Ernie, I want to thank you for joining me today and for sharing your perspectives, insights, and expertise with me and with our listeners.
Ernie Edmonds [00:54:18] Thanks, Gregg. Appreciate it.
Gregg Profozich [00:54:20] To our listeners, thank you for joining me for this conversation with Ernie Edmonds on Zero Clear. Thank you so much. Have a great day. Stay safe and healthy. Thank you for listening to Shifting Gears, a podcast from CMTC. If you enjoyed this episode, please share it with others and post it on your social media platforms. You can subscribe to our podcasts on Apple Podcasts, Spotify, or your preferred podcast directory. For more information on our topic, please visit www.cmtc.com/shiftinggears. CMTC is a private nonprofit organization that provides technical assistance, workforce development, and consulting services to small and medium-sized manufacturers throughout the state of California. CMTC’s mission is to serve as a trusted adviser providing solutions that increase the productivity and competitiveness of California’s manufacturers. CMTC operates under a cooperative agreement for the state of California with the Hollings Manufacturing Extension Partnership Program, MEP, at the National Institute of Standards and Technology within the Department of Commerce. For more information about CMTC, please visit www.cmtc.com. For more information about the MEP National Network or to find your local MEP center, visit www.nist.gov/mep.