Sponsored by:
BLTnT Podcast

Episode 33

With Mykolas rambus
July 16th, 2025

Watch Here:

This week on the BLTnT Podcast, Matt Loria talks with Mykolas Rambus, Co-Founder and CEO of Hush, a company rewriting the rules of online privacy and security. He explains how Hush was born from a simple but urgent question: “What’s online about me that shouldn’t be?” And how AI is both a threat and a tool in the fight for privacy.

Together they dig into:

  • What CEOs Are Getting Wrong About Risk
    Why digital traceability is becoming a top-tier business risk—and how most leaders are still overlooking it.
  • Building a Culture of Safety at Hush
    Mykolas describes how his company blends AI with human service to protect high-risk individuals—from judges and journalists to everyday families.
  • Inside Equifax During the Breach
    Mykolas shares what it was like leading through one of the biggest data breaches in U.S. history—and what it taught him about transparency and resilience.

 

BONUS!! Want to hear more from Mykolas? … You can hear him live at the upcoming Auxiom Community Appreciation Event!

 

When: Thursday, August 14th | 8:00–11:30 AM
Where: Great Oaks Country Club – Rochester, MI

REGISTER HERE

 

Let’s dig in!

 

#DigitalLeadership #CyberSecurity #AIethics #DataPrivacy #BLTnTPodcast #AuxiomEvent #LeadershipMatters #MykolasRambus #OnlineSecurity #ExecutiveLeadership

(0:00) Welcome to the BLTNT podcast. I’m your host, Matt Loria, serving up real stories of business, (0:05) life, technology, and transformations. You’ll hear from interesting people about big changes (0:09) from career shifts to life-altering decisions, and the innovations that help make it all happen.

 

(0:14) It’s about sharing those lightbulb moments, pivot points, challenges overcome, and the journeys (0:19) that inspire us to think differently. If you’re on the lookout for insights to propel you forward, (0:23) stories that resonate, or just a bit of inspiration on your next BLTNT move,(0:27) you’re in the right place. Let’s dig in.

 

All right, welcome to this episode of the (0:40) BLTNT podcast. I’m Matt Loria, and I’m sitting here with my friend Miklas Rambis. (0:44) Thank you for being here, Mik.

 

Thanks for having me. (0:47) I appreciate it. Mik is the CEO and co-founder of Hush, which is a new service dedicated to (0:52) help families take back control of their financial, physical, and reputational security.

 

(0:58) The reason I’m having him on, obviously, we’ve got the fondness for cybersecurity here,(1:04) but Mik is also one of the nicest guys that I’ve met, and exceptionally well-dressed, (1:09) and has been teaching me a little bit of fashion sense here, and very, very smart guy, (1:14) and also sat ringside during the Equifax breach. Yes. (1:20) He’s got some battle scars and war wounds and a lot of amazing stories of coming out of MIT (1:27) and emerging to where he is today with a number of different businesses.

 

We’re going to get into (1:35) stalkers, AI scams, and why, in the next five years, Googling you just won’t work. (1:41) Sounds good. Okay.

 

Well, let’s hit it off here. (1:44) Tell us about Hush and what you guys are doing and why it’s important right now. (1:50) Yeah.

 

We started this business, Hush, actually coming out of my last company, WealthX. I’ve (1:55) been a startup entrepreneur for most of my career. There was this business we built, (1:59) which is basically the Bloomberg for private bankers, right? Imagine all the information (2:02) you wanted to know about Warren Buffett or Jeff Bezos, whoever it is, in a database structure.

 

(2:07) It turns out there are a lot of bad people who want that information. (2:10) No kidding.(2:11) Right? So we found ourselves being approached by Organized Crime, who wanted to get access.

 

(2:16) We said, thanks, but no thanks. Didn’t go over very well. We found ourselves approached by (2:20) foreign intelligence that wanted to get access to that database of roughly several hundred (2:24) thousand people to find tax optimizers, we’ll say, right? But cross-border tax cheats.

 

We said, (2:31) thanks, but no thanks to the intelligence agencies. Didn’t go over very well. (2:35) And then we had some oligarchs, some Russian oligarchs say, hey, I’m in your database.

 

I (2:38) want out. And we said, well, we can’t do that. If you’ve ever dealt with Russian oligarchs,(2:41) you don’t want to say no.

 

Right? And so, unfortunately, that led to some safety and (2:45) security issues for myself and team. We had to figure out how to get our own information, (2:49) that of our families off the internet. And the reality is there was nothing good that did the (2:52) job.

 

I remember talking to people in law enforcement, Interpol, journalists who covered (2:56) sensitive areas of the world, and they’d all say, there’s nothing you can do other than one by one, (3:01) like manually go get rid of your information. And so that was the vision or the idea, I guess, (3:05) behind Hush was, well, there’s got to be a better, smarter way. Let’s see if we can build it.

 

(3:10) Right? Can we build this thing that actually removes our own, but also other people’s(3:15) information off the internet? What we didn’t know is I think how dangerous things have gotten, (3:19) right? That was 15 years ago, 10 years ago. Think about how toxic today is by comparison. (3:26) And that’s where we are.

 

Wow. (3:29) So, well, I don’t know where to start with the questions. Do you have all your fingers(3:33) after saying no to the organized crime families? Thankfully, currently, yes.

 

(3:37) Okay, good. Good. Glad to hear that.

 

What did you do before the system existed? How did you (3:44) keep yourself and your family safe during the time? (3:48) Yeah. So if you dig into the details, like let’s say you Google yourself and people have either (3:53) Googled themselves or they Google somebody else. Everyone’s Googled somebody else, right? And so (3:57) if you look at those websites where your data is held, it’s usually a data broker, right? Data (4:01) broker is a company who’s in the business of aggregating information and reselling it, right? (4:05) If you look into the fine print, go to the legal conditions, terms of use, et cetera, (4:10) almost always it will read, hey, if you can remove your information, if you submit this form, (4:15) send an email, prove that you’re you.

 

And you can do that one by one. It is time consuming, (4:20) mind numbing. It’s very particular and no one’s got time to do that.

 

So that’s how we did it back (4:26) then. Now we’ve built technology to do it a lot faster and better. (4:30) And I’m assuming artificial intelligence has been your friend.

 

(4:33) It has been, yes. Both on the removal. So we have bots that will do that, but also on search.

 

So (4:39) part of the problem is people think when they Google themselves and that’s the internet, (4:43) there’s a lot more of the internet than that is, of course, as we well know, right? So if you think (4:46) about the internet like an iceberg, you’ve got the surface web and everything above the water, (4:51) if you will, is surface web. Google only indexes 15% of that. There’s still a lot of information (4:58) about people for skilled threat actors who know where to look way more than Google out there.

 

(5:03) So yeah, AI helps us in finding the information about an individual in all these different places. (5:08) You know, I remember a time earlier in my sales career where I was Googling a prospect (5:15) and their name popped up on a spreadsheet and it had home address. I don’t remember (5:21) if it was last four or so.

 

It had something very identifying, but home address, cell phone number, (5:28) et cetera, et cetera. And two examples, one was it had their slip number at the marina that they (5:34) were located at. And then another was a very similar one, which was a marathon registration.

 

(5:42) So it even had their weight and age and all that sort of stuff. And I just couldn’t believe(5:48) such personal information was out there in a spreadsheet. Not only did I have that person’s (5:52) information, but I had the other 50 people that were in that race or the other 100 people that (5:57) were in that marina or whatever it was.

 

And I just couldn’t believe it that you’re thinking, (6:01) okay, how many times do I clicked on the opt out for a company? But you don’t think of these (6:06) kind of more just community oriented places that you’re dropping your information. (6:14) That’s right. And that assumes, by the way, those companies give you, or those organizations give (6:17) you a way to opt out, right? A lot of the smaller ones, they don’t even really think about.

 

(6:21) Yeah. These had no idea. I think this was somebody who probably had really good intentions to keep (6:26) the event organized, probably put it on a Google drive or something like that, left it all open, (6:31) no password, no security.

 

That’s right. And somehow I inadvertently found my way into it. (6:36) Was chatting with the CEO of a tech business who we know, and in their case, it was their (6:41) kid’s school that had accidentally put the entire school directory online, right? So we (6:46) happened to find the hit because we’re looking for the individual and their family, but every (6:50) family’s name, all the kids’ names, the emails, the phone numbers, their addresses, all available (6:56) online, right? Wow.

 

And what happens with that, I’ll give you another story. You mentioned the (6:59) flip number, right? For the boat at the Marina. I’m thinking of a venture capitalist who says, (7:04) one Saturday he gets a knock on the door and okay, a little unusual.

 

And a guy shows up at his front (7:08) door to pitch him an idea, right? And so he thought it was a bit odd and weird and said, thanks, (7:13) but no thanks. I’ve got an office and a process. His wife was pissed, right? And so that’s the kind (7:19) of stuff where you don’t want that information out there and people don’t recognize it.

 

Look, (7:22) I get it. We don’t talk about this kind of stuff at cocktail parties, but it happens all the time. (7:26) Wow.

 

Wow. Well, I’m hoping we can get into some more of the stories because I think people will(7:31) really connect with those and they’ll resonate with them to say, okay, could that happen to me? (7:37) Yes. So let’s go into a little bit of backstory here because really right out of college,(7:45) you got into a venture backed business, kind of your first gig.

 

(7:50) That’s right. Even before actually. So it was in the dorm room.

 

So this was senior year, (7:55) second semester and had an idea, actually had a few ideas, not sure which one to take. And (8:00) yeah, we’re very fortunate to get early backing from SoftBank and another Boston VC (8:04) in those early days to get going. You know, the funny thing, it was about mobile technology.

 

(8:09) So if you can believe, you know, we have smartphones everywhere. This was Palm Pilots(8:13) and so forth. But yes, it was a very good journey.

 

Learned a ton of how not to run a business back (8:18) then. Eventually survived. We ended up selling that business and moving on, but nevertheless, (8:23) yeah, there’s a lot of lessons learned those early days.

 

(8:25) Can you tell us about each of these businesses? How many of you formed and successfully sold? (8:31) So three venture backed. So two exited and now we’re on number three venture backed. Other (8:35) companies, real estate and so forth.

 

It’s been about six in total. My first one was way back in (8:39) high school doing computer parts and repair. So, you know, I learned.

 

(8:43) You were kind of in our business. (8:44) I was, I was a little bit of, you know, the lower end of what you do, right. But you know, (8:49) and, you know, adding memory to computers and fixing modems and those things, but (8:53) someone had to do it.

 

Right. And I figured, Hey, if I can make some money that way, all the better. (8:57) Great.

 

Where did you, well, first of all, should we ask you these questions? Like, (9:01) you know, what area, what area of the country did you grow up in? (9:05) Oh, sure. Ask me anything. I can’t always answer, but ask me anything.

 

I say that because (9:09) it’s interesting. You know, we all know, I mean, I don’t know about you. I can imagine if I were (9:14) growing up right now, everything’s recorded, right? Every video is there.

 

It’s all out there. (9:19) And so as we think about our content videos, podcasts, everything else, it used to be, (9:23) you had to find, you know, that one entry, like you said, you had to find that one PDF from the (9:27) race that had everyone’s details. Now you can ask chat GPT and say, show me everything you know (9:32) about this person and the work is done for you.

 

Right. So, so yes, you’re right. I am a little (9:36) more cautious or politely paranoid about the things that I share, but yeah, I grew up here, (9:41) born and raised in Michigan.

 

Right. So I’m very excited to have gone East to university to MIT. (9:47) And then, like you said, started that first company.

 

So tell us about that first company. (9:50) So that was, you know, I still remember the, the agreement behind the company is myself and four (9:56) classmates. The idea was to build a mobile application consulting company, right? So no (10:02) one really knew where mobile was going, kind of like AI these days where it’s a whole, you know, (10:06) sort of evolution and we didn’t know what to build as a product.

 

So we said, well, (10:11) we’ll be a consulting company. Right. And so we start off as a mobile app development consulting (10:16) company and we work for groups like MasterCard and some of the carriers like Verizon to go build (10:21) early mobile applications.

 

We even built an early navigation system that had voice and visual, (10:26) you know, it’s funny you think of that as okay, everyone does that. But back then it was really (10:29) hard to do. And so we built apps, right.

 

For different companies. We eventually turned that (10:34) into a product, basically a software development kit that companies could use to build their own (10:38) apps. And that’s what the company was ultimately sold on.

 

Great. How do you feel like that? Go (10:44) back to, you kind of made a parallel to that, that app dev boom time and also time of uncertainty to (10:53) the AI boom time that we’re in right now and complete uncertainty as well. And can you make (10:59) some parallels there of, of what business leaders are going through and what they’re challenged with, (11:05) with, with, you know, I need an app.

 

I mean, every, every business owner that I talked to today (11:10) basically says I need to be an AI. Yes. I don’t even know that, that, that everyone knows what (11:15) they mean by that, but they certainly don’t want to be left behind.

 

So can you, can you talk about (11:19) that a little bit? I mean, it, it feels like, you know, 2000 where people are saying I need to be (11:23) on the worldwide web. Right. Exactly.

 

I think you’re spot on. I think we saw a reaction to this (11:28) where people were hiring chief AI, you know, officers in their company sort of like chief (11:33) innovation, I guess you could say, but a lot of it is a lot is foundational, right? If your business (11:37) wasn’t on the internet in the early two thousands, you made that, that transition, that transformation. (11:43) And now everything happens online and everything happens on mobile, right.

 

And so forth with AI. (11:48) I think it’s, you know, it’s interesting. You saw the the jobs data come out recently, right.

 

(11:52) Where there’ve been a lot of companies that are not hiring. I think, you know, probably not set as (11:56) much. There are a lot of organizations, a lot of CEOs who are thinking, okay, how do we maneuver (12:00) through and use productivity from AI, let alone the innovations that come through it to be able (12:06) to hold right fast and think about our company growing without adding more people.

 

And so from (12:12) a CEO perspective, I think the first thing that a lot of folks are looking at is, okay, where can I (12:15) get the employee efficiencies and how’s that going to look? It will completely disrupt entry-level (12:20) employment, right? You think about law firms, you think about finance organizations, all of that (12:25) analysis and so forth is done really well, really quickly with AI. So we’re at the beginning of a (12:31) big disruption here. Let’s, let’s dig in on the law firm piece just a little bit here, because (12:36) that’s one that, that, that gets my brain really tied up.

 

I can see where artificial intelligence (12:44) today can make a, let’s say you’re, let’s just pretend there’s a scale of level one, two, three, (12:50) and four of, four is a top-end senior lawyer, knows kind of everything they need to know about (12:56) their craft. And then level one is that person fresh out of, fresh out of college. The level four (13:02) attorney had all of the benefit of those late nights, digging through documentation, et cetera, (13:08) et cetera.

 

And in one, they, they were learning how to find the information. Two, then they were (13:14) learning the information and three, learning how to apply it and make it part of a relevant argument (13:19) of some sort. If you take away steps one and two, how does the lawyer, how does that lawyer that’s (13:27) in college today ever get to level three and four? That’s a great question.

 

And I don’t know how, (13:32) I don’t know how either. I don’t know that they do. You’re absolutely right in the research.

 

(13:36) You know, in theory, the research is done for them. It’s aggregated. They can sift through (13:41) the results.

 

They can know how and affect the machine that has thought and gone about it. (13:45) So I’d like to think that it’s explaining, right, how they’ve done their work. So there would have (13:50) been wasted hours in finding articles that weren’t relevant, why they weren’t relevant, (13:54) right? That is substantially more codified, made efficient, but it’s a real challenge, right? I(13:59) think how do people get from there here to there? I think there’ll be a lot less associates at law (14:03) firms going forward, right? Because that will be, you’ll still have associates, but there’ll be (14:07) technology enabled, right? And so they’ll be the ones who will eventually become the partners, (14:11) but it changes the game entirely.

 

Same for the big consultant series, right? Accenture, (14:16) you know, PwC and so forth. I think you have a lot less junior people doing the work because (14:22) it’s just not needed anymore. Yeah.

 

I was, I bumped into a former McKinsey consultant (14:26) who she’s now on her own working in the Detroit area here. And I said, I said, oh, that’s (14:31) interesting. I speak to you all the time through my, through my chat GPT, because I’ll tell it (14:35) as a McKinsey consultant, answer this, answer this question.

 

And it’ll give me a McKinsey level (14:40) report, you know, out of there. And so the disruption is, is really interesting. But I (14:47) think that, you know, you have a point there where you said the associate will be technology (14:52) enabled, right? They’ll have to know how to go after that information and data and then know (14:58) how to sift through it themselves and then tell the bots what’s relevant to them or not.

 

That’s (15:03) right. So you know, just gaining that experience I think it’s going to be a lot of trial and error, (15:08) but it just will be the new way. It will.

 

You know, I think added to that, (15:13) the technology is, is today, right? So go to where not only can it do the homework and do (15:18) the research, everything else and explain, but also take on a organization’s persona or a partner’s (15:23) persona to say, okay, well, if you were being highly aggressive, let’s say about a given topic, (15:27) or if you wanted to be very risk averse as a simple example, right? Even from a branding (15:32) perspective, we say, look, I’d like XYZ tool to look at all of the things out there about our (15:37) company, Hush, everything we’ve said, all of our messaging and create new messaging, a white paper, (15:42) let’s say that takes into account our, our particular way of storytelling and our language, (15:48) our word, everything else it can do that. Right. And so I think you’ll have a use in law and other (15:52) places where the technology not only is doing the homework, but it’s also emulating how that(15:57) organization and its culture interface with the rest of the world.

 

Okay. What about, I’ll give (16:03) an example. A year ago, year and a half ago, I was at a seminar for AI, a three-day seminar, (16:11) and everything that this person taught us was about prompt engineering.

 

Okay. Creating the (16:16) perfect prompt within six months, everything that they told us was obsolete. What are you seeing (16:24) now as the next level of obsolescence in what we’re teaching ourselves about AI and how to use it? (16:30) Oh, gosh, it’s a great question because it is moving so fast, right? I’m in the business (16:36) broadly and it’s hard to keep up with all the changes that are going on.

 

(16:42) Gosh, probably right now it’s around content creation, right? So yes, they’re the LLMs. Yes, (16:47) there’s a generative AI that we’ve seen in the last year and a half, two years. It’s getting (16:51) really easy, right? So Google recently published a new model, which allows for the creation of (16:57) video content far more easily than ever happened.

 

This is in the last month, right? (17:01) And so I think, yes, it’s on prompts to a degree, but it’s the evolution of those. (17:07) I also think, as I mentioned earlier, the models are getting better at mimicking either individuals (17:12) or organizations, right? Which is actually one of the scary parts from a cybersecurity perspective, (17:17) because you can say, okay, we’ll mimic this person, right? And it can go and find all of (17:20) the content, all of the video, all of the audio, and then emulate that individual. (17:24) We talked about impersonation a little bit.

 

I know we will at some point, but (17:28) it’s getting really scary good. But from a learning perspective, it’s just staying current (17:33) on all the tools that are there. Yeah.

 

I mean, I feel like if I stay out of it for a week, (17:39) I’m lagging. Yes. It’s unbelievable.

 

What do you know about agentic AI? (17:48) So a fair amount. Maybe we should define that, define the agentic AI in your world. (17:56) Yeah, the non-academic version.

 

Basically, an interface where an AI, and again, I’ll use bot (18:01) for simplification, right, is engaging with an individual, taking the feedback or the information (18:06) that’s provided by that human, let’s say, that interface, taking it back, doing something with (18:11) it, changing or adjusting behavior or outcomes as a result, and continuing to have that back and (18:16) forth, right? The example I’d give is you could say a financial advisor, right, where someone is, (18:21) you know, robo-advising is kind of a concept where you first sign up for an account, (18:25) you answer some preferences and the rest, and maybe those things stay fixed pretty much, (18:29) right? And an agentic approach would be, again, hopefully someone doesn’t change with the weather (18:33) of the markets, but if they did, that back and forth continuous feedback would be very helpful. (18:38) That would be more of an agentic approach. Okay.

 

And then even a simple example that I’ll (18:42) give is just acting as an online agent. Yes. So, as an agentic, you know, agent.

 

That’s right. (18:48) Right there, seeking help. So, help desk, right? We know that this is coming in our world of (18:54) managed IT services, that agents, bots, will be part of the, you know, be part of the solutions.

 

(19:01) Absolutely, yes. And, you know, already are in so many places, right? Yes, very much.(19:05) Now, some of those just are not refined.

 

I mean, I had an interaction with the car dealership (19:10) the other day, and boy, did I get an apology call from them on that. I went in, tried to make (19:18) an appointment to bring my car in for an oil change. I tried to just simply put the information (19:24) in on a form, but it kept popping up a window to chat with me.

 

Finally, I just assumed that I (19:29) couldn’t put it in, that I needed to use the agent, and it represented that the agent was a real person. (19:35) And boy, was that, that was frustrating, because it wasn’t a real person. It was three fake names, (19:42) you know, that these are the, these are the current agents on call.

 

Sure. And when it, when it caused (19:48) difficulties for me, I finally sent an email to the company and said, hey, this is a terrible way (19:52) to get me to buy another car from you, because if you can’t get this piece right about me getting (19:57) a simple oil change, I don’t want to buy another vehicle from you. Yes.

 

So, they called me and (20:02) they talked me through it and they said, yeah, you know, we’re actually, we’re, we’re sunsetting (20:06) that system as soon as possible. One of our major complaints is that it represents that Jennifer (20:10) is a real person and Jennifer is not. So, one of the commitments that I’ve given here is that if we (20:15) ever engage with agentic AI or any sort of bot behavior, that the person has to know that they’re (20:22) talking to a bot.

 

Yes. I just don’t, I don’t, I don’t like the, the ethics of that otherwise. (20:28) I can agree with you entirely.

 

We don’t use them at our organization, right? The, the interface on (20:32) the, in the app, if you will, is, is to an actual human being. Although I can see a place where (20:38) humans like interacting with other individuals, right? They like that perception. That’s why (20:42) so many companies in agentic AI have put forth these names, right? Avatars and so forth.

 

It’s (20:46) a little weird. I would agree still entirely. But I could see a place where it is effectively(20:51) handed off where if the bot can’t do what it’s supposed to do, that it hands off to a person (20:55) in a seamless way, right? Right now, a lot of times it’s, I can’t help you.

 

I can’t figure (20:59) this out. Let me refer you to my colleague. I think we’ll start happening is it’ll simply be (21:03) passed off to a human.

 

They’ll continue the conversation and you’ll never know the difference. (21:07) Oh, interesting. Okay.

 

Yeah. So we’ll see where that goes. We haven’t gotten there yet.

 

Our, (21:12) our clients expect humans. I would, I would agree. I would agree.

 

So we’ve talked a little (21:17) bit about the, about kind of the stalker aspect of what hush deals with, right. And kind of the (21:23) genesis of why hush exists because of those, those stalker like behaviors. But what about (21:29) the AI scam side of things? How is hush kind of preventing, preventing that for the, for the (21:36) users? This episode of the BLTNT podcast is sponsored by Oxium, business IT and cybersecurity (21:48) designed to outsmart chaos.

 

Empowered by Juniper Networks, automate your network with Juniper (21:53) Networks and the Mist AI platform, the world’s first AI driven wired and wireless network. (22:06) Sure. So you know, unfortunate truth, right.

 

That the identity theft market in the U S financial (22:13) fraud, right. Is bigger than the illicit drug trade, right. It is a, is a tremendous industry (22:18) that continues to grow, made it to Hollywood, right.

 

You saw in a beekeeper, the movie on Netflix, (22:23) right. That’s just the, you know, a small window again, dramatized about kind of what goes on, (22:28) but it happens all the time. Right.

 

So I think it’s one out of 17 Americans has their identity (22:32) stolen each year. There are people who are being trafficked, right. To do this.

 

So people often (22:38) ask me and us, well, gosh, who’s doing this and where do they find these people to do it? (22:41) Yeah. They must be such bad, bad humans, right? Exactly. Now, a lot of times they are literally (22:46) being held against their will, right.

 

And, and effectively, you know, sweatshops are worse, (22:50) right. To do this work. And then if that weren’t bad enough, technology is making it easier, (22:55) right.

 

For folks. So you back to AI there’s something called a fraud GPT, right. So we’re (23:00) all familiar with chat GPT, how flexible that is.

 

Imagine, you know, fraud GPT, which we probably (23:05) shouldn’t show, right. Cause it’s on the dark room, the tool set, but that tool is out there. (23:09) And so this goes back to, well, how do you find information about someone that’d be useful? How (23:13) do you find the answers to their bank challenge questions to unlock, you know, XYZ that they may (23:18) use their credit file, right.

 

Their two-factor authentication and so forth. But all of those (23:23) tools are there. And then there’s the impersonation that goes along with it.

 

Right. So most criminals (23:29) don’t really even have to bother with the dark web anymore. Or if they do, they get one piece (23:33) of information because most people share a ton on their social media accounts, right.

 

(23:38) Or from data brokers. And so all they need is several of those pieces of data to do what they (23:43) do. And again, that used to be a manual process.

 

Now AI can go grab those things and automate it (23:49) increasingly. So, so, so the AI and the, between the data mining and the AI, the secret questions, (23:57) like what was your first car? Were you a great name of your grade school teacher or (24:01) things like that? You’re saying those are being aggregated in there. (24:04) Correct.

 

That’s right. And it’s shockingly easy to do. Right.

 

So not hard to figure out (24:08) your dog’s name. It’s not. Right.

 

And people may say, well, I’m not on social media. I don’t, (24:12) I don’t post that. Well, yes.

 

What about your spouse or partner or your teenage kids? Right. (24:16) Or like even the friend who visited your house last week, one of your teenagers, friends who put it (24:21) on social media, right. You didn’t know about.

 

So there’s always that information leakage. (24:25) Yeah. And when I’m thinking back to, so our oldest son is getting married this weekend, (24:29) as I, as I mentioned to you.

 

And one of the ways he got in trouble as a kid was (24:33) his friend posted up a picture of him in front of the emergency room sign. (24:40) And my wife, my wife bumped into that on the web and that wasn’t, wasn’t too good. So (24:46) yeah, you can never, you can never, you can’t even personally (24:52) police everything about yourself that’s out there.

 

(24:54) It’s very hard to do. Yes, that’s right. Because it’s, yes, you can, you can police yourself,(24:59) but policing your family and your extended group is very difficult to do.

 

There’s that, you know, (25:04) even so Michael Dell, there’s this, this famous story about his daughter. So Michael Dell spends (25:09) a ton of money on private security has for a very long time. And so there’s a story about his (25:13) daughter getting onto a private jet, right.

 

And basically posting on social media about her (25:19) upcoming activities. And for people who are very substantial means private, well, that’s the last (25:23) thing you would do is say where you’re going to be, right. Made her security team detail crazy.

 

(25:30) Yeah. So, so your daughter is in the room today while we’re filming. So is there a bit of advice (25:38) right there for her if she gets on the private jet? No, no, no posting.

 

Yes. No posting, right. (25:43) No posting.

 

Yeah. No social media, right. By design.

 

Yeah, exactly. And what are some of (25:47) the rules that you have for your own kids? Sure. So, so one of those is a delay devices, (25:52) right? So, you know, we don’t think we advise other people and look, every family is different.

 

(25:56) Everyone’s gonna make their own decisions. We’re very aware of that. We’re also aware that every (26:00) family is different from a risk perspective, right? What they’re worried about and what (26:03) they’ve experienced in the past, but generally speaking, delay, delay, delay devices and (26:07) smartphones in particular, right? There are other ways to stay in touch, whether it’s an Apple watch (26:11) or a flip loan and so forth.

 

And we had that conversation earlier, right? So that that’s one (26:16) social media. You know, there’s a reason why you see so many people, (26:20) tech bros and otherwise who their families are not right on social media. So I think that’s (26:25) indicative of well, certainly.

 

And then there’s just awareness, right? You know, we often joke (26:29) at our business. We wish there was a law and order cyber crime, you know, addition, because people (26:34) don’t see the scams coming their way. They don’t see how the dominoes fall.

 

And so if you’re not (26:39) in this, like we are every day, you’re not looking for it, right? So having that awareness of, well, (26:44) how does someone’s two factor authentication get hacked when they think it’s very secure with (26:48) their bank or how did malware get on their phone? These are important conversations that we have at (26:53) least once a week in our house. Great. Great.

 

So you’re just making it a part of the, the norm, (26:57) just like somebody would say, don’t talk to strangers. That’s right. Yes.

 

Yes. Lock the door. (27:02) I mean, you know, we, we say all that stuff.

 

Don’t talk to strangers. You know, lock the door, (27:07) all these things that we drill into our kids over time. But these, these technical pieces are, (27:12) are added to that conversation.

 

They are in your house. They are indeed. Even something like (27:15) connecting to wifi when you go on a trip, you know, how many teenagers will find any wifi (27:19) they can get ahold of, right? Just connect.

 

And there are plenty of honeypots that are out there (27:24) that will pull and get access to that information. So we should explain. So a lot of our audience (27:29) are not it executives, their business executives.

 

So explain what a honeypot is. Sure. So let’s say (27:36) you go to some hotel resort or, you know, any place where you think is reasonably popular, (27:41) heck even an airport, for example.

 

Right. And you’re looking for the wifi. People assume (27:45) that the name of let’s say that property and wifi is actually the wifi that might actually be a (27:52) hacker’s wifi node that they’re basically trying to encourage folks to connect to and has a similar (27:57) name.

 

Right. So, and once they do that, if your device doesn’t have all the recent protections, (28:01) they can attempt to penetrate your device, get access, right. And from there commit whatever(28:06) crime they’re looking.

 

Sure. And once they’re in that device, then when that device is reconnected (28:10) to your private network, back at your home, back at your business now, then they get in (28:15) even further. That’s right.

 

Yes. And you know, people often ask, well, it’s just a little old (28:18) me. Like what’s, what’s the, you know, why me? The answer is think of it just a massive net.

 

(28:23) This is about fishing and not the social engineering term fishing, but just casting a(28:28) very wide net. So if that happens for 10,000 people, hackers will say, well, who do we have? (28:33) Right. Okay.

 

Oh, there’s 50 people. This looks interesting for let’s go target them. Right.

 

So (28:37) it’s opportunity. Well, and sometimes the, the, I mean, just like in fishing, you’re catching a (28:42) small fish to then catch a big fish. That’s right.

 

Yes. So I think that’s the piece that people (28:46) always miss out on is what’s little old me. Well, little old, you may work for bank of America.

 

(28:51) That’s right. Or little old, you might go, you know, into the take-home building, you know,(28:56) once, once a month, that’s right. Whatever that might be.

 

That’s right. Yes. You know something (29:01) we’ve talked about.

 

And so little fish, big fish, let’s talk about that a little bit is (29:06) right now, hush. The main focus for you is, and this is, this is on a life cycle (29:12) or a product life cycle that, that I think this is where you’re starting. And then (29:15) eventually it’ll become ubiquitous for, for others.

 

But right now you’re really protecting (29:20) and focused on high value, high net worth and high visibility clientele. Is that, is that correct? (29:27) Generally speaking? Yes. So there’s a few buckets.

 

One of course is yes. Private equity, (29:33) family office, right. And so forth.

 

We work with a lot of businesses and those businesses (29:38) are named brands, right? Probably can’t name them here, but you know, fortune 500, (29:42) fortune 1000 companies and their employees. And it could be any level, right? So it might be (29:46) the assistant treasurer, for example, or the accounts payable clerk, because they are targeted (29:52) by criminals as well. So yes, it may be with a big company, but those individuals, we give our(29:56) solution away to current and former members of the military.

 

We know that they face enhanced risks (30:03) and there are frankly, just not effective tools out there for them. So automatic to them and their (30:08) families. And then we have we have some other programs for current and former public servants (30:13) as well.

 

As sadly you’ve seen, you know, the news there is far too much toxicity and, and danger, (30:20) right. For those who are serving today. So.

 

Sure. So and on that, on that life cycle of just kind (30:26) of where Hush is going, you start there and then you’ve told me that in five years, googling a (30:32) person won’t even really be a thing necessarily. That’s right.

 

That’s our vision. Certainly. Right.

 

(30:36) So taking identity theft, going back to that, right. So identity theft in the U.S. is twice(30:41) as bad as it is anywhere else in the world. Right.

 

There’s a tremendous thing. Billions and billions (30:46) of dollars are lost to this every year simply by working for a bank, a credit union, et cetera. (30:52) We expect that, yes, in five years time, your information won’t be findable.

 

Right. It’s (30:55) ludicrous that I can go look up anyone who works at Lawrence Livermore National Laboratories, (30:59) the ones who look after and engineer our nuclear program amongst others and find a ton of (31:04) information about all employees and their families at large. And by the way, so can other nation (31:09) states at the same time.

 

So that’s got to go away. We’re so worried about in the press these days, (31:16) retribution, retaliation for what’s happening militarily. That will probably be directed at (31:21) private sector companies as well.

 

Right. And so same thing. The best way to shut it down is to (31:26) start pulling back that information about employees and just make them quasi invisible.

 

(31:31) That’s right. Yes. And it’s quasi right in this technology age.

 

We all have to have some (31:36) digital footprint. Right. But we don’t necessarily need, you know, our kids information fully (31:42) available or dogs or otherwise.

 

I’ll give you another example as to why I think we’re getting (31:46) to this place where in five years time the information won’t be there. So ransomware, (31:52) we’ve all heard the term and seen the press releases from a breach perspective. We can (31:56) talk about, you know, any number of these big these big breaches.

 

It used to be that, hey, (32:01) we have your information. And unless you pay us ten million dollars, whatever the number is to (32:06) scattered spider, whoever, that will release the information on the Internet. A lot of them have (32:10) been targeted at health care right the last year.

 

And now it’s if you don’t, we’re going to reach (32:13) out to your patients, let them know that we have their information to apply pressure to the (32:17) organization. Sure. Now, what law enforcement is sharing and we’ve seen is the criminals are (32:23) finding information about the employees of the organization, the things that scare them like (32:27) here’s where your kids go to school and saying, unless you pay up, we’re going to reach out and (32:31) find your family and otherwise.

 

Right. So we’re in a place where, yeah, two years from now, three (32:37) years from now, we’re already seeing this will happen in Minnesota. There will be laws increasingly (32:40) probably at a state level.

 

We’ll see if we get there federally where information is going to be (32:45) pulled back. I think data brokers don’t have a very long runway anymore. When you talk about the (32:53) about those state laws just a second ago, you mentioned that the United States is really (32:59) the worst at at data privacy.

 

Now, we don’t have we have things like HIPAA and some others, but we (33:06) but we don’t have GDPR. Like, can you talk about that a little bit? I know you’ve worked internationally (33:11) and you know that you’re you’re versed on the subject. Yeah, absolutely.

 

So so Europe, like (33:16) you said, has this GDPR privacy law, which I think is probably more reflective of European (33:22) culture. Right. We’re generally speaking, people don’t share a lot of information.

 

Right. They (33:27) don’t talk about their families as visibly. I mean, yes, of course, people are on social media (33:31) and so forth.

 

But there’s a culture of holding back information. You know, when you think about (33:35) Switzerland, right. It’s what’s, you know, banking, privacy and so forth and hushed conversations (33:40) to think about rights, sort of how privacy is looked at, generally speaking in Europe.

 

(33:45) And so I think that bill reflected right. What was there? I’ll give you an example in Germany, (33:50) you know, Google Street Maps. Right.

 

You can see a house and type an address and see what’s there. (33:55) So many people removed it in Germany and took down their own property where they actually shut (33:59) down the service. Right.

 

OK. So so Europe gets privacy, maybe to the extreme, maybe more than (34:05) needed. But certainly I think it’s a step in the right direction.

 

The U.S., by comparison, (34:10) has state regulation so far. There’s talk about it at a federal level. California has led the way (34:16) something called CCPA.

 

Right. That’s the California privacy statute. And it basically allows a (34:21) consumer to get their information taken down.

 

Right. In effect, there are now 20 U.S. states (34:26) that have some kind of privacy rule and reg in place. Michigan’s not one of those just yet.

 

(34:31) Hopefully it will be, but done in the right way. I’m the first to say there should not be an undue (34:35) burden on business. And so that’s we’ve got to watch out for.

 

But it’s coming for every state (34:40) at this point. And then there are, thankfully, particular laws around anti-doxing. Right.

 

For (34:47) those who aren’t familiar, where an individual finds someone’s home address or their phone number, (34:51) elected official or otherwise, and puts it out on the Internet for all to see.(34:54) And then swatting, as you well know, has been a problem for years. This is when someone who has (35:00) a beef, a problem with a number individual, basically calls law enforcement to that person’s (35:06) house, fakes or falsifies a report.

 

And then law enforcement comes charging in, (35:12) usually with very negative outcomes. Right. So most states have laws, I should say most, (35:16) a lot of states have laws for anti-doxing, anti-swatting.

 

And we’re getting to the (35:20) place where they have the privacy laws. Yeah. So I don’t want to geek out there too much.

 

(35:24) Trevor Burrus No, no, no. No, this is great. I mean, (35:26) but nobody’s going to want to leave their house after they watch this episode.

 

Or maybe they (35:33) will because they’re afraid they’re going to be swatted. Go back to in your career, (35:40) right, a little bit more of that WealthX and how that really informed what you’re doing today (35:48) and where the holes were, you know, that you were seeing in the market. And maybe feel free (35:53) to extrapolate on the stories that you told about yourself and kind of being victimized by that.

 

(35:58) Sure. Happy to. So at WealthX, again, there’s this Bloomberg for private bankers.

 

That was the (36:03) idea. And this was coming out of 2008, the financial crisis. All of the talk in financial (36:08) services was not about trading, not about investment banking.

 

It’s all about wealth (36:12) management. Right. We’re at the point where we’re in the largest wealth transfer in the (36:16) history of the world, right, from baby boomers to the next generation.

 

It’s a good time to be a (36:20) wealth manager. Right. And so the idea was, let’s give information to these people because this (36:25) will help them grow their wealth management business.

 

Right. And I don’t know about you, (36:28) I get calls all the time. So there’s a lot of activity in wealth management.

 

(36:32) As we started to do this and build what we call dossiers on individuals, all publicly available (36:38) information. So no dumpster diving, no talking to ex-wives or anything else, all publicly available (36:43) and putting that information together. It was shocking what you could find.

 

And we actually (36:47) left things out. Right. So we left out addresses, we left out phone numbers.

 

Meaning you almost felt (36:52) bad. There was too much detail. Entirely.

 

Right. The fact that I can go find a billionaire’s (36:57) kids where they go to school, who the nanny is and where they are on a Friday afternoon is in, (37:02) to me, it’s insanity. Right.

 

And certainly of concern. Right. And we always said, gosh, (37:07) we’re doing this at arm’s length.

 

This is publicly available. But someone needs to let (37:11) them know. Right.

 

That this is out there. I remember, you know, once we found the phone (37:16) number for someone’s, you had a client of ours, you know, very high regarded wealth manager. (37:22) And so they asked us to find a way to get ahold of someone.

 

We rarely did that. In this case, (37:26) we did. And we found the number to this billionaire’s ranch that rang inside the horse (37:32) stable.

 

Right. So and they picked up, by the way, that’s that’s the funny story. Right.

 

So (37:37) this information is out there if you know where to look and it’s publicly available. But too many (37:41) times it was, again, things that would put naturally almost any family at risk. I’ll give (37:47) you another story.

 

It’s been a heated political season, shall we say, for the last several cycles. (37:53) Sure. Donation information is publicly available.

 

Right. Who gave to what campaign? (37:58) And there are a lot of people on the other side of the aisle who don’t like certain (38:01) individuals. Right.

 

And so there have been plenty of examples where someone (38:05) who doesn’t like a given candidate calls and harasses someone who made a gift (38:09) right to that candidate. And I don’t mean Elon Musk. I mean, two thousand dollars to a candidate.

 

(38:13) Sure. And they will say all kinds of horrible things. I’m thinking about one of our (38:17) relationships where they looked up information about that person’s adult daughter, (38:21) knew that she had gone through some pretty rough times on social media because it was there,(38:25) right, and unlocked.

 

And they basically called and harassed this this adult daughter, saying (38:30) your your father is this horrible person, done all these things, everything else. This is all (38:33) over a few thousand dollar political donation. Wow.

 

So this stuff happens and people just don’t (38:39) recognize it. So that’s the WealthX times where we know this stuff is out there. We’ve been (38:44) personally affected.

 

These people who are in our database don’t seem to sort of see and get it, (38:48) even though most very affluent people fancy themselves as very private. (38:53) The reality is they’re not. Right.

 

And so there’s got to be a better, smarter way. Right. We even (38:58) thought about having a business extension to help pull back some of that information that we left (39:03) out of the dossier just to get rid of it.

 

And so here we are. Right. Several years later.

 

(39:07) Great. So that company, though, to you. So let’s let’s talk just about kind of the entrepreneurial (39:12) spirit there, too, that that company also was three guys on a couch up to two hundred and fifty (39:18) employees, 13 countries, I think you were in.

 

That’s right. Is that correct? Yeah, spot on. (39:23) So we started, yeah, three guys on a couch in the Upper West Side of Manhattan.

 

I remember our first (39:27) office was, if you can call it that, was an interesting place. We were next door to bill (39:32) collectors, debt collectors. Right.

 

I remember hearing those phone calls in the background. (39:35) You know, we’re glad to move right after that. And then, yes, we grew pretty fast.

 

We moved the (39:41) business from New York to Singapore. So we headquartered out of Singapore, (39:45) believed that we authentically needed to be in the market where the biggest brands, global UBS (39:50) and Citibank and so forth, were focusing a lot of their efforts. So so we did the Asia realignment (39:56) at our small company and then we grew, like you said.

 

So we ended up getting to about 500 (40:01) clients in about four and a half years and sold that to private equity after having expanded like (40:06) we did. Amazing. And then and then you went to was that when you went to Equifax? I did.

 

(40:11) Seemed like the next logical place to go after a wealth data business, (40:15) WealthX. And so, yes, ran the consumer data division there. Was there before the breach (40:20) and then during it, of course, and then the aftermath thereof.

 

Thankfully, I wasn’t an (40:24) IT leadership during that time. I was a business owner. But look, it affected us all.

 

As rightly (40:30) it should have. You know, what is often said, and I’ll say here, is it’s unfortunate that it (40:35) happened. Shouldn’t have happened.

 

At the same time, I would love for people who are listening (40:39) in to know that all of our information pretty much is already on the dark web. Correct. So we (40:44) talked about this iceberg and there’s the above the water below the water in the bottom of the (40:48) iceberg.

 

The bottom of the iceberg is a dark web. Our phone numbers were already there, right? Our(40:52) socials were already there. And so it’s unfortunate, again, shouldn’t happen.

 

Of course. But (40:56) everything is there. It’s been stolen.

 

It’s sitting there, unfortunately, for threat actors to take (41:00) advantage. That’s funny. I was just asked on an interview the other day about the the more recent (41:08) Google and Facebook.

 

You know, there was a big, big breach there of I think it was 16 billion (41:13) records. Yes. Or net new records.

 

And they said, you know, why is this one any different? It’s like, (41:19) well, this one’s a little different because, you know, it’s new passwords. It’s new information. (41:24) It’s not the data that’s already been aggregated and sold a hundred different times.

 

Right. So (41:29) when we hear of, you know, health care company getting getting hacked, most of that information (41:34) is out there. Even the Equifax one that was already out there.

 

But this Google breach and (41:39) I was I believe is Google meta and and somebody else. Yes. But nonetheless, can you explain kind (41:47) of your view on that? Why why that one maybe has got a little more impact for sure.

 

So so the few (41:53) things there. One, by the way, I should note to my knowledge, at least the Equifax data has not (41:57) shown up in the dark web. So as much as I say, it’s out there.

 

Oh, great. Right. There is a (42:01) data points are out there, but not necessarily from the from that origination.

 

That’s right. (42:06) The Equifax breach. That’s right.

 

So a lot of these breaches, the bigger ones we hear about (42:09) end up where, yes, some of those end up in dark web. A lot of those end up just (42:12) sort of like, you know, high end art. It disappears.

 

Right. And no one knows where it (42:17) is. The supposition is those are nation state actors that would use that information.

 

Right. (42:21) For whatever purposes or whatever time. Right.

 

They’re out there. But with the most recent (42:26) large data breach and there’s there’s ones all the time, sadly. Right.

 

I think the issue (42:31) that continues is it adds to the file, so to speak, for targeting. So, you know, we all assume (42:37) that whatever we Google about ourselves is all that there is. The challenge is there’s always a (42:42) dark web file on all of us.

 

Right. That’s there. And it just gets more and more robust as time (42:47) goes on.

 

There are some people who don’t have one. It’s rare. But that that large breach fills (42:52) in a lot of blanks.

 

Right. For those files. And that’s the issue is, again, it’s sort of like (42:57) we talk about China.

 

Right. And their system of having a file on every citizen. And, you know, (43:02) maybe you can look at points in the U.S. history where, yeah, there’s a version of a file, (43:06) but means we all have a we all have a dark web file, too.

 

We’ve got to worry about. For sure. (43:10) Yeah, that’s interesting.

 

That’s making me think about my relationship with my (43:13) chat GPT right now. Yes. Right.

 

Which is my son got into it the other day. He’s he’s a early teen (43:22) and he said, roast me as me. He says, yeah, the things that it had from its memory (43:30) in its recollection and ability to use against me, you know, in a roast.

 

Yes. Was ridiculous. (43:37) It really made me think to myself, this thing knows this is remembered way more than I would (43:42) have even thought that I had told it.

 

Yes. Yes, very much. So this in many ways goes back to us (43:48) being more aware of our privacy settings, but also, of course, working with organizations and (43:53) brands that are trying to protect us as well.

 

Right now, it’s in chat GPT open eyes, interest (43:57) to get everything they can learn as much as they can improve their systems as much as they can. (44:01) But there are settings, right? You can adjust the settings at GPT and other tools. And most people (44:06) don’t know that.

 

Why? Because they’re not front and center. They don’t have time for it. They’re (44:09) just trying it out.

 

But that’s the thing we always say is whether it’s chat GPT or any other (44:14) right, LLM, even PayPal and Venmo, right? Just look for the privacy settings and turn off as (44:19) much as you can. Yeah. Yeah.

 

Great advice. You had told me that there was quite a boost (44:27) during the beginning, but twelve, fourteen, whatever, how many days ago it was with the (44:34) Israeli strikes on to Iran. That’s right.

 

Yes. You got an influx of of calls or business or (44:41) however it’s transacted. And then, of course, right after that, with the (44:45) Minnesota lawmaker who was who was attacked in their home.

 

That’s right. The Minnesota lawmaker (44:51) makes a lot of sense to me, right? Because it would be a high profile person. You know, (44:56) Hush would be one of those perfect platforms to read the read the web of data about that person.

 

(45:02) But why? Why with the strikes? What happened there? I’ve been hearing a lot lately about the (45:07) knock when an employee knocks on your office door and says, got a minute, and you immediately (45:11) know it’s some sort of incident. But Oxium IT can help whether you’re having a problem, (45:16) need consulting, an upgrade or a managed IT approach. They focus on preventing cyber attacks (45:22) and proactive solutions that deliver results.

 

My friend Matt Lauria and everyone at Oxium are (45:28) ready to help before or after you get the knock. Visit Oxium.com and let Oxium IT help you outsmart (45:36) chaos. Sure.

 

So so with the strikes, you know, the news broke and also the same time the (45:45) Israeli government closed many of the embassies and consulates around the world. (45:47) Right. Anticipating, you know, blowback or attacks at those locations.

 

Sadly, there was (45:53) the shooting and the murder of two embassy employees in D.C. Right. That happened as well. (45:59) So putting this and stringing it all together.

 

Well, and if, you know, someone who’s intent on (46:04) doing harm to Jews or Israeli can’t get at someone at an embassy, well, then maybe it’s other (46:08) locations, maybe it’s other sites, maybe it’s someone’s home. Right. And so, you know, even if (46:14) we go back to some of the protests about what’s happening in Gaza and Israel and so forth right (46:20) now, I’m always careful to use the right language there.

 

There have been people whose homes have (46:26) been ransacked even here in Michigan, actually. I shouldn’t say ransacked, who have been vandalized (46:30) right as well. There are a number of stories about protesters showing up at home, you know, (46:35) putting bloody dowels.

 

So dolls actually on the lawn. I mean, all kinds of things that have (46:40) happened there. And so for a lot of people in the Jewish community, there’s a concern that as a (46:45) result of that turning up the even more right that attacks may come to their house, to their place of (46:50) work and so forth.

 

So that was one of the reasons for the influx. At the same time, we’ve received (46:55) just those same calls from folks who are Muslim from the Middle East, because, again, they are(46:59) concerned about the same types of issues. And again, this is where it’s not talked about at (47:03) cocktail parties, but, you know, I could probably spend a whole hour giving you stories of people (47:09) being threatened, harassed online, having packages delivered to their home and newer sent to their (47:14) house because of an issue that they supported or didn’t support from one post, right? One video (47:20) online and so forth.

 

And, you know, as we say in our business, everyone reacts to online harassment (47:26) differently. So if we were to sort of, you know, look at what’s most likely to least likely, (47:31) right? Least likely someone graduated, right? Yes. Yeah.

 

Least likely someone showing up to (47:35) someone’s house, right? Catastrophic outcome, as was seen, sadly, with the Minnesota lawmakers, (47:40) but pretty unlikely to occur. But you’re seeing the reaction to that. And now state legislators (47:46) amping up privacy protections.

 

Most likely it’s threats, right? It’s death threats, (47:51) violent threats sent to someone by email or posted on a social media site. And so after (47:57) the UHC murder, which was the CEO, right? Of course it was very public. People are broadly (48:03) taking those threats a lot more seriously, right? And so knowing that’s the case, people react very (48:09) differently to threats.

 

And here’s what I mean by that. You might have a, you know, person who (48:13) works at a company, their company is taking a stand on an issue, whether it’s Gaza, sending (48:17) relief, whatever it may be. And they get blowback and they get threats.

 

And that leader, (48:22) that CEO business owner might get a threat. And that may be against their house. Maybe someone (48:26) throws in there, Hey, here’s your address.

 

Speaking of another example, you know, the other (48:29) day, you know, we know a story where someone had someone who was angry with this executive and (48:35) sent a photograph of their house to the executive, right? Just say, Hey, I know where you are. Right. (48:41) I mean, just like out of the movies, isn’t that? I mean, (48:43) but it happens.

 

It’s a very toxic environment these days. Right. And so again, you know, (48:50) the person who receives it might be okay.

 

Their, their spouse may say, I’m sorry, we got to move. (48:55) Right. I’m not okay with that level of risk.

 

And so this is where the family dynamics really (48:59) kick in. You remember this with Dan Campbell, right. And the lions here where, Oh yeah.

 

Right. (49:03) It was made public where he lived and they were gone. Right.

 

So, and again, different profile, (49:08) different reasons and so forth. Hopefully he wasn’t facing any, I don’t know any direct threats, (49:13) but people react very differently to concerns about violence than they might think. (49:17) Yeah.

 

Yeah, that’s true. There’s a, there’s a statement. So I’m going to kind of pull a couple (49:23) of things together here for you.

 

There’s a statement that, that always resonates with (49:27) me when I’m sitting down with someone, which is you are uniquely qualified to be exactly where (49:32) you are right now. And I think that that really stands true for you. And when I think about this (49:38) for you, I think there’s a, there’s a couple of different places of, of how that how that fits.

 

(49:45) And one is the, the timing, right? The timing in your career because of the people that you’re (49:51) working with, you’ve had to establish yourself as someone with a track record of keeping his (49:57) mouth shut. Yes. You know, who’s, who’s, who’s keeping things private all the while while still (50:05) maintaining normal relationships, right? I mean, I’ve never found our conversations to be (50:10) too odd because you had to withhold too much information or anything like that.

 

But, (50:14) you know, obviously to, to get the funding for a company like this, to start a company like this, (50:19) and to also have the, um, the people trusting you to, to be the guy who’s keeping this stuff (50:26) quiet and squashing it and finding it, right. Cause you may find, you may dig up something else (50:30) that someone didn’t even know was out there. And you’re, you’re squashing that or hushing that, (50:35) as I would say.

 

Um, but then you also told me that you’re about one and a half years early (50:41) on kind of where you are in terms of people adopting what you’re doing. So can you, can you, (50:46) can you take what I just said, that big, that big pile and kind of digest it and, uh, and (50:53) tell me what you think about it. So I think you’re, you’re spot on.

 

Uh, it’s been an interesting (50:57) journey, right? For sure. I guess I’ve done a lot. That’s why I have, I’m part of the no, (51:00) no hair club, right? We do this for fashion reasons.

 

I forgot to tell everyone that it’s, (51:05) it’s not genetics or anything like that. Yeah. Efficiency, right? Efficiencies.

 

Yeah. Yes. Um, (51:12) it’s come a long way.

 

I think there is absolutely, you know, I remember back in the WealthX days (51:16) where early on we started and we felt kind of weird, right? We’re like the spies who are out (51:20) here looking at information about people. And yes, we’re holding things back, but eventually we (51:25) started to, I remember from our, for early days, we’d go into a private bank and we talk about the (51:30) solution. And then we’d figure out that the head of the private bank, who’s usually very well (51:34) connected.

 

And sometimes it’s from a very substantial family by name and background (51:37) is in our database. And so they’d say, well, show me my profile. Right.

 

And then we’d have to, (51:43) again, always ethically the right way, show them that profile, walk them through. (51:47) And then we started picking up, uh, business owners, uh, as clients as well. Right.

 

So, (51:52) you know, in the WealthX days it was, yeah, here’s a dossier, but actually I’m going to buy, (51:56) you know, WealthX, I want to see what you have about me and my family. And by the way, (52:00) I’ve got a son or a daughter getting married. I want to see what you have on the other family, (52:04) right.

 

As well. And again, who might be ultra high net worth in the case. And so we started (52:08) to build relationships with those families and we’re in the middle of this, which was, (52:12) uh, incredibly insightful, I will say, but also a lot of fun.

 

So I think that’s where the journey (52:16) started being the real sort of trusted advisor in the middle of all of this. At Hush, it’s only (52:21) been amplified, right? So the, the challenges are, I mean, we see everything, we see everything (52:25) from deep pig, pornography, death threats, stalking. I mean, you name the issue.

 

And usually (52:31) when families are at, uh, some of their worst times, right. Uh, where there’s an active situation (52:36) and an active issue, and we’ve got to engage to get whatever it is, right. Taking off the internet (52:40) as, as fast as possible.

 

So I, you know, I think that’s, that’s worked well, uh, for us. (52:47) The timing is such where, you know, I remember Apple putting on billboards two years ago, (52:52) you know, privacy by design. Right.

 

And I think they sort of started the consumer consciousness of (52:59) privacy and what does that mean? And do I really care about that now they’ve made it a effective (53:04) marketing ploy. And I say a ploy, I think the reality is they are more private on, on average (53:09) in their tool set and how they operate. And so, you know, I don’t know if you’ve ever gone Apple (53:13) pay and you can use a virtual email as opposed to your regular email.

 

So it’s getting there, (53:17) but then these other events have picked up, right. And this is the sad part, but it’s also (53:21) been good for our businesses, the UH CEO, right. Uh, you know, murder, um, all of these things that (53:28) have happened, including the Minnesota lawmakers, two of which were, were, uh, killed, uh, and (53:33) related as well as the two who were injured.

 

Um, it’s all in a place where we, we just see these (53:38) trend lines. Um, and even if we take that in the zoom in at a state level, like I said, (53:44) university of Michigan here, this has been going on for a year, right. Where the regions have been (53:48) having issues around their own safety and security.

 

You can go on a list of different orgs, different (53:52) people. And even when you go to a cocktail party, right. And this is what we do.

 

Oftentimes we’re (53:56) the catalyst around a conversation. So, you know, we try not to be doom and gloom, but you know, (54:01) we open the dialogue and then people start sharing, right. And they say, oh yeah, I, you know, (54:05) had a stalker, right.

 

So I’ll give you a crazy stat. One out of six Americans (54:09) is stalked in their lifetime, right? Wow. Half of that before the age of 25.

 

(54:13) So if you talk to a room of 15, 20 people, you’re going to have one person who’s gotten stalked, (54:18) right. And then one person who will know if someone who has been stalked and they’ll say, (54:21) oh yeah, this is absolutely crazy. Like, you know, here’s how we dealt with it.

 

And no one will know, (54:25) right. One last story and I’ll, you know, pause. The stories are what make the difference, (54:29) to be honest.

 

I mean, the, when, when you just talk in platitudes of, you know, (54:33) be more aware of your cybersecurity or, you know, things like that, or, or, you know, (54:36) everyone’s at risk or it’s not, if it’s when those are all been muted, nobody’s, nobody’s (54:44) listening to them. The only thing that works is the actual story and story of somebody that (54:48) maybe they know. Yes.

 

Agreed. And you look, I guess I consider myself and ourselves fortunate (54:54) to share these stories because they’re also the kind of stories that people want to, they don’t (54:58) want to share themselves, right. They don’t want to be on video recorded and, you know, (55:01) telling the story about some of their worst times, but it’s true.

 

So I’ll, I’ll mention this one. (55:06) There’s an individual, you know, relating to sort of election season and the rest, (55:12) you know, has an opportunity to publicly say his piece, right. For no more than two minutes worth, (55:18) right.

 

Seats his opinion, open mic, that moment goes viral, right. Because, you know, things do (55:24) sometimes, it blows up. Next thing you know, he’s getting shipments to his house of manure,(55:30) of glitter bombs.

 

He’s getting death threats at the house. His spouse is getting called at her (55:35) office with death threats as well. And people are trying to harass her out of her job.

 

(55:39) And again, this is two minutes of, you know, public comments, right. Yeah. And so street type (55:44) of thing it is.

 

And so people just, and so the reason I tell you the story is I was at an event (55:49) where this individual, you know, when I described what I do was happy to share, right. What had (55:53) happened to him and he’s surrounded by his close friends and none of them knew that had happened. (55:56) Right.

 

And so it’s, again, it’s not the kind of thing people talk about at cocktail parties, (56:01) but it happens all the time. Wow. Wow.

 

Let’s also talk about the, so, so when you said the one (56:08) point, the one and a half years early, that was something you told me, right? Yes. Talk about what (56:13) that, what that meant to you. So, you know, we like to think that we try to capture the zeitgeist (56:19) of the moment, right? Like what, where are we and what’s going on? There’s so much going on these (56:23) days, right.

 

But relative to privacy and personal safety and security, right. Where are we on that (56:29) journey? And yeah, we probably started a year and a half, even two years early. I’d like to think(56:33) we got the technology ready so that we’re, we’re, you know, now built for this moment and we are.

 

(56:39) But, you know, again, it’s timing’s hard, right. With startups and tech, (56:42) I was going to say, isn’t that just the technology life cycle, right. You see this, (56:46) these early adopters, right.

 

These are people who can afford this. These are people who it affects (56:50) most, most regularly. It’s really affecting everyone, but we’ve got to get it to where the (56:55) cost can be appropriate for everyone.

 

And then also just that the platform is scalable to where (57:00) it can serve everyone. And that’s what you guys are, that’s what you guys are building, right? (57:03) That’s right. And that’s, that’s where we are, which is fantastic.

 

But again, it’s sort of, (57:06) what, you know, when does it become a consumer trend or consumer fad where people are just, (57:11) Hey, I’ve got to figure something out about my privacy and security, right. You know, (57:15) I Google myself and I did it last week or last month because someone said I should, (57:18) I don’t like what I see. We’re getting to that place where more and more people are aware (57:23) because of these incidents that have gone national.

 

Well, I mean, I think it’s interesting (57:26) too. I mean, when you think about when I think about just myself, right. I mean, I’ve got a (57:30) pretty heavy online presence.

 

I’m pretty forward about this is what I’m up to these days. And I (57:38) have to say every word that I’m saying to you, I’m almost pausing and wondering, you know, (57:42) what, what should I be doing on that? And so, and I don’t think I’m the rarity, right. I think (57:46) there’s many business leaders who are doing a podcast, doing other video work, doing a lot (57:52) of social media, doing a lot of outreach, which is part of the scalability of their own organization.

 

(57:57) Right. So we can’t stop. That’s right.

 

So, so is this, is this for someone like me as well? (58:02) It is. You know, we often say again, platitudes, right. Don’t overshare, (58:07) right.

 

Holding information back. And that’s, that’s true because it’s easy to connect the (58:10) dots. Right.

 

So yeah, I’ve done this. I’m very practiced at what I share and what I don’t share (58:14) and the rest, but I think we’re getting to a place where generally speaking, people will share (58:19) a lot less. You know, for example, I’m thinking about another story recently.

 

This was in the news (58:25) just in the last week where someone showed up at an individual’s home trying to break in, (58:31) you know, two in the morning. And it seems as if it is a all indications, a relative of a (58:38) disgruntled employee. Oh, right.

 

Okay. And so, you know, we think about the person who owns, (58:43) you know, 25 humming trucks. Yeah.

 

You might want to be very well aware of your, (58:47) at least your home address, right. Because even in cases where, you know, you think, (58:51) for example, law enforcement and folks who are very well able of taking care of themselves, (58:55) they’re not always there, right. They’re not always there in the middle of the night or (58:58) they’re not always there in the, you know, afterschool pickup.

 

And so that’s where this (59:02) stuff matters. Yeah. Well, I mean, boy, that’s bringing to mind the whole, the Minnesota lawmaker.

 

(59:06) I mean, they kind of walked through on one of the articles of the details of how that attacker (59:14) came at them. And it was essentially, I don’t care if she would have had a gun in her hand, (59:20) she had no chance. Yes.

 

Yes. So, you know, sometimes the biggest prevention could have been (59:26) just not having that address. That’s right.

 

Yes. Very much. Take yourself off the board.

 

(59:30) Yeah. That’s the, I mean, I hate to say it, this is this and this is the sad part, (59:33) but it is indicative in that example, there was a list. Of course.

 

Right. And if one target was (59:39) too hard to deal with, he would have gone to the next one. Yeah.

 

That’s right. The thing I also (59:43) thought was interesting on that was, you know, they showed a picture of the person’s home and (59:49) I’ll use the word modest, not in any derogatory way. It was just a modest home.

 

It was just a (59:54) normal American home. And it really made me think, wow, sometimes we think when we hear these names (59:59) on the TV or radio or internet or whatever, just through the media, you know, you’re hearing about (1:00:05) this lawmaker, you think that they’re in this ivory tower type of setting. No, they’re just in the (1:00:11) average American neighborhood sometimes.

 

Very much. Right. Especially at state elected official (1:00:15) level.

 

Yes. Very much. And then even the highest profile folks like the UHC CEO, he’s just walking(1:00:23) with a Starbucks in his hand outside of the hotel.

 

That’s right. I mean, he’s got to walk to the car. (1:00:27) Yes.

 

Yes. So there’s a softness to all of us as targets for sure. Very much.

 

(1:00:36) I noticed also when I was doing some prep work for our conversation was just going out and (1:00:45) Googling you. OK. And you have your own website.

 

Yes. OK. Is that on purpose to kind of control (1:00:51) messaging and to control that information flow? So there’s so it’s less likely that someone (1:00:58) takes liberties to to maybe add or delete from that.

 

So, yes, that’s part of it. One of it was (1:01:03) to own the domains. I could, of course, not put a website out there, but actually own the names.

 

(1:01:08) It’s funny how, you know, again, not even sort of big name CEO, but moderate, even small profile. (1:01:15) I consider myself small profile. People will take, you know, Nicholas Rambis sucks dot com.

 

(1:01:22) And and try to use that or post whatever they’d like to. If I make one, it’ll be (1:01:26) Nicholas Rambis is decent. Yeah.

 

OK. Thank you. So I’ll go with something a little more.

 

Yes. (1:01:30) More friendly. Yes.

 

Thank you. But but, you know, you got to own those domains. You got to see (1:01:35) where your digital footprint is.

 

And, you know, a lot of people talk about search engine suppression, (1:01:40) right, where thankfully, to my knowledge, at least there’s no negative articles out there, (1:01:43) but you don’t want the first things to be something that would index very high, (1:01:46) which is your own name with content that you didn’t control. Sure. So, yes, (1:01:50) that’s one of the reasons that’s part of the by design.

 

Absolutely like that. (1:01:54) Let’s get a little silly with that, though. Is there one embarrassing data artifact out there (1:01:59) that I hope not? Yeah.

 

You know, I had a Tumblr account once, which you with a Mohawk or something (1:02:04) like, oh, no, thankfully there’s a high top fade somewhere. Yeah, absolutely. But but no, (1:02:10) I can’t think of one.

 

Social media has been shut down for a long time. Right. So.

 

So, (1:02:15) no, not much. Not much. Yeah.

 

I mean, there probably is on some bulletin board system from (1:02:19) nineteen ninety nine. OK, good. But you’re willing to live with that.

 

It’s the sins of (1:02:23) the past that people will look past. That’s right. You also you live in the area here, (1:02:34) but not not not exceptionally close to where we are here.

 

But you do love (1:02:38) a certain bakery. This is near my office. What other what other kind of Michigan or just home (1:02:47) based sort of stuff do you really enjoy that you’re willing to share? Of course, I’m happy (1:02:52) to.

 

So there’s a I’m a big supporter of small business and local business. I’d be remiss if (1:02:56) I didn’t mention Buddy’s Pizza first. But of course, I imagine most listeners will know that(1:03:00) and have spent many a game night right at Buddy’s Pizza.

 

But that’s one. We’ve got some some great (1:03:07) down the street. There’s a cheese lady you’ve ever been.

 

Right. So fantastic business as well. (1:03:12) There’s so many right to point to.

 

Yeah. Yeah. Yeah.

 

But you’re you’re fond of the (1:03:16) the niche sort of very much, very much. Yeah. If you can find that right thing, (1:03:20) all the all the better.

 

Yes. Love it. Yeah.

 

Love it. What else? What do you think? (1:03:27) In just general advice doesn’t necessarily have to be security, but with our audience of being (1:03:32) business leaders, what would you give us as a as a takeaway? You know, if you could get right in (1:03:38) their face and tell them or do this one thing based on everything I know, what would you tell (1:03:43) them? So I would tell them to a few things. One, go back and run the playbook from 1999, 2000, (1:03:49) right, where at that point, people were scrounging for somebody who knows something about this thing (1:03:53) called the Internet.

 

Right. Go hire your teenage kid. Go hire a college student.

 

Do the same thing (1:03:58) for AI. Right. AI is transforming every business.

 

It’s hard to keep up. Right. For those who even (1:04:03) who are in the industry.

 

And so get that person and figure out what it’s going to mean for your (1:04:07) business. You’re not left behind. Right.

 

You know, we have such an opportunity with this technology. (1:04:14) This is a bit of my, you know, sort of, you know, U.S. cap on. Right.

 

As a nation, (1:04:18) we have a great opportunity to leverage this and make that much more productive gains in our (1:04:23) economy and to reinvent the economy in other ways. Right. I’m very excited about reindustrialize.

 

(1:04:28) That’s a big conference that’s coming to Detroit, coming up soon. So figure out AI,(1:04:34) figure out how to leverage it for your business and begin to pull it. Probably doesn’t have to (1:04:37) be able to pull someone who’s, you know, new and doesn’t have the baggage of a certain way of doing (1:04:42) something to look at your business in a different way and bring them to the fore.

 

Right. So that’s (1:04:47) one. Right.

 

Two is, look, we live in a complicated, messy time. Right. So, you know, know what’s out(1:04:54) there about yourself and your family.

 

You know, we say often, look, even if you don’t use Hush or (1:04:59) whatever, at least Google yourself, just know what’s there, because whether you’re a business (1:05:03) owner and you one day might sell your company, you know, you’re worried about the competitors, (1:05:08) they’re doing the same thing. Right. They’re doing all the research, all the homework.

 

So just (1:05:11) awareness is the first step in that regard. And if they want to go further than great. Right.

 

(1:05:15) Those are the two things I’d share with with listeners, mostly. That’s great. Yeah.

 

You know, (1:05:19) and to buttress that a little bit, I mean, I continue to go with anyone younger than me. (1:05:24) It’s like good, clean living pays off long term. Oh, yeah.

 

Because it’s the best thing is to not (1:05:30) have any of that negative, those negative remarks, that negative news very much about you in the, (1:05:34) you know, in the in the ongoing media. Yes, I look at very much agreed. And look, I would like to (1:05:40) think that, you know, covid, which in some ways seems so long ago, taught us all hopefully some (1:05:46) good lessons about what’s most important.

 

Right. Family, friends and life is short. So hopefully (1:05:51) people remember that going forward.

 

And you’ll notice I didn’t ask you a whole lot of questions (1:05:54) about your family, because I know you’d be pretty hushed about that. Deflect those. Right.

 

Yes, (1:05:58) exactly. Well said. All right.

 

Well, thanks for being here, Mick. This was fantastic. Very timely.

 

(1:06:04) And and you’re a gem. So I thank you for being here. Likewise.

 

I’m lucky to be here. Pleasure. (1:06:08) Thank you.

 

Thank you. Appreciate it.

Guest Bio

Mykolas Rambus

Unknown

Mykolas Rambus is the CEO and Co-Founder of Hush, the elite privacy service protecting companies and high-risk individuals. Prior to founding Hush, he ran Equifax’s consumer data business, founded, built and sold Wealth-X, the Singapore-based leading wealth data business, was an executive at Forbes Media, and an award-winning public and private company Chief Information Officer. Mykolas began his career as Co-founder and CEO of LOBBY7, the MIT mobile software spin-off backed by Softbank. He also serves on several university and private company advisory boards, is a keynote speaker on both wealth and privacy, and has been a featured guest of numerous media outlets including Bloomberg, CNBC, the BBC, CNN.

Listen anywhere:

Feedback?
We’d love to hear from you! podcasts@auxiom.com