Real State

How gen AI is making real estate cybercrime easier than ever

As the FBI report suggested, artificial AI shares much of the blame for the rise in financial crime.

“It levels the playing field,” said Matt O’Neill, a retired Secret Service agent and founder of the 5OH Consultationhe said.

Previously, O’Neill said that cybercriminals will specialize in certain areas of crime or in certain technologies. They would then work together and provide what was a “cybercrime as a service” to scam their victims.

Now, however, O’Neill says AI has made it where hackers don’t really need any level of technical expertise.

“Two years ago, the lowest level players didn’t have much success, it was a pure volume game, but now with AI it’s much easier for them to create complex attacks,” said O’Neill. .

While cybersecurity experts believe that fraudsters are in the early stages of using AI, they have already seen impressive applications.

Adams and his team recently came across a spoof website of a real article company, something he found very concerning.

“It was an exact replica of the company’s website with the title. “Everything was the same except for the phone numbers and they had entered the same place and pretended to be a company with a title,” said Adams. “Those situations are the ones that scare me the most, especially when it comes to AI development because it’s no longer just a bunch of people trying to figure out how to rebuild a website. With AI they can just scrape it off and rebuild it, make it a lot easier.”

But sophisticated website spoofs aren’t the only way fraudsters use AI. Cybersecurity experts say they’re also seeing artificial intelligence applications come from unusual things like phishing scams. According to industry leaders, fraudsters’ use of AI makes scams believable, and unfortunately for victims, it works.

According to a study conducted by Fredrik Heiding, Bruce Schneier and Arun Vishwanath at Harvard University, 60% of study participants became victims of AI phishing. The researchers said this is consistent with the success rate of non-AI messages created by human experts. However, what the researchers found to be very worrying is that the entire phishing process can be automated using Large-Language Models (LLMs), reducing the cost of phishing attacks by more than 95%.

“As a result, we expect phishing to increase significantly in quality and quantity in the coming years,” the researchers wrote in an article in the Harvard Business Review.

The increased development of phishing scams has raised alarm bells for Andy White, CEO of ClosingLockespecially since most of the focus in cybersecurity has been on sophisticated attacks and not on phishing scams, which have been around for decades.

“We don’t really think of phishing scams as a way for fraudsters to use AI to break into the real estate industry, but if you can use AI to make a fake link that’s very believable and a lot of people click on it, then you’re in. any group in what you want. You can even go into the title company’s systems and be able to send emails to the title company itself and not the fake account or change all the account numbers when the money goes into the fake accounts,” said White.

While this is scary in itself, cybersecurity experts warn that scary scams are just around the corner as it becomes easier to create deeply convincing fake videos.

“The technical bar and the level of sophistication to do this attack is still not very high and the hardware costs to do it are down,” said John Heasman, chief security officer at the identity verification company. Evidencehe said. “We expect to see more incidents of real-time face manipulation and the production of deep fake videos throughout the year.”

While Adams believes that deep dives are a real threat to the real estate industry, he doesn’t believe we’ll see scams using this technology for a few months.

“I think this year we’re going to start seeing some really impressive fake IDs for book writers and things like that, and that’s going to be the biggest risk of the year, but when it comes to outright lies and going into Zoom and not knowing if you’re talking to a real person, I think we’re going to start seeing it later this year or early 2026,” Adams said.

Given all this, cyber security experts agree that it’s easy for real estate professionals to feel overwhelmed by the threats posed by fraudsters and their newly turned off AI capabilities, but they believe it’s not all doom and gloom.

“Small and medium-sized businesses are increasingly ramping up their security, doing things like conditional access and dialing down their security intensity, which is promising,” said Kevin Nincehelser, CEO of the cybersecurity firm. Premier Onehe said.

While fraudsters may have new tactics, Nincehelser said the “good guys” also have new tools at their disposal.

“Many pieces of security equipment are also using AI now and it has helped a lot in detecting and mitigating more attacks,” Nincehelser said.

Premier One’s Internet security partners have begun using AI-powered email filtering products, which Nincehelser said has been a game-changer in preventing both fraud and ransomware attacks.

“In the past, email filters just looked for patterns, but then the bad guys stopped using patterns and started using AI and the AI ​​tools we have can stop those attempts or attacks that come in through email because they look for behavior and intent,” Nicehelser. said. “AI tools not only recognize a link in an email like a human, but they see the next three steps beyond that link and what to ask the user. From a security perspective, AI email security has been one of the most powerful new technologies to come out of this so far. “

While O’Neill acknowledges the need for improved fraud detection and prevention tools, he believes the real estate industry could also use a push from the government to improve its cyber security.

“I’m working with state legislators to create some kind of care needs work that says you have to have these basic steps, like multi-factor authentication and using secure social media platforms other than web-based email when dealing with clients. trading for a certain dollar amount,” she said.

At the federal level, O’Neill said there is a push in the financial sector to use 314b of the Patriot Act for financial institutions to share information. He believes that widespread adoption of this law will go a long way to prevent fraud.

According to O’Neill, part of the challenge is that as of now, 314b is voluntary, so many banks have decided not to participate. Because of this, banks are not usually responsible for the loss, which is simply passed on to the buyer.

“If they can’t do that anymore, they’re going to have to start communicating with each other,” O’Neill said. “There could be meaningful changes if financial institutions do things like match account numbers to account holder names and things like that.”


Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button