31
May 19

NY Investigates Exposure of 885 Million Mortgage Documents

New York regulators are investigating a weakness that exposed 885 million mortgage records at First American Financial Corp. [NYSE:FAF] as the first test of the state’s strict new cybersecurity regulation. That measure, which went into effect in March 2019 and is considered among the toughest in the nation, requires financial companies to regularly audit and report on how they protect sensitive data, and provides for fines in cases where violations were reckless or willful.

On May 24, KrebsOnSecurity broke the news that First American had just fixed a weakness in its Web site that exposed approximately 885 million documents — many of them with Social Security and bank account numbers — going back at least 16 years. No authentication was needed to access the digitized records.

On May 29, The New York Times reported that the inquiry by New York’s Department of Financial Services is likely to be followed by other investigations from regulators and law enforcement.

First American says it has hired a third-party security firm to investigate, and that it shut down external access to the records.

The Times says few people outside the real estate industry are familiar with First American, but millions have entrusted their data to the company when they go to close the deal on buying or selling a new home.

“First American provides title insurance and settlement services for property sales, which typically require buyers to hand over extensive financial records to other parties in their transactions,” wrote Stacy Cowley. “The company is one of the largest insurers in the United States, handling around one in every four transactions, according to the American Land Title Association.”

News also emerged this week that First American is now the target of a class action lawsuit alleging the Fortune 500 mortgage industry giant “failed to implement even rudimentary security measures.”

Tags: , , ,

18 comments

  1. It’s unreal how much this happens. Almost daily there are examples of Elasticsearch DBs leaking PII or other sensitive information. Amazon S3 buckets are the same, but from what I understand, these are getting more locked down–which is good. I know those used to be a gold mine for bug bounty hunters as you could find those all over the place.

    • The sad thing is that it’s all misconfiguration. AWS had to take drastic steps to “secure” S3. People thought * meant “in my account” even though it meant public (and is documented as such).

      All they did was add 4 buttons that basically “block all external access.” But you could have restricted permissions to not allow people to change the policies. In most cases, people literally checked “Public Access” in the ACL and then were surprised when their content was leaked.

      The big problem with cloud (and I say this as a Cloud Security Engineer at a Fortune 500) is that you are giving people access to make their own security decisions. Developers are getting handed environments with unlimited access and then they are making dumb decisions, rather than being given a tailored environment with controls already in place like when they had to provision a server. In cases where we do build out a control and access scheme in advance, we get constantly berated for “slowing down” progress because they expect to have a perfect environment that doesn’t have any noticeable restrictions within an hour.

      Now why aren’t we seeing this as much with GCP and Azure? Because they started from the perspective of enterprise support. Azure can inherit your existing AD permissions (to an extent) and GCP has a top-down security stance. AWS has been slowly developing controls for organizations but 90% of what we use was built in-house bc no such solution existed 2 years ago. The default for AWS was an open account with full permissions. Anything else had to be designed and deployed by whatever enterprise was using it (basic permissions models exist, of course, but you had to give them permissions and then manually go in and fix it anytime they hit a roadblock – or build your own automation, like we did).

      • “The big problem with cloud (and I say this as a Cloud Security Engineer at a Fortune 500) is that you are giving people access to make their own security decisions.”

        ^Exactly. Part of that problem is that people are always going to value convenience over security. They’re not even “security decisions” at that point as security is an afterthought. Unfortunately.

  2. The Sunshine State

    Short article and to the point ! I love it!

  3. Look at you go Brian! Keep exposing organizations that fail to protect our PII – your voice echos far and wide. And kudos to your source who tried to do the right thing by contacting FAF directly before (begrudgingly) zero-daying them through you. As long as we (the community) are holding insecure companies accountable for their incompetent ways the regulators will have to follow suit and burn them to the ground (here’s looking at you “negative” Equifax). Bravo good sir! Bravo indeed.

  4. It will be easier to bill these fines to customers than to fix the system.
    It will just be part of doing business.
    The list of fines banks already get does not seem to change the way they behave.

  5. Change won’t happen until it affects the company bottom line and the people at the top who are in the end responsible for budgets , project time lines, hiring practices, and and policy implementation. READ: CEO. (I know I’ll be waiting a long time for that. which unfortunately means this sort of thing will be happening for a long time.)

    In the meantime whoever is responsible for choosing a provider for the service needs to consider alternatives… assuming they are any better…

    • As an impact on the bottom line, there will also be the the inevitable class-action lawsuits. As usual, the lawyers will get richer. The irony is that lawyers are the customers and sales agents for companies like FAF. I once worked IT for a title & abstract company that was an agent for FAF; they proudly had the FAF emblem embossed on their letterhead.

  6. I’ve seen SOHOs locked down better than that! Thanks Brian for all you do!!

  7. Fines are just sources of revenue for government. No reasonable legislator could possibly think that fines will change behavior, knowing fully that companies will pass along the costs to consumers. As government revenue, it’s a tax.

    If the NY authorities actually wanted to affect behavior, there would be criminal proceedings for theft by fiduciary negligence and theft of mortgage fees that were supposed to be used for properly maintaining banking records.

    If NY authorities wanted to affect data security practices, executives and their data handlers would be personally responsible for fines and consumer costs related to data loss due to carelessness.

    Instead, this is NY authorities producing headline porn to bolster bureaucratic and political careers.

    This is a tax on a business and its customers that will not produce any data security improvements.

    This is not good governance.

    • Proving carelessness won’t be trivial, so I think significant fines will continue to play an important role. Suggesting that fines do “not produce any data security improvements” is false. Many organizations took action to become GDPR compliant. Executives care about the bottom line, and the fines need to be prominent enough to affect it either directly or indirectly through reputation damage.

    • The threat if fines certainly produced a lot of data security improvements for health records. HIPAA had an effect.

      Sarbanes Oxley data reporting improvements was also greatly driven by the threat of fines.

      Bad businesses will certainly try to pass along the cost of fines to consumers, but if they have competition, it’s not going to be easy to do that.

      • Matt and VB,

        Regulatory burdens hurt smaller businesses and decrease competition.

        Since you mention it, I think there’s a line that can be drawn between the S/OX act in 2002 and the economic meltdown that occurred in 2008.

        Regulation like that costs a lot of money in compliance documentation. This drives out smaller businesses, causes more mergers to save on costs, and concentrates the marketplace into a few businesses.

        Then, when they all seemed to collapse at once, the remaining big businesses were “too big to fail,” which led to bailouts, mass unemployment, human sacrifice, dogs and cats living together – mass hysteria.

        So, no, I respectfully disagree that regulation does any good.

        As for the question of whether the threat of penalties change behavior, I submit that it doesn’t. GSK, Wells Fargo, and ZTE, all thumbed their noses at the possibility of fines. No one risked jail, so they did not care.

        Two years in a row, Citigroup paid billions in fines. They didn’t learn their lesson in 2017, so they got fined again in 2018. Will they go for the triple hat this year?

        If fines deterred bad behavior, why did Volkswagen, Credit Suisse, Bank of America, Goldman-Sachs, Verizon, BP, and Anadarko Petroleum all do such monumentally stupid things that they now owe billions and billions of US dollars in fines?

        No, fines don’t deter bad behavior. It just gets passed along to consumers and hurts the economy.

        • The only line that can reasonably be drawn between 2002 (1999, actually) and the 2008 economic recession is with the repeal of Glass-Steagall. It was *less* regulation that led to the crash, not more.

          As someone in the financial cybersecurity world whose entire job revolves around GLBA/SOX compliance, let me be the first to really point out that without those requirements most major financial institutions would place significantly less effort on data security.

          Given how many exemptions already exist for small businesses, it’s wholly unreasonable to expect those companies be allowed to make money off of the data of others without taking common sense precautions to protect it. Care to guess how many sole proprietorships handle NPI on unencrypted workstations and store it in unsecured (or barely secured) cloud services? Why shouldn’t they be required to lock those down and report when they lose sensitive information to a breach?

          • No, please tell me. How many sole proprietorships handle NPI on unencrypted workstations and use unsecured cloud services? As I currently run one, I’d like to know what the competition is doing.

  8. disgruntledcustomer

    This is ridiculous and disgraceful for a company of First American’s Size. I see the Big Wigs are just minting money without bothering about operations. Their CIO really seems to be a MISFIT for the job . I would imagine the same about the other executives if the CIO is of such low caliber and from a different educational background.

    I wonder how to they reach to such heights without any proper knowledge or experience. I pity the IT and I am ready to be part of the Class Action as I am one of their customers as well as a Share Holder.

    I am upset on the CEO’s on Data Privacy which he miserably failed. They really need to clean up the Top Bosses and whoever responsible including the Security Department and get people who are experts in the Industry .

    We have not received any follow up on the Data breach yet and whether our Data got compromised in the process. Did they really do damage control yet and come out with full disclosure on the actual issue and remediation ? They do owe us an APOLOGY , explanation and drastic action that needs to be taken on all the bosses responsible for such silly security flaw.

    Looking forward for updates on this. Please keep us posted on the Class Action.

  9. Robert Carleton

    Maybe Paul Bandiera, their new VP IT Strategy can straighten things out. It looks like he has a lot of work ahead of him.

Leave a comment