Real Estate Tech: Disruption In Slow-Mo

| About: CoreLogic, INC. (CLGX)

Summary

Real estate is undergoing an epochal shift in technology.

Tech shifts in real estate are real and massive, but very gradual.

The tech adoption curve in real estate has shortened over the last few decades, but is still long.

Key Trend: The recent spread in API and Big Data technology will be felt in real estate over the next decade.

Some of the darlings of the last real estate tech wave are likely to be losers in the new wave - know who they are.

To speak of "disruption" or "revolution" in the real estate business would be silly, considering how far the industry has come technologically in the past half century.

The real issue in real estate technology isn't a lack of innovation but the slow pace at which most brokers and agents adapt whenever a major shift occurs in the industry.

It usually takes years for the real estate agent holdouts to go bankrupt, get bought out or begrudgingly accept the handwriting on the wall - and it's been this way before the Internet ever existed. See Chart 1 below and you'll see what I'm talking about.

So let's take a walk down memory lane and review the history behind the technology most real estate agents use today. You should see afterwards some very clear patterns that explain how we got to where we are today.

1907-1970s: The paper MLS era

Practically all North American realtors today are members of at least one multiple listing service (MLS). In the U.S., the concept of real estate agents sharing listings via a centralized source dates back to the 1880s. As part of pooling agreements, it became accepted practice for the listing agent to split the sales commission with the agent who brought in the buyer. This type of information clearing house was first formalized as an MLS in 1907.

MLS services were set up in different real estate markets across the country until the end of the 1920s, with most run by local real estate boards. Each week, agents would go to the office of the local real estate board or whichever organization ran the MLS to receive the latest listings in paper form, usually a book.

Before these MLS were created, the only way real estate agents could get listing information from properties they didn't personally list (i.e. pocket listings) was:

● As an agent for a larger area brokerage where the listing agent worked.

● By negotiating directly with other agents to pool leads.

MLS were designed to address a major pain point among real estate agents. During the first half of the 20th century, the most popular type of listing contract for homeowners was an "open listing." This allowed sellers to sign contracts with multiple brokers and whichever broker closed the sale received the commission. The other brokers got nothing. Sellers liked this arrangement because it helped their property receive as much exposure as possible, increasing the likelihood of a sale and competing buyer offers. Real estate agents hated this arrangement since it often led to agents working without pay.

MLS solved this problem by addressing both the needs of property owners and real estate agents. Under MLS rules, property owners have to sign exclusive right-to-sell contracts with listing agents for their properties to appear in the MLS listings book. Exclusive right-to-sell contracts mean that agents get paid if the property is sold during the period covered by their contract, regardless of who sells it. This was crucial for the MLS systems, since it meant competing brokers couldn't use the information provided by the listing broker to steal the sale.

Property owners gradually bought into the need for MLS listings as they saw that pooled listings ensured a larger number of buyer-side agents would bring potential buyers to the property. In fact, a Federal Trade Commission report on the U.S. real estate industry in the 1970s found that property listed for sale on MLS sold twice as fast on average as property that wasn't. The added value MLS listings provided was so clear that brokers in some markets actually added 1 percentage point to commissions for clients that wanted a MLS listing.

However, despite most MLS being formed around the 1920s, this innovative tool only became standard for most real estate agents over 30 years later, during the post-war period. Even as late as 1950, the majority of real estate listings in the U.S. weren't available via MLS. Peak adoptions rates for paper-based MLS took even longer, somewhere between 40 and 50 years.

In 1971, the National Association of Realtors' MLS Policy Committee chairman told the association's members that over the past 25 years "MLS has in most areas of the country become a way of life for both the homeowner and the broker." By 1977, 93 percent of real estate brokers in the U.S. were MLS members and 92 percent of broker-listed homes sold in 1978 had an MLS listing, according to the FTC.

Although, MLS didn't solve all problems. Several issues that remained for real estate agents included:

● Updated MLS listings were distributed no more than once a week, so realtors often had out of date information when talking to clients.

● MLS listing books had to be commercially printed and physically distributed to brokers and agents, which took significant time and money both for the real estate boards and the brokers.

● Leads still came in largely by word of mouth and traditional (e.g. newspaper) advertising. Most agents still had to be active, long-time members in their local communities to be effective in drumming up business.

It's no surprise then that large regional brokerages or real estate franchises were rare to non-existent during the era of paper listings. The large degree of local knowledge and connections needed to cultivate a steady stream of leads also meant that real estate agents were better off working for a small brokerage than working for a large company based outside the region.

1960s-1980s: Computerization

Until the late 1960s, nearly all MLS listings were distributed as a book on a weekly basis. The computerization of MLS listings began in 1966 with the founding of a company called Realtron Corporation, although it's quite difficult to pinpoint which MLS was the first to implement their solution. The National Association of Realtors (NAR) itself launched its first computerized system for MLS in 1975.

However, most of the more than 800 MLS in the U.S. and their broker members adapted very slowly to computer technology. By 1978, only 27 percent of MLS provided listings to brokers through computer terminals. Most (78 percent) still distributed listings through a listing book published weekly and paper-based MLS listings continued to exist well into the new millenium.

Computerized MLS improved the ability of skilled brokers to expand their business by saving a tremendous amount of time that was previously spent getting listings information. Computers increased the upfront capital needed to start and run a brokerage but also greatly increased efficiency. A broker had to invest some money into purchasing a computer terminal and hire or train staff members capable of using it to search through, enter and print out listings information. Computer use ensured real estate agents could keep a large, up to date portfolio of properties ready to show to prospective clients.

But computerization didn't just help brokers share listing information, it also helped them grow the quantity of leads they could bring in and track. Computers dramatically improved the targeting and tracking of direct mail campaigns as well as cutting mail production costs. This led to a boom in direct mail campaigns among real estate brokers, as well as other industries.

According to a study conducted by the U.S. Postal Service, the volume of direct mail advertising grew 9.8 percent per year between 1980 and 1988 - triple the rate of economic growth! Just like in other industries, direct mail campaigns enabled major brokers to develop a pipeline of leads sufficient to sustain a brand with offices full of agents.

It's no accident that many of the well-known realtor chains and franchises today began to expand outside their home markets only after computers were introduced to both sides of the real estate transaction.

However, computerization still didn't address several pain points for brokers and agents:

● No way to speed up or improve lead qualification.

● Lack of a nationwide database for property listings.

These pain points were partially addressed by the next major technological change in real estate: The rise of online real estate listings through third party portals and IDX feeds.

Online Real Estate Listings

Few statistics are available from the early days of the Internet, but by 2005 - slightly over a decade after the broader U.S. public became aware of the world wide web - half of real estate agents had their own website. Importantly, real estate agents' adoption of the Internet to buy and sell homes trailed that of the general public. In 2003, 71 percent of homebuyers already used the Internet to search for homes.

It was the thin years of the Great Recession that really culled the old-school holdouts and made a web presence a necessity for successful real estate agents. By 2013, 72 percent of real estate agents had their own website and 82 percent had a company website. Meanwhile, 92 percent of recent homebuyers in 2013 used the Internet to search for homes.

Third Party Portals

In 1994, the NAR created a subscription-based online listing service called Realtors Information Network (RIN). The original project floundered for a bit before becoming Realtor.com, which started out by featuring listings from the San Diego MLS, Sandicor. A variety of other sites attempted to bring listings online, like Microsoft's (NASDAQ:MSFT) Home Advisor, which launched in 1998. However, nearly all the real estate sites from the '90s besides Realtor.com failed to gain critical mass or implement a sustainable business model.

Aside from Realtor.com, the other major online listing portals today, Zillow (NASDAQ:Z) and Trulia, were actually founded in the Web 2.0 era. Trulia launched in 2004, while Zillow got off the ground in 2006. Moreover, the ZTR (Zillow, Trulia and Realtor.com) triumvirate that reigns in the industry today, really only solidified its dominance in the years following the 2008 financial crisis - more than 15 years after the initial innovation reached the market. As of 2013, 70 percent of realtors display their listings on major third party portals, like Zillow and Realtor.com according to NAR's latest Technology Survey.

ZTR and the entire concept of a third party portal helped the industry by enabling homebuyers everywhere in the U.S. to visit a single website to find and research homes. This addressed a major need of the Web 2.0 era, when more than 88 percent of homebuyers like to conduct their own online research before visiting homes. Some 43 percent of home buyers search for properties online as the first step in the buying process.

The key to the portals' success is that they understood something very important: because the portals bring the buyers, most brokers have to use these sites whether they like them or not. To ensure that they also keep this critical mass of buyers, ZTR all offer to provide listings for free or Zillow even pays MLS to supply their listings directly to the portal.

Besides free property listings, the real estate portal also provide value-added services to agents:

● Advertising agents' profiles alongside listings.

● Supplying leads that showed interest in area listings.

These added value services address agents' need to generate a steady volume of leads to keep closing, but does so with poor quality leads. A 2015 survey by Inman found that more than 50 percent of agents buy leads from third-party online listing portals, but 53 percent of these respondents to the survey said the quality of these paid leads was "poor." The reasons why a majority of the real estate agents using the portals' lead gen services are dissatisfied with them are easy to find wherever real estate agents vent online or offline.

The leads offered by the portals usually only give agents whatever information was generated by the website's lead capture activities, and leads' actual intent to buy is measured simply by someone clicking on a listing and/or filling out a very short lead form. Many of these leads haven't been qualified like any good sales lead. They lack information on:

● The prospect's budget, i.e. ability to buy.

● Verification that the contact information the prospect provided is actually accurate.

● Multiple ways to contact the lead. For example, social media along with email.

This means agents end up paying top dollar for a large number of junk and unqualified leads that either cannot be sold or require additional pre-sale work.

Portals like Zillow and Trulia also created a new pain point for agents. These sites allow any agent willing to pay enough to have their photo and details placed at the top of every listing in a chosen zip code. This means agents who don't know a property or even the neighborhood well can take potential clients away from agents who are much more likely to serve them well and close the sale. It also leads to situations where agents with other listings in the target zip code hijack potential buyers for a specific listing and re-direct them to listings where their broker office will get a full commission.

This has led to a growing trend of established brokers abandoning the third party portals since at least 2012. With the post-recession recovery in many U.S. real estate markets, it has also led to a growing share of pocket listings. A pocket listing is one that is not entered into a MLS or third party portal, which get most of their listings from MLS, but is sold by the broker via word of mouth and other means that are expected to attract interested buyers but aren't accessible to the broader public. In some markets, pocket listings are now approaching 20 percent of the homes sold, a point reiterated in a Wall Street Journal article last year with the headline "Real Estate Pocket Listings Go Mainstream."

Internet Data Exchange (IDX)

Internet Data Exchange is another method of sharing real estate listings online, and one favored by most brokers who invest significantly in technology. On January 1, 2002, all MLS and real estate associations affiliated with NAR were required to enable their members to display aggregated MLS listings data on their public-facing websites. This decision formally made the use of IDX feeds technically feasible for realtors across the entire U.S.

IDX feeds help brokers increase their volume of inbound leads and provide interested homebuyers with a much wider stock of homes to choose from.

However, it still took years for most real estate agents to incorporate listings from IDX feeds into their own website. In 2001, the adoption rate for IDX among brokers was less than 10 percent according to a NAR survey. By 2013, 56 percent of agents displayed their listings on IDX sites. Not coincidentally, this was nearly identical with the percentage (57 percent) that displayed listings on their own website.

However, some pain points unaddressed by IDX feeds include:

● Unlike the third party portals, most buyers probably won't automatically think to check a specific broker's website. IDX feeds alone don't guarantee a significant volume of leads.

● Brokers need to spend time and money finding ways to pull in web traffic to get the listing information in front of buyers (SEO, PPC advertising, etc.).

● Information provided by lead forms may contain incomplete or inaccurate information.

● Website leads may come without qualifying information, such as a potential buyer's budget.

Again, online listings may provide agents with a larger, steadier volume of leads but they also require agents to spend a significant amount of time pre-sale qualifying and sorting through junk leads.

Which brings us to the current technological moment.

Big Data: 2006 to Present

Since the web took off in the 1990s, most forms of communication have become electronic and online. Major hardware advances have also reduced the cost of computing power and memory storage and along with the development of cloud computing have enabled the accumulation and processing of massive amounts of data online. Advanced search and data mining techniques developed since the turn of the millennium have turned this data into new, useful Big Data products. One of these Big Data inventions was PIPL - where I work - which developed what soon became the first people-oriented search engine after it was founded in 2006.

Online people search took a massive leap forward at the start of this decade when the programming interface (API) economy took off. APIs allow developers to quickly and easily connect different software or databases. This allowed companies to provide up to date contact information on people to businesses in real-time. Soon enough, several companies formed to do just that.

PIPL created an API for this purpose as well but used its search engine roots to utilize an algorithm that can search billions of records and identify records belonging to a single, unique person. This means that with PIPL's People Data API, users can automatically send partial information from new leads entered into their CRMs and receive missing details within seconds.

For several years now, sales and marketing professionals have been using the handful of APIs on the market that provide information on people to businesses to:

● Automatically verify the accuracy of lead information.

● Rapidly qualify leads based on job title and age.

● Automatically find missing and alternative lead contact information.

A number of industries outside of real estate have adapted to this technology in a short time. For example, all major U.S. background report providers already integrate API-based data solutions from information providers to create their reports.

Real estate brokers have started to use these Big Data solutions as data providers have begun simplifying the technology further by launching purpose-built apps. Now, real estate agents - as well as all sorts of salespeople and marketers - can simply upload lead lists to websites instead of hiring developers to integrate an API into their database via code.

While real estate brokers have just started to adopt Big Data, the third party portals have decided to actively ignore the issues faced by agents related to lead quality. But if real estate history is any guide, that should change by the start of the next decade as tech savvy brokers start outperforming their competitors in the market.

Where to invest?

For retail investors to profit from this technological change, they must find data-rich companies that are publicly traded and serve the real estate market. In real estate, there are really two groups of data companies that will benefit from the transition to API technology.

People data dead-end

The first group are companies that specialize in people data, meaning they provide contact and job information that helps real estate agents and investors qualify home-seller and home-buyer leads. There are a handful of companies that serve this niche. Unfortunately, most of the companies with the highest exposure like PIPL and FullContact are privately held and not open to investment from retail investors.

The major exception is Lexis Nexis, a large subsidiary of RELX Group (NYSE:RELX), formerly known as Reed Elsevier. However, Lexis Nexis' services are generally not priced competitively enough for most users in the real estate market, and the company's revenues are heavily oriented toward other markets and non-API users. Moreover, the company is just one subsidiary of the larger RELX Group, meaning RELX stock certainly isn't even close to being a real estate data play.

Property data shows promise

That leaves the second data niche, which is property record data. Close to a half dozen companies gather this data from tax records, property title and court record databases. Here are two publicly-traded companies worth looking at: CoreLogic (NYSE:CLGX) and Black Knight Financial Services (NYSE:BKFS).

Both companies provide information and analytics to mortgage servicers, real estate companies, real estate lawyers and title insurance providers regarding properties and lead lists based on property owner information. Both companies can transmit and do transmit data via API.

CoreLogic: Diamond in the rough?

CoreLogic is the larger of the two, with its FY 2105 revenue roughly 65% larger than that of Black Knight, i.e. $1.5 billion vs. $931 million. In 2015, CoreLogic's revenues grew 9 percent year over year as show by Chart 2 above, fueled by 11% growth in its Property Intelligence division. A significant portion of this growth has been fueled by CoreLogic's acquisition strategy. In the second half of 2015 alone, CoreLogic bought 3 companies, LandSafe Appraisal Services, RELS and Australia's Cordell Information - for a total $236.1 million - as it seeks to firmly cement its dominance in this growing market.

As you can see in Chart 3 above, CoreLogic's Property Intelligence division (PI), which is responsible for its property valuation and lead data products has grown significantly over the past 5 quarters. While this growth has been acquisition heavy, Core Intelligence still has managed to grow its level of profits during this process. Importantly, this has changed the fundamental nature of CoreLogic's business from providing mortgage processing technology platforms to data. For the year-to-date, the PI division is actually responsible for more than half of the company's revenue for the first time in its history.

More detailed analysis of CoreLogic's fundamentals and business plans are certainly warranted, as this company shows promise. It may very well be a winner in the real estate data space.

BKFS: Not ready for prime-time

The largest competitor to CoreLogic when it comes to property data is Black Knight Financial Services. Black Knight in its current form was created by the merger and reorganization of several independent business units. The core of the company was the Technology, Data and Analytics division of mortgage processor Lender Processing Services (NYSE:LPS), Fidelity National Commerce Velocity and Property Insight, two subsidiaries formerly owned by Black Knight's largest shareholder, Fidelity National Financial (NYSE:FNF). Minority shareholder Thomas H. Lee Partners (THL) also owns a significant stake in the company.

Black Knight was a publicly traded corporation less than two years, completing an IPO on the NYSE on May 26, 2015. The initial offering stock price was $24.50 per share. Since then, the share price has risen 55% to $38 per share as of Friday, December 16, 2016.

BKFS revenues, similar to CoreLogic, grew 9% last year (see Chart 4 above). Also like CoreLogic, its property data division - Data & Analytics- saw its revenue grow 11 percent in 2015 to $174 million.

There are two key differences though between BKFS and CoreLogic. For starters, data is still a secondary business line for BKFS, never exceeding 20 percent of total revenue over the past 10 quarters (see Chart 5 below). This means that the company is still primarily dependent on providing technology services to mortgage processors and is not really a data play. Almost as important, the growth in the company's Data & Analytics division has stalled over the past three quarters. This suggests BKFS is starting to experience problems with organic growth in its data business products.

Both key differences mean that BKFS currently does not make sense as an investment choice for investors looking to benefit from the shift to data-as-a-service via API technology in real estate. However, investors interested in the space should keep an eye on any announcements from BKFS management regarding potential acquisitions or changes in organic growth strategy with regard to the D&A division. This would signal management realizes the prevailing trends in the real estate market by management and could be a catalyst for significant medium-term growth in revenues and the stock price.

Next steps for serious investors

I will leave further analysis of the investment proposition posed by CoreLogic based on both company fundamentals and technical analysis to other writers or possibly a further article of my own.

But the macro-trends in the real estate market suggest that long-term passive investors should be looking for companies that are leveraging the shift to API technology and Big Data. Lastly, at a bare minimum, investors interested in the space should conduct further company specific research into CoreLogic. This company may turn out to be an investment gem.

Disclosure: I/we have no positions in any stocks mentioned, and no plans to initiate any positions within the next 72 hours.

I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it. I have no business relationship with any company whose stock is mentioned in this article.

Additional disclosure: I am a product evangelist at PIPL, a privately owned technology company familiar with this market niche. However, I have no personal interest in recommending CLGX or any other publicly traded stock mentioned in this piece.

About this article:

Expand
Want to share your opinion on this article? Add a comment.
Disagree with this article? .
To report a factual error in this article, click here