Raging Regionals

Mar 14, 2007 Michael Wurzer

In my last post, I outlined three developing possibilities for toppling the current structure of MLSs in the United States: (1) emergence of one of the new listing aggregators as a national listing service; (2) consolidation of the current 700 or so MLSs into state-wide or other super regional MLSs; and (3) transformation from the anti-trust litigation between the NAR and the US DOJ.

All three of these possibilities are related to each other. If super-regionals are better than the current market definitions, why not just jump straight to a national MLS? Are the regionalization efforts going on right now simply a last gasp effort by the current MLS leadership to ward off the inevitable take-over by one of the national portals? The rise of the new listing portals may be the latest incarnation of the complaints against the MLSs’ attempts to structure rules for distribution of the MLS data. As the new listing aggregators run into these rules, they complain and seek alternative strategies at the same time. At the same time, even while being attacked by the DOJ through the NAR litigation, some of the biggest franchises have decided some of the listing portals are more friend than foe. Perhaps all three of these developments are creating a “perfect storm” of circumstances that will alter the MLS model forever.

While trying to keep the relationship among these issues in mind, I want to focus on each one to discuss the pain points at the heart of each issue and whether the solutions being advocated and pursued are good solutions to those pains. As the title of this post suggests, regionalization is first. I pick regionalization first for several reasons: (1) MLS regionalization is happening right now, it is not speculative in any sense; (2) pros and cons of regionalization likely also apply to a national MLS; and (3) the current formation of regionals provides a good background to the definition of the relevant markets and the influence of MLSs in those markets.

There are several root causes (or pain points) driving the regionalization efforts underway:

  • Brokerage Consolidation – Brokerages are getting bigger and bigger, which means they are covering more and more territory and having to work with many MLSs. In some markets, one brokerage might have to join as many as a dozen or two dozen MLSs. Brokerages spanning multiple MLSs have to enter each listing many, many times to get total exposure, which takes time and money. Single entry of listings should be the goal. That brokers and agents are doing business across many MLSs has changed the game and, with the MLSs still localized, has lead to frustration with the multitude of fees, rules, and data structures of the different MLS organizations and a call for their consolidation.
  • Who Gets the Leads? – Brokerages and franchises of every size are spending lots of money to develop web portals so they can be the first point of contact for the consumer. They want traffic to come to their sites and they need to aggregate listing data to do that. Aggregating listing data across a variety of MLSs is complicated today, because of the lack of standards across the many local MLSs.
  • Web 2.0 Portals – Google, Zillow, Trulia and other new entrants to the listing portal arena raise the possibility of a national system, causing progressive MLS leaders to try to get out in front of that momentum. There is real pain here as consumers can often see listings on these portals from a variety of MLSs when an agent who only belongs to one is limited. This puts the agent in the awkward position of having to use a public portal instead of the MLS to be on an even playing field with their customers.
  • There is little question that there is real pain, unnecessary expense and inefficiency in the current structure of MLSs in some parts of the country. Larger urban areas are the most obvious examples. When there are no physical boundaries or borders, such that cities run one into another, the boundaries of current MLSs become an irrational legacy. Listing aggregation should be easier and less expensive than it is. Data standards should be deeper than they are. MLS rules should be more consistent in substance and application. These appear to be givens.

    What is the solution to these pains? What is the best size for a regional? Who runs it? Who sets the rules? Important to us here at FBS, who provides the software? Many MLSs are grappling with this question right now and using different approaches for consolidation:

  • Merger – In this model, MLSs are merging together so only one entity and one system remains. This approach has the advantage of creating a common data set, common rules, single entry, and single data feeds. Until, of course, the brokerages expand further and cross into areas that haven’t yet been merged. The “right size” question is never-ending. Is the merger activity on-going right now simply headed to a national MLS? There’s a lot of time and money being spent on mergers, but is that just forestalling the inevitable progress to a national system? Ultimately, market forces will determine the boundaries of the MLS and the question is whether the market is moving faster than the MLS or MLS providers.
  • Overlay System – In this model, the localized MLSs remain in tact and each provide a data feed to a separate, typically read-only, system for searching that overlays the existing systems. This approach erases some of the boundaries but these systems are typically limited in the amount of data, particularly with regard to searching, that make the local systems so powerful. Also, with the local systems persisting, the natural inclination for users is to stick with the local system and not use the overlay. This frustrates the goal of single entry, as agents soon learn they still have to enter the listing into the local system to get full exposure. The main advantage of this approach is as a band-aid or a foot in the door to ultimately moving everyone to a single system. It’s a baby step towards merger or acquisition. The question is whether this baby step is worth the cost and energy.
  • Mututal Data Exchange — This approach involves each MLS exchanging data (hopefully using RETS but not always) so that each local system has all the data from each MLS. This approach has the advantage of each MLS being able to retain their independence while also providing the advantages of single entry, single data feeds, consistent (local) rules, and reduced fees. The disadvantage is that each vendor has to deal with a feed from each MLS in the data exchange, which is complicated, time-consuming and costly, given the lack of standards.
  • Data Repository/Independent Front-Ends – In this model, the various MLSs provide a data feed to a “repository” database, or the data is entered directly to the repository, which can then be used for common distribution of data feeds and searching but the local MLSs will retain choice in the “front end” (user interface) they want to offer their members for searching and client management. This is a more sophisticated version of the merger or acquisition approach. The data is still being merged into a single database, the difference is that local MLSs retain choice in the software they use to access the data.
  • Of these approaches, which is best? The easy answer is, “it depends.” It depends on the local market and the market demands. In this fast-moving environment, however, “it depends” is not good enough. There may not be time to allow market forces to provide guidance. In Founders at Work, Charles Geschke of Adobe was interviewed and analogized Adobe’s success to duck hunting, where you need to shoot slightly ahead of the bird (market) if you want to hit it. The MLS bird may have flown away long ago, but I don’t think so. I still believe we can solve the pains of the brokers, agents and the consumers and I believe the answer is in standards.

    Our industry (the NAR, MLS organizations, MLS vendors, IDX vendors and others) has, in fact, anticipated these issues long ago, and has worked very hard on a solution through RETS, the Real Estate Transaction Standard. The RETS working group is now working on version 2.0, which has the potential to revolutionize the way listing data is collected. The industry has often grappled with how to explain the value of RETS to the real estate community and now I believe the value is clear: RETS is a big part of the solution to the salvation of the MLS from otherwise certain death.

    With broader and deeper standard listing definitions, single entry is far easier to accomplish. A listing can be entered in one system and moved more easily to a national repository or to another system. Aggregating the data also becomes dramatically easier, whether for MLS use or for display on listing portals. One of the biggest weaknesses of the merger or overlay approaches discussed above is that you end up with a watered down system, that doesn’t recognize the depth of data necessary to accurately reflect the local market realities.

    We’ve installed over one hundred MLS systems and it never ceases to amaze me how different one MLS data set is from the other. Often, these differences are the result of opinion or local vernacular. In other words, the data isn’t actually different, but it is described and understood by the users differently. In these cases, the data cries out for standards. In other cases, the difference in the data structures is necessary to properly reflect the value drivers of the local real estate. The axiom that real estate is local has real meaning when it comes to data standards. Each community has unique data requirements that have been painstakingly created by the local MLSs and those data structures have enormous value (due respect to the Bloodhound and others who’ve pointed out the value of the deep data in current MLS systems). The RETS standards, however, present an opportunity to maintain the robustness of this local data while also providing deep and broad listing definitions that ease the movement of data to wherever it needs to go (more on this in a few paragraphs).

    The work on the MLS listing payloads (how a listing is defined) is on-going right now. MLS organizations, MLS vendors, at least one listing aggregator, and others will be working to finalize the standards over the next six months. The next meeting is in April and there is work going on right now to create what I like to call a RETSipedia that will expose the current payloads in a format on the web for the domain experts (brokers, agents and MLS executives) to provide comments and feedback. (For those brokers, agents and MLS executives reading this, if you’re not familiar with RETS and believe in saving the MLS, keep an eye on RETS.org, because you’re going to be asked to participate and help define the standards.)

    Whenever I talk with industry veterans about creating agreement on broad and deep data standards, a common response is, “do you remember RIN?” RIN (REALTOR® information network) was an attempt in the mid ’90s to get ahead of the technology movement and create a national listing standard.

    (Side note: I often hear people, including brokers and agents, talk about how slow moving the real estate industry is with regard to technology. This is hogwash. The real estate industry leaders have, for the most part, been pushing technology harder and faster than the technology is growing. The leadership often doesn’t get this credit, but it is deserved. MLS systems were invented by the real estate leadership and those systems have revolutionized the purchase and sale of real estate. Many REALTORS® are huge geeks and gadget freaks and those who paint them as dinosaurs with a broad stroke are mistaken.)

    RIN failed (well, actually, it morphed into Realtor.com) because the process of defining the listings produced tens of thousands of data elements. Literally. As input was collected from all the various MLSs, the different data elements became a huge morass. The challenges of that dialog could not be overcome at that time. The memory of those painful discussions cause industry veterans to cringe at the thought of trying to reach broad agreement. The beauty of the RETS 2 standards, though, is that standardization and customization can be approached at the same time. The X in XML stands for eXtensible. The payloads will remain extensible and can be tailored to local needs. This is a critical requirement to maintain the value of the local data sets.

    At the same time, a common language and understanding must be developed on every element possible. Disagreeing about the definition of a bathroom is no longer acceptable. If we are not the ones to define these standards, who is? Brokers, agents, NAR leadership, MLSs, and MLS vendors have led the real estate industry for many years and we need to continue to do so and recognize that data standards are critical to our future success.

    RETS 1.x has been a success but the success has been limited by the lack of depth to the common names. Light IDX sites can operate okay with the common names but, more often than not, what’s necessary is getting what is known as the compact or compact-decoded format, which is really just a delimited custom data file that requires custom programming. In other words, RETS 1.x provided a common way to retrieve the data but not a standard for what the data would be once it was retrieved. RETS 2 has the potential to change that by providing broad and deep data definitions.

    Once the data is defined, the technology solutions will come so much easier. MLS vendors and other technology firms serving the real estate industry will create amazing things with standard data. Also, with nationwide data standardization driving the change, the need for painful and disruptive mergers of many MLSs may dissipate or at least there will be an alternative strategy. For any MLSs considering the painful process of jamming multiple, disparate data sets together into a single database outside of the RETS process, you should think twice. Instead, participate in RETS, develop the standard with the community, so that when this wave of change passes, we have a true standard and not a continuation of the old disparities, just in larger, more inflexible packages. Big MLSs are only the solution if they come through a national standard.

    Importantly, however, standard data definitions are not the only pain here. The disparate MLS rules, aggregation policies, IDX policies, etc., need to be standardized, too. NAR has been beaten back so many times by the local MLSs on these various issues that they cringe at the thought of trying to “mandate” a policy. Now, with the DOJ jumping down their throat, the NAR is in no position to mandate anything. But standards are needed on MLS policies, too. Perhaps this is where a “state wide” effort makes sense. On the other hand, for data distribution policies (IDX, ILD, etc.), national standards are critical. Slowly but surely, we’re now seeing sold information on broker portals. When will this be a national standard? With the NAR out of the game from the DOJ litigation, the brokers need to step up to the plate and create agreement on these issues. The success of the MLS depends on it.

    Of course, the issue of data distribution policies, while definitely related to the issue of regionalization, is a topic unto itself, involving both the Web 2.0 challengers and the DOJ litigation. So, until next time . . .