I’ve been at the Clareity MLS Workshop in Scottsdale for the last few days and yesterday was the first day of sessions. There were sessions on:
- MLS regionalization efforts in California and Minnesota;
- NAR’s nascent Real Estate Channel or TREC (formerly known as Gateway);
- Syndication of listings; and
- Public-facing MLS listing search sites.
There were other sessions on security and web site privacy policies as well, but I won’t be covering those here.
John Mosey from RMLS-MN and Art Carter from CARETS conducted a session on the data consolidation efforts they are pursuing in their areas (Minnesota and California, respectively). As Art described the CARETS approach, they are creating a new listing repository to which all of the participating MLSs will send their data. Users also will have the option to upload listings directly to the repository. Each MLS will then be able to download the entire dataset to present it back to their users. In fact, the only access to the data will be through one of the MLS systems; there will not be front-end provided by CARETS, it is only a repository of the data.
In creating the one data set, Art described how they had to “kill some hamsters.” If you think of the fields in the current disparate data sets from each MLS as a bunch of hamsters in a cage, you have two choices of how to combine those hamsters: you can either build a bigger cage or you can kill some of the hamsters and only keep the best ones. CARETS chose to kill some of the hamsters and created a data set that is best of all the systems for their area.
John Mosey from RMLS-MN spoke next and described how he would rather be stripped naked, dipped in honey and covered with red fire ants than ever try to merge another MLS into RMLS-MN after his failed attempt with the Southeast Minnesota Association of REALTORS. Instead, John’s approach now is to follow a model similar to CARETS only using his Northstar database instead. John repeatedly lamented the fact that RMLS-MN represents 80% of the agents (or something like that) in Minnesota and yet has all the headaches of dealing with the smaller MLSs outside his area.
I think Gregg Larson had the best advice of all regarding regionalization, data sharing, etc., when he said that every market area is different. Sometimes data sharing makes sense, sometimes a repository makes sense, and the key is to consider the objectives you are trying to achieve and weigh those against the costs. There is no one-size-fits-all solution. Out in the hallways between sessions, I had conversations with several who were wondering exactly the same thing — what are the objectives being sought in these efforts? Clearly, trying to eliminate duplicate entry and providing easier data access for overlapping markets are laudable goals, but where is the value for less overlapping markets? There is a real sense, expressed well over at agentgenius this last week, that one unstated objective of the rush to data consolidation in some markets may very well be to consolidate power.
John and Art were followed by Chris McKeever from the Center for REALTOR Technology (CRT), who described the NAR’s plans to create what they’re currently calling The Real Estate Channel (TREC) and what used to be called the Gateway. As Chris described it, TREC is going to be very much like CARETS, except that no listings will be directly entered into the system, they will only received from local MLSs. NAR also intends to purchase/license public records to build a parcel-based system like Zillow’s. Maybe they should just buy Zillow instead? Nah, that wouldn’t work, because NAR intends for this to be open to agents only, not consumers. I’ve said before and I’ll say again, NAR could make something of this by simply focusing on creating a standard for a universal property ID to let the web organize the information. NAR could create a non-profit that could be the ICANN for property IDs, and that would be valuable for tying together the efforts of those publishing real estate info on the web. Trying to cram everything into one silo is never going to happen.
Next up was Saul Klein from Point2, Mark Tepper from Homescape, and Marty Frame from Cyberhomes, all of which were advocating syndication of listings to their or other sites. Saul’s pitch was that the listing is a marketing asset and should be promoted all over the web, similar to how agents use yard signs to promote the listing and themselves. Marty Frame had a very interesting discussion about how Cyberhomes is working on focusing their web site experience to the various stages homeowners, buyers and sellers experience, as opposed to treating all users the same. All of these speakers were very focused on the need to collect and analyze user activity data to improve the experience for users and to understand the return on investment (ROI) from syndication. In other words, focus on which sites provide the best leads from your ads.
The next panel included Bob Hale from HAR.com, Jim Harrison from MLSlistings.com, and Jenny Natale from MLSLI.com, all of which operate public facing MLS web sites. The message was clear: MLS web sites do not compete against broker sites for traffic. Instead, MLS web sites can drive enormous traffic to broker sites. Bob Hale presented a ton of stats from Media Metrix showing that in market after market across the country, broker web sites rarely rank highly in terms of traffic . In contrast, public facing MLS sites almost always rank highly, even more than popular sites like Realtor.com and others.
Those were the big highlights for me from the first day. Kudos to Gregg, Matt, Tina and the entire Clareity team for putting on a great conference.
Oh, wait, I almost forgot, Gregg gave the FBS Blog a Clareity award for being the best blog on MLS issues. Gregg also said he thought blogging was the most over-hyped technology for 2007.