Select Page

This RESO recap was organized by Matt Cohen of CoreLogic, a long-time RESO advocate and contributor, as well as the RESO Staff, Paul Stusiak, Rob Larson, Chris Lambrou, and Chris Haran. Thanks to Matt and others who contributed to this report.

 

Pano

 

The 2019 RESO Spring Technology Summit was held in Boise, Idaho – a beautiful city being transformed by new development and a wide range of entertainment options for conference goers.

RESO TODAY

We kicked off with RESO Chair Art Carter thanking past-CEO Jeremy Crawford for his time with RESO. Jeremy received a well-deserved, extended standing ovation. The group also showed great appreciation to Intermountain MLS, our host city sponsor who made the event so rewarding.

Sam DeBord was introduced as the new RESO CEO, and delivered RESO by the numbers, the state of the organization. RESO now has over 850 members and over one million practitioners with access to standards-based tools. This is due to 20 years of volunteer work on data standards. The latest Data Dictionary 1.7 is a broker breakthrough, with over 4000 fields and enumerations and 200 plus fields in a standardized IDX payload.

RESO has an aggressive strategic plan for 2019-2020, focusing on standards creation, adoption, utilization, and leadership. The focus is to “make standards easy for you to utilize”. Efforts include Data Dictionary expansion; Internet Tracking for ROI intelligence; additional data payloads; Universal Property ID growth; Web API update, Replication, including Push; Common Schema; Event Cataloging and Distributed Ledgers; and Broker Advisory direct outreach.

Fall conference registration is open, with the event being held in St. Louis, MO, September 9-12, 2019.

Register for the Fall Conference here!

Si Hablamos Español

Amy Gorce (CoreLogic) moderated a panel about lessons learned implementing the data dictionary in Spanish. Panelists include Jonelle Simmons (MFRMLS), Greg Moore (RMLS), and Michael McKay (El Paso).

Greg told the story about how 5 or 6 members got together and organized the effort, having native speaker practitioners do the translation. Then it was brought back to various markets for review and revision.

The panel mentioned that the Spanish translation of the standard is VERY comprehensive. Michael is from a market where two-thirds speak Spanish, and they are working on making every display in the MLS so it can be displayed in Spanish. Greg is working on client-facing reports first, then the goal will be to develop RESO Spanish data feed. Jonelle implemented a language switch for all displays.

The panelists all noted a common challenge: having a single Spanish translation does not account for various dialects among Spanish speakers. Amy asked about the local fields that are not in the data dictionary, and panelists explained how they intend to address them: bringing these fields to their Spanish speaking members for translation as well as using a translation service. Amy mentioned how this may be of interest to Canadian customers who want to see RESO Data Dictionary fields in French. Greg replied that Spanish is just a start, and how French should be relatively easy. Some languages, such as Chinese and Japanese, may be more challenging due to layout considerations, but are possible as well.

Michael talked about how great this will be for foreign investors, and how MFRMLS was able to bring on Puerto Rico into their Matrix system because of this effort. The cost for this initial translation was in the tens of thousands of dollars – not much for such a huge benefit to their Members and Consumers!

Auto-populating Green Data into Listings: The Oregon Trail

Greg Moore (RMLS) and David Heslam (Earth Advantage)

Greg and David have been working together for over ten years on research, IT development, policy development, and training surrounding Green Data. RMLS led the nation by formally adopting green fields in the spring of 2007. Subsequently they have worked to achieve RESO compliance as standards were introduced for Green Fields. They then engaged in an effort to auto-populate green data into listings.

The U.S. Department of Energy was creating an accelerator team – they joined that team. What really helped was the City of Portland creating a policy requiring sellers of single family homes to incorporate the following practices prior to listing a home for sale:

●      Obtaining a home energy score from a licensed home energy assessor.

●      Providing a copy of the home energy performance report to all licensed real estate agents working on the seller’s behalf.

●      Including the home energy score and the attached home energy performance report in any real estate listings.

RMLS does an API call with an address and they get back the info 90%+ of the time from David’s Green Building Registry. Challenges include using multiple address verification services, and the timing of new construction addresses.

The Climate is Ripe for Adoption: EE & Renewable Data Fields in the MLS

Craig Foley (Sustainable Real Estate Consulting Services) moderated a panel featuring Meg Garabrant (meg@neren.com, NEREN MLS) and Madeline (Maddy) Salzman (madeline.salzman@ee.doe.gov, USDOE Building Technologies Office)

Craig opened with how Green Energy is not a fad and how the NAR is engaged.

Maddy talked about the home energy score system. Energy affordability is a real problem in the U.S., and more energy efficient homes are both more affordable and more reliable. They also contribute to making the energy grid more reliable. Working with the real estate industry is a great way for the Department of Energy’s Building Technologies Office to showcase the benefits of energy efficiency.

The challenge is trying to bridge the communication gap – their agency has its own data standards. Agents might have concerns about using energy scoring – for instance, what if the information looks bad for a house they are selling? But they’re finding that buyers care about access to these kinds of data, and if the agent can communicate energy recommendations, including financing for remediation, fears are usually not warranted.

Meg talked about running the green implementation for NEREN MLS, something she’s been doing since 2008 when NAR first came out with their field recommendations.

They’ve done various implementations over the years and she is working on implementing new field requirements in the data dictionary.

There is some concern about both risk and reliability when providing the data, but having these data auto-populated with the source clearly attributed makes it easy and eases some of the concern. Meg is working on a publication to describe the process of moving to the new green fields.

Web API Finally in Production

Amy Gorce (CoreLogic), Jon Druse (W+R Studios), and Al McElmon (CoreLogic)

Amy introduced a panel with Jon and Al. Jon talked about developing a Ruby library for the Web API over the better part of a year – which has been open sourced.

Downloading and integrating the data after that has been easy for W+R. Al said that 80% of data consumers are using standard libraries, such as the Apache Olingo or Microsoft OData Clients. If you’re using Java, Javascript, or .Net you should consider using one of the off-the-shelf libraries before writing your own. If you’re using Ruby, you may want to consider contributing to the work that W+R has done.

Jon talked about differences between various vendor implementations preventing “plug and play” at this point, but anticipates it will take much less time to add additional vendors than it did previously. Jon talked about RETS being a “chatty” spec and how the Web API allows him to get the listing, photo, office and agent records all at once, which is a huge time saver. Parallelizing queries should make things even faster.

Al agreed that the $expand feature is great for Consumers, when servers support it, allowing programmers to use a single request for something that used to take multiple requests. Jon says the Web API is definitely moving us in the right direction.

Broker Tech Adoption of RESO Standards

David Gumpper (Gumpper Group, WAV Group) moderated a panel including Scott Petronis (Redefy), Warren Bowley (Michael Saunders), Bill Fowler (Compass), and Dan Troup (RE/MAX).

Each of the panelists has a different view of broker technology. Dan says it’s about creating a playground where developers can come and play, and to do otherwise stifles innovation. Warren sees it from a consumer perspective, as his five MLSs consolidated and adopted standards – so he knows the data is coming back in a common format and will always be the same to support his projects.

Scott says it doesn’t matter what sort of company you are – one way or another you must have a technology team: either an in-house or external team being leveraged, or both. Scott further noted that most brokers don’t need to understand standards because they don’t have developers in-house. Dan replied that this panel wasn’t a good cross-section of the brokerage, because most small brokers outsource their technology work to vendors.

Scott says there are many companies that say they are RESO certified – but few broker technology companies are, and that’s “on us.” Our next battleground must be to address idiosyncrasies in how the standards have been implemented, or as Paul Stusiak said in the Interoperability Workgroup’s session, “nonstandard standards.” Dan says that he doesn’t think certification matters for him – he consumes certified implementations.

In what might be seen as a contrasting viewpoint, Warren later noted how difficult it was having organizations downstream from him that aren’t RESO certified. Dan said that what blocks innovation is the 600 implementations. Bill says to be a national broker is to have 600 relationships. The next frontier for him are national policies to solve his challenges with MLSs. To have to tell agents that it’s going to take three months to get a market up and running with local data isn’t acceptable.

However, the pain points aren’t purely technical. Scott agreed with Bill on policy and talked about the challenges getting a broker back office feed – it’s not part of policy and he can’t consistently get the feed he needs. Scott said that “We should all be participants in creating policy – we need to be advocates.” Dan Troup says data policy with MLSs is based on use, that if your use doesn’t fall into one of the uses described in policy, it can be an issue. Bill replied that he doesn’t want to give his uses and game plan to the MLS – which is full of his competitor brokers. Dan would like to have brokers get all the data as a matter of policy but to display it, a broker would need a license.

Bill said he had to prove inside his company that RESO matters, and it’s our responsibility to bring other brokers to the table. Warren talked about asking third parties whether they use the Data Dictionary – but it’s not always possible to get companies to adopt it. Brokers need to put more peer pressure on companies to adopt RESO standards.

Scott says his takeaway is that we need to have smarter discussion about policy and brokers need to be involved. His ideal is to get Data Dictionary compliant data from all his MLSs and have third parties get it that way from him. Dan said he’s preaching to the MLSs who get it – the ones that don’t are not here. He urges MLSs to only send out the RESO Data Dictionary feed rather than in a custom format, or at least to default to it.

From a roadmap perspective at RESO, many of the problems discussed here are currently being tackled in the Interoperability Workgroup, as well as in Transport through the Common Schema Subgroup, in which the goal is to converge on a common data shape to facilitate data exchange, ingest, and integration.

RESO Contributor Awards

Sam DeBord recognized: Board of Directors Officers; Art Carter, Michael Wurzer, Richard Renton, Tim Dain, as well workgroup chairs Rob Larson (Data Dictionary), Chris Lambrou (Internet Tracking), Mark Bessett (UPI), Greg Moore (Research & Development), Paul Stusiak and Scott Petronis (Web API), Rick Trevino (Payloads), Mark Lesswing and Ash Antal (Distributed Ledger), David Gummper (Broker Advisory), and Chris Haran (Cross-Platform Interoperability). Sam also recognized many other contributors and volunteers.

A few RESO Volunteers…

Individual Contributor Award Winners included: Brian Tepfer (Data Dictionary); Gayle Ludemann, Keith Schreifels, John Thummel, and Shawn York (Internet Tracking); L.D. Salmanson (UPI); Matt Cohen and John Breault (Research & Development); Doug Shamoo (Web API); Cass Herrin (Payloads) and Eric Stegemann (Broker Advisory).

Thanks to all the volunteers who generously contribute their time and expertise working on RESO standards!

Broker Advisory Group Meeting

David Gumpper (Gumpper Group, WAV Group), Amy Gorce (CoreLogic), Sam DeBord (RESO), Richard Bellamy (Terradatum), and Eric Stegemann (TRIBUS)

David gave an overview of the broker survey that is being developed to find direct needs of brokers in the marketplace for better data solutions. This focus is particularly on brokers’ direct interactions with real estate consumers.

There was a discussion on the value of Teams data. In particular, the focus was on where it can be useful for practitioners to measure performance, issues with defining Teams, MLS variance, and geographic variance related to regulation.

Richard led a discussion on the inconsistency across the market in Web API availability, performance, and adherence to a common standard, some of which are being addressed by the newly-formed Common Schema group within Transport. A broad discussion covered inconsistent usage of data dictionary fields that can lead to inconsistent integrations, and transformations of data across MLSs, Amy noted that some technology companies do the heavy lifting by digging through past data and transforming it to current standards for production availability.

Governance of MLSs was discussed, as it relates to broker access to IDX, VOW, and Broker Back Office Data feeds. Access to in-production Web API feeds also touched on governance, as well as the apparent unwillingness of vendors to report lack of compliance in markets where they need data. Concerns about backlash for the whistleblowers cause them to not use current compliance channels.

It was suggested that RESO could provide another communication channel to accelerate adoption in production. Eric noted that while changes to improve broker access to data can be made by putting the right decision makers on the board of directors for MLSs, technology vendors are rarely able to sit in those seats and contribute to the decisions.

The Broker Advisory Group, being a one-year old work group, then discussed direction and outreach. Governance, best practices, and standards policy will continue to bubble up through this group and can be directed by RESO to the appropriate bodies to fulfill brokers’ needs.

In-house technology staff from larger brokerage companies and brokers’ technology vendors are the primary players who can bring broker concerns and address the technological challenges needed to be traversed to bring solutions. Sam asked the group to each provide RESO with two names of brokerage technology staff or vendors who RESO can reach out to for feedback.

Everything You Need to Know to Succeed with Web APIs

Turan Tekin, Jay Lee, Kevin Regensberg, and Colin Clay (Bridge Interactive/Zillow)

Turan opened the interactive live polling with some levity by letting attendees weigh in on the long-standing question of how to pronounce RESO. The results were clear: rē sō. The long “e” is the undisputed winner.

Turan, Jay, Kevin, and Colin provided a basic introduction to APIs as a method of interacting with a data provider. “Good APIs are well-documented, consistent, flexible, and performant,” said Turan.

Kevin and Colin clarified that the Web API and RETS are both APIs – but RETS is real estate specific while the Web API is built on open standards created by OASIS. Some differences between the two are the way metadata are represented, and that RETS uses DMQL while the Web API uses OData $filter syntax, which can be easier to understand and work with.

Some platform considerations include costs, support, payload setup and flexibility, usage reporting, security, broker opting and data sharing capabilities. Meanwhile data consumer concerns include documentation, support, certification, available tooling, additional data, data quality, performance, and MLS support.

When it comes to moving from RETS to the Web API, they suggest new users go straight to Web API, but others may want to do a parallel conversion and go through an incremental roll-out. Some MLSs are aggressive, with firm deadlines aimed at curtailing RETS usage. Others roll out new feeds only in the API with legacy RETS staying (at least for a while) – or some hybrid of the two.

Kevin and Colin then discussed Replication and On-Demand queries. For replication, the pro is full control of data and system performance while the cons may include increased development and ongoing costs, and possibly stale data (if not replicating regularly). On-demand pros can include lower costs and “data from the source”, while cons include query limitations and lack of control, among other issues.

Bridge took a live poll on how people want to work with data sources and while Replication led by a small margin, with 23% wanting Replication and 20% wanting on-demand queries, 57% wanted some combination of both (n=66). They mentioned the efforts underway to improve the current state of Replication through RCP-027 and RCP- 028.

The good news is that standardization has come a long way, adoption has been increasing, the industry is becoming more tech-savvy, real estate tech is starting to get big investments, and RESO continues to get greater participation.

Technology companies that are focused on analytics, machine learning, and artificial intelligence often favor replication. Data needs to be ingested and processed to create derivative data, something which can’t easily be accomplished using on-demand queries.

Universal Property Identifier (UPI) Workgroup

Chaired by Mark Bessett (CRMLS)

Current workgroup updates include: releasing open source UPI reference libraries, approving an update to add UPI Air Rights to the specification, discussing integrations and continuing work on the UPI Registry. The group is continuing to explore a “Canada strategy” and beginning work on sub-property standardization.

UPI libraries on Github are located at https://github.com/RESOStandards

There’s an idea for a registry of UPI-based property data components. Providers would register UPIs for which they have data and govern access to their data. All participants can then obtain a UPI validity score. Consumers can reference (or cross-reference) data using the registry, with endpoint details subject to provider licensing. Governance is “to be determined” – potentially it could be managed by RESO, RESO members, or consumers.

There’s a UPI demo at http://resouniversalpropertyid.azurewebsites.net

Blockchain: Making Data Transparent, Provable, and Immutable

Peter Anewalt (ULedger)

Blockchain technology creates a history of data that is mathematically provable, immutable, distributed, and tamper-resistant. It does this using cryptographic hashes and proofs and a decentralized, distributed database. The database is stored via a network, rather than a central authority.

Peter pitched using a stateless hybrid blockchain for real estate business use, rather than public or private blockchains. Technically, this would involve multiple chains, and multiple nodes rather than a single global history of data. The data would be private, with public, permission-less, immutable proofs. Nodes would collaborate – not needing incentives and incurring mining costs. There would be a REST-ful API layer for easy access to key functions.

Standardized Data: Nurturing Lifelong Customer Relationships

Marilyn Wilson (WAV Group) moderated a panel including Eric Stegemann (TRIBUS), Cass Herrin (MoxiWorks), Sam DeBord (RESO), and Bob Evans (Realtor.com)

In an industry survey focusing on software with CRM capabilities, only 23% were members of RESO and only 15% had attended a RESO conference. While half were familiar with the data dictionary and Web API, none of those surveyed are RESO certified. That said, Eric’s TRIBUS product is RESO certified, though most of the third parties his company integrates with are not. Cass said that his MoxiWorks product is RESO compliant – but given the conversation at this event, he will strive for certification.

His third parties also use a variety of APIs and formats for contacts, leads, etc. rather than a RESO standard. Bob’s products work off the Data Dictionary but are not yet certified.

Cass talked about data going in and out of his platform. The contact source is agent entry or imported from Gmail or Outlook. Once they are in a real estate CRM, that’s when they may be shared with other products. They are focused on giving brokers and agents control over their data, and want it to flow easily to other products. Eric talked about other data related to Contacts, for example, Saved Searches and Listings they had saved. They also pass information to transaction management systems to pre-fill contracts, and the TM platform passes information back to the CRM. Similar TM use cases are currently being worked on in the Interoperability workgroup, which has been working on a common data shape that can be used both by RESO vendors as well as third-party vendors and transmitted easily between systems.

There was consensus that we need a critical mass of vendors that are RESO certified so there can be “flip a switch” fast integrations with a variety of other platforms. Bob says we need to build on success stories and show success and show the metrics – vendors have to “drag each other” toward adoption. Eric talked about the vendors who have a “walled garden approach” and that we need brokers to stand up and say that’s not acceptable anymore.

The Common Schema workgroup has begun tackling the issues of common data shapes in order to increase “plugability” and facilitate interoperability, as well as more consumer-friendly representations of APIs and their data.

Relevant Data Dictionary Resources: Contacts

Saved Search

RESO Toolkit Workshop

Rob Larson (CRMLS), Greg Lemon (RESO), Joshua Darnell (RESO), and Paul Stusiak (Falcon Technologies)

Rob Larson provided an overview of the workgroup collaboration process and Confluence system. Members can post new topics and contribute to discussions. Members can also see call agendas in advance. Rob also showed the wiki (https://ddwiki.reso.org) – a great place for understanding the data dictionary. You can also download the documentation from https://www.reso.org/data-dictionary/

Greg Lemon walked through how to use some certification resources and how to avoid some certification pitfalls. The documents referenced during Greg’s presentation may be found at the following Google Doc links: “RESO Certification: Editing and Submitting Data Dictionary Certification Testing Results” and “RESO Certification: Creating and Submitting Web API Server Credentials“. Certification is a free benefit for all RESO members. The compliance tool can be used on staging servers or even by uploading metadata files – but final certification must be on a production server.

Josh Darnell demonstrated the RESO Web API Adapter – a cross-platform desktop client written in Java. It provides saved searches, a metadata viewer, scheduled downloads, and export to Excel. It was built using available off-the-shelf OData libraries from Apache Olingo, which greatly simplified things like working with Metadata. Prior to release of the Adapter, RESO will create a UI-based “filter builder” to help non- programmers create saved searches.

Replicating data from vendors is also an important use case. Currently, there is no standardized way to do this and the current methods are often difficult and error prone, requiring frequent reinitialization of local data. The tools RESO is currently building will handle replication in the environment we have now in a more robust way, but there’s also work being done to create a simple and reliable replication standard within the Replication Workgroup, whose proposal was adopted by the Transport Workgroup during their Thursday meeting.

Josh also showed us a tool called Commander, which is written in Java and uses the Apache Olingo library as well, as it shares code with the Web API Adapter and does all the “heavy lifting” behind the scenes.

The Commander supports parsing and validation of metadata and provides schema validation for data as it’s being transferred from the Server (with the useEdmEnabledClient option). It supports Saved Searches, Scheduling, Parallel

Queries, Local Database Initialization, Replication, and export to Excel. Commander has various options such as getMetadata, validateMetadata, getEntities, and saveRawGetRequest. It can also convert EDMX to Swagger (Open API) in order to provide a more consumer-friendly view of the queries and data that each vendor’s Web API offers.

The Web API Commander will be available Q2 and the Web API Adapter (UI-based tools) in Q3/Q4 2019. Josh is happy to provide early access and welcomes contributions from other developers. Anyone interested should email Josh at josh@reso.org.

To handle local replication using the Web API, the Commander and Adapter use the strategy of fetching all available IDs for each given Entity by key, splitting ID data into pages, then pulling data for each page (in parallel) and reconciling records that might have been changed or deleted since the transfer had begun. This is due to the fact that initialization can take a long time and records often change since initially queued. The goal in performing these additional checks is to prevent cases where data that have been removed from view for a given Consumer but are nevertheless transferred and displayed in their systems. Issues such as these are among those currently being addressed by the Replication Workgroup.

Replication Preview:

There are two RCPs related to replication (RCP-027, RCP-028) being discussed and voted on at the conference. They’re aimed at solving some of the current issues with Replication:

●      It’s difficult and error-prone, in part due to the use of physical timestamps and the fact that many separate resources have to be constantly polled and correlated to fetch new or updated records.

●      Initialization is complicated and processor-intensive, and people frequently reinitialize their data to fix inconsistencies.

●      There’s no clear way to tell users to remove records from their systems.

●      And much, much more…

To address these issues, Paul and Josh have been working with a large group of data Producers and Consumers since the Milwaukee RESO conference within the Transport Replication Subgroup, which has proposed a Log interface to provide an immutable, unified record of all events that have happened to all Entities in a given system and removed the dependence on physical timestamps by using logical timestamps in the form of sequence numbers. It’s also technology independent! People can store their “Entity Events” behind the scenes however they want using PubSub, RDBMSs,

Ledgers, NoSQL, or any other persistence mechanism of their choosing. As a bonus, performance and accuracy will also be greatly improved by switching to queries by key.

To quote from the RCP:

“This resource will contain EntityEvent items ordered by logical timestamps using the EntityEventSequence field. EntityEvent entries are generated each time resource data changes occur on the Producer, meaning when resources are created, updated, or deleted. However, this proposal does not attempt to interpret or classify raw source EntityEvents into their corresponding business EntityEvents such as “Status Changed”. For the purposes of this document, EntityEvents are meant to be indicators that something has happened within the Producer’s system that may be of interest to Consumers. By representing resource change EntityEvent items in a simple and atomic way, eventual consistency can be achieved so that Consumers can play the EntityEvents back on their respective systems. If the Consumer has been granted access to HistoryTransactional, a relationship exists between EntityEvents and HistoryTransactional so they may be correlated, and additional data describing specific details of the changes on the Producer’s system may be retrieved by the Consumer.

However, as History Transactional is not a required resource, there are no guarantees at this time that historical data will be made available to Consumers. Since the EntityEvents resource can stand on its own without History Transactional being present, this proposal supports either case. Continue Reading…

Building on RCP-027, RCP-028 provides a mechanism for Producers to push EntityEvents to Consumers using the EntityEvents Resource and Web Hooks as the Transport, with room to support additional transport mechanisms in the future. “Push” has been discussed in the workgroups for some time, and the goal seems so much closer now!

Data Dictionary Workgroup

Chaired by Rob Larson (CRMLS)

Rob Larson talked about the future of the Data Dictionary, where the workgroup was at with DD 1.7, and where it was heading with DD 1.8. Ongoing initiatives such as Days on Market, Association Management Systems, and Lockboxes were also discussed.

Unfinished business was discussed, such as Reverse Prospecting, Enumerations, and Smart Home Features. New Business included Egress Windows and Terminology Required by Local Law.

Rob then described the changes to the Dictionary that came from the Internet Tracking Workgroup and pointed out that the process used by the group was exactly what he was looking for, the Internet Tracking Workgroup vetted all proposed items for the Data Dictionary and brought forward a proposal. It made it easy for the Data Dictionary workgroup to move it forward. Fractions of Stories, e.g. 1.5, 1.75, etc., and Marina in Community were also discussed. The session wrapped up with a review and discussion of “DOM and its Multiple Personalities” a paper written by Rob about the five dominant flavors of Days On Market used in the US and Canada.

Research and Development Workgroup Meeting

Chaired by Greg Moore (RMLS)

The purpose of the group is to solicit and review submitted business cases and underlying business needs, opportunities and challenges and identify how RESO can directly contribute benefits for the business needs of the industry with solutions developed through the creation and evolution of RESO standards.

The first subject was using the real estate business rules language (REBR) for display rules rather than just to support listing maintenance. In this way, a vendor could get one feed and read the rules about what data elements may be used for what purpose.

Suggestions to accomplish this include adding OriginatingSystemID, Type (e.g. IDX, VOW, Content Maintenance) to the REBR RuleBook syntax. Mark Lesswing suggested changing “type” to “context”. Then we could add two new rule types: MAY_DISPLAY and MUST_DISPLAY. For example, “The field MemberMlsID may not be displayed” which in “REBR would be MAY_DISPLAY NO MemberMLSID”. The group discussed whether, when referring to a field, the resource should be specified.

Another subject was MLS data feed authorization information for data consumers.

The group also discussed new construction data dictionary fields – high rise building units, quick move-ins and to be built properties.

The next topic was Image Standards: should we have standardized naming conventions for different sizes of an image? E.g. thumbnail (small images, 40px wide), small (fit into 300×225), list 43grey (for some list views, 4:3 aspect, grey background), listquad (for some list views, 1:1 aspect ratio, crop-fit), detail (large image on detail pages, 800×400), fullscreen (full-screenable gallery, 1600×1200). Matt Cohen planted the seed for articulating enumerations for media manipulation to the information attached to media/images – a topic for future meetings.

The R&D workgroup received an update from the distributed ledger workgroup. the group started with an event model to transmit changes between systems – not the data (e.g. price decreased from 500k to 400k), just the fact that change had occurred. This is not just about MLS: it involves MLS, broker, county, lender, escrow, vendor, owner, builder, city, and buyer. Fields may involve: transactionid, eventsubject, system, subjecttype, entity, event, state, recorder, timestamp, version, and application. The eventsubject would use the RESO Universal Property Identifier. The Recorder would be

the RESO Organizational Identifier. Event types might be things like application, appraisal, assessment, construction, contract, deed, estimate, identity, improvement, lien, listing, offer, openhouse, permit, price, referral, and status.

Internet Tracking Workgroup

Chaired by Chris Lambrou (MetroMLS)

The tracking workgroup is formed to solve a problem: tracking data (e.g. how a consumer uses a website) is recorded in silos and it’s difficult to commingle tracking assets. The solution has been to create a light-weight, object-driven, event-style data specification. The standard leverages events, actors, objects, and sources. An event might be a detailed listing view, an actor might be a consumer and their associated information, an object might be a particular listing, and the source would be a particular website.

At the group’s last meeting, the group deprecated “non-detailed view” and added “impression”. The group also added “eventsource” and enumerations (map, list, and voice assistant). The group has been working on a standard “summary report”, examples of which are illustrated below:

Whenever we talk about tracking there is always a discussion of consumer privacy and personal data. Enterprise tracking solutions need to honor the “do not track” browser message, focus on aggregate data, not personal data, not use cookie solutions, anonymize IP addresses and optionally be hosted on your own servers.

At the Summit, the workgroup passed the motion to sunset the “Google+” EventTarget enumeration, reviewed the 1st draft of the RESO Standard Summary Report white paper, did an overview of the valid reasons for collecting end-user activity per GDPR, took a look at some enterprise level privacy focused tracking solutions that are gaining popularity, discussed ways to improve adoption of the spec and held a group discussion on what the Internet Tracking workgroup can do to help the Broker want of lead generation via analytics.

 

Web API (Transport) Workgroup

Chaired by: Paul Stusiak (Falcon Technologies), Steve Ledwith (eXp Realty) Contribution: Joshua Darnell (RESO)

New Business: Steve Ledwith is taking over as co-chair, with Scott Petronis stepping down.

New Business: This meeting was a return to the older block seating where participants faced inward. Some challenges were noted with this format, screen location, separation between tables, but the general consensus was to try this format in the next meeting with some modifications.

A vote was taken to change “Web API” Workgroup back to “Transport,” which more accurately represents the purpose of the group. The vote passed unanimously, and will go to the board.

Paul Stusiak led the discussion of various RCPs:

RCP-023 – Test Versioning and Multi-valued Lookup Clarification. Enumerations are structured differently between the RESO Web API Standards. These differences require the testing rules and procedures to be different for each standard. Change “A field that contains Multi-Valued lookups must make use of an Enumeration that has IsFlags=false.” to: “A field that contains Multi-Valued lookups must not use an Enumeration that has IsFlags=true.” The testing / compliance team is already set to make the changes, and this was just a formal vote, which passed unanimously.

RCP-022 – Lightweight Autofill API. A lightweight API would be helpful for 3rd parties who provide autofill or auto-population of data that is used for MLS Listings and RESO Media. Many of the people involved (like photographers, energy score providers / green data) are not code-writers, but may have “fragments” they want to submit, possibly prior to a listing being created. This would be a JSON endpoint. Details are all available in Confluence. The discussion generated a lot of comments. One comment deserves additional mention. Bill asked, if there’s no MLS number, how would this be associated with the listing? The answer provided was the proposal creates a standard way to share these listing fragments and how it is implemented is not covered in the proposal. Since the address is known, that would be used by the Listing Agent to incorporate these fragments into the listing when entered or updated. Those providing the information have some relationship with the Listing Agent (photographer, room measurement service or others) and the endpoint sharing is not part of the proposal. There was no vote on this today and the work will be done in regular Transport meetings along with RCPs 024 and 025.

RCP-024 – Lightweight Autofill API DD Extensions. This describes the data that goes around RCP-022.

RCP-025 – Lightweight Autofill Schema (data shape). This describes the structure of data proposed in RCP-22 and will be discussed in the newly-formed Common Schema subgroup.

Josh Darnell proposed adding a sub-workgroup within Transport to discuss adding more structure to our data – defining container names, relationships between items, a place for local fields, and converging on existing definitions which currently exist in the Data Dictionary, where appropriate. This workgroup will have to discuss versioning as well.

Josh mentioned there were several use cases for taking this step now, including work being done in the Interoperability Workgroup, RCP-022, and Green Fields. Group voted on formation of a Common Schema Subgroup under Transport.

RCP-026 – Change Default and add Mandatory Requirements for Authentication. The wording of section 2.1.1.2 OpenID Connect Standard (version 1.0.3 referenced herein) should be changed to “NOTE SC0-1: The majority of applicants will be required to receive the “OAuth 2 Bearer Token” security classification. Unless otherwise requested, applicants will be tested with the “OAuth 2 Bearer Token” rules set. Applicants may request to receive multiple classifications.” Josh Darnell suggests to table this for further discussion to make sure the change proposal covers the appropriate use cases. The default now falls to attended, while most use involves server-to-server use – unattended, but changing the default alone isn’t enough to satisfy the server-to-server cases, which MUST support unattended authentication.

RCP-027 – EntityEvents Resource and Replication Model. See earlier session for more details. Paul Stusiak and Josh Darnell answered questions and, after some discussion, some minor changes were proposed for voting on the final day.

RCP-028 – Push Replication of EntityEvent items. See summary in “Tools” session or later Replication section for more details. Josh Darnell provided a summary of the change proposal and took questions on it.

Cross-Platform Interoperability Workgroup

Chaired by Chris Haran (MRED)

Contribution: Paul Stusiak (Falcon Technologies), Joshua Darnell (RESO)

The purpose of this workgroup is to identify through relevant business changes and needs, solutions that will bring together disparate systems to better serve the industry, especially the practitioners and brokerages.

MRED conducted an interoperability survey to find practitioner pain points. The survey had 1,234 respondents over a four-day period. Out of these responses 40% said it was extremely important for transaction management systems to “talk to each other,” and another 35% said it was “very important” – including a standard not just for data, but also sharing documents back and forth. There’s also still a lot of duplicate entry of listing data. Only 24% enter it once, but 60% enter it 2-3 times, some entering it 4-5 times, and a few more than five times. They want transaction management (TM) systems to move data in and out of the MLS. Other categories are CRM to transaction management, MLS and vice versa, as well as having a social media push from various systems. Many of the resources already exist at RESO and unifying those resources across workgroups will be a priority going forward.

Paul Stusiak and Josh Darnell discussed the challenge of “non-standard standards”. There was also a discussion on locally important data (producing systems MUST have locale specific data and consuming MAY need local data). A third problem is that relationships between entities are undefined – an assumption is that Property has Media. We can group fields that belong together (amenities, address fields, etc.) into logical entities and strong definition is needed. Where and how is this relationship expressed? Systems need to exchange information simply and reliably at the syntactic level (basic data) and semantic level (add structure of model and permitting unaided, automated interpretation). Interoperability is going to have to address some of these challenges: improving interoperability with explicit, prescriptive base entities, data share and locale additions.

A common schema would expose systems in a predictable way, while allowing people to program their systems how they want. The schema would implement RESO standards, and move toward a consistent structure for expressing standards and local

data. It would accelerate development for data consumers since they won’t have to program to many different structures. There would be a consistent data shape for entities, relationships, and local fields. Common Entities might be Office, Member, Property, Media, and OpenHouse. In terms of relationships: Property has Member, Member belongs to Office, Property has Media, Property has OpenHouse. OData has at least three ways of defining the relationships:

1.      Using key relationships between separate Entities

2.      Collections

3.      Predefined Types

Local fields are challenging. It would be nice to have a common place for local fields. These can be scoped at the Entity level, but don’t have to be the same across Systems, as long as their location is consistent.

For example:

–           Property has LocalFields

–           Media has LocalFields

–           Property has zero or more Media

–           Property has Media.LocalFields via Property.Media A starting point for the group could be to define Property.

 

Distributed Ledger Workgroup

Chaired by Ashish Antal (MLSListings), Mark Lesswing (Lesswing.com)

The purpose of this workgroup is to identify and document property lifecycle events through the Event Catalog. These events could be recorded in a distributed ledger by the industry participants to support accountability, provide instant notifications and identify rules/patterns that are valuable to real estate professionals.

Rather than diving into the “how” of blockchain, the group had reConsortia show a practical application. This app tracks real estate referrals – a $20 billion market. Until this app, nothing tracked the referrals and back-end relationships. Once the sale closes, the agent making the referral is paid. They’ve leveraged some of the work done in this workgroup to get this done. As part of that effort, they’ve created a title token for every property in the United States.

Mithra Contract then demonstrated a next-gen legal contract platform on blockchain.

Replication

Chaired by Paul Stusiak (Falcon Technologies), Joshua Darnell (RESO)

The purpose of this workgroup, a subgroup under the Transport Workgroup, is to identify replication problems and to deliver changes to the Web API standard to improve replication.

Replication is used throughout the industry to create new data structures from many MLS providers or sources to deliver data to the services provided by the replication consumers.

The session had two items of business before the group. RCP-027 and RCP-028.

The session had discussion on the material of RCP-027 and reviewed the changes recommended by the Transport Workgroup. Additional amendments were made to the proposal and it was put to a vote to add the changes therein to the Web API standard document version . The vote passed unanimously.

The session reviewed the material of RCP-028. After discussion, the proposal was put to a vote to add the changes therein to the Web API standard document version. The vote passed unanimously.

Closing

Direct participation in RESO is growing quickly and the practical benefits of standards through our workgroups is accelerating. RESO conferences always sell out, and are limited in size to ensure the attendees can have an interactive experience with the speakers. The workgroups are designed to get everyone involved and push progress forward in-person.

Tickets for the Fall Conference in St Louis are selling quickly, so sign up now if you want to get your company involved. Because we have a business track and a standards track in the fall, you’ll want to make sure your broker technology vendors participate.

The industry is highly focused on innovation in technology, and the progress of standards continues to let all of our participants deliver greater value to their customers-

-professionals and consumers. Thank you to all of our attendees and contributors. If you haven’t joined RESO yet, reach out to us–we’d love to have you involved!

Register for the Fall Conference here!

 

Check Out the Full PDF Version of the Spring Conference Recap Here!

 

Subscribe To Our Blog!

Subscribe To Our Blog!

Join our mailing list to receive the latest news and updates from our team.

You have Successfully Subscribed!