Blog

Blog

AI in Real Estate Is Getting Practical: Key Takeaways from RETCON 2026

6 min read • March 18, 2026

The Outcome Team

The Outcome Team

If you attended RETCON even two years ago, you probably remember a lot of sessions that felt like orientation week for AI. What is it? Should we be paying attention? Is it a threat or an opportunity? That era is over.

This year, the question of whether AI in real estate was viable was not on the agenda. The people on stage were wrestling with how to deploy it at scale without losing control of their data, their processes, or their competitive edge. That shift was not confined to one room or one track. It was the throughline of the conference. Whether the topic was deal-making, asset management, operations, or technology strategy, the same themes kept surfacing: workflow clarity, data security, governance, and the relentless pressure to show ROI faster.

We came away with a clear sense of where the industry’s head is at right now. Here is what stood out.


The “Pilot Purgatory” Problem Is Real, and People Are Done With It

One of the sharpest observations came from Tony McGibbon at First Line Software, who pointed out that the pilot lifecycle has compressed dramatically. A year ago, pilots took months to build and months to evaluate. Now they take days. The bar has moved: if a solution cannot demonstrate ROI during the pilot itself, the conversation is over.

That expectation was echoed across conversations throughout the conference. The industry spent a few years giving vendors the benefit of the doubt on long evaluation timelines that rarely delivered clear outcomes. The new standard is speed to value. Pick the right solution, define what success looks like upfront, and measure it fast. If it works, scale it. If it does not, move on.

The corollary to this is that organizations need to know what they are actually trying to solve before they start. Greg Carey at The RMR Group described their approach to AI in deal screening and underwriting: they started from what their asset management and acquisitions teams actually needed, then worked backward to figure out what data and tools could serve those needs. That kind of clarity — starting with the business outcome, not the technology — is what separates the firms making real progress from the ones still running pilots with no finish line.

You Cannot Automate What You Have Not Mapped

This was probably the single most consistent theme at RETCON this year, and it deserves more attention than it typically gets.

Dusti Wofford from Trammell Crow put it plainly: if you do not know your processes, you will not excel at AI. Full stop. Her team spent time systematically mapping their organization, everything from property operations to corporate, to identify where work was redundant, where handoffs broke down, and where automation would actually move the needle. That exercise is not glamorous, but it was a message that resonated well beyond her session.

The same logic applies to data. RMR is building toward a world where AI handles preliminary and full underwriting before a deal ever reaches a human analyst. That is an ambitious goal, and the reason they can credibly pursue it is that they have spent years building a proprietary data foundation and understanding exactly how information flows across their investment and operations teams. When their deal makers push back on an AI recommendation, there is a documented reason for it. That feedback loop is only possible because the process underneath it is well-defined.

The point is not that every firm needs to do a multi-year data infrastructure project before they touch AI in commercial real estate. It is that the firms getting the most out of it right now are the ones who understood their workflows first. That came through loud and clear across the conference.

Your Data Is Your Differentiation — Protect It

This came up with real urgency across multiple conversations, and it was not abstract. Dusti Wofford shared a live example: her deal team was using personal Claude plugins to experiment with underwriting automation because they had seen tutorials circulating on LinkedIn. She had to have a conversation with them about what that actually means. Anything running through a consumer or beta AI product is potentially training that model. Your underwriting methodology, your deal logic, your operational processes — you are giving that away for free.

Lease abstraction, loan summarization, OM analysis are genuinely good places to start. They are high-volume, time-consuming tasks with clear accuracy benchmarks. The use case is real and the productivity gains are meaningful.

But the solution you use matters enormously. The question to ask any vendor is simple: does your model train on our data? If the answer is yes, or even unclear, that is a problem worth taking seriously. The firms moving fast without asking that question are quietly handing over their competitive edge.

The Data Quality Debate Is Closer to Resolution Than You Think

Real estate has always had a data problem. Fragmented systems, inconsistent formats, siloed ownership, and a heavy reliance on paper-based processes have made it genuinely difficult to use data at scale. For years, the assumption was that you needed clean, structured data before AI could do anything useful with it.

That assumption is being tested. Armel Traore dit Nigan at Principal Asset Management made a point that generated real debate on stage: models are now being trained on enough information that they may be able to do meaningful work on messy, unstructured real estate data within a matter of months. His view two days earlier had been the traditional “garbage in, garbage out” position. He updated it in real time.

Not everyone agreed. Wofford pushed back on the hallucination risk in complex development scenarios, and that skepticism was shared by others in the room. The emerging consensus, though, was that AI’s ability to handle noisy data is improving faster than most people expected. What felt like an 12-month problem a year ago may be closer to a 3-month problem now. That does not mean the work of cleaning and standardizing your data is irrelevant. It means the urgency to start has gone up, not down, because the window to build a proprietary data advantage is narrowing fast.

This is an area where purpose-built solutions have a meaningful edge over general-purpose AI. Outcome’s model is designed specifically for commercial real estate workflows — built to work with the kinds of data real estate firms actually have, not just the data they wish they had.

The Governance Gap in Real Estate AI

The most honest moment of an AI implementation panel came near the end, when a question surfaced that nobody had a clean answer to: once AI is generating insights and recommendations at scale, how do you govern it? How do you make sure the same question gets the same answer across the organization? How do you audit it? How do you maintain accountability when the output is AI-generated?

It was a question that could have been asked in almost any session at the conference. Multiple panelists shared live examples of vendors going outside their contracted scope, AI agents making unauthorized changes to live platforms, and employees using unsanctioned tools because a competitor posted a tutorial on LinkedIn. The governance structures are lagging behind the pace of adoption, and everyone at the conference seemed to know it.

The firms building answers now — monitoring programs, clear vendor contracts, documented processes, human review loops — will be in a fundamentally better position than the ones who figure it out after something goes wrong. That transition from “move fast” to “move fast with guardrails” was arguably the defining tension of RETCON 2026.

What It Means for Where We Go Next

RETCON 2026 confirmed something the industry has been circling for a while: the question is no longer whether AI belongs in commercial real estate. It is how to make it operational at scale. The technical capability largely exists. The gap is in the workflow foundations, the data governance, and the organizational alignment to put it to work in a way that actually sticks.

The firms leading the way are not necessarily the ones with the biggest budgets or the most sophisticated tools. They are the ones who got serious about their processes first, protected their data, and demanded results fast enough to justify the investment. That is the standard the rest of the industry is catching up to.

Outcome attended RETCON 2026. These takeaways reflect the sessions and conversations we were part of throughout the conference.

Discover more from Outcome

Subscribe now to keep reading and get access to the full archive.

Continue reading