The CRM Problem Isn't the Software
We had a CRM. We paid for it monthly. It was frequently out of date. The problem wasn't the software — it was the input model. So we eliminated it.
We had a CRM. We paid for it monthly. It was frequently out of date.
This isn’t a knock on the tool — we’ve used several over the years and the problem is consistent across all of them. CRMs require input. Someone has to log the call, update the stage, note that a capital partner mentioned they’re not deploying outside the Mountain West this cycle, or that their fund is in a quiet period, or that they’ve been watching a specific market for three years and would move quickly on the right deal. That input requires discipline. Discipline compounds into friction at the end of a busy day, after a call that ran long, before the next meeting. Six months later the database that was supposed to give you a clear picture of your investor relationships is a patchwork of stale entries and missing conversations. With outdated data, we lose confidence in what it tells us.
The problem isn’t the software. It’s the input model.
What we built
We use Granola to automatically capture meeting notes. We built a layer on top of it — using Claude Code — that runs after every investor meeting without anyone doing anything.
It reads the notes, identifies which capital partners were present, extracts the relevant intelligence — investment preferences, check size ranges, geographies of interest, fund timing, anything substantive that came up — and updates a structured record automatically. By the time Preston gets to his desk the next morning, the log is current. No manual entry. No discipline required. The database stays current because the input problem is eliminated, not managed.
What a CRM field can’t hold
Our prior CRM did store information — contact records, deal history, and basic preferences. What it couldn’t store was the nuance of a conversation. And even when someone took the time to write a detailed note, you couldn’t query it. You could retrieve it if you already knew to look. You couldn’t ask: which of our investors has mentioned an openness to development plays in markets they know well, and whose fund timing suggests they’re ready to deploy? That kind of question requires language understanding across unstructured data — which is exactly what large language models are built for.
The agent captures conversational texture in narrative form. It’s searchable, it doesn’t deteriorate, and it doesn’t leave.
Where it connects to deal execution
When our underwriting system analyzes a new deal, it queries the investor database and surfaces the most likely capital partners for that specific opportunity — by equity check size, geography, strategy, and the contextual signals captured from prior conversations.
The result is a prioritized shortlist based on everything we know about each relationship.
The same infrastructure, a different problem
The same tools that enabled the underwriting agent enabled this one. Granola provides summarized meeting notes. The Model Context Protocol connects the agent to our contact database. Claude Code handled the build. Training this agent was faster — it already had a working model for what knowledge matters and how to apply it.
What’s different about this agent is that the coaching is continuous. Investor preferences change. Fund cycles turn. The agent learns from corrections as relationships evolve, which is what keeps the database genuinely current rather than accurate at launch and stale six months later.
What doesn’t change
The agent only surfaces who to call.
We are the trusted partner in these relationships — and trusted partners are built through years of communication, transparency about outcomes, and ultimately delivery on execution. The intelligence the system provides is preparation. The relationship, and the obligation to perform, is ours. What changes is how prepared we are when we pick up the phone.
Next: what the process of actually building all of this looked like — and what we’d tell a firm evaluating whether to start.

