I’ve always been drawn to technology that forces us to rethink how we build. In 2016, I joined Uber at a time when mobile was reshaping how the world worked.
Suddenly, everyone carried a phone—and that changed everything about how products had to be designed. At Uber, we reimagined how people move and how they interact with services in real time. It was a rare chance to build a new platform from the ground up.
Today, AI is creating that same kind of shift, but even faster. In just a few short years, we’ve gone from the first release of ChatGPT to agentic systems that can reason and act with increasing sophistication.
It’s a thrilling engineering challenge—one that is already transforming how millions of people work, especially in industries built on knowledge. That’s what drew me to Hebbia. After my first conversation with the team, it was clear they were tackling this problem in a way no one else was.
Many industries are built on time-intensive knowledge work. Financial analysts comb through filings—S-1s, 10-Ks, 10-Qs—across portfolios of companies. These tasks, while foundational, are inefficient bottlenecks to higher-value work.
Instead, you can ask Hebbia things like: “Review the countries where my portfolio companies operate and highlight potential tariff risks”. Hebbia will review your entire portfolio and summarize these risks by company — not through a simple document extraction, but via an exhaustive review that cross-references every file, and can benchmark it versus industry sources.
The same is true in legal workflows. In March 2020, M&A lawyers pored over thousands upon thousands of agreements to assess how pandemics may implicate "Material Adverse Effect" definitions.
But in the past few months, as lawyers had to assess how tariff policies may implicate key provisions of M&A and Credit Agreements, the sleepless nights of rote review were replaced by a few clicks in Hebbia.
Hebbia is redefining how people interact with information. By introducing structure into unstructured workflows, it unlocks entirely new ways to analyze complex document sets. What once took days or weeks—triaging documents, comparing clauses, validating assumptions—now takes minutes.
With Matrix, users can investigate hundreds of documents at once and ask questions that span across them. Hebbia’s Chat interface brings the same power into a more conversational, long-form setting.
Solving this isn’t just about building fast retrieval or slick UIs. It’s about architecting systems that can reason over ambiguous inputs, surface reliable answers, and handle complex multi-source workflows at scale. Hebbia leads the field here—which is exactly why I wanted to be part of the team.
What truly impressed me is the way Hebbia approaches this challenge. The team pioneered retrieval-augmented generation (RAG), learned its limitations firsthand, and then advanced to a full attention-based strategy. Hebbia doesn’t just retrieve text — it reasons over it like a human would, and it shows its work. That transparency builds trust in critical workflows.
Beyond technology, Hebbia’s greatest strength is its people. Many team members were once customers, inspired by what the product could do.
The team here reminds me of what I valued most in past roles—at Uber NYC, that deep belief in the product and drive to make it the best it could be. Another work experience that comes to mind is CloudKitchens, with the strong sense of teamwork and shared ownership across teams.
At Hebbia, that same spirit runs through the engineering culture. Our engineers work closely with customers and each other, always focused on how the product is used in the real world. That customer-first mindset makes the work here especially energizing.
The team holds a high bar for quality, but also knows how to move fast on what matters. And George, our CEO, brings clarity to how we’ll keep building the best product for knowledge work. I’m thrilled to be helping build that future.
We’re hiring across the board — if this mission resonates with you, check out our careers page. And feel free to reach out to me. I’d love to tell you more!