Customer Intelligence

4 stars and frustrated | time to move beyond surveys and sentiment

By
Joel Passen
December 28, 2022
5 min read

Whether it’s a positive review or a scathing complaint, customer feedback is critical to the success of every business. It’s a window into the experiences buyers seek and a way for B2B software companies to improve their products, processes, and relationships.

Customer feedback is information given by your customers about the quality of your products and services. Are you meeting customer requirements and delivering value? Whether good or bad, there is no better and more reliable data source about your company than customer feedback.

With B2B buyers demanding more B2C-style experiences, it’s never been more critical to keep up with the changing needs of buyers and users. Unfortunately, many teams still rely on yesterday’s tools to solve today’s challenges. 

To date, most companies have relied heavily on surveys to gather feedback. Others have coupled surveys with analytics tools that analyze customer sentiment. Unfortunately, both surveys and sentiment analysis fail to provide the necessary depth of qualitative data to build deeper customer relationships. Simply put, surveys and sentiment are often subject to broad interpretation. 

Today’s most competitive B2B SaaS companies are putting deeper contextual insights about their customers to work. They are doing this by layering them into operations, processes, metrics, information flows, etc., to enable every team to make decisions based on specific, actionable signals. We’ll explore this more later.

Surveys are still the status quo

Let’s face it, surveys are a relatively simple and inexpensive way to collect customer feedback. However, Forrester reports that surveys capture between 2% and 7.5% of customer interactions.

 

Given the importance of understanding our customers, SaaS businesses must expand their approach to collecting and curating customer feedback. This starts with expanding the data sources teams use to operationalize insights across the business.   

Easier said than done. To date, B2B SaaS businesses haven’t invested heavily enough in tools and technologies to help them better understand their customers. Today, leaders still struggle to create a complete picture of customer needs, frustrations, and intent. To a large extent, this is due to a reliance on surveys.

While many of us can’t rid ourselves entirely of surveys, they continue to fall short for these reasons.

  1. Surveys are a backward-looking tool in an era where customers expect near real-time remedies.
  2. Survey results are often ambiguous, failing to reveal the cause of customer frustration.
  3. Survey data is often seen as unreliable and not contextually substantive enough to drive real business impact.
  4. Surveys are often answered by users with exceptionally positive or negative experiences.
  5. Survey responses are limited to structured questions, so respondents cannot provide feedback about topics that are not covered. 
  6. Surveys require significant customer time and effort and can be considered annoying.

Customer surveys are just one tool in the burgeoning field of customer intelligence. Sturdy defines it as the process of collecting and analyzing customer data from internal and external sources to unlock customer insights. Recently, many have turned to sentiment analysis to gain a deeper understanding of the consumer mindset. Sentiment analysis insights gathered from different sources lead to improved product features, pricing, customer experience, and overall customer satisfaction. 

Sentiment alone is… OK

Many companies are running sentiment analyses on their product or customer service feedback. But as with surveys, this isn’t enough. Sentiment analysis gives you the binary answer good/bad or extends the range with outputs like terrible/bad/OK/good/great. 

Sentiment analysis requires machines to be trained to analyze and understand emotions as people do. Human language cannot be categorized into only three buckets (positive, negative, and neutral) in its intricacies and complexities. For example, Let’s say we determine that 68% of customers have a negative impression of our product. That still leaves us with many unanswered questions: Do we change the pricing? Do we make UX adjustments? Without more specific insights, we’re left, once again, to go with our guts. Think survey results. 

Let’s put it differently: if 68% of your customers are expressing negative sentiment, you need to understand why the customer feedback is so negative. Your team will need contextual clues to solve this level of dissatisfaction. The answers are probably right there; you just need the qualitative layer below the actual sentiment. 

Once you understand the qualitative data, you can design better products, adjust processes, and build better relationships based on specific data points that need less interpretation. To do this, companies are leveraging next-generation AI, NLP, and ML technologies that provide deeper, actionable insights about their customers. 

Tapping a new source of customer feedback

Customer insights programs are more successful when customer data and feedback are gathered from multiple sources to get a more complete, diverse look into customer needs and impressions. Companies realize that customers constantly send signals that help us predict churn, capture references, get in front of renewals, prioritize features, and run our businesses better. Our customers are giving us this information in Slack, Email, Salesforce, Webinars, training sessions, quarterly business reviews, Zoom calls, etc., daily.

Customer Signal
(noun) A gesture, action, or transmission delivered intentionally or unintentionally by a customer that conveys information, instructions, or insights. 

For B2B SaaS businesses, these signals are immensely valuable. For example, reducing churn from 10% to 9% in a $10 million ARR business means that every customer is worth $17k more in lifetime value (500 customers, $20k annual contract value). And reducing churn in this example is saving just 5 customers a year. 

Examples of Customer Signals‍

Identifying, classifying, and escalating customer signals to the right people at the right time empowers companies with information and insights to preempt issues before they spiral and seize revenue opportunities in time to improve the bottom line. 

For example, when a customer asks, “Can I have a copy of our contract?” in a support ticket, a signal is being sent. In a SaaS environment, the customer is likely signaling risk. Maybe they are evaluating a competitor. Perhaps there has been an executive change or a shift in priorities. Regardless, every SaaS leader will agree that this signal needs to be escalated so action can be taken. 

Below are a few other examples of customer signals. This is not an exhaustive list; every company will vary on what is essential. An interesting exercise is to sit down and list out the signals that your teams should be watching for. The output of this exercise can be used to improve operations, user experience, training workflows, and more.

Feature requests

Customer signals help us understand our customers better than surveys and sentiment alone. By defining and leveraging signals at scale, we can clearly understand if our products are delivering the value promised at the time of the sale. We can also better understand if our customers are willing to grow with us or are growing away from us. 

“B2B companies historically lag behind their B2C counterparts in adopting and deploying commercial analytics, but the ones who engage with the tools already outperform their peers; their return on sales are up to five percentage points higher than that of their counterparts.” McKinsey

New analytics tools like Customer Intelligence platforms reveal opportunities for cross-functional collaborations. And the insights often have significant implications for non-sales teams. Rapid advancements in technology, especially AI, are making it easier to help brands quickly and responsibly use data to understand customer behaviors and predict customer needs. We can better anticipate future decisions when we discover new patterns and insights in our data. Ultimately, going beyond surveys and sentiment by leveraging customer signals presents opportunities and incentives to deliver better service and find new ways to grow.

Similar articles

View all
AI & ML

Your AI isn’t the problem. Your data is.

Joel Passen
May 6, 2026
5 min read

IT leaders may have resisted AI early, but that phase passed quickly. The real concern wasn’t whether to use it. It was how to control it. Governance, security, visibility. In the end, it came down to preventing sensitive work from being done in personal accounts. Reasonable.

So they got comfortable, signed off, and rolled it out. ChatGPT, Copilot, Claude, company-wide, with guardrails.

People are using it. That part worked.

The disappointment

The problem is what revenue leaders are finding now that it’s live.

The data they actually want to use isn’t accessible in any meaningful way. And that matters more than most people realize, because LLMs are only as useful as what you put in front of them. They’re exceptional at reasoning over structured, coherent information. They’re not designed to reconcile fragmented, inconsistent data spread across a dozen systems.

Nobody’s model is.

So instead, people compensate.

They cut and paste. Drop in exports. Upload a batch of emails and call transcripts, and hope coherence comes out the other side.

It doesn’t. They get fragments. Plausible-sounding ones, but fragments.

The diagnosis

What commercial leaders are running into isn’t a model problem. It’s a data problem.

The data they actually care about isn’t unified. It lives across email, Slack, Zoom, support tickets, calls, and CRM notes. Different systems. Different formats. No shared identity. No relationship context.

Even with connectors. Even with MCPs.

Because underneath it all, the data isn’t organized in a way a model can reason on. There’s no canonical view of the world.

The model doesn’t know that the same person shows up in Zoom, Slack, Zendesk, and Salesforce. It doesn’t understand that those interactions belong to the same thread, the same account, the same moment in a relationship.

So it fills in the gaps.

Not because it’s weak. Because it has to keep trying.

The gap

Meanwhile, the models themselves have gotten amazingly powerful. Reasoning is sharper than it’s ever been and getting better daily.

But the data layer most companies are feeding them? Still immature.

According to MIT’s 2025 State of AI in Business, over 80% of companies have explored or deployed LLMs, but only around 5% are seeing meaningful business impact.

High adoption. Low transformation.

That’s not a model problem.

What’s possible

What it looks like when this actually works is different.

Not dashboards. Not reports. Not exports.

A conversation. Like having the best revenue ops analyst you’ve ever worked with on call, one who has read every email, sat in on every call, and never forgets anything.

You ask: “Which accounts have shown signs of churn risk in the last 90 days?”

And instead of a guess, you get a ranked list. Accounts. ARR. The exact messages where the signal showed up. What changed. What triggered it. What to do next.

So you ask a follow-up: “Which of these are new customers?”

Now you’re looking at onboarding breakdowns. Common threads. Where the process is failing.

So you keep going: “Where are we missing expansion opportunities?”

And it surfaces accounts where someone said, “We’re thinking about rolling this out to another team.” But nothing was logged. No opportunity created. No follow-up.

That’s the shift.

You’re no longer stitching together context. You’re interrogating it.

What changes

What changes when you fix the data layer, when your commercial data is normalized, deduplicated, and accessible, isn’t just speed.

It’s the level of questions you can ask.

These aren’t dashboard queries. They’re judgment calls. The kind that used to require a senior operator spending a weekend in spreadsheets and Salesforce. When your data layer is clean and the model has real context to work with, they become a 90-second conversation.

That’s the difference. Not a better model. A better fuel.

The data infrastructure reality

Most teams won’t get there by accident. The infrastructure problem is real: identity resolution across systems, conversation reconstruction across channels, deduplication, and signal enrichment. It’s six to twelve months of plumbing if you build it yourself.

The companies that crack it first won’t just be more efficient. They’ll be operating with a fundamentally different information advantage. They’ll see churn coming, spot expansion signals, catch friction early, before any of it shows up in the numbers.

At that point, the question changes.

It’s not whether AI works.

It’s whether your data is ready for it.

And whether you’re going to build that layer, or keep working around the absence of it.

This is what we're building at Sturdy.ai. The data layer your LLM actually needs.

Insight Updates

The Moment B2B Sales Teams Forget Everything They Learned During the Deal

Joel Passen
May 6, 2026
5 min read

It’s not the close. It’s not the kickoff call. It’s the 48 hours in between — when the contract gets signed, the champagne (metaphorically) gets popped, and everything the sales team learned over months of conversations, negotiations, and relationship-building quietly disappears.

The delivery team inherits a contract and a few CRM notes. Not the story behind the deal.

This is the handoff problem. And it’s costing companies more than they realize.

Why the Knowledge Dies at the Signature Line

Think about what actually happens during a complex B2B sale.

Over weeks or months, a sales team accumulates an extraordinary amount of institutional knowledge. They learn why the buyer is actually moving now — not the official reason, but the real one. The compliance incident that became a board-level conversation. The internal champion who’s been pushing for change for two years and finally got budget. The exec who’s skeptical and needs to see a specific proof point before they’ll get on board.

They learn who matters and how decisions actually get made, which is almost never what the org chart suggests. They learn what got promised in the final stretch: the SLA clause that got added at the last minute, the integration that’s now contractually locked, the go-live date that the CFO has already presented to her board.

None of that lives in the CRM. It lives in emails, call recordings, Slack threads, and people’s heads.

And the moment the deal closes, the sales team moves on to the next one. That’s their job. That’s how they get paid. But the institutional knowledge they spent months building the context that would let an implementation team start informed, instead of starting over, largely evaporates.

Onto the next pipeline review.

The Cost Nobody Is Measuring

Companies measure churn. They measure NPS. They measure time-to-value.

Most don’t measure the cost of the knowledge gap at handoff — because it doesn’t show up as a line item. It shows up as implementation delays. Escalations. Customers who feel like they have to repeat themselves six months into a relationship that should already be mature.

It shows up as promises made during the sale that nobody on the delivery side knew about. Commitments that surface in month three as a nasty surprise. Expectations that were set in a negotiation conversation that never made it into a system anyone on the CS team can see.

The SaaS industry has spent a decade optimizing the top of the funnel. Sophisticated systems for capturing and qualifying demand. Playbooks for every stage of the sales motion. Entire conferences dedicated to pipeline hygiene.

And then we hand a contract and a prayer to the team responsible for actually delivering the value we sold.

What Good Looks Like

I’ll make this concrete.

We recently ran Sturdy against a real deal, a $190K ACV implementation that had just closed. Board-level compliance incident drove the urgency. CFO was the economic decision-maker: analytical, direct, not interested in being charmed. An integration was contractually locked in Exhibit A. Timeline slippage wasn’t just an ops problem; it would retrigger board scrutiny because of the prior incident.

The implementation team knew all of that before the first kickoff call.

Not because someone wrote a perfect handoff email at 11 pm the night before go-live. Because Sturdy read across the entire deal — emails, calls, negotiations — and surfaced the context that actually matters: why they bought, who really matters internally, what was promised, and where the risk lives.

That’s the brief I show in the video. Notice how specific it is. Notice that it doesn’t just describe what happened, it tells the delivery team what to do with it.

That’s what institutional knowledge looks like when it doesn’t get lost.

The Broader Shift

The handoff problem is really a symptom of something larger.

B2B revenue has always been a team sport — sales, CS, implementation, product, and finance all own a piece of the outcome. But the systems we’ve built treat each function as a silo. Data gets entered into the CRM by whoever remembered to do it. Calls get recorded and filed somewhere nobody looks. Emails pile up in inboxes that get searched only when something’s already on fire.

The signals are there. The context exists. It’s just buried, and it disappears at exactly the moments in the customer lifecycle when it’s most needed.

The companies that figure this out and build systems to capture, preserve, and operationalize institutional knowledge across the revenue lifecycle will have an operational advantage over those still relying on heroic individual effort and the hope that someone wrote a good handoff doc.

This isn’t an incremental improvement. It’s a different way of operating.

The moment a deal closes should be the moment an organization puts everything it learned to work.

Right now, for most companies, it’s the moment they forget it.

That’s the problem Sturdy was built to solve. If this resonates, start at sturdy.ai.

Insight Updates

Sturdy's MCP Server: One Call. Every Source. Already Resolved.

Joel Passen
May 4, 2026
5 min read

Another Step to Unlocking AI Outcomes: Resolve the Data First

The bottleneck is not your AI model. It’s the data it has access to. Sturdy’s MCP server delivers pre‑resolved, canonically organized context so your LLM can reason over it instead of guessing around it.

Another Step to Unlocking LLM Outputs: Resolve the Data First

For years, the problem was that data lived in silos. Different systems for sales, support, and calls. But the worst offenders were email and Slack. Email isn’t one silo; it’s as many silos as there are people on your team. Every rep, every CSM, every exec running their own inbox, none of it visible to anyone else. Slack is no different. Conversations buried in channels and DMs that nobody ever sees again.

What Changes

"Your LLM now has a single, usable data layer any user can query to inspect the full context of every prospect and customer."
“Every team now works from a single view of the relationship, not fragments of it. Sturdy gets everyone on the same page, no matter what screen they use.”

MCPs were a material step forward. They give LLMs a standardized way to reach outside their context window and pull live data from external systems without a human copying it in manually. An account record, an open ticket, a call summary, all accessible at query time without a custom integration.

Today, teams are dealing with a different version of the same problem. Every MCP server exposes a slice of the picture. The LLM can pull structured records, read a ticket, or fetch a call summary. What it cannot do is answer a question that requires all of them at once, because the data across those systems was never resolved against each other.

The entities don’t match. The timeline is fragmented. The thread that started the conversation often isn’t there at all.

The question every revenue team actually needs answered isn’t “what does this system say about the account?” It’s the question that requires the full picture: what has every person at our company said to every person at this company, across every channel, and what does that tell us about where this relationship actually stands right now.

No single MCP server can answer that. Most LLMs, handed raw data, will approximate an answer and present it with false confidence. That’s not intelligence. It’s a good guess.

That answer doesn’t live in any single system. It lives in the relationship between all of them. And if the LLM has to call multiple MCP servers to piece it together, resolve duplicate records, and reassemble a coherent account state on every query, the fragmentation problem hasn’t been solved. It’s just been moved into the inference layer.

What Sturdy’s MCP Does

Sturdy ingests from all of it. Email, call transcripts, support tickets, Slack, CRM, and meeting tools. Every channel where communication happens.

Before any of that reaches an LLM, Sturdy does the work that makes it usable. Entities are deduplicated and matched to canonical records. Interactions are classified. Signals are enriched, permission‑scoped, and source‑referenced. The relationship between interactions across systems is established once upstream.

Not inferred at query time. Resolved in advance, maintained continuously, and auditable.

That last part matters more than it sounds. LLMs are getting better at fuzzy matching, but revenue decisions cannot rely on it. “Probably the same account” is not good enough when you’re making retention calls, forecast commits, or expansion bets.

Then Sturdy exposes all of it through a single MCP server. One call. Pre‑resolved context with citations. The LLM starts from the signal, not the raw material.

The Token Cost Nobody is Talking About

There’s a practical consequence to raw MCP that most teams haven’t priced in yet. When an LLM has to reconstruct account context from scratch on every query, it burns tokens doing work that shouldn’t need to happen at query time.

Pulling from multiple sources. Resolving conflicts. Traversing relationships. Figuring out what it’s looking at.

At low volumes, this is invisible. At scale, it isn’t. The rediscovery tax on a raw MCP call runs roughly 60 to 80 percent of total token consumption per query. That’s the LLM figuring out context, not reasoning over it.

Sturdy removes most of that overhead. The context arrives already structured. The LLM starts from a position of knowing. The inference budget goes toward answering the question, not reconstructing the data.

What This Means for Teams Building on it

Sturdy’s MCP is designed for teams that have already provisioned an LLM and are now trying to make it useful. CTOs deploying models across their organization. Heads of Data and AI are trying to get real answers out of them. Operations teams are building agents that need reliable account intelligence.

The properties that matter:

Canonically resolved
Entity deduplication and matching happen upstream. The same account appears as one account regardless of how many systems it lives in.

Permission‑aware
Access controls are baked into the data layer. What a user can see reflects what they’re authorized to see in the source systems.

Source‑referenceable
Every signal comes with a citation. When something surfaces, the underlying interaction is linked.

Model‑agnostic
The data layer doesn’t change based on which model you use.

Nobody wants to spend 12 to 18 months normalizing data before they can build something useful. Resolving that data upstream changes what your LLM can do on day one.

Talk to us about connecting Sturdy to your existing AI deployment.

Your customers are already telling you what's going to happen.

Connect what customers say to the reasons your numbers move. Contextual revenue intelligence, ready for any LLM — or running natively in Ask Sturdy from day one.

Unlock Your Accounts
4 stars and frustrated | time to move beyond surveys and sentiment