Kesavan Nampoothiry
Senior Content Designer at Adobe
July 29, 2025
8 mins

Measuring UX Content: Methods and Tools to Measure Content at Scale

This is the first blog post in the “Measuring UX Content” series, where we explore how content design teams use data and user insights to track, prove, and improve the impact of their work.

July 29, 2025
8 mins

Editor’s note from Frontitude:

This article is part of our blog series, “Measuring UX Content,” where we explore how content design teams track, prove, and improve the impact of their work. We’re speaking with UX writers, content designers, and product leaders who are building the case for content with data, and doing it in thoughtful, user-centered ways.

In this guest post, we’re happy to feature Kesavan Nampoothiry, a senior content designer who helped shape the QuickBooks migration experience at Intuit and is now a senior UX content strategist at Adobe. Kesavan shares his practical, honest approach to measuring UX copy, from conversion metrics to language studies, and how he turned those insights into scalable systems.

For a long time, I believed that great UX content spoke for itself. Clear copy, thoughtful microinteractions, a smooth onboarding experience. These were all signs that the writing was doing its job. But over the last few years, particularly during my time at Intuit, I learned a more difficult truth: no matter how good your writing is, it won’t have the impact it deserves unless you can measure it.

In this post, I want to share how I came to that realization, and how I began to build measurement into my day-to-day practice as a content designer.

Why Measuring UX Content Matters

At Intuit, I worked on the QuickBooks migration team. Our job was to help users transition from QuickBooks Desktop, a product with over 30 years of history, to QuickBooks Online. It’s a big leap for many users, and content played a crucial role in reducing anxiety, increasing clarity, and guiding people through a multi-step process.

The catch? You can’t fix what you can’t see. And if you don’t track how content performs, it’s impossible to know whether it’s actually helping users succeed.

That’s why measurement became so important. It gave us insight into how users were interacting with our copy. Were they getting stuck? Were they misunderstanding a step? Were they dropping off at a critical point? These weren’t theoretical questions—they were trackable, observable behaviors.

“That’s why measurement became so important. It gave us insight into how users were interacting with our copy. Were they getting stuck? Were they misunderstanding a step? Were they dropping off at a critical point?”

How We Measured Content Impact

We approached measurement from two angles: quantitative data and qualitative feedback.

On the quantitative side, we looked at:

  • Conversion rates: Were users completing the migration?

  • Click-through rates (CTRs): Were users engaging with help links or in-product nudges?

  • Drop-off rates: At which step did users exit or stop progressing?

Our data science team helped us implement dashboards (using Splunk) to monitor user behavior across different screens. We could see, for instance, that 70% of users made it past a specific screen, but 30% dropped off. That was a red flag. Something about that screen wasn’t working.

But numbers only tell part of the story. That’s where qualitative methods came in.

Qualitative Methods That Made a Difference

To supplement our metrics, we did:

  • Customer interviews

  • Prototype testing

  • Language studies

  • Hierarchy and card sorting tests

There’s a great example I like to share:

Early on, we hesitated to use the word “migrate.” It felt too technical. We explored simpler alternatives like “move your data” or “bring in your data,” thinking they would sound more approachable. But when we tested those phrases with real users, we saw a clear and consistent preference for “migrate.”

One customer summed it up perfectly: “I don’t even need to read the whole sentence. If I see the word ‘migrate,’ I know what this screen is about.”

That insight did far more than just clarify a single message. It triggered a ripple effect across the entire product. We updated multiple onboarding flows, refined labels and CTAs, and aligned our messaging across interfaces. Eventually, we documented the decision in our style guide, solidifying “migrate” as the canonical term across product and documentation.

This kind of decision would not have been possible without qualitative measurement. Talking directly to users, observing their reactions, and listening to how they describe their experience helped us uncover an emotional and cognitive resonance that quantitative data alone could never reveal. It reminded us that sometimes, a single word—if it’s the right word—can dramatically improve comprehension, trust, and user confidence.

Ultimately, it wasn’t just a word swap. It was a turning point in how we built and evolved our language system—grounded in how real people think and communicate, and continuously informed by the voices of the people we’re designing for.

“This kind of decision would not have been possible without qualitative measurement. Talking directly to users, observing their reactions, and listening to how they describe their experience helped us uncover an emotional and cognitive resonance”

Working Cross-Functionally to Make Measurement Stick

It’s important to note: I didn’t do this work alone.

Measuring content success was a cross-functional effort. I partnered with product managers to define success metrics. I worked with data scientists to build dashboards. I collaborated with UX researchers to run interviews and tests. And yes, I even teamed up with our customer support team to analyze patterns in support tickets, which is another underrated source of qualitative insight.

At Intuit, we also had a support team that helped us find users who had gone through specific flows, like customers who had migrated only payroll data or stopped at a certain step. These were the people we needed to talk to in order to understand where content was falling short.

Pre-Launch Testing: Measuring Before It Ships

One of the most effective strategies we used was testing content before launch. We didn’t wait until users complained. We built validation and measurement into the design process.

We ran:

  • A/B tests to compare different copy versions in gradual deployments

  • Soft launches to observe live behavior in a small user group

  • Usability testing with tools like UserTesting

  • Hierarchy testing for menu and navigation structures

This allowed us to catch issues early and make confident content decisions, backed by real user feedback.

Turning Insights Into Scalable Guidelines

A big part of making measurement useful was operationalizing what we learned.

We didn’t just use findings to patch specific issues, we integrated them into our content guidelines. For example, although we generally avoided words like “export” or “import” (because they’re technical), we made exceptions in migration and other flows where users expected that language.

Our style guide became a living system, one that evolved with each new test, interview, and insight. And because Intuit is a large company with multiple products (each with their own tone and voice), having those documented decisions helped us maintain consistency and scale quality across teams.

The Role of AI in the Future of Content Measurement

Looking ahead, I believe AI tools will transform how we measure and optimize UX writing.

Right now, it takes multiple tools and a lot of human effort to synthesize quantitative and qualitative data. But what if an AI writing assistant could:

  • Pull sentiment analysis from user research transcripts?

  • Track user confusion in real time?

  • Fetch content-related information from support tickets?

  • Pull content-related insights from A/B tests?

  • Recommend copy changes based on behavior patterns?

The potential here is immense. AI-powered tools won’t just save time, they’ll uncover patterns and surface opportunities that humans often miss. As content designers, we’ll be able to work more strategically, confidently tying words to outcomes. With the right AI integrations, UX content won’t just be written and shipped, it will be continuously measured, refined, and proven to drive product success.

Writing UX Content for AI Products

When users interact with chat-based systems, they don’t just click. They speak their minds. Unlike classic UI, which relies on predefined buttons and structured inputs, chat interfaces invite open-ended, natural language responses. That kind of input is rich with emotion, intent, and unmet needs, which is exactly the kind of signals that rarely surface through traditional analytics.

For content designers, this shift is transformative. In classic UI, understanding what users struggle with often requires combing through drop-off rates or scheduling moderated usability tests. But in a chat interface, users tell you. Every interaction is a potential insight.

With the right AI systems in place, we can automatically extract those insights and turn them into action. We can detect where language fails, what users expect, and how intent aligns (or misaligns) with design.

Chat-based UX doesn’t just change how users interact, but it changes how we understand them, and how quickly we can adapt. For content designers, it means being closer to the user than ever before.

“Chat-based UX doesn’t just change how users interact, but it changes how we understand them, and how quickly we can adapt. For content designers, it means being closer to the user than ever before.”

Final Thoughts

If there’s one takeaway from my experience, it’s this:

Measurement is not a bonus. It’s a foundation. You don’t need to measure everything. You just need to start somewhere. Pick a flow. Define success. Track it. Talk to users. Iterate. And then document what you learn so others can build on it.

Because the better we measure our UX content, the better we can serve our users, and the stronger our discipline becomes.

Get the monthly scoop
Stay up to date on new features, industry updates, and the latest trends around UX content and localization.
You’re in!
Watch your inbox for our next update
That email doesn’t look right… Give it another shot.
🍪