Yukti Tuli
Content Designer at Atlassian
August 7, 2025
6 mins

Measuring UX Content: How to Prove Your Words Are Working

This is the second blog post in the “Measuring UX Content” series, where we explore how content design teams use data and user insights to track, prove, and improve the impact of their work.

August 7, 2025
6 mins

Editor’s note from Frontitude:

This article is part of our blog series, “Measuring UX Content,” where we explore how content design teams track, prove, and improve the impact of their work. We’re speaking with UX writers, content designers, and product leaders who are building the case for content with data, and doing it in thoughtful, user-centered ways.

In this guest post, we feature Yukti Tuli, a content designer at Atlassian with a background at Amazon and Oracle. Yukti shares her journey into content design and why measurement became essential to her work. Through projects like AI feature launches and high-stakes migrations, she highlights practical ways to track content impact.

When I started my career in content, measurement wasn’t something we talked about. I began in copywriting, moved into technical writing at companies like Amazon and Oracle, and eventually landed in content design. What drew me to content design was the opportunity to speak directly with customers, iterate based on feedback, and shape experiences from scratch. But what kept me here was the realization that UX content isn’t just about clarity and tone. It’s about impact. And to prove that impact, you need to measure it.

At Atlassian, I’ve worked as a content designer on Jira Service Management, contributing to several product flows, including recent AI-powered features. Across projects, I’ve constantly asked myself: How do I know this UX copy is working? This post shares what I’ve learned: challenges, successes, and strategies for measuring UX content in ways that matter.

Why Measuring UX Copy Is Still So Hard

Let’s be honest: UX content is still rarely measured on its own. Most metrics we see, like click-through rates, conversion funnels, drop-offs, are tied to the entire UI or feature, not to specific words.

As a content designer, this can be frustrating. You know the copy made a difference. You know the CTA wording helped drive the user to act. But when you try to isolate the performance of content alone, the data often isn’t there.

This is especially true in cross-functional teams where engineering and design might not prioritize content as a distinct measurable element. It usually falls to us, content designers, to raise the flag and ask: Can we track this click? Can we test this copy variant?

Measuring Content in High-Stakes Projects

One of my most insightful measurement experiences came during a critical migration project. We had to move 4,000 customers from one authentication method to another before the old system was deprecated. There was no major UI design, just a banner component and a CTA. The copy had to do all the work.

So I reached out to our engineers and asked them to track:

  • Click-through rate on the banner CTA

  • Completion rate of the onboarding task

These two simple metrics gave me a clear view of whether the copy was effective. The design didn’t change. The layout didn’t change. Just the words. It was a powerful way to isolate content performance.

A/B Testing Copy for Feature Adoption

Another area where I saw measurable impact was in A/B testing copy variants to drive feature adoption. We had launched a new capability, but adoption was low, even though the feature was free.

Initially, our CTA was “Get started,” with a dense paragraph of explanation. We revised it:

  • Simplified the body copy into three bullet points

  • Changed the CTA from “Get started” to “Try now”

The result? A 16% increase in conversions. That small change signaled to users that there was no commitment required. Just a simple, exploratory click.

We realized that “Get started” felt like a heavy lift. Users thought they were entering a setup process. But “Try now” implied low effort and low risk. That’s the subtle psychology of words, and measurement helped prove its power.

“We realized that “Get started” felt like a heavy lift. Users thought they were entering a setup process. But “Try now” implied low effort and low risk. That’s the subtle psychology of words, and measurement helped prove its power.”

Using Comprehension Testing to Align with Mental Models

With our recent AI features, I ran comprehension testing to ensure that the copy matched user expectations.

Instead of coining new feature names, we kept things intuitive. For example, naming a feature simply “Draft a reply.” I recruited participants who worked in the same industry but didn’t use Atlassian products. I showed them screens and asked:

What do you expect will happen when you click this button?

If their answer matched the actual functionality, it was a sign that the CTA and microcopy were clear. The results reinforced a core belief: good UX writing should align with users’ mental models—not with internal jargon or branding ambitions.

Support Tickets as a Measurement Signal

Sometimes, the absence of content is what leads to measurable pain.

At a previous company, we tracked customer support tickets tied to specific product flows. One error message, which was triggered when uploading large files, just said “You’ve reached your limit.” But it didn’t tell users what the limit was.

We discovered this missing context was generating a high number of tickets. How did we fix this?

We added the actual file size limit to the error message, to let users know it’s related to file size limit. Tickets dropped right away.

This type of measurement may not be exciting, but it works. It shows you exactly where the content is missing the mark and points you toward practical ways to fix it.

Knowing When Not to Invest

There’s another side to measurement: knowing when content investment isn’t worth it.

In one case, I argued for a more detailed error message for an edge case. The engineers pushed back, saying the use case was rare and the effort was high. So we created a dashboard to track how often the error appeared.

Turns out, only five users per week encountered it.

That insight helped us make a balanced decision. In ideal worlds, we’d polish every edge case. But in real-world environments, data helps us prioritize where content matters most, and where “good enough” is okay.

“In ideal worlds, we’d polish every edge case. But in real-world environments, data helps us prioritize where content matters most”

What’s Still a Struggle

Even with all the progress, measuring UX content is still a real challenge. The reality is, content designers are often the only ones actively thinking about it. While most teams agree that copy matters, few have built-in processes to track or evaluate its impact. And when performance reviews or strategic discussions come around, we’re often left scrambling for proof.

But that’s exactly where the opportunity lies. We’re in a unique position to lead this change. By pushing for content-specific metrics, partnering with engineers to track key actions, and bringing data into everyday conversations, we can help shift how teams value and measure content. 

With better tools, more advocates, and a commitment to making measurement a standard, not a side note, we can move from being reactive to driving the strategy forward.

How AI Is Changing the Game

With the rise of AI features, I’ve found myself learning prompt engineering, helping product teams design conversational interactions that feel intuitive and on-brand.

What’s interesting is that we’re now writing for systems that generate language, not just users who consume it. That shifts how we think about content quality and measurement. You can’t rely on front-end error messages anymore. You have to train the model via prompts to respond with the right tone, structure, and fallback behavior.

As AI becomes more integral to products, I believe content designers will become prompt designers, and measurement will have to evolve too. We’ll need ways to assess not just copy clicks or comprehension, but LLM behavior and performance as well.

Final Thoughts

If you’re a UX writer or content designer trying to prove the value of your work, I see you. It’s not easy to isolate content’s impact in a sea of product changes. But with a little creativity, and a lot of persistence, you can make your work visible.

Here’s what I’ve learned:

  • Ask engineers to track specific clicks and completions.

  • Run A/B tests for key flows.

  • Use comprehension testing to validate clarity.

  • Track support tickets as signals for missing or broken content.

  • Use measurement not just to improve copy, but to prioritize your time.

  • And most importantly, don’t wait for permission. Start the conversation and be proactive.

Because once your team sees the numbers, content stops being invisible. It becomes an asset; one that’s worth investing in.

Get the monthly scoop
Stay up to date on new features, industry updates, and the latest trends around UX content and localization.
You’re in!
Watch your inbox for our next update
That email doesn’t look right… Give it another shot.
🍪