Benchmarking the Digital Experience

I worked with organizations around the company to institute a series of lean user research techniques to help benchmark the UX effectiveness of Pega's Digital experiences.

What

Working primarily with the Enablement team, we created and iterated on a suite of research tools that helped us benchmark and better understand the Pega professionals who were visiting our Digital experiences. This toolkit, which included usability testing, site intercept surveys and foundational research, gave us a solid grounding to inform prioritization discussions. It also allowed us to incorporate research into our process without creating perceived roadblocks to development velocity.

Why

Creating an effective user experience depends on understanding how the current experience is performing, how it can be improved, and the impact of the changes that your team is making. By triangulating our approach, we were able to get a broad range of insights to inform our design decisions.

How

Although Pega leadership had an appetite for user feedback, there was a constant concern that process would get in the way of progress, i.e. that taking the time to do research would be an impediment to delivering functionality. To counter that perception, I started with a few structured usability tests, using a lean usability test planning/reporting format that I developed at HBR.

By planning and reporting usability tests in the same convenient memo format, velocity-focused executives more easily bought into the research process.
Using a rainbow notes grid made it even faster to report findings, and gave a quick visual guide to the most salient insights.

When the Digital group added Hotjar to our websites in 2017, I was able to use the tool to conduct heatmap studies of key pages within the websites to inform content prioritization discussions, and I created a quick intercept poll to help us benchmark the experience of Pega Academy and the Pega Discovery Network. The results of those polls, which received ~650+ responses per month, were synthesized and reported out to key leadership within Enablement, as well as the overall Digital team. This gave us the ability to see, month over month, how the work we were doing impacted the experience of our users.

Our primary question allowed us to see month over month how search improvements were doing across websites.
As time went on, we added two questions that allowed users to rate both the quality of content and the experience of finding that content on our sites.

As a way of incorporating foundational research into our toolkit, I started by conducting formative interviews at our company's annual sales conference. I also worked with my former professors at Bentley to sponsor coursework for the Field Methods class. I synthesized these insights for leadership, and used them to spur conversations about how we could enhance our Digital experiences.

One of my first research projects, done at our annual sales conference, helped inform the featureset of an experience that had just launched.
In partnership with Bentley's Field Methods class, we uncovered a set of principles to guide our work improving Pega's developer community website.

Key Contributions