Benchmarking the Digital Experience
Working primarily with the Enablement team, we created and iterated on a suite of research tools that helped us benchmark and better understand the Pega professionals who were visiting our Digital experiences. This toolkit, which included usability testing, site intercept surveys and foundational research, gave us a solid grounding to inform prioritization discussions. It also allowed us to incorporate research into our process without creating perceived roadblocks to development velocity.
Creating an effective user experience depends on understanding how the current experience is performing, how it can be improved, and the impact of the changes that your team is making. By triangulating our approach, we were able to get a broad range of insights to inform our design decisions.
Although Pega leadership had an appetite for user feedback, there was a constant concern that process would get in the way of progress, i.e. that taking the time to do research would be an impediment to delivering functionality. To counter that perception, I started with a few structured usability tests, using a lean usability test planning/reporting format that I developed at HBR.
When the Digital group added Hotjar to our websites in 2017, I was able to use the tool to conduct heatmap studies of key pages within the websites to inform content prioritization discussions, and I created a quick intercept poll to help us benchmark the experience of Pega Academy and the Pega Discovery Network. The results of those polls, which received ~650+ responses per month, were synthesized and reported out to key leadership within Enablement, as well as the overall Digital team. This gave us the ability to see, month over month, how the work we were doing impacted the experience of our users.
As a way of incorporating foundational research into our toolkit, I started by conducting formative interviews at our company's annual sales conference. I also worked with my former professors at Bentley to sponsor coursework for the Field Methods class. I synthesized these insights for leadership, and used them to spur conversations about how we could enhance our Digital experiences.
- Coordinated and performed a regular cadence of usability testing, both internally and with external vendors.
- Developed partnerships with Bentley's HFID program to sponsor student coursework.
- Created a series of quick intercept surveys across our learning and developer communities to gather feedback on the effectiveness of our content.
- Created a participant mailing list to aid in research recruiting.
- Did monthly analysis of user feedback and reported out to Digital stakeholders.