I recently wrote a blog post called “11 Efficiency-Improving Features of the Kingland Platform” for the Kingland blog. It showcases the recent UX improvements to the Kingland Platform – check it out!
Category: UX
Case Study: Inline Validation
During a redesign of our product, we discovered that error messages for specific fields were painful to use. So, we decided to redesign the experience.
Duration: 2 weeks
Methods: Discovery, analysis, prototyping, moderated usability testing
Tools: Figma, Webex, Trello
Context
Several data points in our application have strict formatting or uniqueness requirements. Messages regarding formatting and duplicate checks popped up in a modal, disconnected from the data points. The text was confusing and the different parts of the message were spread apart, making reading difficult.
Duplication messages didn’t include a way to navigate to duplicate records. This meant that the user would have to leave their work-in-progress to find out if the record already existed in the system.
Our users are data management experts who need to be both efficient and accurate as they enter and check data. The current design wasn’t helping them reach those goals.
Process
First, I researched design patterns and came up with three typical patterns for validating data: asynchronous validation, inline validation, and masking. In asynchronous validation, error messages are triggered by an action, such as clicking ‘Save’ or ‘Submit’. Inline validation is displayed next to each field as the user types. Masking formats the data as it is entered, preventing bad data from being entered in the first place. Each type of validation is appropriate for different contexts, although research shows that inline validation is easier, less frustrating, and more efficient to use than asynchronous validation. And masking is more effective than instructional text in ensuring correct data entry.
Asynchronous validation is best used for validation checks that take too long or are too resource-heavy to run inline. Working with industry best practices, I set a time limit of 5 seconds for asynchronous validation checks – anything faster than that could be displayed inline. Then, I collaborated with the development team to establish that all of our duplication and formatting checks were faster than the cutoff.
Next, I identified the fields that had strict, easily defined formats and wrote masking requirements for those fields. Our UI library, PrimeNG, provides this ability OOTB. A few remaining fields required more complex formatting, so I defined inline messages for those.
The last validation to tackle was the duplicate checking. Some fields needed strict duplicate matching, while others generated a warning, still allowing the user to save. I designed inline messages for both scenarios, including links to the duplicate records and careful microcopy. The messages appeared after the user exited the field, and disappeared when they began typing a new value. I also designed a success message to reassure the user that the value they entered was unique.
Lessons Learned
After implementation, we ran into an edge case that we hadn’t considered. Showing validation when the user clicked away from the field worked fine with data entry, but the message didn’t appear if the user was editing an existing record with faulty or missing data. We modified the requirement to also run validation when the user started editing and display any pre-existing errors up front.
Another hiccup was that one of the fields I identified for masking actually had another variation I didn’t know about. The faulty format made it through requirement reviews and QC before our domain expert spotted it. It turned out that my source for the field’s format was outdated – oops! Regardless of how much you’ve tested a design, always run it past an expert who can verify the details.
Impact
With this approach, we were able to significantly improve error-checking UX. Usability testing the new design showed that users easily understood both the inline messages and masking patterns. One user even expressed delight: “I don’t expect [to have a duplicate value], but I’m glad the system is checking it. And I like that the message shows up right away! Usually it would pop up when I submit.”
The feature hasn’t yet been rolled out to end users, but it was a talking point in sales discussions. And, this pattern can be extended in the future to tackle any formatting or checking needed by a client. Taking initiative to solve this issue translated into enhanced business value for both Kingland and our clients.
Case Study: Baselining User Research
When I started at Kingland, their core data management and compliance platform was slated for a redesign. The front-end software could no longer be upgraded and was difficult to customize. Before starting on new designs, I knew we needed to evaluate the application’s usability and pain points. I conducted a quick, cheap usability baseline that helped direct our redesign efforts.
Duration: 2 weeks
Methods: In-person moderated usability testing, thematic analysis
Tools: Webex, Trello
Context
Our compliance platform is customized for different clients’ needs, but some core functionality is kept the same, including hierarchy data management, search, and workflow. Before I started, the platform had never been tested for usability, and I didn’t have access to any end users.
I was the only UX designer on a waterfall team comprised of several overseas developers, an architect, a project manager, and a business analyst. I was responsible for spearheading UX research, strategy, and ideation, as well as delivering mockups, prototypes, and high-fidelity requirements. I worked closely with the business analyst to conduct testing, then shared results with the rest of the team.
Process
Time was limited, so I decided on low-cost, in-person testing with company interns who use a similar, but different application. Research shows that testing an application with ~5 people provides a good balance between cost and insights, so we conducted 4 usability tests across the high-touch parts of the application. (We had planned for 6 but had two no-shows.)
I wrote a series of tasks for the interns to complete within the application. The tasks were open-ended and didn’t reference any specific feature, to avoid biasing results.
“You are a data administrator and you are responsible for compiling a list of banks. Find all banks headquartered in Boston and determine how many there are.”
During testing, I facilitated while a teammate observed and took notes. We also recorded the sessions for later review.
Later, we compiled our notes. I used Trello to create an affinity diagram and coded the data by area of the application. This gave us a wealth of insights about participants’ behavior and attitudes during the test.
Findings
Our testing showed some serious weak points in the application’s design. Participants had trouble understanding what the app was for because the main dashboard was blank by default. They also had difficulty completing key tasks like searching, updating data, saving and submitting their work for review. Work frequently disappeared, frustrating the participants.
We also found that other areas of the application worked fine as-is, including the layout of the record page and the log-in process. And some features, like drag-and-drop, were hard to find but extremely helpful once discovered.
Impact
After analyzing the data, I presented my findings to the team. I showed clips of the participants struggling with difficult features, along with charts showing success rates for each task. Sharing this data helped build buy-in for changing the design of key areas. Our architect in particular empathized with the video clips, and advocated for usability throughout the redesign.
The findings from our baseline test informed the rest of the redesign, and our team was able to improve usability in every area we tested. A few examples:
- Submitting work for review, the highest-priority task, went from 0/4 completions in the original application to 3/3 in the redesign.
- Average time spent on a key editing task decreased from >5 minutes to <2 minutes.
- Negative feedback decreased from 74% of feedback in the original application to 22% in the redesign.
Baselining our application’s usability was a great first step in the redesign project. It was a very low-cost investment that helped us target areas to improve, while keeping designs that already worked well.
Stories & Sharpies Lightning Talk
In October 2019, I gave a 15-minute Lightning Talk to Workiva’s UX team during our week-long Team Jam in Scottsdale, Arizona. We spent three days bonding as a team, participating in workshops, tackling design challenges, and learning from each other.
My talk, Stories & Sharpies, came together over a couple of weeks before the jam. It dives into detail on a basic yet easily-forgotten design tool, the marker. Then I give sketching challenges and practical applications for using sketching to enhance design practices. At the end of the talk, I handed out markers and challenged the team to go draw!
Stories-SharpiesCase Study: Live Chat Application
Role
UX Designer, Project Manager, Developer Advocate, User Trainer
Project
Partway through my internship at GuideOne Insurance, the department director called me in to work on a technology project. He told me, “We want to implement a live chat service with our new web portal, GO BOP. You’re in charge – figure out what we need.”
In two weeks, I came back with a plan for integrating the chat with our site and setting up an internal support team. Senior leadership approved the project, and I worked with information architects and developers to set up the backend and test the service. I collaborated with the business team to organize a support team and designed the chat interface to match company branding.
Partway through the project, we discovered that our chosen service didn’t work with Internet Explorer. My team and I moved quickly to pivot to another service and even found a lower price, saving money for the company.
The project stayed on-schedule, and I presented it to the CIO a few weeks later for approval. I also trained the internal support team on the interface, and the chat service went live on the website soon after.
…In further pursuit of ease-of-doing-business, GuideOne has implemented and is currently piloting live chat, Beving relates. “This allows our agents to receive live assistance for their questions about the product, underwriting, and the GO BOP portal navigation,” he explains. “Once the pilot concludes, we plan to release this functionality to all portal users. In addition, we are actively reviewing all questions and answers so that we can build chat bot functionality to answer commonly asked questions instantaneously.”
– Insurance Innovation Reporter
The best part of this project was creating a great user experience! The chat app launched with a redesigned agent portal, and the support team received over 100 messages from users on the first day alone. It allowed users to directly reach out with subject-matter questions, website support issues, and feedback on the new portal. This feedback channel was valuable both for external and internal users, and good UX helped everyone transition easily to the new portal.
Content Mode Awareness
Role
Interaction designer, liaison to UI team, stakeholder manager, dev team advocate
Project
One of my first big projects at Workiva was Content Mode Awareness. Users had trouble distinguishing between two content modes, editing and metadata management. The metadata mode was also difficult to discover.
First, I evaluated qualitative research data for themes. One major problem with the existing interface was that the UI for both modes looked almost identical. Data showed that users were effectively blind to the feature intended to distinguish the two. Also, the single entry point for the metadata mode was hidden on a little-used tab. I used these themes to guide my concept explorations.
A competitive review of mode awareness in Google Suite, Adobe XD, and other apps provided patterns to explore. I sketched 15-20 ideas for differentiating modes and another 10 for entry point discoverability, then narrowed them down based on technical constraints and collaborative feedback. To share out the project with my development team, I created a mid-fi click-through prototype in Balsamiq and presented research data to support my design choices.
One option for entry point discoverability involved changes to ecosystem patterns, the application’s building blocks of code and UI. I worked with the UI team and developers to estimate feasibility for this change, and determined that it was out of scope. Instead, I altered the design to produce a Minimum Valuable Product with less development effort that still met user needs. Further refinements are slated for second-pass development.
Both the discoverability and mode differentiation designs were approved, and are currently in development. Once the project is live, I’ll collect and analyze additional data via analytics and user testing to further validate my assumptions.
Paper Airplane App
Role
UX Designer, UI Designer
Problem
Making paper airplanes is fun and educational, but it can be difficult to find or follow instructions for advanced designs. What if there was an easy way to find new airplane designs, follow along with instructions, and create your own? I decided to mock up an app called Airplane Builder.
This was a personal project with limited scope, so I created three personas to define my audience and their needs.
Gabby, age 10
Gabby is a budding engineer and loves making paper airplanes. She wants to be just like her older brother, Sam! Gabby gets distracted easily, but she can spend hours with digital devices, especially her tablet. She wants to experiment with building new plane designs – all by herself. She also wants to build her favorite designs over and over and show them to her friends.
Sam, age 20
Sam is Gabby’s older brother. He’s in college, studying to be an engineer, and he misses his sister. Sam is a creative guy and likes to build and invent. Before he went to college, he spent a lot of time building things with Gabby. He’d like to see what she’s working on and send her blueprints for new airplanes he creates.
Danielle, age 42
Danielle is Sam and Gabby’s mother. She’s a tutor and works from home most of the time. When Gabby’s not at school, Danielle is constantly looking for things to keep her busy. She doesn’t mind Gabby playing with her tablet as long as she’s actively engaging with her environment too. Paper airplanes are a great way to keep Gabby busy! But Danielle does worry about privacy and keeping Gabby safe on the internet.
Acceptance Criteria
With these personas in mind, I created four acceptance criteria for Plane Builder:
- The app is usable and attractive for both children and adults.
- The app can share designs between users, but there is no chat feature.
- The app saves the user’s favorite designs to build again.
- The app allows users to upload their own designs.
Ideation
The plane designs will appear in a rotational, 3D space, which will make folding instructions easier to understand. I sketched ideas for screens and devices in marker, and decided that Gabby is likely using a tablet, and focused on a mobile-oriented design. Interactions with the design can occur through swipes, taps, and pinches. I also considered empty states and error screens.
Errors and empty states Error screens
Mockup
I used Adobe XD for mockups. I carefully mapped a user flow, then filled in each stage with UI elements and instructional text. Screens, error states, and menus are fleshed out.
I kept the app design simple so children and adults could use it. Instead of in-app sharing, it integrates with existing social media like Facebook so users can safely share content. Each plane design has a difficulty level and flight characteristics, so users can discover new types of plane. In the instruction screens, the paper is shown before and after folding.
The app includes a library that saves designs and shows users their stats. It also gives instructions for users to photograph and upload designs.
Evaluation
This was my first UX project, so it never went through a development cycle or user testing. Looking back, I would like to clean up the UI design, especially the size of icons and buttons. I would also redesign the Settings page for better figure/ground contrast on the text. But this project helped me enormously in understanding the UX design process, and it was a lot of fun to make!