Operators consumes a lot of data daily, but how do they approach the data available? Do they search for data based on the task they are about to finish or by data type? That was the next question we had to answer. My colleague and I had some guesses based on what we heard during interviews, and we decided to test them with operators with mockups.
Figure 6 Three approaches to information architecture tested in first round of usability testing.
From Round 1, we learned that rather than looking for data by task or type, operators expected to be able to look for information by pad because all their tasks are pad centric. We created a second set of info architecture that groups data by pad and put them to the test. The new set of info architecture consisted of two new approaches, one requires operators to select their phase of the day first and the other required selecting their entity (pad, well, etc.) first.
Figure 7 Two approaches to information architecture tested in second round of usability testing.
From this exercise, we learned a lot about how to better present the various operation data to the operators. A couple of the highlights included
- The ability to see critical data together in context out weights issue with discoverability.
- Operators wanted to be able to see data on run (a collection of pads) level as well since they are each assigned a specific run.
When we started creating mockups and laying out the end-to-end workflows, we explored various ways to integrate operation data and feature from different source tools into a single application. From embedding with iFrame to opening data source tool in new tab to replacing old tools.
Figure 8 Four different ways to reach source tool data through the Tool.
When shared these different approaches with the client, we were informed that the Tool will be data read only, no writing back. With this constraint in mind, we moved forward with the approach of opening source tool in a virtual window to allow operators to take actions such as creating a task. Though it wasn’t ideal, but we chose it because would offer a more immersive experience comparing to opening a new tab, and more flexibility comparing to iFrame embed.
To assess how acceptable the idea of a read only “single-pane-of-glass” is for operators, I moved forward with creating a prototype to test with operators the usability of a read-only tool.
Figure 9 Screenshots of prototype used for testing task creation in virtual window.
During testing, operators expressed strong desire to complete work in the Tool such as creating a new job request and responding to tasks in the Tool. This is because
- Operators expected to do everything in a single application. Having to launch source tool was unexpected.
- Having to jump between tabs to complete a single task was exactly what they were already doing. It’s not solving the problem.
With users’ responses and our explanation of how breaking users’ flow into several tools would lead to inconsistent interaction and branding experiences as well as increasing risk of low adoption rate, we were able to convince clients to enable data write back to source tools if APIs were available.
With data read and write being the new approach for the Tool, I went back to ideation to incorporate more action features such as the ability to respond to tasks in the Tool. I also saw this as an opportunity to redefine some terminology.
For example, historically, a “task” is something that a specialist verbally asked an operator to do. Aside from “tasks”, there are many other action items getting assigned to operators through various tools, each called a different name. I wanted to challenge whether we can combine them all under the “task” umbrella.
I went on explored different ways of allowing operators to respond to tasks in the Tool.
Figure 10 Three approaches to allow task response in the Tool.
When consulting with operators and business stakeholders on these ideas, their responses were:
- Prefer having slide out for task response to maximize the real estate for listing tasks.
- Prefer distinguish tasks by type, not by source, because the two requires different type of responses and generally varies in urgency.
Figure 11 Final design for task page.
Through rounds of iteration and testing, my colleague and I were able to determine the fundamental data architecture for the Tool. We were able to utilize usability testing results to show the project team the pros and cons of continuing with a collection of separate apps verses unifying the interface into one and more workflow driver. By making these decisions early in product planning, before development, we were able to
- Save development time
- Provide users with more consistent experience
- Reduce maintenance cost on applications and tool by replacing 4 of them with the Tool
The final design was proven to match with operator mental model well. The training time on this application was significantly lower than that of historical applications. Most operators reported requiring weeks before getting used to previous applications but with the Tool only half an hour of walkthrough was needed to learn how to operate it. Operators reported it being “very easy to use”.