Mission Control was a project that myself and three others worked on during my last two semesters enrolled in the Human Computer Interaction master’s program at IUPUI. For this project, we were paired with a local Indianapolis company (noted as DoT from here on out) to design a system and utilize our educational training in a “real-world” setting. We met with one of them business owners, discussed their needs, completed preliminary research, conducted interviews with their clients and potential users of this product, and then designed and tested a system that would later be implemented into their current business practice.
Executive Summary
DoT noticed various gaps in how they were currently keeping up communication and wanted a way to make communications more efficient and active. Our team worked closely with one of the founders of DoT in order to assess the needs this product would need to address.
To understand target users and their activities, our team interviewed clients of DoT, potential clients of DoT, and reviewed comparable existing products. After analyzing the data gathered from this research, we decided to design an enhanced web application that allows file sharing, supports various means of communication between DoT and clients, and provides a visual presentation of the Buyer Insights process.
We designed a version of a Mission Control portal that allows DoT clients to track upcoming events, organize and share files, comment on files, and visualize the Buyer Insights process through a timeline.
DoT is a company based in downtown Indianapolis that specializes in providing customer insights to clients looking to improve the way products are experienced by their customers. The current process that DoT utilizes is called Buyer Insights. This data-driven process involves continuous communication between various members of the DoT team and the client.
The Process

Analysis of Research

The team decided to create an empathy diagram in order to better understand the current client of DoT. Within the empathy diagram, you focus on an individual’s goals, pains, and gains of a product while keeping their daily routines in mind. We used the data we collected from our current client interview as well as our personas to complete the empathy diagram.
To help us envision the different types of target users for our application design, we created two personas. The first persona depicts a manager at a small, local, non-technical, busy company who is looking for a small package of Buyer Insight service. The second persona is a manager at a large, non-local, technical, busy company who is looking for a full package of Buyer Insight service and website redesign option. The key insights from these personas were: some target users are not tech savvy, they want to help their clients above all, and they have met an obstacle in their current business/marketing strategy.


Summary of Research
A common theme in our research and analysis was a lack of communication and transparency between DoT and their clients during the Buyer Insights process. For example, during Buyer Insights involves many stages and DoT doesn’t always communicate with the client in all of the stages. A client may start emailing or calling DoT to inquire about what is taking so long and DoT may not always respond in a timely manner. We also found that the number of meetings that the client was expected to attend could be overwhelming and challenging for the client to fit into their schedules. For example, on a weekly basis DoT will have phone calls with the client to talk about updates. Some of these weekly phone calls last 45-90 minutes.
Communication, in almost anything, is the key to success. Our goal is to support a quicker, more transparent way of communicating in our design. Uniformity and organization are other important aspects to take into account for DoT’s service. It would be beneficial for the client if all of their information, files, timelines, calendar, etc. could be available in the same domain so it is organized and easy to find.
Lo-Fidelity (LoFi) Prototype
Hi-Fidelity (HiFi) Prototype
User testing results
The purpose of the user testing was to collect feedback from representative users about the utility, usability and user experience provided by the design exhibited in our high-fidelity prototype.
For our usability test we had a total of 8 participants. One participant was a current DoT client, one was a graduate student at IUPUI, two were undergraduate students at IUPUI, and the others were professionals in the Indianapolis area.
Remote usability test
We used Zoom to conduct a remote test with the current DoT client. First, we introduced ourselves to the participant. Then, our team member who served as a facilitator for the test shared the screen with the participant so he could view and read the Consent Form. Next, the facilitator asked the participant background questions. Before starting the test, the facilitator showed the participant the tasks page on Google Docs and the prototype webpage on another tab. Next, the facilitator gave the participant control over the mouse. After each task was complete, the facilitator asked the participant to provide spoken responses to a post-task questionnaire. Finally, the facilitator asked the participant to complete a post-test satisfaction questionnaire on Google Forms.
In person usability tests
Before they started the test, we gave participants the tasks page and opened up the prototype on our laptops. After each task, the facilitator asked the participants a post-task questionnaire. Then, the facilitator asked the participants to fill out a post-test satisfaction questionnaire on Google Forms.
The tasks we asked the user to complete included; 1. Finding a report and adding a comment to the report, 2. Finding a future event, 3. Locate details about the kick-off meeting, and 4. Changing notification settings.
After conducting four tests to discover strength points and pain points for the prototype, our next step was to analyze and conclude results.
Test 1: Scenario Completion, Critical Errors, and Non-Critical Errors Analysis
After analyzing our results we discovered there were a total of six non-critical errors. Four of these were from Task 1A (logging into the system). Due to the language in our script, some participants asked what their login would be. With our prototype, you can type in anything and still log-in but we didn’t take this into consideration when we wrote the script, causing participants to have a non-critical error. Other non-critical errors were made on Task 1B and on Task 4 by two different participants. Tasks 2 and 3 had a 100% completion rate by all participants.
Two critical errors took place. These errors were on Task 1B and Task 3. Participants were unable to complete these tasks due to this error. The critical-error recorded on Task 1B was caused by the user unable to click the comment button. The button was un-clickable and the user was unable to scroll the page so that the comment button would become clickable. This resulted in the user being unable to leave a comment.
Test 2: Post-Task Questionnaire Data Analysis
The mean post-task ratings (5-point scale, 1= Hard, 5=Easy) were Task 1 is 4.1, Task 2 is 4.8, Task 3 is 5.0, and Task 4 is 3.9. For Task 1, the majority of the participants gave this task a rating of 4 which is 62.5% (5 out of 8). We then asked them why they gave it the rating that they did. For example, P6 gave the rating of 4 but choose this rating because P6 “was unable to click on the comment button, was confused over whether the report was just a picture or not.” For Task 2, the majority gave a rating of 5 which is 75% (6 out of 8), Task 3 every single participant gave it a rating of 5 which is 100% (8 out of 8), Task 4 the majority gave it a rating of 5 which is 50% (4 out of 8).
We got great feedback from this post-task questionnaire. Some of the comments were the same such as for Task 1B, when trying to add a comment, there should be a label for each icon when you hover over it. P1 and P5 suggested this.
Another issue that the participants brought up was that the setting icon needed to be placed someplace else. For example, P5 stated “No meaning of what the gear icon meant. I didn’t know that the gear icon meant settings. Should have a hover to see labels.”
Test 3: Post-Test Satisfaction Questionnaire Data Analysis
From the Post-Test Satisfaction Questionnaire Data (the Google survey), 75% of the participants strongly agreed with the statement “this system was easy to use”. In addition, 25% of the participants agreed with the statement. In addition, 25% of the participants strongly agreed with the statement “the overall look and feel of the site was appealing to me”, and 62.5% agreed with the statement. However, 12.5% are neutral.
On the other hand, the common response for “Was there anything about the prototype that you especially liked?” is the website is simple and easy to use. For example, P1 said “I liked the simplicity of it.” and P2 said, “I liked how easy it was to navigate”. Also, P3 said “it’s clear enough.” and P5 said, “it was easy to navigate through”.
Furthermore, there are two repeated suggestions that the participants state we need to improve which are the look of the website and the settings. For example, P3 and P8 state “just need more color on the webpage for the icon” and “aesthetics could be improved. Very basic”. Lastly, we received some unique suggestions. For example, P1 states “I think there could be a news feed that shows all the recent activity within the project or all of what would be notifications”.
Test 4: Time-on-Task Analysis
Based on our results from our time-on-task, the tests went really well. Most of our participants were able to complete each task in a timely manner. If they weren’t able to finish a task we still recorded the amount of time it took. The average time for Task 1A was about 76 seconds, Task 1B was about 51 seconds, Task 2 was about 40 seconds, Task 3 was about 21.5 seconds, and Task 4 was about 81 seconds. P5 and P6 were the only two participants who weren’t able to complete one of their tasks. P5 took 363 seconds (about 6 minutes) before they gave up on Task 4 and P6 took 90 seconds before they gave up for Task 1B. For those two tasks, the future plan would be fixing the issues that the participants gave us and then retest to make sure that there weren’t any more issues.
Overall, the results of our Usability Testing were mostly positive. When analyzing the ease of use for each task, the majority of users ranked the tasks as being easy. Two of these tasks had results worth further discussion. Task 3 was reported as easy by all users. Task 4 had the most variety in responses ranging from 1 to 5. For task 4, we had our users go to the settings icon and change the notification settings to match their preference.
When reviewing the notes of these sessions, it was found that a common complaint of task 4 was in regards to the icons. Some of the users stated they were confused about what the gear icon represented. One user stated that the gear icon meant settings, however, they wanted to be able to allow important emails to still come through. The takeaways from the results of this task are; eliminate the ambiguity of the icon and clarify the notifications by allowing more in-depth selections. These takeaways will be addressed in future design requirements. The settings icon will be replaced with either a different icon, or with the word “settings”. The notifications will be more customizable and will include a description about the purpose of the notifications.
conclusion
Our main finding was that there was a communication gap between DoT and their clients throughout our research phase of the project. As a team, we wanted to tackle the issue, face it head on, and improve DoT’s means of communication. Client’s of DoT mentioned they were lost throughout the process, they would email but wouldn’t receive quick responses, and documents were hard to find.
Our team came up with a new concept for Mission Control. We designed a website to address the needs of the clients. This is why we created an interactive timeline for the clients to follow along with. This way they can see which phase of the project they are on without needing to receive an email or reaching out to someone.
We added a tab for files within the navigation bar. This allows the client to search for any files they may need, based on the project phase. Allowing the user to find documents easier and more efficiently.
Lastly we added a notifications center. This way clients can be updated real time anytime changes are made. If a process is ending and a new one is beginning, if documents have been added, or just for meeting reminders.
Our team tackled every concern DoT clients had when it came to communication. The clients we interviewed loved the concept of Mission Control and feel it closed the communication gap. They hope to see DoT implement it into their system in the near future.
We have handed off our designs, research, etc. to DoT for them to go through and implement if wanted. Mission Control is now in DoT’s hands.
***if you would like to read more about this project please contact me and I’d be happy to share more***