FOCUS BRANDS: INTELLIGENT ANALYST
Ask anything and everything.
With the vision of making data-driven business decisions and planning, FOCUS Brands recently established its Business Intelligence (BI) department with Microsoft Power BI. Power BI allows data to be presented visually as “dashboards.”
As part of reporting, these dashboards can help employees view and dig into data relevant to their areas of work for insights and trends. But as all new implementations do, FOCUS Brands is currently challenged by the task of increasing the adoption of Power BI across the corporate level. How can FOCUS Brands make these dashboards more effective for their employees?
Overview of the Problem
FOCUS Brands is a global franchisor headquartered in Atlanta, Georgia that owns several familiar food-service brands – from Moe's Southwest Grill to Auntie Anne's, to name a few. As one can imagine, they deal with a lot of data. Essentially, various numbers are pooled in from different sources that tell how “well” the stores are doing both domestically and internationally. They are then transferred into internal reports by the employees of FOCUS Brands for executives and directors to identify areas of business interventions and opportunities.
With the vision of making data-driven business decisions and planning, FOCUS Brands recently established its Business Intelligence (BI) department. Business intelligence is very much similar to business analytics, except that it doesn’t tell the users what to do - instead, as mentioned above, it is used to derive insights and trends and the data are presented in the most suitable and understandable way possible. FOCUS Brands selected Microsoft Power BI which leverages software and services to visualize past and present situations based on the data gathered by different organizations within the company. Power BI is one of the popular tools available in the market, which competes with SAP Labs, Tableau, Oracle, SAS and IBM.
However, as all new technological implementations over corporate level can be, FOCUCS Brands was faced with the issue of low adoption rate. In other words, people weren’t using Power BI as much as they should be. (And ironically, even Microsoft – the developer of the tool – had faced the same issue when they implemented Power BI with their employees.)
In August 2018, FOCUS Brands came to us in hopes to find where the users were having problems and what they could do in the future to increase the adoption rate. They encouraged us to go out of the box in terms of solutions.
Design Result: Intelligent Analyst
In addressing the user experience issues we found with Power BI, we created an Intelligent Analyst that was inspired by Microsoft's Cortana. The Intelligent Analyst works as a “wizard” within the Power BI that assists the user at all times, ready to provide instant information. The user can ask it to take them somewhere specific within the dashboard, show specific data/insights, and answer questions encountered while browsing through Power BI. All in all, the point is that the system will be able to take in any contextual information and give back relevant information quickly that are meant to help the users with their daily jobs.
This was a semester long project (August – December) that went through a full design process.
My Contribution: UX Researcher + Designer
Research. Explored Power BI to understand the current product. Collaborated with teammates to plan for on-site visits. Conducted a contextual inquiry with an employee that led to informing the task analysis. Directly interviewed with a number of stakeholders. Facilitated the affinity mapping sessions in synthesizing the research results.
Iteration. Actively facilitated the brainstorming session based on synthesized data and contributed in conceptualizing. My Cortana-based concept became the basis for further iterations and development.
Prototyping. Provided rough sketches of wireframes for a teammate to mock up digitally. As for the final prototype, I worked with another teammate working on the hi-fi to finalize the features – to ensure that the design decisions were sound and evidence-based.
Testing. Drafted the protocols (instructions, what to do, and questions) for the teammates to execute on-site. Helped with cleaning and analyzing the qualitative data post-testing.
Final Wrap-up. Presented a summary of the process and results to FOCUS Brands.
Overall. Actively communicated with the FOCUS Brands’ liaison/stakeholders to schedule meetings and interviews as well as provide weekly updates.
As Microsoft itself recognizes that “consumption requires consumable products,” increasing the adoption rate of Power BI at FOCUS Brands will be ultimately be concerned with making the Power BI consumable. Data analytics and business intelligence are paradigm shifters within an organization - and when the employees (synonymous with the “users” in this case) are already practicing variously different way of doing things, many of the problems can be solved and avoided if the users are better understood.
Therefore, a series of research methods were deployed to gain more insights on the contexts and needs of the employees at FOCUS Brands. Our overall progress was as such:
It is important to note the distinctions between the stakeholders as the problem affected multiple personnel across FOCUS Brands and beyond. The basics were inferred from the initial information given by our FOCUS Brands liaison; later, they were confirmed as we investigated more during the research phase.
Primary. Corporate employees are the primary users accessing the dashboards for various needs: executing reports, referring to the data to keep themselves up-to-date, and using the visualizations to help themselves understand the trends better.
Secondary. Secondary stakeholders consist of our client, the BI Team at FOCUS Brands. They are closely related with the problem, but are not direct end users of the product in question. They want to find ways to make BI more useful for the corporate employees.
Tertiary. Tertiary stakeholders are non-employees like franchisees and managers (store owners) who are exposed and impacted by the works produced and decisions made by the FOCUS Brands employees – done through Power BI.
Consequently, in understanding the users within this problem space, these research questions were used as guidelines for probing:
How can we improve the current dashboard system to streamline the usage process?
In what ways can the BI dashboards serve needs across all levels of the end user groups?
Furthermore, the overall research goals were to:
Find out what obstacles are hindering the users from adopting the BI dashboards.
Find how to make the dashboards more effective by allowing users to get the data and insights that are catered to their needs.
Initial Product Exploration: What even is Power BI (AKA, “dashboards”)?
FOCUS Brands granted us an access to a discounted Power BI account that enabled us to explore the product in action on our own. Additionally, our FOCUS Brands liaison offered us keynotes and provided conversations that gave general background ideas of how Power BI is currently utilized:
Contextual Inquiry: Stop hypothesizing!
Information Goals & Data Desired to Collect
To see the dashboards in action and in “context”
To know how a typical user carries out their work practice, step-by-step
To find the relationship between the user and the dashboards
To understand how dashboards help their job
To discover any problems during the process
This method was conducted in our first on-site meeting at FOCUS Brands; consequently, it was our first time observing the real users of the dashboards. We had an inkling of how these dashboards were used from the words our liaison and through initial background research, but we had never seen how the users actually interacted with the dashboards.
In this meeting, we gained concrete information of what was happening from the user’s perspectives – instead of hypothesizing from second-hand data. Putting the user in the expert role helped us learn more about how a typical user went about using the dashboards; as a result, we were able to gain data on the step-by-step procedures which painted the picture for the task analysis in the future.
This method was constrained to the moment that the user was offering to show and therefore could only provide what was happening at each moment.
Additionally, it was a less intimate process as the user was focused with “showing” the procedures, and the interviewee was more concerned with absorbing the knowledge and less about connecting personally with the user.
Data regarding experiences that encompass multiple scenarios and histories and more personal opinions about the dashboards were gained through semi-structured interviews with other employees in the next step of the research process.
Semi-structured Interviews: Investigating the users’ opinions + feelings.
Information Goals & Data Desired to Collect
To obtain detailed insights and attitudes on Power BI
To further widen our knowledge of the users
To seek areas of opportunities
We decided that something more free-form would be more appropriate as we were still in the beginning phase of research, and there were still many unknowns. The casual and conversational nature of this method allowed us to gain more in-depth responses that explained certain behaviors and patterns that we were starting to see at the on-site visit.
Consequently, we were aware that parsing the data from the interviews would take some time since we would not be able to predict their exact answers. But, we knew that the richer information would allow us to diverge in many different ways for design implications, and we would be avoiding sorting ourselves into certain sets of preconceived notions. The interview was still guided with “core” questions to probe for answers that aligned with our initial objectives, but we still kept ourselves open and let the interviewers painted the picture for us.
Of course, the data represented by the interviewers are not fully representative of the dashboard users at FOCUS Brands. It is very possible that there are users who are completely different from the ones we interviewed, and that we did not captured these people.
Information Goals & Data Desired to Collect
To generate an organized hierarchy of the process
To identify pain points that translate into design implications
This method was for understanding the overall procedures in using the dashboards and identifying where exactly the pain points were happening. The task analysis was a necessary tool in helping us imagine the process from the user’s perspectives as best as we could.
The task analysis was triangulated from the information that we obtained through contextual inquiries and interviews. Therefore, it may not be a full representation of those outside of our knowledge - it may be excluding their tasks.
All qualitative results - from contextual inquiry to interviews - were reviewed, and the common usage steps and pain points regarding the dashboards were pulled out. Some steps were differentiated based on different cases. These information were synthesized to form a generalized task analysis so it could be applicable to almost all interviewees across all organizations.
Information Goals & Data Desired to Collect
To find patterns to the issues, requirements
To get to the mental model of the FOCUS Brands employees
As a team, we analyzed our notes from semi-structured interviews and other resources to come up with interpretations that seemed most plausible and coherent. We used different colors for the users from different departments (red for Finance, yellow for . Different users within the same department were indicated with the labels U1, U2, etc.).
Out of the sorting process, the most notable category that we interpreted contained issues related to information architecture, data, and training (level of user knowledge):
Unfriendly Information Architecture. Across the collected data, users of various types encountered issues that were presented by Power BI. The highest number of issues fell into the information architecture of Power BI that did not help with the user’s cognitive load.
For example, there were many different ways of using the filters that were supposed to help the user slice and dice the data. One dashboard would have certain filters unavailable in other dashboards. Some dashboards would be clumped with filters that the user would find irrelevant. These filters could neither be saved to the user’s preferences nor be automatically presented.
Since the organization of the dashboards was not standardized, users had difficulty in navigating through the layers with the dashboards to find what they wanted.
Some users could not understand the meanings of the words/data presented on the dashboards.
Outdated UI. Some interaction and visual design issues were present that disconnected the user experience on Power BI.
For instance, the discrepancy between the typefaces used and the overall feeling of Power BI (e.g. Times New Roman for a system meant to be “friendly”) seemed jarring.
The product’s purpose seems to be technologically advanced, but the general interactions and the presentations with the interface were archaic rather than modern and static rather than fluid.
Unreliable System. The data that were being used for Power BI instigated some levels distrust amongst the users as they found them unreliable at times.
Some of the mentioned problems were that data were not pushed real-time, data were not 100% clean, and sources of the data seemed doubtful.
Infrastructure of the data caused the system to be slow at executing its jobs (e.g. loading the visualizations) and reflecting changed parameters.
Expertise Gap. There was a huge expertise gap within the employees.
Our interviewees were considered as the few “star” users of Power BI because they knew their ways around with the tool. However, the rest of the employees weren’t quite as technically-savvy, and they came to our interviewees if they got stuck with the dashboards.
Interviewees expressed that more active training was necessary in order for others to recognize the dashboard’s full potentials and help them be more self-sufficient with their jobs.
Personas: Focusing design directions – who are important?
Building on our findings and observations, we generated 3 target personas with varying needs and levels of exposure and technical skills related to Power BI as expertise had a major relationship with the adoption of Power BI across users. As such, these personas and their characteristics are:
Frequent + Tech-savvy. Users who were part of the BI implementation in its “startup” phase; they have their dashboards customized accordingly to their needs. However, they’re still affected by the issues within Power BI and would like to see improvements.
Infrequent + Varied Technical Skills. Users who do not frequent Power BI and can also rely on other resources for their data-driven work if Power BI doesn’t deliver their needs.
Left in the Dark. Users who are completely unaware of Power BI and/or are very recently acquired.
Moving forward, we decided to focus on Persona Group #2; those who aren’t entirely satisfied with Power B. This is because solving their issues would be the most impactful in directly raising the adoption rate. Consequently, Persona Groups #1 and #3 are considered as “secondary stakeholders” – as the resulting benefits from addressing Persona Group #2 will trickle down to these people.
Design Implications: So what do these all mean?
To recap, here are what’s been gathered so far – Based on the synthesized results, the findings were further abstracted into four major high-level categories of issues:
Inefficient “exploring periods.” From the initial research phase, one of the biggest problems we saw with the current Power BI system was that many users wasted their time “exploring” around since the organization within the dashboards was labyrinthine rather than upfront. Consequently, users had to hurdle through layers of screens when they needed to get somewhere within the dashboards to find what they needed.
Inconsistencies within the system. Another issue was that the current system was ridden with organizational inconsistencies. For instance, one dashboard would have certain filters unavailable in other dashboards; some dashboards would be clumped with irrelevant filters. However, these myriads of combinations of filters could neither be saved to the user’s preferences nor be automatically presented.
Learning curve. For those users who were not particularly savvy with technology - which made up the majority of the employees at FOCUS Brands - these issues were daunting to their cognitive workload and made them apprehensive of Power BI. There needed to be an effective way of reducing their workload and therefore enabling these users to feel less reluctant about utilizing Power BI.
Need for more Context behind the data. Users wanted to be able to pinpoint the problems with the data and know more about the trends and the reasons behind the numbers. For example, users wanted to have more resources from previous years and view insights beyond the numbers provided on the dashboards.
Unavailability of dashboards. Some expressed that sometimes the dashboards didn’t even have what they were looking for and/or they were unable to find the necessary feature; when this occurred, users exited Power BI and found different outlets. Some interviewees revealed that if they weren’t proficient enough to find different outlets, they may ask someone else to find the information for them.
Then how can we solve these issues?
As such, the overarching design implications that’d effectively solve the above issues are as follows:
Ideation + Iteration
We translated the implications into several design ideas in forms of rudimentary digital sketches (or “mockups”). Among the concepts, one of my ideas was based off of Microsoft Cortana, intended to reduce the hurdle through a lot of screens when they needed to get somewhere within the dashboards and reports all the while being naturally integrated with the current Power BI system.
Feedback Session #1: Concept convergence.
The goals were to:
To receive general impressions on each concept’s functionalities from the users
To narrow down the concepts by taking the users’ suggestions and combining strengths
To create a more defined set of wireframes for the next feedback session
After the feedback session, the team sat together to go through the feedback notes and analyze the findings. We wrote the findings on a spreadsheet and discussed the implications of the responses that we received in the feedback.
Except one, all of our users preferred an intelligent assistant over other solutions.
Our users really liked the sending request feature from the FAQ and the News Feed concepts.
Additionally, the team chose my Cortana concept for its creativity (newness factor) as well as for its familiarity in product integration.
Consequently, we picked up the top-rated features from the concepts and merged them with the Intelligent Assistant. As such, my Cortana concept became the framework in further design implantations.
As the provider of the original idea, I sketched the basics of the Intelligent Assistant’s features that were mocked up into digital wireframes that were presented in the next feedback session.
Feedback Session #2: Making improvements.
Moving forward, we organized a second feedback session at FOCUS Brand’s premise to gather data and reactions in-person.
There was a common positive consensus with the Intelligent BI Assistant concept – all our users were very positive about the capabilities and usefulness of the solution.
However, some concerns and apprehensions regarding feasibility and logistics also came into light, which helped us make improvements over our existing design.
After summing up the issues in our system, we did another online brainstorming session to think about how we could leverage their feedback and suggestions to further improve our features and design. We grouped the issues that we saw and came up with solutions that could touch on all of them.
Designing the Prototype: The Drill Down
In finalizing the design prototype for further reviews and evaluations, more design improvements were made in this particular phase in the efforts to increase fidelity.
My major contribution in particular was hammering out the features included with the design based on the latest feed back we received in the previous session.
Scenario 1: The user knows exactly what they want. This scenario is designed for users who know exactly what they want. This is to appeal to the tech-savvy/frequent users of Power BI and therefore the Intelligent Analyst can find the information more quickly for them. Based on their past activities, the Intelligent Analyst will pull in the report with appropriate filters.
Scenario 2: The user has ambiguous questions. They don’t know exactly what they want from Power BI. The Intelligent Analyst can then help them in taking a business decision, but it can also collate existing information and prepare them for the user for reference.
The next scenario is where the user is not sure of what exactly they want, but they can still reach out to the Intelligent Analyst to seek help. This is a scenario where we have replicated historical analysis in making a business decision. The system will listen to the user and ask for more information if necessary and collate multiple reports in case the information seeked doesn’t exactly exist within the dashboards. The user can then go through the collated reports to synthesize the information they are looking for to make their business decision.
This is also to address the issues that some of the users were confused about how to describe the report/dashboard they needed. The AI is meant to accommodate this situation as it will be smart enough to through the ambiguity.
Scenario 3: Power BI does not have what the user wants and prompts them to raise a request.
Last scenario is where the requested information doesn’t exist in Power BI. As the user requests for the information, the Intelligent Analyst conducts a thorough research and tries to make sure that the information is indeed there. However, once the system realizes that the requested data doesn’t exist, it would provide a request form to the user. With this request form, the user can provide all the information and context that they want with a potential new dashboard. After completion, the form will then be sent to the BI Team for review, allowing them to be more aware of Power BI users’ needs.
Lastly, when the user wants to refresh their memory on Power BI, they can do so through the Intelligent Analyst as well:
When it comes to helping the users learn Power BI, the current system makes them watch tutorial videos and read FAQs which are considered time consuming. Rather than doing this, we decided to help them by letting them go through the tutorial interactively with real tasks. Users can reach out to the Intelligent Analyst if they are facing any issue while using Power BI or if they are having difficulties understanding certain features and completing tasks. The Intelligent Analyst will then take control of their screen and ask them to click on highlighted parts, showing the process step-by-step.
User Testing & Takeaways
To show our design solutions to the real end users and see their reactions
To seek validation of the final design and its assumptions
In order to observe the users’ reactions, the group created 5 benchmark tasks which reflected the most important functions/features of the design. The tasks were asked in short scenarios to set the stage for the action and to provide context for the tasks. Below is a table of the tasks that were used in the actual testing and their rationales.
We would first give the users a brief background introduction of the project and our prototype. We highlighted that they would be given with 5 tasks, and for each task they would have to explore on their own. We also encouraged the participants to think aloud during the testings. Later, the moderator would give one task at a time to the user. The notetaker would take notes of the user’s actions and comments. At the end of the each testing, the moderator would ask the questions in the post-questionnaire and let the user rate their answers. Each of the testing lasted 30-40 minutes.
Throughout the analysis process, we saw a couple of patterns within their responses:
Users were confused about certain interactions, and they had difficulties imagining the scenarios/cases with the pre-programmed instances that we gave them. This may be because the participants were not our target users at all and because they were completely unfamiliar with Power BI.
Users suggested adding more features to help with the querying process. For example, when the user asks questions, the Intelligent Analyst should be able to auto-fill the sentences based on its existing database.
Although most of their feedback were positive, users were still skeptical about the technology and had issues imagining the capabilities were were trying to convey with the concept.
Reflection + Personal Challenges
Project management. Throughout this project, I took on a few roles that pushed me out of my comfort zone, like being the liaison between the team and the client, and facilitator for team meetings.
Communication challenges. Our designed system was a very ambitious effort, with AI integration and all. As many of the FOCUS Brands’ employees were not technical people, they had difficulties understanding the high-level concept.
Having humility. While we were asked to give our reasoning behind our decisions, we were also actively trained to think about the the limitations and shortcomings of our justifications.
HCI Research Methods
Fall 2018 | Georgia Institute of Technology