When jumping into a fun project like this, it can be agonizing not to open your preferred design tool and start throwing concepts together. I’ve certainly been guilty of getting into the details too early. Over time I’ve learned to restrain, at all costs, especially when it comes to the more conceptual, less formalized and pliable ideas. Once you have established designs (especially ones in high fidelity) it’s hard to backtrack and shift away from what you already have. Also, when showing other team members your early ideas in a very polished format there is a chance it is perceived as more refined and finalized than it actually is, thus impacting the feedback you receive.
So, before we got ahead of ourselves, we needed to first ask ourselves some questions to better understand the problem we were trying to solve:
- What is the root of the problem for our users?
- Who will be using it (which of our personas)?
- What are their specific pain points?
- How to they attempt to solve the problem today?
Eventually we get to a point where we feel good about our initial, instinctual understanding of the problem we’re solving. At this point, however, it’s all just a hodgepodge of gut instinct and knowledge from our personal experiences. The next step is to validate our assumptions, or prove them wrong, with research. This involves a number of different ways of gathering information including:
- Client interviews
- Interviews with internal team members closer to the problem
- Competitive analysis (how are others solving this business problem?)
Hearing first-hand the desires of our clients and internal team members was vital in making sure we were solving the right problem. These discussions typically encourage us to adjust our initial assumptions based on the findings. This process is also key to us empathizing with our users on a deeper, more human level. A few of our big takeaways from these early conversations were:
- The CX Suite was extremely lacking in the ability to customize (and it seemed every client was looking for a different way of displaying their data)
- Clients needed multiple, unique views of the data for various roles and user groups within their organizations.
- Viewing the same segment of data in different visual formats was limited and not possible without cumbersome navigation and/or repetitive filtering.
There were many other ideas and suggestions brought up that we kept in mind, but these were the common themes we kept hearing. After the sessions, we were feeling well informed, focused, and excited to get rolling. It was finally time to start putting pen to paper.
Ideate, Iterate, and Validate
These early design stages are my personal favorite. Its when you really get to flex those creativity and problem solving muscles. I just love it. Based on our early discussions and research, we knew this project needed to focus specifically on flexibility, and customization that would serve a wide range of needs. My personal mission at hand was to conceptualize the users’ journey through “I know what I need” → “Show it to me in the way I want” → “Let me get it in front of the right people”. My ideation process varies from project to project, but typically involves taking notes, mind maps, story boards, and rough sketching. I’ll also try and get some rough ideas into a clickable InVision prototype early, as it helps me and whomever I share it with get a better feel for the navigation and flow through the idea.
In just about every step along the way through my design process I am constantly looking for inspiration and digging in to how others may have solved similar problems (no sense in recreating the wheel if there is an existing paradigm). Ideally I’m getting hands on with these products, which can be tough when you’re talking about enterprise software. Google Image and Dribbble searches are always handy and can suffice when you’re unable to use the real thing. Fortunately, I was able to get my hands on numerous tools. This went a long way toward helping me understand how they were handling custom data visualization and dashboards, and thus how users may already expect to perform some of the tasks they’ll be doing with our software.
Time to Test
One valuable piece of ForeSee’s professional services offering is access to our extremely knowledgeable and talented usability team. Their main focus is to investigate the websites and products of our clients, helping them with any usability issues they discover. In addition, though, they take time out of their busy day and play a huge role in assisting us by facilitating our testing sessions, pinpointing issues, and recommending adjustments.
When discussing the projected timing for testing, we decided the earlier the better. Creating a completely customizable dashboard and data visualization system was a pretty massive undertaking for us. The sheer scale and complexity of this project pushed us to verify as early as possible. We didn’t want to go too far down the direction we were headed if it wasn’t the right course. In hindsight, testing early with our clients like we did was absolutely the right decision. In fact, the design ended up evolving so much after the first round, we held a second round of testing after the updates.
Here are some of the most prevalent takeaways from our testing sessions:
- Needed to clarify what a “card” was. The concept was new and we hadn’t done a great job explaining the new system.
- We had numerous terminology issues to address.
- There was confusion around “Done Editing” vs “Saving” a card (we ended up combining the concepts).
- Nearly all users tried to interact with the charts in some way (drill-in functionality is key)
- It took time for users to find the menu of dashboard options initially, but all eventually found it (this was deemed OK by myself and the usability team as this seemed to be a justifiable learned behavior)
If you’ve ever been a designer witnessing testers work their way through some of your testing flows, then you know how agonizing it can be to see them inevitably struggle. Biting your tongue when you just want to shout “Click the big blue button!!” In all seriousness, while watching these can be painful at times, user tests are the single most powerful tool you have as a designer. I’ve never gone through testing where I didn’t learn something, and these tests held true to that. After some design refinement and polish, we were now ready to hand off the finalized work to the developers.
One might think that the moment the designer hands his work off to the developer to create, his or her job is done. However, this is far from the reality of the situation. As the developers piece together your design, those back-and-forth discussions are inevitable. And that’s good thing! A solid line of communication with the Dev Team is vital. Things can easily get lost in translation between static image and code. There are, however, numerous ways to set your developers up for success, and reduce the amount of uncertainty, guess-work, and questions they have. In this example, I knew the complexity and sheer number of possibilities that came with this project could lead to a lot of questions. I set out to answer those questions before they were asked and tried to prepare and guide my team as best as possible. Here are some assets I created to help with just that:
Guidelines for the card system
Card creation flows for each card type
Edge cases, empty states, error states, and loading states
Matrix of all our different survey types
InVision was perfect in allowing me to piece all of these things together. I used the popular prototyping tool in a unique way by hosting this abundance of information in a single, easily accessible location, allowing the team to consume it all in a manageable way. Being told by the Development and QA teams how valuable all of this was made me incredibly happy. Its a standard of detail and thoughtfulness that I’ll continue to hold myself to.
There were still plenty of questions and discussions happening during the development and QA phases, but before long we had a real-life functioning product. Getting your hands on the real thing that you worked so hard to help create is what product designers live for. The custom dashboards and card building features have only been live in production for a few weeks, so I’m sure we’ll be getting plenty of feedback and more insight on how we can improve it further. For now, let me share with you the things I’ve learned and the major takeaways from the invigorating process of building this thing.
In the early stages of a project, we need to do a better job of documenting and organizing our knowledge and findings. Understanding the problem, early research and discussions, competitive analysis, success metrics, testing results etc. None of this could be found very easily in a single, handy location. I ended up establishing a new process for the product team and we now require a Product Brief (within Confluence) for each project that serves as the single reference point for all of this information.
Better utilizing our clients’ time
We have great clients who love being involved in our product development process. Often this means being asked to test our early designs. Our pool of client resources to pull in for testing isn’t abundant, however. I feel like we may be able to better utilize the precious time we have with our clients. Every face-to-face moment with one of our users is so precious. I’ve found that during our early interviews when we’re discussing pain points, functionality needs, and our early ideas for a solution, this is when we’re getting the most insightful, empathy building nuggets from our clients. So, instead of asking these folks to perform click tests that check for overall usability of the designs, let’s utilize their time in the most impactful way possible. This doesn’t mean forgoing usability testing by any means. We have a large internal analyst team we can leverage for these sorts of usability tests.
A formalized “Design QA” process
When pieces of the product are handed off to QA by the front end team, designers aren’t always brought in to assist with the testing. This can result in us later finding design problems, specifically some of the more aesthetic issues that other team members may not notice. While the back-and-forth we have with developers often help in refining these details, we had no established process and things were slipping through the cracks. Moving forward we plan to have a specific Design QA step in the process where designers sign off on the dev work.
When releasing software, the goal is not to ship a “complete” product. Waiting until every desired feature is packed in there before releasing is a sure fire way to mess things up. Like all projects, this one was not immune to the constraints of time and personnel. Thus, this initial release was missing some big ideas that we plan to add:
- More powerful sharing options
- Scheduled reporting of these dashboards
- Presentation mode
- Collaboration (commenting, @ mentioning, event tags, annotations)