top of page
CoverPage_edited.jpg

The Decision Dashboard: Bridging the Gap Between Raw Data and Organizational Strategy

My Role

Lead Product Designer

Platform

E SaaS Product

Duration

2 months

Team

UX Researcher and 2 fellow designers

As Lead UX Designer for DevOps Intelligence Product, I led end-to-end design strategy for solution addressing the DevOps lifecycle for large enterprises own the Cloud infrastructure management at Kyndryl. My core responsibilities included:

- Defining design vision: Led and shaped a holistic user-centric solution that ensures Development Managers can track their teams performance with different lenses and multiple tools adapted by large organizations for DevOps lifecycle and security.

- Delivering intuitive workflows: Designed and streamlined solutions for next release as a Lead UX Designer and UX Researcher.

di-banner.jpg

- Championing data-driven decisions: Created experiences that surface critical metrics in DevOps lifecycle, through qualitative research.

Development Managers are responsible for optimizing the DevOps lifecycle, yet they currently lack actionable intelligence regarding team performance. The existing metrics are superficial and disconnected from operational reality, leaving managers unable to identify bottlenecks or make data-driven decisions to improve delivery velocity.

At present Development Managers struggles with: The Operational Gap Crucial delivery data is currently trapped in isolated silos, Jira (Planning), GitHub (Code), and Azure DevOps (Deployment). Because these platforms do not communicate effectively, there is no single source of truth for the end-to-end development cycle.

Frame 38.png

Initial Design

quote.png
  • Improve user satisfaction and task completion speed with the dashboard.
     

  • Increase user confidence in the data presented.
     

  • Enhance the overall usability and visual hierarchy of the interface.

Discovery Methods
  • Stake holder Interviews

  • Competitor analysis

  • Rapid Mock-ups for testing during user interviews

User Research

The research was focused on the struggles users faced in their day to day life related to their DevOps Cycle and their present individual process.

  • 7 Development Managers

  • Managing deployments and directly managing teams development performance
  • Minimum 2 yrs experience
  • With/Without prior access to DevOps Intelligence
Team
  • Interviewer - Myself

  • 2 Observers / Note takers

Objective
  • Identify & Prioritize Top Tasks: Discover the user's most frequent and critical workflows (Top Tasks) to ensure we are building for the "80%" of high-value use cases, rather than edge cases.
     

  • Uncover Operational Patterns: Observe how users attempt to navigate the mockup to reveal their mental models and preferred process flows, identifying where the current business process causes friction.
     

  • Define Evidence-Based Requirements: Move beyond assumptions by deriving the requirements list directly from observed user needs and gaps identified during the session.
     

  • Anticipate Stakeholder Concerns: proactively gather qualitative data to answer potential business objections regarding feasibility and feature necessity.

Findings
  • Non-Standardized Data Inputs: Tracking methods vary wildly across teams, ranging from Excel sheets to verbal Scrum updates, making data aggregation and comparison impossible.
     

  • The "Benchmark Vacuum": Without integrated industry standards (like DORA), managers lack a baseline to objectively assess whether their delivery velocity is competitive or lagging.

Design Strategy & Leadership

Prioritization & User Flows:
I facilitated workshops with stakeholders and developers to prioritize features based on user needs and technical feasibility. We also mapped user flows to identify key interaction points and optimize the overall user journey.

Information Architecture & Visual Design:
Leading the design team, we restructured the information architecture for better organization and clarity. We implemented a clean and consistent visual design with clear data visualizations and interactive elements.

Collaboration & Iteration:
Throughout the process, I ensured clear communication and collaboration between the design team, developers, and stakeholders. We conducted iterative testing with high-fidelity prototypes, incorporating user feedback to refine the design.

Results & Impact

  • User satisfaction ratings for the dashboard increased by 60%.
     

  • Task completion times for key actions decreased by 25%.
     

  • Users reported increased confidence in interpreting and leveraging the data.
     

  • The redesigned dashboard received positive feedback across marketing teams, promoting wider adoption and improved decision-making.
     

  • This also increased credibility of the product, 50% Development efforts reduced.

Conclusion

Through this investigation we were able to uncover more information from our users regarding their preferences, metrics, explanations users were unable to understand.

This case study demonstrates the effectiveness of user research and a well-defined design strategy in improving user experience. The success of this project highlights my leadership skills in facilitating collaboration, ensuring user focus, and delivering impactful design solutions.

© 2025 by Usha Sham

Follow my rants on

Follow

  • LinkedIn
bottom of page