Web3 Dev Platform - MVP Usability: a Case Study

Web3 Dev Platform - MVP Usability

Enhancing the usability of a web3 development platform.

Enhancing the usability of a web3 development platform.

Web3 Dev Platform - MVP Usability: Case Study

➞ Product Lead:

ongoing

I am responsible for overseeing the product strategy, coordinating with engineering and design teams, and ensuring the platform meets the needs of our target users.

As an expert UX practitioner, I also lead User research & Usability tests

➞ Overview:

Leveraging detailed feedback from technical users to refine and improve the platform's functionality and interface.

➞ Process:

  • Initial Discovery and Usability Testing

  • Analysis and Implementation of Changes

  • Second Round of Usability Testing

➞ Results:

  • A notable increase in user satisfaction, with improved ratings on ease of use and intuitiveness. (SUS score)

  • Enhanced user efficiency, with reduced task completion times and error rates.

  • Positive feedback on the platform’s improved navigation and streamlined workflows.

Background

The platform was designed and developed to streamline various workflows, integrating multiple tools into a cohesive environment.

Early feedback highlighted usability issues that affected user satisfaction and efficiency.

Recognising the critical role of UX in the platform's success, we initiated an iterative process of testing, feedback collection, and implementation of changes.

Phase 1: User Interviews and Usability Test Sessions

→ Objective:

Identify usability challenges and areas for improvement.


→ Methodology:

We conducted the first round of usability tests with 12 developers and techies, focusing on core tasks that users would perform on the platform:

  • Initial onboarding

  • Navigating through the platform

  • Creating product environment

  • Testing APIs

  • Leveraging Documentation


→ Findings:

Users reported difficulties with navigation, finding specific functionalities cumbersome, and the overall experience less intuitive than expected.

  • Expected more Streamlined Process

  • API Documentation is below average

  • Would like to see code snippets in the docs, as well as in the UI of the platform

  • Missing context on certain aspects

  • Found some bugs

  • Confusion on certain flows

  • And many more…


Aggregated findings: Time on task, task completion etc..

Phase 2: Synthesis and Implementation of Changes

What areas do we excel?

  • Comfort and satisfaction with using the platform

  • Found it easy to navigate through familiar tools

Strengths

Where do we see growth?

  • Provide more context where necessary

  • Enhance transparency and communication: e.g. status updates

Opportunities

SWOT

Where are the pain points?

  • Lack of visual clues: E.g. More detailed loading indicators

  • Confusion during environment creation

  • Uncertainty of API usage guidelines

Weaknesses

What are ares of concern?

  • Potential loss of user confidence: The lack of clarity and transparency during certain tasks

  • Risk of user errors: Without adequate guidance and clear communication

Threats

The platform was designed and developed to streamline various workflows, integrating multiple tools into a cohesive environment.

Despite its innovative approach, early feedback highlighted usability issues that affected user satisfaction and efficiency.

Recognising the critical role of user experience in the platform's success, we initiated an iterative process of testing, feedback collection, and implementation of changes.

Background

Objective:

Identify usability challenges and areas for improvement.


Methodology:

We conducted the first round of usability tests with 12 developers and techies, focusing on core tasks that users would perform on the platform:

  • Initial onboarding

  • Navigating through the platform

  • Creating product environment

  • Testing APIs

  • Leveraging Documentation


Findings:

Users reported difficulties with navigation, finding specific functionalities cumbersome, and the overall experience less intuitive than expected.

  • Expected more Streamlined Process

  • API Documentation is below average

  • Would like to see code snippets in the docs, as well as in the UI of the platform

  • Missing context on certain aspects

  • Found some bugs

  • Confusion on certain flows

  • And many more…


Phase 1: User Interviews and Usability Test Sessions

What areas do we excel?

  • Comfort and satisfaction with using the platform

  • Found it easy to navigate through familiar tools

Strengths

  • Provide more context where necessary

  • Enhance transparency and communication: e.g. status updates

Where do we see growth?

Opportunities

Pain points?

  • Lack of visual clues: E.g. More detailed loading indicators

  • Confusion during environment creation

  • Uncertainty of API usage guidelines

Strengths

Areas of concern?

  • Potential loss of user confidence: The lack of clarity and transparency during certain tasks

  • Risk of user errors: Without adequate guidance and clear communication

Threats

SWOT

Recommendations

We categorised proposed next steps into:

  • User Interface (UI),

  • Product:

    • Development (Dev),

    • User Experience (UX)

  • Individual Product teams (Event, Sign)

  • Marketing team

  • User research topics


Here’s a brief overview of how the recommendations were categorised:


→ For Product Teams

Development (Dev) Recommendations:

Technical issues or feature requests that required backend development, coding, or system architecture changes.

User Experience/User Interface (UX/UI) Recommendations:

Feedback that pertained to the usability, design, accessibility, and overall user interaction with the platform was categorised under UX and UI recommendations.

This included changes to improve navigation, simplify complex workflows, or make the interface more intuitive.


→ Marketing Recommendations

These recommendations focused on how to communicate the platform's features and benefits to potential users, based on insights about user perceptions and expectations gathered during usability testing.

We've identify the need to highlight its unique selling points, and address user concerns through targeted marketing messages.


→ User Research Needs

Identified as a separate category to emphasise the importance of ongoing user research in guiding future enhancements.

This category included recommendations for further studies to understand user behaviour and the effectiveness of the implemented changes.

We categorised proposed next steps into:

  • User Interface (UI),

  • Product:

    • Development (Dev),

    • User Experience (UX)

  • Individual Product teams (Trace, Seal)

  • Marketing team

  • User research topics


Here’s a brief overview of how the recommendations were categorised:


→ For Product Teams

Development (Dev) Recommendations:

Technical issues or feature requests that required backend development, coding, or system architecture changes.


User Experience/User Interface (UX/UI) Recommendations:

Feedback that pertained to the usability, design, accessibility, and overall user interaction with the platform was categorised under UX and UI recommendations.

This included changes to improve navigation, simplify complex workflows, or make the interface more intuitive.


→ Marketing Recommendations

These recommendations focused on how to communicate the platform's features and benefits to potential users, based on insights about user perceptions and expectations gathered during usability testing.

We've identify the need to highlight its unique selling points, and address user concerns through targeted marketing messages.


→ User Research Needs

Identified as a separate category to emphasise the importance of ongoing user research in guiding future enhancements.

This category included recommendations for further studies to understand user behaviour and the effectiveness of the implemented changes.

Phase 3: Second Round of Usability Testing

→ Objective

Evaluate the effectiveness of changes and gather further insights.


→ Methodology

Employing a similar testing setup as the initial phase, we observed how the implemented changes impacted user interaction.


→ Outcomes

The modifications led to significant improvements in task completion times and user satisfaction.

However, new insights emerged, suggesting areas for further refinement, such as enhancing the visibility of certain secondary features.

→ Objective

Evaluate the effectiveness of changes and gather further insights.


→ Methodology

Employing a similar testing setup as the initial phase, we observed how the implemented changes impacted user interaction.


→ Outcomes

The modifications led to significant improvements in task completion times and user satisfaction.

However, new insights emerged, suggesting areas for further refinement, such as enhancing the visibility of certain secondary features.

Results

The iterative usability testing and enhancement process led to significant improvements in user satisfaction and efficiency. By the end of the testing cycles, we observed:

  • A notable increase in user satisfaction, with improved ratings on ease of use and intuitiveness.


  • Enhanced user efficiency, with reduced task completion times and error rates.


  • Positive feedback on the platform’s improved navigation and streamlined workflows.

The iterative usability testing and enhancement process led to significant improvements in user satisfaction and efficiency. By the end of the testing cycles, we observed:

  • A notable increase in user satisfaction, with improved ratings on ease of use and intuitiveness.


  • Enhanced user efficiency, with reduced task completion times and error rates.


  • Positive feedback on the platform’s improved navigation and streamlined workflows.

Recommendation

We categorised proposed next steps into:

  • User Interface (UI),

  • Product:

    • Development (Dev),

    • User Experience (UX)

  • Individual Product teams (Event, Sign)

  • Marketing team

  • User research topics


Here’s a brief overview of how the recommendations were categorised:


→ For Product Teams

Development (Dev) Recommendations:

Technical issues or feature requests that required backend development, coding, or system architecture changes.

User Experience/User Interface (UX/UI) Recommendations:

Feedback that pertained to the usability, design, accessibility, and overall user interaction with the platform was categorised under UX and UI recommendations.

This included changes to improve navigation, simplify complex workflows, or make the interface more intuitive.


→ Marketing Recommendations

These recommendations focused on how to communicate the platform's features and benefits to potential users, based on insights about user perceptions and expectations gathered during usability testing.

We've identify the need to highlight its unique selling points, and address user concerns through targeted marketing messages.


→ User Research Needs

Identified as a separate category to emphasise the importance of ongoing user research in guiding future enhancements.

This category included recommendations for further studies to understand user behaviour and the effectiveness of the implemented changes.

Key Insights

  • The Importance of Iterative Testing: Stakeholders realised the importance of continuous testing


  • User Feedback as a Guiding Tool: Direct user feedback informed the prioritisation of development efforts


  • Adapting to User Expectations: The testing highlighted the need to balance innovative features with user expectations

  • The Importance of Iterative Testing: Stakeholders realised the importance of continuous testing


  • User Feedback as a Guiding Tool: Direct user feedback informed the prioritisation of development efforts


  • Adapting to User Expectations: The testing highlighted the need to balance innovative features with user expectations

Initial Discovery and Usability Testing

Phase 1

Synthesis and Implementation of Changes

Phase 2

Recommendations were categorised into:

  • UI

  • Product

  • Marketing

  • User research


Recommendations

Development:

  • tech issues

  • bugs

  • backend

  • system architecture

UX:

  • flows

  • context

  • new & improved features

Product

  • messaging

  • positioning

Marketing

  • Specific feature research

    • functionality

    • workflows

User Research

  • 12 developers

  • 4 main tasks

    • subtask

Method

  • UX not seamless enough

  • not as intuitive as expected

  • confusion at certain points

Initial Findings

Second Round of Usability Testing

Phase 3

Synthesis and Implementation of Changes

Phase 2

Second Round of Usability Testing

Phase 3

Synthesis and Implementation of Changes

Phase 2

Recommend-

ations

Recommendations were categorised into:

  • UI

  • Product

  • Marketing

  • User research


Development:

  • tech issues

  • bugs

  • backend

  • system architecture

UX:

  • flows

  • context

  • new & improved features

Product

  • messaging

  • positioning

Marketing

  • Specific feature research

    • functionality

    • workflows

User

Research

  • Discovery

  • Usability

Phase 1

Want to work with me?

Want to work with me?

Yes, let's connect

©️ 2023 Sinteza. All rights reserved.