COLAB29 - Chrome Extension


Meet Flo, the Chrome extension that supercharges your focus! Boost concentration, enhance learning, and bring calm to your workday

Problem Background 

As we navigate an increasingly digital world, we find ourselves consistently distracted by technology, life events, and competing priorities, all contributing to an environment where our ability to focus is significantly reduced. Digital distractions can consume up to 2.5 hours of productivity each day.

Alarmingly, technology vendors are fully aware of this and exploit attentional processes to engage people with websites and applications, keeping their attention glued to the screen.

Research suggests that attention is goal-directed. When completing a task, people benefit from being able to assign priorities and track progress. However, in the face of constant distractions, maintaining focus on our goals becomes an uphill challenge. This issue largely stems from an inability to establish a conducive learning environment—one that lacks routine, structure, and is rife with both internal and external distractions.

To address this challenge, we need a solution that helps individuals regain control over their attention and productivity. By providing a structured environment that minimizes distractions and fosters goal-oriented focus, individuals can reclaim their ability to plan, prioritize, and achieve their objectives effectively.

We hypothesize that:

Users particularly those who struggle with maintaining focus during both casual and formal goal setting—will gain greater control over their distractions through engagement in a reward-driven environment designed to improve attention spans

This led us to the following problem statement:

How might we make goal setting engaging for users with varying attention spans so that they embrace long-term, consistent habits?

Research Insights

We conducted extensive user research using generative and evaluative methodologies to confirm our hypotheses and understand our users' pain points and needs. 

Phase 1: Generative Research


  1.  User Interviews (n=10)
    • Career changers
    • Retirees
    • Academia
    • Casual learners
    • Professionals
  2. Survey (n=30)
    • 8 close-ended questions on learning goals
    • 3 open-ended questions on learning behaviours


At a Glance

  • 60% users reported that they experience both internal and external distractions
  • The average attention span among respondents was found to be less than 1 hour
  • 67% find bite-sized lessons to be the most engaging format
  • 70% of respondents perform best in a minimal, quiet space


  • Challenges
    • Time-management
      • Procrastination
      • Competing priorities
    • Motivation & discipline
      • Difficulty in maintaining focus
      • Feelings of failure
    • Distractions
      • Technology
      • Major life events
    • Attention span
      • Struggle with longer tasks
      • Alignment
  • Preferences
    • Reward-based learning
      • Small wins
      •  Routine
    • Environment
      • Quiet
      • Accountability
    • Tools
      • Timers
      • Organization

*A more in-depth overview of our preliminary user interviews can be found in the Product Spec.

High-Level User Flows

Phase 2: Evaluative Research

Usability Tests (UT)

  • Round One: 3 moderated tests
  • Round Two: 3 moderated tests


We kept going back to our preliminary user research and supplemented this with usability testing at each stage of the development process to ensure we were addressing user pain points and iterating our product. Our core refinements have been highlighted in the table:

*Usability testing feedback from both rounds has been compiled here

MVP User Flow
Pivot 1
Pivot 2
Pivot 3

Solution Explanation

Based on the pain points identified during user interviews, we tailored our solution to align with our initial goals. Our market research revealed the absence of a product that combined a focus-productivity extension capable of promoting consistency and routine. As a result, we decided to build our MVPs around the following core features: 

  1. Goal setting: To help users visualize a north star and simultaneously allow them to set actionable tasks effectively
  2. Focus session: To help users thrive in a structured learning environment, combat procrastination, and gain momentum
  3. Reflection tool: To help users be more accountable and provide them with a sense of accomplishment

Phase 1 streamlined our vast problem space, fueling our motivation for Phase 2. Here, we embraced agility, iterating our product weekly to incorporate feedback and refine our MVP. As we progressed, we descoped, trimming unnecessary complexities. Consistently reassessing priorities ensured developers remained focused on core functionalities, avoiding roadblocks.

Our pivots and adaptations, detailed below, were instrumental in shaping Flo's success. Embracing change allowed us to navigate the evolving landscape, ensuring our product remained relevant and aligned with user needs.

Pivot 1 - Adding Progress Bar

The first round of usability testing revealed that goal setting can vary based on user preferences. Users expressed a lack of visual continuity, which prevented them from staying fully engaged. We addressed this by introducing a progress bar and reducing the number of steps it took to complete onboarding. In the second round of testing, users were impressed by how tracking and visualizing progress consistently kept them motivated while completing tasks.

Pivot 2 - Simplify Goal Setting

Users being distracted due to the lack of a structured plan was highlighted through preliminary user research. We initially devised a 3-step solution to goal setting, which was revealed to be time-consuming. As a result, we pivoted to a 2-step process where users would first identify a north star and then define actionable subtasks. In doing so we were able to achieve our goal of giving users maximum control. 

Pivot 3 - Gamifying Reflection Tool

Lack of accountability impacted users’ ability to stick to a routine, which made it difficult to set and achieve goals. To tackle this, we introduced a reflection tool where you can rate sessions and jot down notes. Our moderated tests revealed that this feature lacked "gamification” and needed to be incentivized. We addressed this by implementing a calendar functionality that allows them to track and review progress. We also replaced the numerical rating scale with a 5-point emoji scale to make it more engaging.

Implementation Details 

Technical Implementation

Our chrome extension is hosted on the Chrome Web Store where it is publicly available for download.

Tech Stack
  • React, TypeScript, CSS, and HTML
  • User data is stored in a user's Google account via the API and is accessible wherever a user is logged into Google in the browser
  • Data is accessed in the extension via requests to the background.js file
High-level Journey of a Request
  • We chose to use React for building the UI because it was ideal for the user interaction and responsiveness our extension would require. We also chose to use TypeScript because type safety would make our code less prone to bugs.
  • When the user creates a goal and subtasks, or saves a rating and reflection, these are handled via chrome.runtime in background.js. This handler can communicate with to save the data in our focusData object.
  • This data is then available when the user is signed in to their Google account on any chrome browser.
  • Chrome.runtime handles various other requests from the app components, including “clear”, “fetch”, “set”, etc. and communicates with to execute them and send a response with any data needed to the frontend
Technical Challenges

We struggled the most with the API. It is not difficult to use, but sending information back and forth between parts of an application requires care, print statements and type handling. With time and patience, I learned the syntax requirements and it became much more simple and intuitive. We have put safeguards in place for the amount of data that can be stored at any given time for a user. The Chrome Web store framework already allows for us to have as many downloads as users.

Chrome extensions can be as tricky as any full-stack application to implement. There are other considerations to take into account such as the effects of the styling, and other components and scripts that could interfere from the parent URL. At a basic level, it is the same basic HTML, CSS, and JavaScript as any application though, so we were able to work through it!

Future Steps

We are excited to continue working on this project and have some exciting features planned for the future.

Our product was positively received by beta testers who appreciated its unique product-market fit. Users were especially impressed by how they could define actionable tasks, track progress, and move at their own pace. Additionally, there was a lot of encouraging feedback about our minimal UI, which helped users stay focused on their goals.

Based on this feedback, we've prioritized the following features for our upcoming sprints:

  1. Generative Goal Setting
  2. Music Integration
  3. Light/Dark Mode
  4. Study Trends

We look forward to implementing these features and enhancing the user experience even further.

Generative Goal Setting


Product Manager Learnings:

Priyaan Lall

Delivering an MVP in just 5 weeks is no easy feat. I felt most challenged when we had to descope our product to ensure development was feasible within this tight timeframe. A solid research foundation helped me strike a balance between being adaptable and staying firm on key aspects. I was able to understand user needs deeply and this allowed me to show my team how specific features created value and were essential to the product. At the same time, it made it easier to trim unnecessary features that distracted us from addressing the core needs of our product. This project was a masterclass in prioritization and negotiation, as we evaluated opportunity costs, and anticipated requirements, and worked tirelessly to deliver something desirable, feasible, and usable.

Designer Learnings:

Steve Sheng

CoLab has been a fantastic simulated learning environment for our cross-collaborative team. One key takeaway for me was improving asynchronous communication through better documentation and front-loaded messaging, which helped us iterate effectively, clearly articulate design decisions, and stay focused on our end users. A big thank you to everyone involved; it was a wonderful journey together!

Developer Learnings:

Sarah Johnson

One thing I learned as a developer is how important time estimation is to ensuring the success of a sprint and ultimately the development of a product. Looking forward to becoming increasingly accurate in gauging my own capacity in order to create effective sprints and targeted progress toward product shipment. I learned how much I love working on a team and how much I’m looking forward to coding in a professional setting. I hope we can continue developing our product and it can gain some downloads in the meantime!

Developers Learnings:

Eric Jacobowitz


One of the most important lessons I’ve learned during my time at CoLab29 as a Software Engineer on Team 6 is the importance of always working with the most up-to-date code. It's crucial to ensure that both my co-developer and I are synchronised with the latest changes. This involves regularly using Git commands such as pull and merge to keep our codebases aligned.

Full Team Learning

Our project was both intense and rewarding, teaching us crucial lessons in prioritization, negotiation, and balancing flexibility with key priorities. Solid research and a deep understanding of user needs helped us pinpoint essential features while trimming away distractions. The experience highlighted the importance of clear, asynchronous communication, precise time estimation, and staying synchronised with the latest code updates. Overall, working in a cross-collaborative team at CoLab enhanced our skills and passion for developing effective, user-centric products, making this journey an invaluable learning experience.