OhmConnect optimizes residential energy efficiency through the use of smart home devices, such as WiFi thermostats, smart plugs, and smart appliances. This allows customers to save money, while also reducing carbon emissions by maximizing the use of renewable energy sources like solar and wind.
As the Lead Product Designer for end-to-end design product development, I was responsible for user research, concept testing, UX/UI design process and decision making, and ownership of our product design roadmap for near and long-term objectives. In addition to my day-to-day responsibilities, I oversaw a comprehensive redesign of our product suite (mobile + desktop), scaled an in-house design system with engineering, and led an overhaul of our technical stack to enable our aggressive market expansion.
2021 - Lead product designer partnered with eng and pm
Energy Events are the core of OhmConnect’s UX and business. They provide the first and continuous "ah-ha" moments in the user journey, when energy reductions are measured and rewarded. Collectively, they make up the value that OhmConnect provides to the energy grid: reducing energy consumption when generation cannot efficiently meet demand.
This document summarizes the research process, problems identified, and solutions tested that helped us maximize the effectiveness of our core product experience. First, let's take a brief look at the experience that helped us increase core engagement metrics by over 40% by focusing on user comprehension and progressive disclosure.
Event performance is unique to each user and determined by three numerical measures:
- Expected energy usage (calculated from historical averages).
- Actual energy usage (how much is used in a single event).
- Saved energy usage (the difference between expected and actual usage).
Through a combination of qualitative and quantitative research, we learned the following:
- Users are confused about the terminology and values used to communicate how much energy can be saved in an event and how much energy was saved after an event.
- Users don’t understand the values used to calculate their performance in an event, which compares their actual consumption in the event to their average consumption from past days without events.
- Users don’t feel motivated by the values presented before an event (their maximum reduction) because they never actually reduce the maximum amount of energy.
- Improving user comprehension of how OhmConnect calculates energy savings will improve the trust and motivation required to participate.
- Users will better understand energy savings if we explicitly surface their baseline (average usage) to compare with their actual usage.
- Users will feel more motivated by targets and goals that focus on realistic savings rather than the total amount of possible savings.
A quick glance at the product’s state before any changes were applied to highlight areas complicit in the aforementioned issues:
- Over-simplification of values is abstracting the basis for ‘how it works’ and eroding trust in the service.
- Internal jargon used for ‘cute’ branding complicates an already esoteric topic, instead of using intuitively descriptive terminology.
- Numeric values offer no visual affordances for perceptual comprehension.
- Prevalence of info buttons indicates an experience that is not achieving clarity.
We knew that we wanted to take a visual approach where possible in order to communicate information and context as clearly as possible. Explorations for valuable solutions involved several areas:
- Charting variants — what were the available data visualizations for expressing these concepts?
- Energy analogues — how have other services conveyed similar data? What are customers already familiar with?
- Strategic vision — how might we effectively solve this problem while staying true to our larger vision of 24-hr energy efficiency?
- Conceptual models — how might we introduce consistent conceptual models to aid in user comprehension?
Before returning to user studies with product-like artifacts, we wanted to validate concepts that we might employ in product experiences to ensure we were on the right track before designing. The following artifacts evolved over time to validate a conceptual model based on storytelling.
Design sprint testing
Once we identified our conceptual anchor, we set out to quickly design and iterate our interface and flow, starting with low fidelity mockups and evolving to higher-fidelity states. Talking with customers, we learned that accessing the data used to compute their average was important for establishing trust, but only if they hadn’t seen it before. Once they’d seen it and knew how to find it, they were able to trust the service without the need to see it every time they had an event.
With our fresh new findings, we set out to make the data feel accessible without being an obstacle. If we could teach users where to find it, they could trust us to have it ready for them whenever they needed it, even if they didn’t.
Utilizing our ‘Sheet’ UI pattern, we introduced a tap target to the top-level average so that users could invoke the data when desired. In this way, users can review the data when needed, but won’t be obstructed from the primary flow once it’s no longer necessary for building trust. Knowing that it’s always just a tap away, our test customers felt much more comfortable with the data we presented.
We’d heard from customers that it was discouraging to see your total possible reduction before an event, only to always come up short. The fact was that no users were ever going to reduce the maximum possible amount of energy (short of flipping off their circuit breaker), so we wanted to introduce more realistic and encouraging targets, while also helping users stretch to reach a difficult goal. After multiple rounds of experimentation, we found that reduction targets of 40% and 80% (accompanied by the possible savings one might achieve at those targets) helped to increase user motivation through the behavioral tactic of goal visualization.
Time to ship
One of the toughest decisions in product is knowing when to ship. Thanks to a rigorous design sprint structure, we were right on target for our timeline. Although we could have iterated indefinitely to find more minimal improvements, we believed we accomplished the maximal improvement for the minimal cost, so it was time to ship. We increased exposure from 10-100% of users over the course of just a few weeks and immediately saw marked improvements in both qualitative and quantitative feedback. While great to see, downstream results like this shouldn’t be a surprise when you’re continuously validating your iterative design process as we did here.
2020 - Making mundane objects like smart plugs more approachable and fun.
3D animated smart plug
super-embed: <script type="module" src="https://unpkg.com/@email@example.com/build/spline-viewer.js"></script> <spline-viewer url="https://prod.spline.design/uefiPyvuenXRzapx/scene.splinecode"></spline-viewer>
Product system overview
2019 - Information architecture and feature roadmapping at various levels of abstraction.