Leading Lenovo's First Commercial SaaS Offering: ThinkSmart Hub 700

2016 - 2019  |  Design Lead

As the world's largest PC company, Lenovo is creating adjacent solutions for the enterprise space starting with in-room collaboration devices.

The Smart Collaboration team designed a hardware & software solution that was canceled in 2019 after 2+ years of development. I led the UX+UI design, strategy, and research for this solution.

Business Objective

Lead the launch of Lenovo's next Billion-Dollar business in Smart Office solutions including hardware, software, and services.

Design Objective

Provide design leadership across all products and experiences to get this new business off the ground and achieve revenue milestones.


Smart Office was what brought me to Lenovo.



I joined Lenovo in the fall of 2016 to take on the challenge of turning an idea into an opportunity. Working closely with our sister UX team, we brainstormed, workshopped, reviewed competitor products, watched market trends, talked and listened to customers, and finally, defined the solution we knew would change the industry.

We started with this question:

What do our customers really need?



We visited several customers to record their pain-points, view their workspaces and technologies, observe their ways of working, and most importantly, understand their needs.

We can always solve issues, but our goal was to identify the correct ones.

We compared these learnings with our marketing data. We noticed that companies were looking to not only improve their conference room experience but also invest heavily in it.

Our position was now ready for the ideation phase.

Our focus was to provide the best holistic user experience to our customers. This meant that hardware and software had to seamlessly work together. This also meant they had to be designed together.

This was the first time Lenovo would design the hardware and software together from scratch. Our next step was to understand how we would actually get it done.

My role was to lead the software strategy and design from idea to production.

My challenge was that this was new territory for Lenovo and I was on my own.



In the beginning, we explored hundreds of ideas using all sorts of mediums: white boards, sketching, prints, Post-its, videos, word play, charades, anything that could get ideas flowing and discussions moving.

Then, we set out to establish our MMF for Gen 1.

What was critical to the success of this project and how would we measure it?

What features could we roll out as OTA updates in lieu of getting it into Release 1?

What would make the industry nervous?


From wireframing to mock-ups, I generated hundreds of layouts, grids, and frameworks focused on minimizing user friction and designing for the most common use cases. I knew that if we could predict what users would do most of the time, it would boost their positive experience with the product.

I studied several interaction models and created more than 10 different working prototypes for both internal review and external user studies.



Next, I took my prototypes on a roadshow and ran demos for internal stakeholders, large enterprise customers, and formal user research participants.

We gathered feedback on our product definition, features, and recorded any specific use cases or environmental constraints.

In April 2017, we ran an in-depth 15 person user research study to get initial feedback on our product, the user experience, workflows, features, value proposition, and NPS.

We achieved a 88 SUS score in our test, which was one of the highest scores we've ever recorded. We were now ready to pitch our idea to internal stakeholders for investment.



Over the next few months, I pitched the software prototype with the product manager to several key stakeholders across Hardware and Software Development, Strategy, Research, and Customer Experience.


In September 2017, we received official approval for the project and off we went. The following week, I jumped on a plane to Yokahama, Japan to formally start the project. Over the next several months, I designed the entire software platform, focusing on core design principles and keeping the customer at the center of my design process.

It would be an understatement to say that

designing the software was challenging.


I had to not only address numerous use cases, but also had to optimize the user interaction with the simplified hardware controls. This meant that I had to integrate all the user options and features used in typical meeting software like Skype for Business and allow any user to control them all with just 2 buttons.

The hardware was designed with the focus of  "making meetings simple."

With a powerful i5 Intel processor, an array of mics and sensors, high-quality LEDs, and premium Dolby Voice Audio, the ThinkSmart Hub 700 is one of the most powerful and premium conference room devices in the market.


On the software side, there were four main components:

1. Room User Interface

(application on the hardware)

2. Companion App

(application on Windows client devices such as PCs)

3. Companion App

(application on Android client devices such as mobile phones)

4. Administrative Management Console

(cloud-based management platform)

I led the strategy, design, and implementation of all these products.


Device deployment, management, and maintenance would be driven by an administrative management console built on a web-based platform.

Additional customization parameters and telemetry, as well as account information, would be housed on the portal.

I led the initial discovery, strategy, and framework, then assigned the product to one of my designers. We agreed to use simple Boostrap libraries and a prefab template to meet the aggressive timeline and keep the design simple for Release 1.



I first started by designing a few main screens based on:

1. My preliminary prototypes

2. Corporate branding guidelines

3. Market and customer data

4. UX research study results

5. UX Principles

I spent the very first day doing math.


A lot of math.

I started with a 4K HD resolution canvass and created my column-grid to fit within the pixel constraints. I used a 4 column grid with 2 half columns on each side nestled within equal outer padding.

I ran multiple calculations to ensure all the content would fit within the grid as well as confirmed the app would scale and adapt correctly across various display resolutions.

Then, I designed several more frameworks to see if the grid would hold up.

The most important factor to the grid was ensuring proper legibility at various sizes and distances from the display.

This was quite simple to do by projecting the UI on a room display and letting users read the content at different locations in the room. I used anthropometric data to measure the distance of readability and compared the results to published research.

Here is a short teaser video I created of designing the calendar view for ThinkSmart Hub 700.



I sketched out hundreds of wireframes to ensure that I incorporated all the features and content correctly then used them across workflow maps and user flows to map out various use cases. We spent weeks discussing the use cases in detail then used that list as our SW feature backlog.

I ran through the use cases numerous times to test the logic and map decisions, clicks, navigation, and overall ease of use. We referred to these workflows throughout the development process to identify if there were any missing interactions.

Endless design studies, design critiques, and user studies helped provide confidence in design decisions. Where data couldn't be collected, we built simple PoCs to evaluate the proper user interaction.

Thousands of art boards were created to detail the interactions, workflows, and designs of each screen across hundreds of use cases and behaviors.

Design studies focused on not only various display resolutions but also various types of information that might be displayed at any given point.


For example, considerations were made for long names, maximum use cases (max number of devices and users sharing), various time formats, legibility, and language translations.

I next put together a simple site map outlining the overall UX architecture. This was extremely important in getting everyone on the same page, especially the developers.

After seeing the site map, people realized how simple the interface was and found ways to optimize the code based on this simple UX architecture.


The next few months was dedicated to creating detailed design specifications that would outline fonts, colors, behaviors, interactions, assets, and even audio.

Here is the transformation from prototype to final design.

Room UI Prototype

Companion App Prototype

Room UI Final

Companion App Final

One of the most important parts of the project was translating my designs into detailed specifications.

By far, one of the most challenging tasks of my career was to create a single document that would be robust enough to handle millions of data points and could be updated frequently as the design changed throughout the development process.

The final draft of my Master UX Specification consists of more than 1800 pages of detailed user behaviors, spacing, features, assets, fonts, colors, interactions, audio, and animations.

Example of the Overlay Menu layer design


The ThinkSmart Hub 700 needed a logo. Traditionally, this responsibility falls onto the branding team, but I went ahead and delivered one.


Below is the product icon I designed for the Hub app.

I used what appears as a simple logo to create two separate loading screens:


Starting Your Meeting

Ending Your Meeting

These screens are important as many background tasks must be executed before the meeting can be properly joined or ended.

I also conducted design studies to ensure that different customer brands would work well within our app. We had a custom wallpaper feature in our Admin Console that allowed users to upload a background to the room UI. When the room was empty, the wallpaper would be used as marketing material. This exercise also helped define the final colors we used for our button states in order to:

1. Ensure high enough contrast for legibility and accessibility

2. Produce a distinctive hierarchy to highlight the in-focused button


It is also important to design for accessibility. Ensuring the app has high contrast options for visually impaired users is an important part of my design philosophy.

I also checked my designs across various types of color blindness tests, which is an easy way to identify issues across the color palette.

Finally, I always design with language translations in mind. I use a set of languages for my baseline and adapt the English version as needed. As you can see, some translations stretch the design in many cases so ensuring the design fits these translations at beginning of the project will save a significant amount of time in the end when your product rolls out globally.


Halfway through the project, we were notified that our original plan for a hardware dialpad device was canceled. Since the product needed to support dialing at launch, we were left with solving the need with a fully digital experience.

The problem was that we had only TWO buttons to dial numbers and search for contacts.

As the lead designer, I was tasked to solve this problem and present solutions to the Executive Team within a month to stay on schedule.

Below are some initial dialing design mockups and prototypes I created.

After a couple of quick prototypes with real code and some quick user tests,

we landed on this design:

The beauty of this design was that it was reflective of the hardware and provided a nostalgic reference to old telephones.

and people just got it.

After I received Executive approval, I was off trying to make the design work. I carefully crafted the UX architecture and wrote out code logic for the developers to follow.

The only gray area was what the SDK could provide in terms of dialing features which we would only know once we jumped into the code.

I had also recommended we add a dialing feature into our companion app, which meant I had to design the new feature into the app and map out the workflow on how the app and room would integrate and work together.


This task was extremely challenging because there were several unknowns.


Throughout the project, I worked closely with developers to not only help them design the app, but also to help them code it correctly. We added a lot of "smarts" into our solution from meeting management to when the help animation would appear. All these little behaviors required lines and lines of code, which I helped write.

Below is one example where the team asked me to help them with the logic behind our meeting countdown timer across the various use cases we allow.



When you develop a brand new technology, you also create a brand new experience.


Our biggest question was if this new experience was intuitive for users and how adoption would affect the overall experience.

We knew there was a learning curve to our solution. How steep the curve, we didn't really know. Throughout the project, we conducted several usability tests from preference testing, exploration and concept testing to in-depth task analysis utilizing standards like SUS and Nasa-TLX. We tried several research methods to uncover usability issues and identify UX gaps.


For all our all studies, we recruited specialized users such as IT Decision Makers as well as general users with experience across various Unified Communications platforms. We would splice out the data accordingly when analyzing the results.

In this study example, we asked IT Decision Makers to conduct a typical meeting using our solution. They had little to no instructions on how to get started and even tried new ways to run the meeting as they discovered new features.


This picture was taken behind a single-sided mirror positioned in our usability lab.

The users were given all the devices they typically use in a meeting.


Our hypothesis was correct in identifying a learning curve. However, the results weren't as steep as we had predicted. For the most part, once the user learned the controls of the hardware, they were able to navigate the system with ease and because we designed the UX to be "forgiving," mistakes had little to no consequences to the UX, which made users comfortable clicking and trying new features.

So how do we lower this learning curve? Our first thought was to provide marketing materials, but we also knew that many users would not read the instructions prior to using the solution. Knowing this, I designed a Quick State Guide that would be shipped with the device so that companies could place it near or around the device in the room.


It worked. And it worked very well.

We decided to take it a step further and invest in professional tutorial videos. We hired an agency to film 30-second videos for the most common user tasks and use cases.

My colleagues and I volunteered to be the actors in the tutorial videos. Later that day, I added hand modeling to my resume.

Videos speak louder than words.


After integrating the tutorial videos into the app, we ran several more studies that focused strictly on the intuitiveness aimed to address the learning curve issue we identified in the prior research study.

Our results significantly improved with the added HELP content. This was reflected in an increase in the SUS score (per task and product) as well as the NPS.





It was now time to see how we stacked up to the competition. I submitted our work to the top 3 internationally recognized design competitions and a few months later, we learned we had swept all the awards!

ThinkSmart Software was recognized and honored with:




I had the honor to receive the Red Dot Award in person at the Red Dot Ceremony in Singapore in October 2019.