🔁

The Lean Product Playbook

 
UX Principles
Components
Utility - Can customers use your product?
Usability testing: ask users to complete key tasks and observe what they do
Efficiency: # clicks, taps, keystrokes, or other user actions to complete task in certain UX
Delight - Do customers enjoy using your product?
Seem to read the user’s mind
Aesthetics
Dynamic response by the product based on user action
Surprise
UX Design Iceberg
notion image
Conceptual
Uber - map based app design
Experiencing user research firsthand is much more impactful than just reading a research report
Team debriefs, where individuals share and discuss their observations, help maximize learning and should be held promptly after the research occurs
Documenting the summary of results and key takeaways is also important to solidify the learning and capture it for others
Personas
Target user’s goals along with any relevant psychographic, behavioral, and demographic attributes
How tech savvy the user is - less comfortable - very simple interfaces that focus on the most important tasks
Technically advanced user would care less and instead prefer greater flexibility and productivity
User is rushed for time/lean in experience
Naming persona is so critical
 
Info Architecture
Design discipline responsible for defining how the information and functionality of a software product should be structured
Card sorting
Research technique used to learn how customers think about the different parts of the product and how they are related in order to identify their preferred organization scheme
Good IA
manner that users find intuitive, with labels that are easy to understand, resulting in good usability and findability
Findability:
ask a group of test users to try to find a certain page or screen in your product and see what percentage are successful
navigation patterns from an analytics tool: shortest path to each page?
Sitemap
Used to specify structure for any software product, including mobile apps
All the pages or screens, how they are organized into sections, and the high-level navigation patterns provided.
Page titles and the words used to label sections of the product
Global navigation shows the major sections of the product, which correspond tothe main links you would see at the top of a website
notion image
Interactive
What actions can the user take at each step, and how will the product respond?
Any user interface control or link with which the user can interact (click, hover, drag, type, tap, swipe, etc.)
Includes the feedback the product gives the user
Response time (confirmation that the system is receiving their actions)
Flowchart
specify the possible flows for key tasks in your user experience (actions that can be taken and the decisions that can be made by both the user and your product)
notion image
Wireframes
Look across a product and identify groups of pages or screens that should be similar. Each group will share a distinct template that defines its layout
Visual
Color
To ensure readability of text, there must be a strong contrast between the colors used for text and for the background.
Blue color conveys trustworthiness and calm.
Green is associated with nature, growth, and money.
Purple suggests luxury and creativity.
Red is associated with aggression, passion, power, and danger.
Orange is energetic and vibrant.
Yellow conveys happiness and sunshine.
Brown is associated with warmth and the earth.
Black can suggest sophistication, elegance, and mystery.
White is associated with purity, cleanliness, and simplicity
Typography
Serif fonts work better for print materials, which have a very high resolution (dots per inch), whereas sans serif fonts work better for the web, which has lower resolution
Serif fonts can be hard to read on a screen; used in headings and other large text elements
As with color and typefaces, you want to avoid using too many different text sizes and be consistent throughout your product.
 
Graphics
Hero images—where a large, prominent photo shows your product, a typical customer, or some other artistic or inspirational object or scene
Illustrations are often used to explain how your product works
Icons are small symbols used to represent objects or concepts
 
Style Guides
design deliverable that is used to achieve a consistent look and feel
specifies the visual design details—such as color, size measurements, fonts, and graphics—for commonly used elements
Layout Grids
12 columns, which is evenly divisible by 2, 3, 4, and 6, and allows a wide range of possible element widths
Grids used in print design often specify vertical divisions (rows)
Grid lines for vertical positioning have been less useful on the web due to the large variation in screen heights
notion image
Mockups
higher-fidelity design deliverables that capture your visual design
Design Principles
Gestalt Principle
According to the Gestalt principle of proximity, the brain perceives objects that are closer together as more related than objects that are farther apart
Visual hierarchy
size and color of elements
larger objects are more important and smaller objects are less important
elements with high contrast more important
location of elements
squint your eyes to determine
Principles of Composition
Unity: Does the page or screen feel like a unified whole or a bunch of disparate elements?
Contrast: Is there enough variation in color, size, arrangement, and so forth to create visual interest?
Balance: Have you equally distributed the visual weight (position, size, color, etc.) of elements in your design?
Use of space: How cluttered or sparse does your design feel? Ensuring your design has enough white space—the space you don’t use on the page or screen—is important to avoid designs that feel crowded to the user.
Responsive Design
Determine the screen width “breakpoints” you want to use and then apply the desired differences in styling to each width
With responsive design, as the screen width shrinks from wide to narrow, some page elements start “wrapping”—that is, getting pushed to the next line.
Designing for Multiple Screen Sizes
Harder to design for a smaller screen due to the space constraints, which require more tradeoffs
Many teams embrace a “mobile first” approach—designing for the smallest screen first since this forces them to prioritize what is most important
Copy
Labels, instructions, descriptions, error messages affect usability
Error messages should be helpful and explanatory instead of cryptic
conduct usability tests, if users encounter difficulty with a particular word or phrase, you should ask them what they would call
Design Constraints
Common for a designer to be stronger in either visual design (how the product looks) or interaction design
Crucial for these team members to work together effectively in order to deliver a great UX
4 Horsemen of an A-Team in UX: PM, interaction design, visual design, front-end development
What stage to focus on UX
Innovators may be willing to tolerate a substandard UX for a breakthrough product that provides cutting edge benefits
UX becomes more important to product-market fit as you add different segments
 
Qualitative MVP Testing
Qualitative user tests require that you show customers your product or design deliverables—wireframes, mockups, or prototypes—to solicit their feedback
# of customers
Can speak with >1 customer at a time, but suboptimal results due to group dynamics - fear of being judged or criticized
Groupthink - all or most of the group artificially converges on the same opinions
1/time - more likely to share true feelings
If the user test is remote, then the observers can join the screen sharing session
2-3 customers - good for speed and printed mockups
Optimal # - waves of five to eight customers/time strikes a good balance to uncover major issues and identify patterns across users
No-show rate ~10%
Types of testing
Moderated - researcher is present and conducting the test with the customer
Unmoderated - no moderator is present; recorded for the product team to watch later (Screen, audio, video)
In-person = richer data
Can see user’s screen and face when you’re in his or her presence,
Can pick up little things like sighs, facial expressions, and other subtle cues.
Can see where the user’s eyes are looking
Better rapport = better data
Remote
Prepare to encounter technical difficulties
Common to find that customers have not installed the software required to share their screen or need help getting it running properly
Lag between the customer’s actions and when you see them on your screen, firewalls can cause problems
Unmoderated
Mostly screen+audio+video
Some calculate clickthrough percentages - useful at scale
No risk of the moderator influencing the results
 
When to use what
When you are early in defining and validating your MVP, moderated testing is the way to go to ensure you can ask questions and get rich customer feedback
When more confident about your MVP, unmoderated testing can be a useful tool to compliment moderated testing
Recruiting
Surveys → Segregate Target Market by demo/psychographic attributes
Craigslist, TaskRabbit, Amazon’s Mechanical Turk (MTurk)
Harder for B2B - conferences, meetups, or other events where they congregate and conduct some guerrilla on-the-ground testing
Blindly schedule users on a routine basis - teams can just count on users being available at the designated recurring time
Starbucks guerrilla test
Compensation based on worth of their time/cool swag/gift card
One-way mirror rooms
Ramen user testing
only essential parts of user testing (conference room used)
Max of 3/person
Make users bring laptop
Customers don’t like being recorded
Structuring User Test
Test script - lists what you plan to show and ask the user
which design artifacts or parts of the product you plan to show the user
what tasks you plan to ask the user to attempt to accomplish
what questions you plan to ask the user
Conduct a pilot test with a team member first
Rules
User tests - 15min -1hour: First 10-15min for warming up+needs+current soln; 45 minutes getting feedback; 5 to 10 minutes of wrap-up
Honest, negative feedback desirable
Think aloud protocol
Structure of Session
Discovery Phase
Discovery q good for exploring the problem space and value proposition
Understand the customers’ needs, their current solution, what they like and don’t like about it, how satisfied they are
Product feedback
Don’t ask rhetorical and leading questions like “Isn’t it good” - confirming that the product is good not getting actual, authentic feedback
If a user takes an action on a prototype but doesn’t verbalize that they did or why they did, ask why that particular action?
Go deeper with why’s
Substitute with:
Could you please tell me more about that?
Could you please help me understand?
What thoughts were going through your head when you did?
If user asks a question, don’t answer, ask their expectation
Open v Closed Follow up
In normal conversation, when you’re not moderating a user test, closed questions are perfectly fine
Focus on mainly open q
Don’t embed answer in q
Don’t help users, answer questions with questions to get feedback (Pretend you’re not there)
Wrap
Ratings of features/product
Answer questions that user asked/had problems with
Would you be willing to participate in future research?
Would you like to be notified when this product is available?
Synthesis
Users will give you feedback on: functionality, UX, and messaging
Functionality
Whether your MVP addresses the right benefits or not
Feature missing/extra
UX
Connect feedback to benefits and value proposition, else bad UX may be ruining feature set
Messaging
The way you talk about your features, benefits, and differentiators—your messaging—may not resonate with customers
Map feedback to 3 categories on a single documents
Can also compare ratings
PMF v/s Usability
Feedback on usability - how easy it is for customers to understand
PMF - how valuable they find your product
“Would you buy this?”
 
Iterate and Pivot to Improve PMF
Build Measure Learn
Build - Design something to test
Measure - testing hypotheses with customers
Learn - +hypothesize
Hypothesize Design Test Learn
Hypothesize - formulate your problem space hypotheses
Design - takes us from the problem space to the solution space
Test - Expose product or artifact to customers and make observations
Learn - validated learning
Iterative User Testing
Look at % of users to see how many gave the same feedback, either positive or negative
notion image
Proceed to the next step in our product process (lofi to hifi, hifi to build)
Persevere or Pivot
Map each problem back to the corresponding layer of PMF Pyramid
May be iterating at a higher level than where the true problem lies
Eg if hypothesis about target customer is wrong, iterating UX design won’t make much difference
 
Pivot - if one of hypotheses changed
Deciding to change the differentiators in your value proposition
Switching to a completely different target customer
Game Neverending → Flickr
Burbn → Instagram
Persevere/Pivot/Stop
Stop
Resource constraints limit time
If no PMF or significant progress, challenging to raise the next round of investment
Pivot
Not for Shiny object syndrome
If after best efforts, target customers are lukewarm on MVP
If you haven’t yet identified a customer archetype that is very excited about your MVP
Less central part of your value proposition is what most resonates with customers
Marketing Report Inc
Target Customers: Mainstream Consumers
Underserved Needs
  1. Discover money-saving offers of interest to me
  1. Reduce the amount of irrelevant junk mail I receive
  1. Gain insights into my spending behavior
  1. Meet and interact with other people with similar shopping preferences
  1. Earn money by giving permission to sell my marketing-related data
Value Proposition
notion image
MVP Feature Set
notion image
MVP Prototype
defining the product’s structure (information architecture) and the flow of the customer experience (interaction design)
User Testing
In-person moderated testing: customers were employed, net-savvy, tested for whether they used coupons/caller id blocking (survey)
User testing script
  1. Introductions and warm-up (5 minutes)
  1. General discovery questions (15 minutes) a. Direct marketing mail b. The data about you that companies have c. Comparing yourself to others financially
  1. Concept-specific questions (45 minutes total) a. Discovery questions related to concept’s main theme (10 minutes) b. Feedback on concept mockups (35 minutes)
  1. Review: What did you most like/dislike about what you saw? (5 minutes)
  1. Brainstorm: What would make the product more useful/valuable? (10 minutes)
  1. Feedback on possible product names (10 minutes)
  1. Thanks and goodbye
Learnings
Neither Saver/Shield concept was appealing enough to customers
Marketing Profile somewhat interesting, but not compelling
Marketing Score confusing and it had low appeal
Had to pick a direction to pivot to
Pivot Point
Saver: many coupon sites, not clear to customers how our offering was differentiated
price that was less than the actual savings it achieved for them
Shield: more willing to pay for a service that reduced their junk mail
Small amt payable - if the service really worked as expected during initial free trial period, they would be willing to pay afterwards
Iteration
JunkMailFreeze Wave 1
Learning: almost all customers hated certain types of mail: pre-approved credit card offers and cash advance checks which can be stolen
All knew someone who had been a victim of identify theft
Customers considered catalogs to be a pain because they are so big and bulky
Privacy-conscious customers spend a lot of time shredding their junk mail
Customers consider local advertising a nuisance
Expected that the service would take a while to “kick in.”
May inadvertently not receive some type of mailing that they would want to receive
 
Wave 2
everyone really liked product
category-level controls for blocking weren’t adequate for junk mail related to credit cards and catalogs
to specify their preferences at the individual company level
received feedback on our messaging and UX
 
Agile
Teams using Agile methodologies break the product down into smaller pieces that undergo shorter cycles of requirements definition, design, and coding
Waterfall
team first defines all of the requirements, and then designs the product. They then implement the product, followed by testing to verify it works as intended
No progress to the next step until the previous step is 100 percent complete
Scrum
Sprint - time-boxed increments—that is, limited to a specific timeframe
All work that the team completes comes from the product backlog of user stories
Backlog - rank-ordered to-do list
Roles
Product Owner - writes User stories and places it on the product backlog
groom the backlog to make sure that stories being considered for the next sprint are well written and understood by the team
Development team member - estimate the size of stories and build the product
UX/visual Designers bring the user stories to life by designing the user experience, which they convey through design deliverables
QA testers help check to see if acceptance criteria are met and ensure the quality of the product
Size = the two pizza rule
Scrum Master - job is to help the team with the Scrum process and improve its productivity over time
Process
Sprint planning meeting - which stories to accomplish in sprint and move those stories from the product backlog to the sprint backlog
Estimate the scope of each story using story points (#, Fibonacci, Power of 2, T-shirt size, Planning Poker) which determine a team’s capacity for work by tracking how many story points they complete each iteration (velocity)
Teams will often break each story down into the set of coding tasks required to implement it
When varying skills, each story assigned to specific developer to ensure the team is properly load balanced for the sprint
Agile tools: JIRA Agile, Rally, VersionOne, and Pivotal Tracker
Burndown chart - shows how much work remains to be completed for the iteration
notion image
QA testing during sprint: for higher velocity, team members should test stories as developers complete them. If the story meets its acceptance criteria, then it is accepted; otherwise, it is rejected and kicked back to development
Goal for sprint - complete an “increment” of work that adds functionality to the product
Sprint review meeting - team members show what they have built
Retrospectives - reflect on how the last sprint went (what worked well, what didn’t, and improvements)
Kanban
Each card is a user story or a development task that supports a user story
Cards arranged on a kanban board, which consists of a set of columns, one for each different state of work
notion image
Stages of Kanban
Backlog: Items to be potentially worked on, sorted in priority order.
Ready: Items that have been selected from the backlog and are ready for development.
In development: Items that a developer has started working on.
Development done: Items that the developer has finished working on but which have not been tested yet.
In testing: Items in the process of being tested.
Testing done: Items that have successfully passed testing but have not yet been deployed.
Deployed: Items that have been launched.
WIP limit - maximum number of cards each column can contain
Single WIP limit to constrain the total number of cards across the two related “in progress” and “done” states
notion image
Use Whiteboard or digital tools like JIRA Agile, SwiftKanban and LeanKit
Scrum v Kanban
Kanban tends to work best with smaller development teams
Lower process overhead and the lack of a predetermined iteration length can enable faster delivery of product
Scrum works better with hard deadlines
Can make high-level estimates for how many sprints a feature should take
Success Req Checklist
Cross-Functional Collaboration - free + frequent comms (chat, development-tracking tool (e.g., JIRA Agile), and knowledge collaboration tools (e.g., a wiki or Google Docs)
Ruthless Prioritization - clear on rank order priorities but quickly incorporate new or changing requirements
Adequately Define Your Product for Developers
Wireframes if style guide in place and no new major UX components
Mockups if visual design details need to be conveyed
Only user stories purely back-end with no UX component
Stay Ahead of Developers
Scrum - designers 1-2 sprints ahead of current sprint,
Kanban - PM should ensure there are enough cards in the “ready for design” queue, Designers should ensure there are enough cards in the “ready for development” queue
Break Down Stories into smallest possible size to avoid estimation errors
QA
Pre QA
Code review - one developer examines another’s code—and can catch mistakes that the original developer missed
Pair programming - two developers work on creating the code together at the same time
Driver and observer - good for learning
 
QA
Types
Manual/Black box - one or more people interact with the product to verify it works as expected
Automated - software is used to run test cases on the product and compare the actual results with the predicted results
Aspects to test
Validation testing - new or improved functionality consistent with user stories and design artifacts
Regression testing - no existing functionality was inadvertently broken during build
Test driven development
Cont. integration v deployment
Integration - version control system to keep track of every single revision made to the code
Integration testing is performed at this point to ensure that the new product works as intended
unit testing of his or her code by writing the relevant test cases and ensuring they all pass
Deployment - code that successfully passes all tests is automatically deployed
staging environment (an internal environment that customers can’t access), or direct production
DevOps - which focuses on building and operating rapidly changing, resilient systems at scale
Metrics that track the health of the product are used to trigger an automated rollback
 
 
 
Measuring Metrics
Research Method F/W
notion image
Behavioral information - what customers actually do
Attitudinal - what customers say about their attitudes and opinions
PMF for v1 - Oprah, Optimize - Spock
User Interviews - understand a user’s needs and preferences
Usability Test - behavioral learning by observing the customer use your prototype
Survey
Provide attitudinal data - opinions often don’t end up matching behavior
Landing page smoke test > survey
Sensitive to the specific wording of the question
NPS = Net Promoter Score = %Promoters - %Detractors (attitudinal)
Sean Ellis’ PMF: “How would you feel if you could no longer use X?”
If >40% very disappointed, then PMF
A/B Testing
Know the difference in conversion rate for different landing pages
Peter Drucker - you can’t manage what you don’t measure
Viral Loop
Detailed steps by which an existing customer generates a new customer
Startup metrics for pirates - acquisition, activation, retention, revenue, referral
notion image
Identify MTMM - metric that matters most - highest ROI opportunity for improving your business right now
Retention → Conversion → Acq → Rev → Paid Acq
Retention curve
notion image
Cohort analysis
notion image
Analysis of metrics for different cohorts over time
Group of users that share a common characteristic—such as the month that they signed up
AARRR → Profitability f/w
Profit = Revenue − Cost
Revenue = Users × Average Revenue per User
Ad model
Revenue = Visitors × Average Revenue per Visitor
notion image
notion image
Visitors = New Visitors + Returning Visitors
notion image
notion image
Subscription model
Revenue = Paying Users × Average Revenue per Paying User
Paying Users = New Paying Users + Repeat Paying Users
notion image
New Paying Users = Free Trial Users × Trial Conversion Rate + Direct Paid Signups
Profitability Path
LVT - profit customer generates for you without CAC
LTV = ARPU × Average Customer Lifetime × Gross Margin
Average Customer Lifetime = 1/Churn Rate
LTV = ARPU × Gross Margin / Churn Rate
CAC = Sales and Marketing Costs / New Customers Added
CAC = Cost per Acquisition / Prospect Conversion Rate
For profit, LTV > CAC
 
Analytics
Â