USWDS Design Principles support and reflect the important guidance codified in the 21st Century Integrated Digital Experience Act. These design principles are intended to help teams across government align on important common goals and better use the design system — to be an evaluative lens for design and implementation decisions. Regardless of how you build, any USWDS project should support these principles.
Start with real user needs
Real user needs should inform product decisions. Whether our audience includes members of the public or government employees, decision-makers must include real people in our design process from the beginning. Then, we need to test the assumptions we make and the products and services we build with real people, to keep us focused on what is most useful and important.
- Does your product or service have access to the resources necessary to perform research?
- Who is your primary audience?
- What user needs will this product or service address?
- Do you use personas or other audience segment analysis techniques to connect your solutions to different segments of your audience?
- How often are you testing with real people?
- Which people will have the most difficulty with the product or service?
- Which research methods were used?
- What were the key findings?
- How and where were the findings documented?
- Usability testing with Steve Krug [digital.gov]
- 18F’s human-centered design methods [methods.18f.gov]
- 18F’s research guidelines [handbook.18f.gov]
- Interview checklist [methods.18f.gov]
- Tips for capturing the best data from user interviews [18f.gsa.gov]
- How to address barriers to user-centered design [18f.gsa.gov]
- Three ways to manage research projects remotely [18f.gsa.gov]
- Driving innovation with design education by the Lab@OPM [youtube.com]
- The Foundations of Agile, Part I [digital.gov]
- The Foundations of Agile, Part II [digital.gov]
- The laws of UX [lawsofux.com]
- A List Apart: Articles, books, and events for people who make websites [alistapart.com]
- Start early. Early in the project, spend time with current and prospective users to better understand their perspective and the context of the problem.
- Use a range of methods. Use a range of qualitative and quantitative research methods (such as 18F Methods) to determine people’s goals, needs, and behaviors.
- Use prototypes. Use prototypes to test your assumptions and solutions with real people, in the field if possible.
- Share your findings. Document and share your research findings with team members, managers, and the public, when practical.
- Test regularly. As the product is being built, regularly test it with potential users to ensure it meets people’s needs.
Trust has to be earned every time. Federal websites and digital services can’t assume it. Trust is about understanding and meeting or exceeding expectations, a process that can be established quickly and maintained over continued interactions, but is easily damaged. Be reliable, consistent, and honest. Reduce the impact of failure with solid design and engineering. Be a good steward of your audience’s data, resources, and time.
- Do users understand that this is a government site or service?
- What are the public’s expectations of your product?
- What private or sensitive data do you ask your users to provide?
- What are you doing to keep that data private?
- Does your product utilize redundancy to minimize the effect of server failure or traffic spikes?
- Does your product use continuous integration testing to prevent unintended regressions?
- Can users to edit or undo actions or edit data they’ve added to the system?
- How often do you check that your service works as intended?
- What components are made available to the public as open source?
- How quickly do you respond to bug reports?
- Is your content written in clear, easy-to-follow plain language?
- Do you provide meaningful access to people with limited English proficiency?
- Digital Services Playbook [playbook.cio.gov]
- U.S. Web Design System performance guidelines [designsystem.digital.gov]
- The HTTPS-only standard [https.cio.gov]
- .gov domain registration [dotgov.gov]
- Federal plain language guidelines [plainlanguage.gov]
- How to design effective communications [PDF; oes.gsa.gov]
- Best practices for multilingual websites [digital.gov]
- Ten usability heuristics [nngroup.com]
- A link is a promise [nngroup.com]
- Preventing user errors [nngroup.com]
- Designing for trust [TED.com]
- Humane by design [humanebydesign.com]
- Identify yourself. Clearly identify your site as a federal government site.
- Build with modern best practices. See the Digital Services Playbook.
- Review your content. Review your content at least twice per year to assure information is correct and non-redundant.
- Use the proper government domain. Use a .gov top-level domain and https with up-to-date certificates.
- Add the USWDS banner component. This shows your site is an official government website and explain the benefits of secure connections.
- Identify link rot. Find and fix broken links on your website.
- Keep communications simple. Ensure content is easy, personal, and timely.
- Write for the web. Expect users to skim and scan.
- Properly manage data and records. Reach out to your agency’s records officer and privacy official. Consult with them to ensure you are properly managing data and records (see play #11, Manage security and privacy through reusable processes, in the Digital Services Playbook).
- Understand expectations. Understand what your audience expects of your service, and validate the success of your service with real users.
- Publish open code and data. When appropriate, publish source code and datasets of projects or components online.
- Work in the open. When appropriate, share your development process and progress publicly.
Accessibility affects everybody, build it into every decision. Legal requirements are a critical, necessary starting point, but this is only the beginning. Accessibility is about real people who use our services — it’s usability for people who interact with products differently. Everyone who works on government websites has a role to play in making federal resources accessible and inclusive. Design generously and celebrate accessibility requirements as a set of design constraints that help us create a better product for all users.
- Can users navigate your site using only the keyboard?
- Can users use a screen reader to access the page content?
- Can users quickly understand the main points of your content?
- Can users easily interpret content associated with graphic elements?
- Can users easily understand and complete key tasks?
- Are you testing your service with a broad range of users?
- Do you know your agency accessibility team?
- Is your site organized such that everyone can navigate it easily?
- Are you using accessibility testing tools?
- Did your accessibility testing tools provide accurate results?
- Are you providing content in languages other than English, as appropriate for the audience?
- The Section 508 accessibility program [section508.gov]
- Accessibility for Teams guide [accessibility.digital.gov]
- Find your Section 508 Coordinator [section508.gov]
- Section 508 ICT Testing Baseline [section508coordinators.github.com ]
- Learn about writing in plain language [plainlanguage.gov]
- Plain language resources [plainlanguage.gov]
- 18F’s accessibility guide [accessibility.18f.gov]
- Accessibility Requirements Tool (ART) for contracting [section508.gov]
- Understanding universal design [section508.gov]
- Web Accessibility Perspectives Videos [w3.org]
- Inclusive design principles [inclusivedesignprinciples.org]
- Accessibility fundamentals with Rob Dodson [youtube.com]
- Microsoft’s inclusive design manual [microsoft.com]
- Inclusive components by Heydon Pickering [inclusive-components.design]
- Humane by design [humanebydesign.com]
- Humanize accessibility. Seek out examples of the real life impact of accessible products and services. Try to make accessibility less abstract and more personal.
- Use agency resources. Reach out to your agency’s accessibility team and build a relationship with them.
- Learn about assistive technology. Get familiar with the basic ways people use assistive technology and how people with disabilities use the web.
- Follow existing standards. Conform to the Revised 508 Standards and WCAG 2.0.
- Work from existing resources. Consult Section508.gov, Accessibility for Teams, and the 18F Accessibility Guide.
- Design generously. Adopt an inclusive design mentality, as described on the Inclusive Design Principles website.
- Develop accessible code. Ensure front-end code is written accessibly and conducts manual and automated testing.
- Write accessible content. Ensure content is written in plain language and headings, images, and links are accurately labeled.
- Build accessible designs. Ensure that designs are accessible, pages are laid out in a logical order, and content meets color contrast requirements.
- Test broadly. Test with a broad range of users and abilities throughout the design and development process, including manual accessibility testing against the Trusted Tester and ICT Testing Baseline.
- Be responsive. Remediate accessibility issues when you discover them.
- Contract for accessibility. Use the Accessibility Requirements Tool (ART) to incorporate accessibility requirements into your contracts.
Minimize disruption and provide a consistent experience: throughout services, over time, and across agencies, platforms, and devices. Consistency is not necessarily conformity. Agencies, sites, and services may have different audiences, missions, and goals — and the way we implement our solutions may differ — but we promote continuity by starting from shared solutions and values. These design principles are one set of shared values, and the design language of the U.S. Web Design System is another. Strive to build user-centered solutions that address the whole experience, not just a user’s specific task, but the context of their journey.
- Do you know if your audience understands that your product is a government site or service?
- Do you know if your audience understands the purpose of each page or section?
- Is it always clear what users are expected to do next?
- Does your agency have established style guidance?
- Have you tried and tested shared solutions before developing your own?
- Have you considered your service in the context of customer or user journeys?
- Have you identified your highest-impact customer or user journeys? Within these journeys, have you identified specific opportunities at which to collect feedback?
- Have you considered your service in the broader context of a service ecosystem?
- Can you reach across agencies and silos to collaborate and share solutions?
- Does your site or service have a consistent experience on any device or browser?
- Do users have equivalent access to your information and services on any device?
- What factors outside the scope of your product or service affect its success?
- What other government products or services are related to the success of your product or service?
- Are you able to coordinate solutions with other projects that share a similar audience?
- How design systems help us work together [digital.gov]
- The U.S. Web Design System [designsystem.digital.gov]
- The Journey Mapping method [methods.18f.gov]
- Embracing responsive design [digital.gov]
- Example: USDA’s design and brand guidelines [usda.gov]
- Example: Centers for Medicare and Medicaid Services design system [design.cms.gov]
- Federal Front Door research findings [labs.usa.gov]
- Sharing Quality Services Cross-Agency Performance goal [performance.gov]
- Eight Principles of Mobile-Friendliness [digital.gov]
- Test on real devices with the Federal Crowdsource Mobile Testing Program [digital.gov]
- Journey mapping 101[nngroup.com]
- Service design 101 [nngroup.com]
- Service blueprints [nngroup.com]
- Using a service ecosystem map [service-design-network.org]
- Ten usability heuristics [nngroup.com]
- Atomic Design [atomicdesign.bradfrost.com]
- Identify as a government site. Clearly and consistently identify as a government site on every page.
- Use a style guide. Use a simple and flexible style guide for content and style throughout a product or service. Know if existing guides already exist in your agency before developing something new.
- Connect related services with a similar style. Use the style guide consistently for related digital services.
- Support a wide range of devices and platforms. Support a wide range of devices for a mobile-friendly experience.
- Test on real devices. Test your site on the actual mobile devices as often as possible.
- Move or remove content with care. Provide proper notice and forwarding when content is moved or removed.
- Clarify multi-step processes. Give users clear information about where they are in each step of a process.
- Support multi-session processes. Provide users with a way to exit and return later to complete a process.
- Support re-use of saved data. Assure that repeat website visitors, who have logged in, can auto-populate forms with saved information.
- Find a community. Participate in cross-government communities of practice.
Evaluate and improve your product by listening to your audience and learning from what you hear. Continuous feedback drives continuous improvement. Measure customer experience — how well what we’ve built is working for our audience — at every stage of a project, and as projects grow and mature. Listen to what people say and observe how they interact with our products or services, whether through direct observation or through analytics data. If we’re not listening, we’re not learning.
- Does your product or service have access to people with design, development, and research skills?
- What are the key metrics your service uses to measure success?
- How are your success metrics tied to positive customer or user outcomes?
- How have these metrics performed over the life of the service?
- Do you have system monitoring tools and processes in place to identify and respond to issues?
- Which tools are in place to measure user behavior, and how do you use them?
- Do you measure customer satisfaction and take steps to improve satisfaction?
- Do you assess your customer experience maturity and develop action plans to identify focus areas for improvement?
- How are you collecting user feedback for bugs and other product issues?
- Do all members of the project team participate in user interviews and research activities?
- Do you cultivate direct community participation in your project with activities like hackathons?
- How often are you reviewing and addressing feedback and analytics?
- Do you contribute feedback to services your project uses?
- Guide to the Digital Analytics Program (DAP) [digital.gov]
- Getting started with Search.gov [search.gov]
- Office of Evaluation Sciences [oes.gsa.gov]
- Improving Customer Experience: Cross-Agency Priority Goal [performance.gov]
- Tips for Starting Your Customer Experience Journey [performance.gov]
- OMB Circular A-11 Section 280: Managing Customer Experience and Improving Service Delivery [performance.gov]
- USDA’s guidance on conducting a regular customer experience (or A-11) survey [usda.gov]
- Customer Experience Maturity Self-Assessment [performance.gov]
- Customer Experience Action Plan Template [performance.gov]
- Customer Experience Toolkit [digital.gov]
- Top tasks: prioritizing what is truly important [digital.gov]
- Google Design: Stop Talking, Start Listening [medium.com]
- Design is Listening: listening as design strategy [youtube.com]
- Actively collect issues. Offer users a mechanism to report bugs and issues, and be responsive to these reports.
- Collect direct feedback. Actively collect, review, and address feedback about your product or service (such as through surveys or customer email).
- Analyze analytics data. Implement both the governmentwide Digital Analytics Program (DAP) and agency-specific analytics services and analyze the data.
- Analyze search results. Include a search function on your site (through Search.gov or another tool) and analyze the search data.
- Analyze social media data. If you use social media platforms, analyze the data from these platforms.
- Publish metrics. Publish metrics internally and externally.
- Coordinate large projects with service design. Conduct a service design analysis when designing, coordinating, or consolidating large sites or services.
- Involve the team in research. Involve all members of a project team in user interviews and research activities to hear directly from real users.
- Use direct observation. Use direct observation in your research whenever possible to understand the context of a user’s actions.
- Keep testing. Test and re-test with real users.
- Share back. Contribute feedback and share solutions back to the internal and open source projects you use.