Design principles

Design principles

USWDS design principles support and reflect the important guidance codified in the 21st Century Integrated Digital Experience Act. These design principles are intended to help teams across government align on common goals and better use the Design System — to be an evaluative lens for design and implementation decisions. Regardless of how you build, these principles should support your project.

Start with real user needs

Real user needs should inform product decisions. Whether your audience includes members of the public or government employees, decision-makers must include real people from the beginning of the design process. Test your team’s assumptions and the products and services you build with real people to keep focused on what is most useful and important.

Key considerations

The following are useful questions to ask your team as you assess users’ needs:

  • Does your product or service have access to the resources necessary to perform research?
  • Who is your primary audience?
  • What user needs will this product or service address?
  • Do you use personas or other audience-segment analysis techniques to connect your solutions to different segments of your audience?
  • How often are you testing with real people?
  • Which people will have the most difficulty with the product or service?
  • Which research methods have been used, and which methods do you plan to use?
  • What are the key findings?
  • How and where were are your findings being documented?

Practical actions

The following are practical actions you can take:

  • Start early. Early in the project, spend time with current and prospective users to better understand their perspective and the context of the problem.
  • Use a range of methods. Use a range of qualitative and quantitative research methods (such as 18F Methods) to determine people’s goals, needs, and behaviors.
  • Use prototypes. Use prototypes to test your assumptions and solutions with real people, in the field if possible.
  • Share your findings. Document and share your research findings with team members, managers, and the public whenever practical.
  • Test regularly. As the product is being built, regularly test it with potential users to ensure it meets people’s needs.

Further reading

Government resources

Non-government resources

Earn trust

Trust has to be earned every time. Federal websites and digital services can’t assume it. Trust is about understanding and meeting or exceeding expectations, a process that can be established quickly and maintained over continued interactions but is easily damaged. Be reliable, consistent, and honest. Reduce the impact of failure with solid design and engineering. Be a good steward of your audience’s data, resources, and time.

Key considerations

The following are useful questions to ask your team as you strategize to build trust with users:

  • Do users understand that this is a government site or service?
  • What are the public’s expectations of your product?
  • What private or sensitive data do you ask your users to provide?
  • What are you doing to keep that data private?
  • Does your product use redundancies to minimize the effect of server failure or traffic spikes?
  • Does your product use continuous integration testing to prevent unintended regressions?
  • Can users undo actions or edit data they’ve added to the system?
  • How often do you check that your service works as intended?
  • What components are made available to the public as open source?
  • How quickly do you respond to bug reports?
  • Is your content written in clear, easy-to-follow plain language?
  • Do you provide meaningful access to people with limited English proficiency?

Practical actions

The following are practical actions you can take:

  • Identify yourself. Clearly identify your site as a federal government site.
  • Build with modern best practices. Review the guidance outlined in Digital Services Playbook.
  • Review your content. Review your content at least twice per year to assure information is correct and non-redundant.
  • Use the proper government domain. Use a .gov top-level domain and https with up-to-date certificates.
  • Add the USWDS banner component. This component shows your site is an official government website and explains the benefits of secure connections.
  • Identify link rot. Find and fix broken links on your website.
  • Keep communications simple. Ensure content is easy, personal, and timely.
  • Write for the web. Expect users to skim and scan.
  • Properly manage data and records. Reach out to your agency’s records officer and privacy official. Consult with them to ensure you are properly managing data and records (review play #11, Manage security and privacy through reusable processes, in the Digital Services Playbook).
  • Understand expectations. Understand what your audience expects of your service, and validate the success of your service with real users.
  • Publish open code and data. When appropriate, publish source code and datasets of projects or components online.
  • Work in the open. When appropriate, share your development process and progress publicly.

Further reading

Government resources

Non-government resources

Embrace accessibility

Accessibility affects everybody; build it into every decision. Legal requirements are a critical starting point for factoring accessibility into your decision-making, but these requirements are only the beginning. Accessibility is about real people who use your services — it’s usability for the broadest possible audience regardless of how they engage with your content. Everyone who works on government websites has a role to play in making federal resources accessible and inclusive. Design generously and celebrate accessibility requirements as a set of design constraints that help us create a better product for all users.

Key considerations

The following are useful questions to ask your team as you account for accessibility requirements and features:

  • Can users navigate your site using only a keyboard?
  • Can users use a screen reader to access all page content?
  • Can users quickly understand the main points of your content?
  • Can users easily interpret content associated with graphic elements?
  • Can users easily understand and complete key tasks?
  • Are you testing your service with a broad range of users?
  • Do you know your agency accessibility team?
  • Is your site organized such that everyone can navigate it easily?
  • Are you using accessibility-testing tools?
  • Are your accessibility-testing tools providing accurate results?
  • Are you providing content in languages other than English, as appropriate for the audience?

Practical actions

The following are practical actions you can take:

  • Humanize accessibility. Seek out examples of the real-life impact of accessible products and services. Try to make accessibility less abstract and more personal.
  • Use agency resources. Reach out to your agency’s accessibility team, and build a relationship with them.
  • Learn about assistive technology. Visit the Web Accessibility Initiative website to get familiar with the basic ways people use assistive technology and how people with disabilities use the web.
  • Follow existing standards. Conform to the Revised 508 Standards and W3C WCAG 2.0.
  • Work from existing resources. Consult Section508.gov, Accessibility for Teams, and the 18F Accessibility Guide.
  • Design generously. Adopt an inclusive-design mentality as described on the Inclusive Design Principles website.
  • Develop accessible code. Ensure front-end code is written accessibly and conducts manual and automated testing.
  • Write accessible content. Ensure content is written in plain language and headings, images, and links are accurately labeled.
  • Build accessible designs. Ensure designs are accessible, pages are laid out in a logical order, and content meets color-contrast requirements.
  • Test broadly. Test with a broad range of users and abilities throughout the design and development process, including manual accessibility testing against the Trusted Tester and ICT Testing Baseline.
  • Be responsive. Remediate accessibility issues when you discover them.
  • Contract for accessibility. Use the Accessibility Requirements Tool (ART) to incorporate accessibility requirements into your contracts.

Further reading

Government resources

Non-government resources

Promote continuity

Minimize disruption, and provide a consistent experience throughout services; over time; and across agencies, platforms, and devices. Consistency is not necessarily conformity. Agencies, sites, and services may have different audiences, missions, and goals — and the way we implement our solutions may differ — but we promote continuity by starting from shared solutions and values. These design principles are one set of shared values, and the design language of the U.S. Web Design System is another. Strive to build user-centered solutions that address the whole experience, not just a user’s specific task, but the context of their journey.

Key considerations

The following are useful questions to ask your team as you work toward a consistent user experience:

  • Do you know if your audience understands that your product is a government site or service?
  • Do you know if your audience understands the purpose of each page or section?
  • Is it always clear what users are expected to do next?
  • Does your agency have established style guidance?
  • Have you tried and tested shared solutions before developing your own?
  • Have you considered your service in the context of customer or user journeys?
  • Have you identified your highest-impact customer or user journeys? Within these journeys, have you identified specific opportunities to collect feedback?
  • Have you considered your service in the broader context of a service ecosystem?
  • Can you reach across agencies and silos to collaborate and share solutions?
  • Does your site or service have a consistent experience on any device or browser?
  • Do users have equivalent access to your information and services on any device?
  • What factors outside the scope of your product or service affect its success?
  • What other government products or services are related to the success of your product or service?
  • Are you able to coordinate solutions with other projects that share a similar audience?

Practical actions

The following are practical actions you can take:

  • Identify as a government site. Clearly and consistently identify as a government site on every page.
  • Use a style guide. Use a simple and flexible style guide for content and style throughout a product or service. Know if existing guides already exist in your agency before developing something new.
  • Connect related services with a similar style. Use the style guide consistently for related digital services.
  • Support a wide range of devices and platforms. Support a wide range of devices for a mobile-friendly experience.
  • Test on real devices. Test your site on the actual mobile devices as often as possible.
  • Move or remove content with care. Provide proper notice and forwarding when content is moved or removed.
  • Clarify multi-step processes. Give users clear information about where they are in each step of a process.
  • Support multi-session processes. Provide users with a way to exit and return later to complete a process.
  • Support re-use of saved data. Assure that repeat website visitors, who have logged in, can auto-populate forms with saved information.
  • Find a community. Participate in cross-government communities of practice.

Further reading

Government resources

Non-government resources

Listen

Evaluate and improve your product by listening to your audience and learning from what you hear. Continuous feedback drives continuous improvement. Measure customer experience — how well what you’ve built is working for your audience — at every stage of a project and as projects grow and mature. Listen to what people say and observe how they interact with your products or services through direct observation or through analytics data.

Key considerations

The following are useful questions to ask your team to ensure you’re listening to your users:

  • Does your product or service have access to people with design, development, and research skills?
  • What are the key metrics your service uses to measure success?
  • How are your success metrics tied to positive customer or user outcomes?
  • How have these metrics performed over the life of the service?
  • Do you have system monitoring tools and processes in place to identify and respond to issues?
  • Which tools are in place to measure user behavior, and how do you use them?
  • Do you measure customer satisfaction and take steps to improve satisfaction?
  • Do you assess your customer experience maturity and develop action plans to identify focus areas for improvement?
  • How are you collecting user feedback for bugs and other product issues?
  • Do all members of the project team participate in user interviews and research activities?
  • Do you cultivate direct community participation in your project with activities like hackathons?
  • How often are you reviewing and addressing feedback and analytics?
  • Do you contribute feedback to services your project uses?

Practical actions

The following are practical actions you can take:

  • Actively collect issues. Offer users a mechanism to report bugs and issues, and be responsive to these reports.
  • Collect direct feedback. Actively collect, review, and address feedback about your product or service (such as through surveys or customer email).
  • Analyze analytics data. Implement both the governmentwide Digital Analytics Program (DAP) and agency-specific analytics services and analyze the data.
  • Analyze search results. Include a search function on your site (through Search.gov or another tool), and analyze the search data.
  • Analyze social media data. If you use social media platforms, analyze the data from these platforms.
  • Publish metrics. Publish metrics internally and externally.
  • Coordinate large projects with service design. Conduct a service design analysis when designing, coordinating, or consolidating large sites or services.
  • Involve the team in research. Involve all members of a project team in user interviews and research activities to hear directly from real users.
  • Use direct observation. Use direct observation in your research whenever possible to understand the context of a user’s actions.
  • Keep testing. Test and re-test with real users.
  • Share back. Contribute feedback and share solutions back to the internal and open source projects you use.

Further reading

Government resources

Non-government resources