Testing for accessibility
Testing basics
Accessibility testing is the process of evaluating digital content, design, and functionality. This testing helps ensure people living with disabilities can use the digital products.
Hennepin County follows web accessibility laws and best practices including:
- Web Content Accessibility Guidelines (WCAG) version 2.1 by meeting success criteria for levels, A, AA, and where possible and relevant level AAA.
- Section 508 of the Rehabilitation Act
- The county has committed to digital accessibility in its digital accessibility policy (DOCX, 1MB).
Types of accessibility testing
There are 3 main testing methods:
- Automated testing: quick scan of a digital product for accessibility issues
- Manual testing: thorough review of a digital product using specific tools and methods
- Testing with users: authentic experience and perspective by users with disabilities
You should do both automated and manual testing. Whenever possible, include user testing.
Do testing in a certain order
- Automated testing: This helps find common accessibility issues. You can fix these issues first. That makes your manual testing more efficient. But automated testing has limitations. It can give you incorrect answers such as false assurances and false errors. Automated testing also doesn’t test the quality of the text or how understandable it is.
- Manual testing: This catches issues automated testing cannot or did not find. This is a more thorough test because you use the same tools that people with disabilities use.
- Testing with users: This identifies accessibility issues in a more authentic way. It will catch issues that neither automated nor manual testing can find. Nothing can replace the input of a person who uses assistive technology in their daily life.
Testing tools
These tools help ensure your digital product meets accessibility guidelines. Many of these tools are also used by people who live with disabilities in their daily life.
Automated testing
For checking code on a web page
Testing code for accessibility is an important first step to ensure your digital product can be used by everyone. This kind of testing analyzes code on a web page and generates a report of existing issues and how to fix.
How to do automated testing
These tools are endorsed for county employees to use. They check for the same issues, but the level of development experience needed varies. If you need help using these tools, please refer to their user guides, instructions, and FAQs.
These tools are ordered by least to most development knowledge required:
- WAVE: quickly evaluate a single paper or use a WAVE browser extension
- Browserstack: a web-based application to test web pages in multiple browsers including on mobile devices
- ANDI (Accessible Name and Description Inspector): a federal government tool to test single pages against Section 508 standards
- Lighthouse Chrome DevTool: a built-in testing tool for Google Chrome
- axe Accessibility Linter: this tool is an extension for Visual Studio Code editor, it checks code for accessibility issues as it's being written
For Word, PowerPoint, and PDF files
Microsoft Office and Adobe have accessibility checker tools to evaluate files for accessibility. Use the checkers to identify issues and for tips on how to fix them.
How to use Microsoft accessibility checker (HC Connect)
How to use Adobe Acrobat accessibility checker (Adobe)
Keyboard testing
For users who navigate with a keyboard
Using a mouse can hard or impossible for people living with mobility disabilities. People who use screen readers also rely on navigation using only their keyboard. A digital product only navigable with a mouse excludes people who only use a keyboard.
How to test keyboard navigation
Use keyboard controls:
-
Tab
moves the keyboard focus forward. -
Shift
+Tab
moves the keyboard focus backwards. -
Enter
activates buttons and selects an option in a menu. It also opens links and expands accordions. -
Spacebar
activates buttons, checks or unchecks checkboxes, and scrolls down a page. You can also expand an accordion.
For more detailed keyboard controls, reference WebAim’s keyboard testing guide.
Color contrast checker
For users who are color blind or have low vision
Poor color contrast makes it hard or impossible to read or perceive the text or image.
How to test color contrast
Color contrast is the difference in color brightness between two visual elements. An example is the difference between text and its background.
Test all visual elements for color contrast by using a color contrast checker, such as:
Color blindness checker
For users who are color blind
People with color blindness can have difficulty perceiving some color combinations. For example, some people can’t tell the difference between red and green. Some may not see color at all.
People may not be able to understand information conveyed only through color. For example, a link that has color but no underline can cause some users to miss the link.
How to test color perception
Use color blindness checkers to simulate how colors look to people who are color blind.
Use free tools such as the Chrome extension Colorblindly or Who Can Use.
Use a color contrast checker to correct text or images that look hard to perceive.
Readability checker
For users with reading or cognitive disabilities
People with reading or cognitive disabilities can find reading a major challenge. Use language everyone can understand and that avoids confusion.
After you write your content, you can put your text into a readability checker. The readability checker will give your text a grade. It will also flag issues such as hard-to-understand sentences and word choice.
How to test the readability of your content
Hemingway Editor is a useful readability checker that's free. You don't need an account or to download it.
Enter your text into the Hemingway Editor. You will see color-coded readability issues you can address within the website. Afterward, you can copy and paste the corrected text into your original document.
Screen readers
For people with visual impairment, or mobility or cognitive disabilities
A screen reader is software that reads the text of a screen out loud. Testing with a screen reader identifies accessibility issues. Fixing these issues will help ensure everyone can access your digital content.
How to test with screen readers
Three common screen readers are NVDA, VoiceOver, and TalkBack. Each screen reader has different controls. Review their user guides to understand the controls.
Most screen reader users navigate content using heading levels and landmarks. So include heading levels and landmarks in your testing.
NVDA screen reader
NVDA is a free, open source screen reader that’s becoming more popular. It performs best in Windows and the Firefox browser.
NVDA overview and support (must have network access)
VoiceOver and TalkBack
VoiceOver is available on Mac and iOS devices. TalkBack is available for Android. They are both free.
Screen magnifier software
For people living with sight disabilities
People use screen magnifiers to make content display larger so they can read it. The screen magnifier enlarges small text and images.
How to test with screen magnifiers
For PCs: use Microsoft’s built-in Magnifier
For Mac: use Apple’s built-in Zoom
Speech recognition software
For people with mobility disabilities
People use speech recognition software instead of, or alongside, a keyboard and mouse. The user speaks commands into their device. This allows them to navigate and to input text with verbal commands instead of typing.
How to test with speech recognition software
For PCs: Dragon NaturallySpeaking overview and support
For iOS: Voice Control for iOS
For Mac: Voice Control for Mac
Automated testing
When to use automated testing tools
Automated testing tools scan a single web page or checks code as its being written to find accessibility issues. The scan provides an overview of accessibility issues to be fixed. Scans often include strategies on how to fix issues.
Do automated testing before manual testing to get an overview of issues or missing features. Always follow automated testing with manual testing.
Inaccurate test results
Automated testing has its issues. It can sometimes incorrectly label an element as either passing or failing the scan.
Examples of incorrect scan results
- Bad alt text: Alt text exists but the quality and accuracy of the alt text is not checked. For example, a picture of an apple with the alt text "an orange" would pass an automated test.
- Unclear form labels and instructions: Form labels exist but do not clearly explain purpose of the field. For example, a form field asking for a person’s first name but the screen reader reads the field as “text input field.”
- No test for keyboard navigation: This can result in incorrect reading order, missing landmarks, and skipping content. For example, incorrect reading order could be the focus indicator moving from the header to the main content to the footer while skipping over any additional content or navigation landmarks.
- Wrong language attribute used: Passes because a language exists but it’s the wrong language. For example, the test sees that the Spanish language is being used when it should be English.
Examples of consistent and accurate results
There are specific items that will get correct test results:
- Images and graphics: Identifies missing image alt text
- Alternative text: identifies repetitive or redundant alt text and anchors
- Form labels: identifies missing or repeated labelling on form elements
- Empty links: identifies links missing descriptive information
- List structure: identifies list items
<li>
not contained within unordered list<ul>
or ordered list<ol>
parent elements - Page language: identifies the default language of the page
- HTML code error: identifies any errors in HTML parsing
Benefits and limitations of automated testing
Benefits
- Faster than other methods of testing
- Inexpensive or free
- Can be done with fewer people
- Can be used without a lot of accessibility or development knowledge
Limitations
- Misses crucial accessibility errors, such as alt text tags that aren’t accurate or descriptive
- May find nonexistent errors or create a false sense of security and confidence in a digital product
- Excludes the perspectives and lived experiences of people with disabilities
Accessibility Overlays and Artificial Intelligence
- Hennepin County websites do not use accessibility overlays. These are automated tools that modify a website’s code in an attempt to make the website more accessible.
- Overlays don’t allow people with disabilities to use their assistive technology. This makes it difficult or impossible for users who are disabled to use the site.
- Don’t rely upon artificial intelligence (AI) for accessibility testing. For example, AI-generated alt text often inaccurately describes the image.
Manual testing
This checklist will help ensure you meet the Hennepin County digital accessibility policy. These standards are from WCAG levels A and AA, and are combined with the county web standards. The checklist organizes each standard according to the type of testing needed.
- Read each accordion title to identify the standard.
- Use the information in the accordion to learn about each standard and how to test it.
- Use the checkboxes to check off each standard once tested.
Test keyboard navigation
How to test the focus indicator
Press the "Tab" key on your keyboard to navigate to each element on the page.
Look for the focus indicator on each element of the page that you can also click with a mouse. An element has keyboard focus when it has a unique visual display like a border or a highlight.
Why we test with a keyboard
Many people living with mobility disabilities use the "Tab" key to navigate a page. Without a focus indicator, keyboard users won’t know where they are on a page. It would be like trying to navigate with a mouse but the mouse pointer is invisible.
Example of a visible focus indicator
This image shows a list of links. The "Residents" link has a thick orange border, bold text, and darker background. These three features put the focus on the "Residents" link. When the user presses the "Tab" key, the focus indicator will move to the "Business" link.
How to test for keyboard traps
- Press "Tab" to navigate and interact with a page. Use the spacebar and enter keys to select and activate page elements.
- Look for a seamless experience with keyboard navigation. Users must be able to move forward and backward without any issue. If you can’t move forward or backward, you’ve found a keyboard trap. A keyboard trap prevents a user from using a page.
- Be sure to test third-party widgets, pop-ups (also called modals), videos, and calendars. They often have keyboard traps.
Why we test for keyboard traps
Keyboard traps force people who rely on keyboard navigation to close or refresh the page. It causes them to lose any progress. People with sight and mobility disabilities often rely on keyboard navigation.
Example of a keyboard trap
A user triggers a modal to open while filling out a form. The user can’t exit or close the modal using either the escape key or a "close" button. The user has become trapped in the modal. They now must refresh the page to exit and will lose all their progress filling out the form.
Test for color contrast and color blindness
How to test for color contrast
Test colors with a color contrast checker. It will tell you if the color of your text contrasts enough with the background color.
Requirements for color contrast ratios:
- Normal text must be at least 4.5:1. An ideal value is 7:1.
- Large text must be at least 3:1. An ideal value is 4.5:1.
Visit the Color contrast tools section under the Basics tab.
Why we test color contrast
Text and informative graphics may be hard to read without enough color contrast. Color contrast especially affects people with sight disabilities.
Example of a color contrast test
The button on the left has text color that stands out against the dark blue background.
It has the ideal contrast ratio of above 7:1. The higher the contrast ratio, the easier it will be for users to read.
The button on the right fails color contrast. It does not meet the minimum contrast ratio of 4.5:1. The text color is hard to see against the dark blue background.
How to check content for enough visual differentiation
You can simulate the experience of a user who is color blind. Use the Chrome extension Colorblindly.
Why we test using color blindness simulators
Testing with color blindness simulators shows the colors you need to change. It also shows the identifiers you need to add. People who are color blind or have low vision can then view and differentiate your content.
Example of a color blindness test
This example shows how users would perceive an error message in a form field.
The image on the left is how a user who is not color blind would see it.
The image on the right is how a user with red-green color blindness would see the same error message. Anyone who is color blind may not be able to see the color red. They would miss the meaning of the alert if the system only uses a red color to convey the alert.
The "X" icon alerts color blind users of an error.
This example uses design along with color to convey meaning. The combination of color and design helps all users perceive the error message.
Test readability
How to test for 8th grade reading level
- Run your text through a readability checker, like the Hemingway Editor.
- Follow the tips provided by the readability checker when possible. Use simple alternatives for complex words, limit adverbs, and shorten lengthy sentences.
Why we test for an 8th grade reading level
Complex text can be hard to read and understand. This is especially true for people with cognitive disabilities and English language learners. Writing at this reading level helps make the content accessible for all audiences. This is a county standard and WCAG AAA standard.
Example of a readability test
The Hemingway Editor color codes readability issues. It also gives the text a grade level. You can fix readability issues in the editor.
If possible, avoid using acronyms and jargon. Most of the county's audience doesn't know what they stand for. Use a full description. Also include program or county. For example, write out Community Development Block Grant program instead of CDBG.
For more information about acronyms and jargon, visit the writing guide.
How to test for link quality
- Read all link text.
- If the link is vague or unclear, rewrite the link text.
- For more information about writing links, visit the Digital Writing section.
Why we test for link quality
People who use screen readers can view a list of links on the page. Without descriptive link text, it will be hard for them to know where the links lead to.
Example of link text
Bad example: Click here to learn more.
Good example: For more information about writing links, visit the Digital Writing section.
How to test for button label quality
- Read all button text.
- If the button text is unclear or long, rewrite the text to describe the action of the button when used. For example, a button to submit a form should include the word "submit."
Why we test for button label quality
Vague or unclear button text will lead people to confuse the purpose of the button. This could cause an unintended action.
Example of button labels
Label a button that submits a form as "Submit" or "Submit form."
Test screen reader experience
All screen readers have a slight difference in function. Read the user guide for the screen reader you are using.
How to test heading levels
- Navigate to each heading on the web page. Listen for every heading level included in the content. In general, screen reader controls use a hot key. For example, NVDA uses the 'H' key.
- Listen for the correct heading level hierarchy. For example, you should hear heading levels increase from Heading level 1 to 4. Hennepin County’s best practices state there should only be one Heading 1. Make sure the following headings are in logical order.
Why we test heading levels
Screen readers communicate heading levels to users to help them navigate content. This helps users who are visually impaired understand the structure of the content.
Example of a screen reader navigating headings
Video showing how a screen reader user navigates headings (YouTube)How to test alt text
- Navigate to every image on the page you’re testing.
- Use the screen reader controls to listen to the alt text. Check that alt text is present and gives an accurate description of the image and its context.
- Screen readers ignore decorative images.
- For more information about alt text, visit the Writing for accessibility section.
Why we test alt text
People who can’t see images rely on alt text to understand and learn from the image. If the screen reader doesn’t detect alt text, these users will miss important information.
Example of a screen reader announcing alt text
Video showing how a screen reader announces alt text on a website (YouTube)How to test landmarks
On NVDA, use the "D" hot key to move between landmarks.
Examples of expected landmarks
- Header or banner: the first section of the page with the website title
- Navigation: list of links for navigating the website
- Main: the primary content of the page
- Footer: the last section of the page with information and links about the website
Think of landmark regions like parts of the human body: the head or header in this case, should be first. The feet, or footer, should be last. The main landmark is the main body torso. The landmark order needs to be in logical order. For example, think of the song, "Head, shoulders, knees, and toes."
For more information about landmarks, visit the Developing for accessibility guide.
Why we test landmarks
Landmarks break down a page into expected sections. This helps users create a mental map of that page to navigate it successfully.
Example of landmarks
A person wants to find the social media link that usually exists at the end of a page. They would use the "D" hot key to navigate quickly to the footer landmark where they can get to the social media link.
This helps them skip over content and landmarks they don’t need.
Test screen magnification
How to test screen magnification
Use screen magnifier tool to check that text isn’t blurry when zoomed to 200%.
If images become blurry when resized, use images in svg format. If layout and space changes content when resized, use responsive design. This allows content to reflow around spacing modification without cutting off text or losing content.
Another way to check responsive design is by using Browserstack or the "Toggle device toolbar" in DevTools. Both tools mimic the perspective of various devices.
Why we test screen magnification
People with certain sight disabilities may need to adjust the content on a page to make it easier to see. If the content and layout is not responsive, then content may not be readable and may be lost. Navigation will also become more difficult for those users.
Example of screen magnification
The following two images show Hennepin County's navigation bar. The first image is what the text looks like with default text spacing and size, and the second image has increased spacing between characters. Both images include the same text, so the navigation bar is compatible with text spacing editors.
Test speech recognition
How to test speech recognition
- Don’t pause while saying a command. Pause after each command is spoken.
- Navigate through your page. For example, you may say the command "Tab" to move from a text input field to the submit button.
- Test all buttons and links to make sure they can be selected and activated using only the speech recognition tool. For example, saying the command "Click submit button."
- Test mouse control using spoken commands. For example, saying "Mouse grid" and "Move mouse up."
Why we test speech recognition
Being able to navigate and use a page using only speech commands is critical for people with mobility disabilities. They rely on speech commands to successfully use digital products.
Example of speech recognition
A person with carpal tunnel navigates to the Hennepin County homepage using spoken commands. They say the command "Mouse grid" to put a grid overlay on the page with 9 numbered squares. The user can say a number of 1 through 9 to create a deeper grid of 9 squares in that space. This helps the user get to a specific link they want to visit. Video of a user activating the mouse grid commands with a speech recognition tool (YouTube).
Test videos, images, and animations
How to test controls
- When you open your website or digital product, look and listen for videos or animations that automatically move for more than 5 seconds.
- Check that there is a way to pause, stop, or hide these videos or animations.
Why we test controls
Moving animations and videos can be distracting, especially for people living with cognitive disabilities like ADHD. It can draw their attention away from everything else on the page.
Example of controls
If your website has an image carousel that continuously rotates through photos, include a pause button and arrows to move through the photos manually.
How to test accurate transcripts or captions
- Review all pre-recorded content for available transcripts or captions. Video only- and audio only- content require a text transcript. Video with audio requires captions.
- Captions and transcripts must include all dialogue and important sounds.
- Make sure to check for accuracy. Automated captioning can have many errors.
- Check the captions and transcripts for spelling or grammar mistakes.
- Transcripts are either beside the pre-recorded content or are included through a link.
- To increase accessibility for videos with audio, you may include a transcript.
Why we test accurate transcripts or captions
Captions and transcripts provide a visual alternative to auditory information. This is especially important for people with hearing disabilities.
Example of accurate transcripts or captions
This is an example of a properly captioned video. Select the "CC" button on the video to view captions. Select the "Show transcript" button in the video description to view the transcript.
"We Are DHHSD" from the Minnesota Department of Human Services (YouTube)How to test synchronized captions
- Watch the video and listen to the audio.
- Closed captions must match all spoken words and important sounds heard in the audio.
- Each caption frame must align with the audio track.
- Captions are presented at a readable speed of 3 to 7 seconds on the screen. While time duration isn’t a WCAG standard, it is an FCC requirement for closed captions.
- Videos must include captions from to start to end.
- Captions don’t block other important visual content.
Why we test synchronized captions
Synchronized captions help people who are deaf or hard of hearing understand the audio content of a video at a reasonable pace.
Example of synchronized captions
This is an example of a properly synchronized video captions. Select the "CC" button on the video to view captions. Select the "Show transcript" button in the video description to view the transcript.
"We Are DHHSD" from the Minnesota Department of Human Services (YouTube)How to test alt text
- Informative images must include meaningful alt text. For more information about crafting meaningful alt text, visit the Writing for accessibility guide.
- Use an empty alt tag
alt=""
for decorative images. For example, decorative lines that divide content. - Complex images, such as charts or maps, must have two types of alt text: a short description and long description. A short description describes the image of the chart or graph. A long description explains or summarizes the data and findings.
Why we test alt text
People with sight disabilities rely on alt text to understand visual information. Bad or missing alt text prevents people with sight disabilities from understanding content.
Example of alt text
The following image might include the alt text: "A person wearing a white shirt and blue blazer stands on a busy city street."
Test code accessibility
How to test page titles
- Page titles are displayed on browser tabs. Make sure a page title exists and is specific to that page.
- To find the
<title>
attribute, go to the<head>
section of the HTML file. - An example format for writing page titles at Hennepin County is "Residents | Hennepin County."
Why we test page titles
Page titles serve as a navigation point. Screen readers use page titles to let users know what page on the website they are accessing.
Example of a page title
On the Hennepin County Contact page, the page title on the browser tab is "Contact | Hennepin County." The corresponding code is <title>
Contact | Hennepin County </title>
in the <head>
section. Visit the Hennepin County Contact page.
How to test heading levels
- Headings are used to organize and structure the content. For more information about headings, visit the Navigation section in Developing for accessibility.
- Each web page only needs one
<h1>
. The<h1>
should be the main topic of the page. - The next heading must use an
<h2>
tag. - The content hierarchy for the rest of the page should use heading levels
<h2>
through<h6>
.
Why we test heading levels
People using assistive technology, such as screen reader and speech recognition tools, rely on properly coded headings to navigate and comprehend content.
Example of heading levels
For an example of heading level page hierarchy, visit W3C’s Headings that reflect the page organization.
How to test keyboard shortcuts
If your digital product is using a keyboard shortcut, follow these 3 requirements:
- Make sure it can be turned off.
- Users can change the shortcut to include other keyboard keys. For example, the user can choose to also use the
ctrl
key. - Keyboard shortcut is only activated when component has focus.
Why we test keyboard shortcuts
Speech input users and keyboard users need to be able to customize keyboard shortcuts. This helps prevent accidental command actions. Speech input users may try to spell out a word but accidentally activates a letter "m" keyboard shortcut. Users with mobility disabilities may accidentally hit a wrong key and trigger a keyboard shortcut. The ability to customize keyboard shortcuts have greater control over their navigation experience.
Example of keyboard shortcut
To see an example of a customized keyboard shortcut, visit the Deque University character keyboard shortcuts.
How to test input assistance
- Instructions are clear and understandable.
- Instructions are easily found by assistive devices.
- When an error occurs, all users know where it is.
- The user knows what the error is and how it can be fixed.
- The error message may include additional indicators but must have text.
Why we test input assistance
Input assistance tells users how to successfully complete a form.
People who are visually impaired are most impacted when instructions and errors are not clear, buried in content, or are not indicated beyond color.
Example of input assistance
Video showing how a screen reader user interacts with form errors (YouTube).
How to test component predictability
- Look for components that trigger changes upon interaction. This includes form fields, selects, checkboxes, radio buttons, and any custom widgets.
- Interact with these components to see if any content changes occur.
- If changes occur, verify that there are instructions to inform people of the changes in advance. The changes should also be minor and not disrupt the user's workflow. For example, let people know ahead of time that choosing a specific option within a select will change the follow-up question.
Why we test component predictability
Predictability is important for people to successfully navigate and interact with a website or form. The disorientation caused by unexpected changes can make pages or forms unusable, especially for people with cognitive disabilities. Changes that do not require a page reload but update content should be expected or communicated to people beforehand. For example, let people know ahead of time that sections of a page update dynamically.
Example of component predictability
A form selection triggers additional fields. Next to a select component is text informing users that choosing certain options will display additional questions specific to that option. Because there are clear instructions informing people of the change in advance, the select component passes this test.
How to test time limits
Check if at least one of the following is true:
- Turn off: can remove time limit
- Adjust: can change time limit
- Extend: can extend time
- Exception: time limit is required and cannot be extended; user gets a warning message before session closes
Why we test time limits
Having more time to navigate, understand, and use a site benefits various accessibility needs. For example, people using screen readers require more time to understand screen layout, locate controls, and operate them. People with cognitive disabilities require more time to read, understand, and respond to content. People with physical disabilities, require more time to complete tasks accurately.
Example of time limits
An online registration form gives people a chance to extend their current work session by another 15 minutes. They get 10 opportunities to extend their work session. After 10 extensions, there is a warning that the extension limit has been reached and the session will officially end after 15 minutes.
How to test pause, stop, or hide controls
- Find any content that moves, blinks, scrolls, or auto-updates. For example, look for carousels, animations, or auto-updating feeds.
- Check that there are control mechanisms for people to stop or hide the moving content. For auto-updating content, people can also control the frequency of updates unless the auto-updates are essential.
- Test the controls to ensure they are operable and meet screen reader and keyboard-only navigation requirements.
- Verify that people have enough time to read and understand content before it automatically updates.
Why we test pause, stop, or hide controls
- Moving or auto-updating content are issues for people with cognitive disabilities, motion sensitivity, or who are visually impaired.
- People who have difficulty focusing or are easily distracted may find moving and auto-updating content disruptive.
- People with reading disorders may find scrolling or blinking content difficult to read.
- Moving or flashing content can cause physical discomfort or nausea for people with motion sensitivity.
- Screen readers have difficulty reading auto-updating content.
Example of pause, stop, or hide controls
A video about water safety is embedded on a web page. The video doesn’t automatically start. It has clear and easily navigable control buttons to play, pause, pause, stop, or hide the video.
Usability testing
Why we test with users
Testing with users living with disabilities provides a perspective that cannot be replicated with automated tools, or even by manually self-testing. People living with disabilities are experts at living with their disabilities, so it’s important to have them test your digital product if possible.
Request for testing with users
Hennepin County contracts out to have testers who live with disabilities check our digital tools and experiences for accessibility issues. For details, contact digital accessibility coordinator Lisa Yang at 612-543-1954 or lisa.yang@hennepin.us.