Given the rapid adoption and uncertainty surrounding AI, it’s become increasingly necessary for employers to take control of their tech stack and monitor their use of automated employment decision tools (AEDTs). For many, this will take the form of a recruitment technology audit. But undergoing an audit can be a daunting experience for any organization, especially for an HR team. Don’t fear; when approached correctly, it can be an invaluable learning experience that helps identify areas of improvement and ensures fairness in your organization’s practices. In this blog post, we’ll share some best practices for those considering where to start when conducting an AI audit.
Best Practices
Legislation such as NYC’s Local Law 144 mandates that employers must audit their AEDT recruitment technology at least once a year to ensure they are not imposing bias via these AEDTs. This is a new legal requirement for many employers, with no clear guidance on how to begin or best practices to consider when attempting to comply with the law. In order to effectively audit their recruitment technology vendors, employers need to be able to identify what bias might look like within the technology and understand the right questions to ask.
We have identified best practices to help guide employers as they navigate this uncharted territory:
- First, map your talent acquisition journey and workflows: Create a visual representation of your talent acquisition process to identify where automated employment decision-making tools may be present. This will help pinpoint areas where unwanted bias could emerge, allow employers to see the data flow at each step, and enable employers to take proactive measures to address issues that may arise. Additionally, employers should identify the vendors and systems used at each stage to better assess the technologies involved.
- Next, reach out to your vendors: Once your processes are mapped, it’s time to evaluate your existing or potential recruitment vendors. Questions to consider when evaluating your tech partners:
- Have you conducted an audit to assess and mitigate biases in your technology? If so, can you provide documentation of the audit findings?
- How has your organization taken steps to ensure bias reduction in your technology?
- Are there any transparency statements or public announcements available to show your commitment to mitigating bias in technology?
If a vendor cannot provide answers to these questions, it could be a warning sign that their technology may not align with your organization’s commitment to reducing bias and maintaining compliance.
- Then, give candidates a voice: Start building trust with your applicants by offering them a way to voice concerns during the recruitment process. This can be done through the application itself or an automated survey, ensuring you stay informed about potential issues as soon as they are flagged.
- Always keep an expert in the loop: Maintain a human presence at all times to address concerns quickly and efficiently. AI is meant to enhance the human work experience, not replace it. Collaborate with your vendor to ensure a productive and compliant process.
- Lastly, conduct a demographics survey: Gathering data on your organization’s demographics and the demographics of those who apply to your open jobs can help you accurately assess the impact of technology on your workforce through self-reported information.
Summary
Ultimately, fairness and transparency should always be the top priority. While efficiency is essential in the workplace, it should not come at the expense of fairness, and an AI audit can be an excellent opportunity for organizations to ensure that their practices are fair, transparent, and compliant.
With that in mind, PandoLogic has proactively made efforts to audit our AI technology and review our AI models’ current and future state applications to ensure, among other things, that we are helping to mitigate bias at the top of the recruitment funnel. We believe we have a responsibility to ensure that our use of AI is transparent, compliant, and consenting. Download our explainability statement here.
Disclaimer: Information contained on PandoLogic's website and set forth in this document is for general guidance only and does not constitute legal advice or a guarantee of compliance with federal, state, or local law. The data and information contained within is for illustrative purposes only and is not intended to replace or serve as an employer’s audit of its own use of PandoLogic’s automated employment-decision tools. Please consult your organization's legal counsel to ensure compliance with relevant laws and regulations.