The Use of AI by Job Candidates: Employment Law Considerations

ai used by job applicants

Part 1 of 2

As employers explore ways to use Artificial Intelligence (AI) within the bounds of existing and emerging legislation and guidance, and as government agencies, states, and municipalities seek to regulate AI in employment and other areas, the use of AI by job applicants has proceeded largely under the radar.

But an increasing number of job candidates are using AI, including to draft and review resumes, cover letters and writing samples, to complete job applications, and even to help them prepare for and participate in job interviews. This use is often undetected by employers, which can lead to problematic results.

It is critical that employers understand the ways in which candidates can use AI in the hiring process and learn how to effectively navigate potential issues that might arise without negatively impacting the pool of talented applicants or running afoul of employment laws.

How Are Job Candidates Using AI?

Job candidates are using AI in a variety of ways. A 2023 survey from a provider of online and app-based resources for job seekers found that almost half of them were already using ChatGPT to generate resumes and cover letters, and 70% of applicants reported a higher response rate from employers when using ChatGPT to create or revise application materials.

Indeed, a 2023 Harvard Business Review article theorized that “[u]sing tools like ChatGPT to help craft [a] resume may very well be the new norm in a few years’ time.”

In addition to using AI to generate and revise applications, resumes, cover letters, and other written materials, candidates are also using AI in connection with interviews.

For example, in 2023, multiple news agencies reported on a TikTok video (with over two million views) that showed how to use AI to prepare for interviews by using the tool to generate possible interview questions based on the job description. Indeed, a recent survey found that 41% of college students believe that using AI to prepare for interviews is acceptable.

More troubling, some applicants may use AI to respond to text, pre-recorded, or video interview questions. This controversial use of AI was highlighted in a 2023 TikTok video that showed a woman using an app on her phone to generate answers to questions while they were being asked during a video interview.

While some believe that the post was an advertisement for the app the woman was using and not a real-life scenario, the video shows yet another way that AI can creep into the hiring process.

Employment Law Considerations

While Congress, agencies, and state and local governments have not addressed the use of AI by applicants, existing federal employment guidance and laws on employers’ use of AI provide insights into how employers might regulate applicants’ AI use.

For example, in May 2022, the US Equal Employment Opportunity Commission (EEOC) issued “The Americans with Disabilities Act and the Use of Software, Algorithms, and Artificial Intelligence to Assess Job Applicants and Employees” (the EEOC ADA Guidance), which contains guidance on how the Americans with Disabilities Act (ADA) could limit employers’ use of AI to screen job applicants.

Although focused on employers’ use of AI, the guidance’s is helpful in understanding how employers might limit an applicant’s use of AI. Among other things, the EEOC ADA Guidance notes that one of the most common ways that employers can violate the ADA is by failing to provide a “reasonable accommodation” for job applicants to be fairly considered.

Relatedly, the EEOC ADA Guidance explains that employers cannot use AI to “screen out” job applicants with disabilities under the ADA.

With these concerns in mind, if an employer has a general policy prohibiting AI based on a legitimate, nondiscriminatory business reason (e.g., combatting plagiarism and misrepresentations about skills and experience), then it may need to make an exception for job applicants with disabilities if the applicant can articulate why they need the AI to assist them in the application process.

If the underlying purpose of an employer’s workplace policy banning job applicants from using AI could be accomplished with alternative means (e.g., using AI tools and human screeners to detect potential AI plagiarism in applications), then employers may need to adjust the policy for applicants with disabilities.

Further, if a job applicant with disabilities uses AI to, for example, draft their resume, but is still able to perform the offered position’s essential functions, employers should be careful not to automatically screen them out because of a blanket ban on AI use.

On the whole, while employers should take some comfort from the EEOC ADA Guidance’s instruction that they never need to “lower production or performance standards or eliminate an essential job function as a reasonable accommodation” under the ADA, employers should proceed cautiously and purposefully when regulating job applicants’ use of AI, including by following the recommended practices set forth below.

In May 2023, the EEOC issued “Select Issues: Assessing Adverse Impact in Software, Algorithms, and Artificial Intelligence Used in Employment Selection Procedures Under Title VII of the Civil Rights Act of 1964,” which contained guidance on Title VII of the Civil Rights Act of 1964. Although the Guidance is focused on how employers can use AI within the bounds of Title VII, it also provides helpful insights for employers as they develop policies governing on job applicants’ use of AI.

At the forefront of this guidance is the EEOC’s clear message that employers may be liable for any discriminatory use of AI-influenced “selection procedures” (i.e., any measure used to make an employment decision) in a manner that has a disparate impact on job applicants based on their protected characteristics (e.g., race, sex, religion, etc.).

In light of this, if publicly available data or studies are published that show that job applicants with protected characteristics are disproportionately using AI to apply to certain industries or occupations, then affected employers should proceed with caution when deciding how to restrict candidates’ use of AI and should not automatically reject candidates because of their use of any form of AI in the application process.

Existing and proposed legislation is also informative. For example, Illinois’ Artificial Intelligence Video Interview Act in 2020 requires employers to disclose their use of AI in the hiring process, which suggests that employers could require candidates do the same.

Finally, employers restricting a job applicant’s use of AI should draft their policies with the Biden administration’s “Blueprint for an AI Bill of Rights” (the AI Blueprint) in mind to mitigate potential legal exposure and reputational harm.

Because the AI Blueprint focuses on how AI can exacerbate existing biases in employment, employers seeking to restrict or prohibit the use of AI by job applicants should ensure that their rules do not inadvertently or disproportionately impact certain groups of applicants.

Next Week:
We look at recommendations for employers in addressing job applicants’ use of AI.

Have questions regarding how your organization might regulate applicants’ AI use? Contact Synergy HR for insight tailored to your hiring protocols.

Is your Employee Handbook
2025 Compliant?

Like it or not, recent federal and state law changes, regulatory changes and precedent-setting federal case law have necessitated the updating of your policies, procedures and forms.

And these required updates apply to employers of all sizes.

These revisions should have been in place by January 1, 2025.

Synergy Human Resources is available to help ensure that your policies, procedures and forms are updated and compliant for the new year.