Retailers emerging from the global pandemic are faced with continuous supply chain disruptions, an ongoing battle for talent, and labor shortages. Meanwhile, online retail continues to expand and delivery services are rapidly growing—in March, global market researcher Technavio predicted that the global online grocery delivery services market will increase by $800 billion between 2020 and 2025.
Without a doubt, agile retailers are poised for transformational changes to their workplaces as consumer expectations and behavior evolve. These changes should be navigated in the context of the evolving regulatory environment.
To address the talent battle, employers, including retailers, are turning to recruiting tools powered by artificial intelligence (AI) to help identify and recruit talent. When developing, purchasing, or utilizing these types of tools, firms need to be aware of increasing compliance risks. Regulators are becoming keenly aware of the proliferation of these types of tools and are evaluating whether their use complies with existing laws.
As an example, the EEOC recently published a “Technical Assistance” (TA) document aimed at addressing compliance with the Americans with Disabilities Act (ADA) and current agency policy.
While not binding, the “Technical Assistance” document provides insight into how the EEOC views employers’ obligations to comply with the ADA when utilizing AI-powered tools. In particular, the TA document focuses on three themes:
1. reasonable accommodation;
2. circumstances where AI decision-making tools “screen out” individuals with disabilities; and
3. circumstances where AI-powered tools may make impermissible disability-related inquiries.
In addition, just last year, the EEOC announced the launch of an initiative to address the use of AI in hiring and other employment decisions, with EEOC chair Charlotte Burrows noting, “the EEOC is keenly aware that these tools may mask and perpetuate bias or create new discriminatory barriers to jobs. We must work to ensure that these new technologies do not become a high-tech pathway to discrimination.”
It is also worth noting that the interest in regulating the use of AI in the employment space is not unique to the EEOC. In 2021 the EU issued a proposed regulation, New York City enacted a law regulating “automated employment decision tools,” and in 2022 the California Fair Employment & Housing Council continues to evaluate its own proposed regulations.
Given the constantly evolving landscape, it is critical that firms engage their legal departments when evaluating AI-powered tools designed to assist in making employment decisions.
Tools Addressing Customer and Employee Experiences
Retailers are increasingly turning to AI-powered tools to improve customer and employee experiences. For example, users are becoming more comfortable with voice-activated technology and the use of voice-activated shopping apps is expected to increase. Such technology has the promise of helping to provide an interactive and tailored customer experience but comes with increased risks.
When adopting this type of technology, firms must be aware of biometric privacy laws that exist or are emerging in several U.S. jurisdictions. Illinois adopted the Biometric Information Privacy Act in 2008, and since then, businesses have faced continual waves of class-action lawsuits involving a variety of different technologies. As biometric privacy laws expand, it will remain important to understand the legal compliance environment when adopting voice-activated technologies.
AI tools also empower firms to reimagine their training regimen to provide consistent, yet individualized training. This approach improves the effectiveness of training by ensuring that employees are provided the right training at the right time, and by reinforcing key learnings based on needs. And in certain areas, like workplace safety, it is important to be able to document attendance and demonstrate the efficacy of a training program, for example, in the event of a safety citation.
Wearable technology and mobile apps are also becoming prevalent tools to assist employees in performing their jobs. Mobile apps can provide employees with immediate access as to the location or availability of a product, and wearables can be used to more immediately deliver information to employees. Apps and wearables are also capable of capturing data and measuring aspects of an employee’s performance.
When deploying these types of applications, however, it is critical to evaluate the type of data the business intends to collect, how it intends to use the data, and any compliance risks—such as biometric privacy laws. It is also important to consider possible risks of unintended uses of the data collected, data retention, and data security.
To address recent changes to the workplace, nimble retailers continue to leverage technology to provide enhanced employee and customer experiences. The transformation of the workplace will undoubtedly continue, and as retailers continue to adopt technology to aid in managing the workplace, it will be increasingly important to consider the complex and evolving regulatory environment and to take steps to address and mitigate compliance risk.