The development of artificial intelligence poses a real threat to women's position in the labor market, particularly in administrative and support sectors. Analyses from Italy and Spain indicate that the automation of routine tasks is hitting female-dominated professions. Simultaneously, the lack of female representation in AI system design and algorithmic errors may perpetuate historical biases in recruitment processes, necessitating urgent interventions in public policy and training.

Threat to administrative roles

Women are overrepresented in support positions, which are the easiest to automate using AI systems.

Risk of algorithmic bias

Systems trained on incomplete data may replicate discrimination against women during recruitment and performance evaluation.

Need for reskilling

Experts point to the necessity of retraining programs to prevent the digital exclusion of female workers.

The development of artificial intelligence could significantly widen inequalities between women and men in the labor market, especially in sectors dominated by repetitive and easily automatable tasks. According to analyses conducted in Italy, the problem primarily concerns administrative and support positions, where women are statistically overrepresented. Consequently, the risk of technological unemployment is currently higher for female workers than for male workers. At the same time, the automation process is not viewed solely as a negative phenomenon. If new technology implementations are properly designed, artificial intelligence could take over some of the most burdensome, routine tasks, thereby paving the way for new professional specializations and the development of soft skills.

The situation in Spain confirms these concerns, additionally pointing out that the problem does not end with machines displacing some jobs. If AI systems are trained on incomplete or unequal historical data, a phenomenon known as algorithmic bias can occur. This mechanism risks cementing prior prejudices, especially in key recruitment and candidate selection processes. A further challenge is the fact that women occupy a smaller share of positions in AI system design and development departments. This means they have less influence over how these tools are built, tested, and implemented in the workplace. This increases the likelihood that performance evaluation algorithms will replicate old patterns of discrimination instead of actively correcting them.

[{"aspekt":"Routine tasks","przed":"mainly performed by humans","po":"partially automated"},{"aspekt":"Recruitment","przed":"human decisions with biases","po":"biases encoded in the system"},{"aspekt":"New roles","przed":"low female participation in design","po":"growth opportunity after training"}]

The common conclusion from reports from Southern Europe emphasizes that public policy and the way technological implementations are organized within companies are crucial for the future of the labor market. If countries do not launch large-scale training programs and processes like reskilling, the digital gap between genders will widen. Customer service and broadly defined office administration remain particularly vulnerable areas. In these sectors, automation could eliminate some tasks much faster than in professions requiring high creative, technical, or decision-making competencies. 2 — countries from which consistent warnings about AI discrimination risk originate

The debate about automation's impact on employment has been ongoing for decades, but the current wave of generative tools is shifting the dispute from factory floors to offices and recruitment processes. This brings the question of gender equality back with renewed force, as the stakes involve not only the number of jobs but also the ethics of systems that will decide the careers of millions of people. Artificial intelligence itself does not necessarily have to lead to increased discrimination; however, without the introduction of clear ethical rules, regular model audits, and targeted support programs for women, this technology could become a tool that cements a harmful status quo.