Let me begin with two numbers that should not coexist in the same industry and yet, like a bad marriage, somehow persist.
Number one: 93% of recruiters plan to increase their use of AI in 2026. Number two: 8% of job seekers believe AI makes hiring fair.
Ninety-three percent on one side. Eight percent on the other. That is not a disagreement. That is two groups staring at each other across a burning bridge, each convinced the other one lit the match.
Der eine sagt H, der andere sagt Hott — one says left, the other says right — except in this case, the horse is an algorithm, the cart is your career, and nobody is entirely sure who is holding the reins. Or whether there are reins.
And the pharmaceutical industry — an industry that would not approve a cough drop without a double-blind, placebo-controlled trial — has adopted AI in its hiring processes with approximately the same scientific rigour as a horoscope. No control group. No peer review. No adverse event reporting. Just a vendor demo, a budget approval, and a quiet prayer to the gods of efficiency.
Vertrauen ist gut, Kontrolle ist besser. (Trust is good, control is better.) Lenin said that. He was talking about communism, not applicant tracking systems, but the principle transfers with uncomfortable precision.
The Numbers That Nobody in HR Wants to Put on a Slide
70% of hiring managers say AI helps them make faster and better decisions. They are pleased. They are confident. They sleep well at night. On the other side of the screen, only 26% of candidates trust AI to evaluate them fairly — despite the fact that more than half believe AI is already screening their applications. Among American Gen-Z entry-level workers, 62% have lost trust in the hiring process entirely. And 46% of all job seekers say their trust in hiring has decreased over the past year, with 42% blaming AI directly.
Das Vertrauen ist nicht im Keller. Es liegt darunter. Im Grab. (Trust is not in the basement. It is below that. In the grave.)
And it shows. Job seekers are accepting 23% fewer offers than before the AI boom — acceptance rates dropped from 74% in 2023 to 51% in 2025. Half of all candidates no longer believe the jobs they applied for were even real. Read that again if you need to. Every second applicant suspects the position does not exist.
And here is the part that should make every head of talent acquisition reach for something stronger than coffee. The SHRM Benchmarking Survey found that both cost-per-hire and time-to-hire have increased over the past three years — precisely the period of accelerating AI adoption. The tool that was sold as the cure for hiring inefficiency has, in aggregate, made hiring slower, more expensive, and less trusted.
Operation gelungen, Patient tot. (Operation successful, patient dead.) The medical profession has a dark joke for exactly this outcome.
But we are not done with the uncomfortable numbers. Two thirds of companies acknowledge that their own AI hiring tools could introduce bias. Age bias is the most commonly identified type, followed by socioeconomic and gender bias. Only 29% of companies maintain full human oversight on all AI rejection decisions. Half use AI exclusively for initial screening rejections. And 21% — one in five — allow AI to reject candidates at all stages without any human review whatsoever.
For pharmaceutical professionals over 50, that final number deserves a moment of your full attention. The most commonly identified bias in AI hiring tools is age bias. One in five companies lets the biased tool reject candidates with no human check. If you have been wondering why your applications disappear into a black hole despite three decades of regulatory affairs experience — jetzt wissen Sie es. (Now you know.) It was not personal. The machine does not do personal. That is, in fact, the problem.
The AI Arms Race: When Both Sides Bring Robots, the Humans Lose
SHRM calls it the AI arms race, and it is currently turning pharmaceutical recruitment into a performance art piece about mutual distrust. It goes like this. Candidates, exhausted by the opacity of AI screening, start using AI to write their CVs, generate cover letters, and rehearse interview answers. Somewhere between 40% and 80% of applicants now use AI in their applications. LinkedIn reports that 81% of job seekers either already use or plan to use AI in their search.
So what do recruiters do? They deploy more AI to screen the AI-generated applications. Naturally. 91% of US recruiters report spotting candidate deception. 34% spend up to half their week filtering spam and junk applications. 65% of hiring managers have caught applicants reading from AI-generated scripts during interviews, hiding prompt injections in CVs, or showing up as deepfakes.
And then nobody knows what is real any more. Machines write the applications. Machines screen the applications. Humans enter the process only after both sides have already made their most consequential decisions.
What This Means Specifically for Pharma and CRO Hiring in DACH
The pharmaceutical industry is not a normal hiring market. This is a market where a regulatory affairs specialist with EMA submission experience and German language skills is not interchangeable with one who has FDA experience and speaks Mandarin. The specificity of skills makes pharma hiring particularly vulnerable to AI screening errors.
A Pharmacovigilance Senior Manager with 12 years of experience across BfArM submissions, ICSR case processing, and aggregate report authorship is a very specific kind of candidate. An AI tool trained on generalised hiring data — which is most of them — may not understand what makes that profile genuinely exceptional. It will see a pattern match. It will miss the nuance.
The Law Has Arrived: The EU AI Act and What It Means for Your Next Application
Under the EU AI Act, any AI system used for recruitment is classified as "high-risk." The core obligations become enforceable on 2 August 2026. Requirements include mandatory human oversight, bias audits, detailed documentation, and transparency obligations. Fines can reach €35 million or 7% of global annual turnover.
For candidates, this means:
- You will have a right to know when AI was used to evaluate your application.
- You will have a right to a meaningful explanation of the decision.
- You will have a right to request human review.
- Emotion recognition in recruitment is already banned as of February 2025.
Five Things to Do Now
1. Make your CV machine-readable before you make it human-impressive.
Over 87% of companies use AI in recruitment. Use standard section headings. Avoid tables and graphics. Mirror terminology from job descriptions exactly — not creatively. If the job description says "CAPA management," do not write "corrective action processes."
2. Use AI yourself — but like a tool, not a ghostwriter.
Job seekers using algorithmic CV assistance were hired 8% more often. Use AI to structure and refine your genuine experience, not to fabricate expertise. The interview will reveal the difference. Always the interview.
3. Ask whether AI was used — and what it decided.
From August 2026, the EU AI Act requires employers to inform candidates when AI is used and provide explanations on request. Start asking now. The question itself signals that you understand the landscape — and that you know your rights.
4. Build a digital footprint that the algorithm can find.
59% of recruiters say AI helps them discover candidates. Your LinkedIn profile, publications, and professional activity are all searchable and algorithmically ranked. A well-optimised LinkedIn profile with genuine pharma expertise signals is a candidate who the AI surfaces proactively — before you even apply.
5. Evaluate employers by how they handle AI.
During interviews, ask: "Can you tell me how AI is used in your recruitment process, and what human oversight is in place?" The quality of the answer tells you more about the organisation than any Glassdoor review.
The Bottom Line
AI in pharmaceutical hiring is not your friend. It does not know you. It does not understand the context of your career. It rejected your application in six seconds and has no memory of having done so.
AI in pharmaceutical hiring is also not your foe. Used properly — audited, governed, combined with human oversight — it surfaces candidates who would otherwise be overlooked. The technology, when implemented with discipline, works. The implementation, in most organisations, has the discipline of a Fasching parade.
Friend or foe? Neither. AI in hiring is a power tool. In the hands of a skilled operator with proper safety equipment, it builds something useful. In the hands of someone who skipped the manual, it removes fingers.
For you, as a pharmaceutical professional navigating this: be the informed candidate. Understand the tools being applied to your application. Exercise the rights the law is about to hand you. Build the visibility that makes the algorithm work for you rather than against you.
Sources: LinkedIn / HR Dive — 93% of recruiters plan to increase AI use in 2026 · Greenhouse 2025 AI in Hiring Report · Gartner / UNLEASH — Only 26% of candidates trust AI · SHRM Recruitment Benchmarking Survey · CoverSentry AI in Hiring Statistics 2026 · Novoresume AI in Recruitment Statistics · EU AI Act Annex III · CNBC / LinkedIn — AI will dominate hiring in 2026 · LinkedIn Future of Recruiting 2025 · NBER algorithmic CV assistance study
Is the algorithm working against you?
Book a free getting to know call. I will tell you honestly how to position yourself in a market where AI is deciding who gets seen — and I will tell you what the algorithm is actually looking for in your function.
Book a Free Getting to Know Call