The Trillion-Dollar AI Correction
How a viral post on women's health bias spotlighted an overlooked challenge in healthcare data infrastructure

This newsletter is 100% free. But it takes hours each week to research, write, and produce at this level. Here are 5 ways to support my work: 1. click “❤️” to amplify 2. subscribe 3. share this publication 4. buy me coffee 5. become a partner
📲 connect and collaborate with me here! email | LinkedIn | Instagram
This week I posted a short note on AI bias in women’s health. It struck a nerve: over 65k views, 3300+ likes, nearly 500 new subscribers, and a cascade of responses from doctors, nurses, patients, and technologists.
What the data has long shown, the comments confirmed. We may be hard-coding historical medical bias into tomorrow’s health systems.
The implications stretch from care delivery to digital infrastructure and they raise essential questions for anyone tracking the future of healthcare technology.
The Note That Sparked It
“Only 12% of AI researchers are women.
We are training algorithms to diagnose female bodies using data created by people who have never had a period, never been pregnant, never been told their pain was ‘just stress.’
Meanwhile, 60% of women say doctors don’t take their health seriously. And now we are automating that bias.”
The response was swift. Healthcare professionals shared personal experiences that echoed and expanded on the theme:
“This has ALWAYS been the practice. ALWAYS.” — Francine Shannon, RN
“I waited 30 years for a Cystic Fibrosis diagnosis. My pulmonologist laughed when I first raised it.” — Patricia Staes, RN
“My wife was told her pain was from ‘too much sex.’ Emergency surgery revealed an ectopic pregnancy.” — HP
These are not isolated cases. They reflect systemic gaps in how health data is collected, interpreted, and scaled.
When the Data Is Built on Male Defaults
Until 1993, women of childbearing age were routinely excluded from clinical trials. Much of today’s medical literature and the AI trained on it still reflects those decades of imbalance:
Less than 2% of global healthcare R&D goes to female-specific conditions outside cancer
Erectile dysfunction receives significantly more research funding than PMS, endometriosis, and menopause combined
Women influence over $7 trillion in global spending and make 85% of household healthcare decisions
Yet influence does not equal access. Controlling healthcare spending doesn’t mean women receive care designed around their needs. Many still face misdiagnoses and delayed treatment not due to patient behavior, but because the system wasn’t built with them in mind.
When the Tools Themselves Are Biased
As Dr. Alvarez who has worked on AI training projects, explained:
“If you point out a bias they’re not tracking, you get removed from the project. Not because it’s false but because that bias wasn’t part of their quality control framework.”
The issue starts with the tools used to collect and interpret health data:
Pulse oximeters sized for larger fingers
Face masks that seal more reliably on male facial structures
Blood pressure cuffs calibrated for average male arms
IV needles standardized on male vein dimensions
Diagnostic criteria based on male symptom presentation
When baseline equipment is mismatched to the population it serves, bias becomes embedded long before data reaches an algorithm.
What Corporate “Solutions” Often Miss
Despite billions flowing into healthcare AI, much of today’s focus skews toward surface-level interventions.
Femtech often centers on tracking cycles or fertility, described by one physician as covering “the function, disease, and pain of pelvic organs and breasts.”
Meanwhile, serious conditions like cardiac disease in women continue to be underdiagnosed, in part because diagnostic models rely on male-centric symptom data.
Corporate perks like egg freezing or fertility stipends can be well-meaning but they don’t replace equitable clinical care, nor do they address how bias flows through data systems. The next generation of infrastructure needs to begin with inclusion.
Where the Sector May Be Heading
By 2030, more than 1 billion women will be post-menopausal, the largest aging demographic in human history. Yet menopause research still receives a fraction of the support given to better-funded men’s health issues.
This is not a niche trend. It’s a foundational shift in the demographics healthcare systems are being built to serve.
Emerging areas to watch include:
Diagnostic algorithms trained on inclusive datasets
AI tools that audit healthcare models for gender bias
Monitoring devices designed for female physiology from the outset
Platforms that integrate hormonal complexity across life stages
Trends Gaining Momentum
The viral response to this topic surfaced multiple patterns that may shape the healthcare AI space:
Growing regulatory interest in AI auditability
Employer and hospital legal teams asking hard questions about model transparency
Patients demanding alternatives and sharing their own diagnostic gaps
Founders building for overlooked physiology, not just underserved markets
Considerations for Stakeholders
For those observing the evolution of this field, key questions might include:
How transparent and validated is the data behind clinical algorithms?
Are healthcare systems or regulators signaling demand for more equitable models?
Do emerging tools have the capacity to integrate with existing healthcare platforms?
What safeguards exist to ensure models perform consistently across diverse patient groups?
Can these solutions be extended beyond women’s health to address broader demographic gaps?
Building the Next Layer of Infrastructure
Bias correction is not a product feature. It’s a structural necessity.
As AI becomes embedded in healthcare delivery, the systems that fail to account for gender-based discrepancies may face operational and reputational challenges. The work underway today will define whether future tools are truly equitable or merely automated extensions of legacy models.
If You Are Tracking This Shift
The Investor Readiness Masterclass is a six-week program designed for individuals exploring private market opportunities in women’s health innovation.
It includes:
✅ Real-world case studies on diagnostics, data bias, and structural exclusion
✅ A due diligence framework tailored to early-stage health ventures
✅ Sample deal rooms and investor interviews
✅ A peer cohort working through investment strategy, risk tolerance, and thesis-building
The work of rebuilding healthcare infrastructure is already underway. For those following this space, it’s a critical time to understand the shape of what’s coming next.
👇 Join Our Network
Are you building or backing credible, under-the-radar solutions in women’s health?
We want to hear from you. Reach out privately or reply to this post. FHV curates brands and breakthroughs that deserve broader attention in the women’s health ecosystem.
📣 New: The Billion Dollar Blind Spot
I’m writing the strategic investing guide I wish I’d had at the start.
It’s for anyone; men, women, allocators, skeptics who wants to learn how to spot real alpha in women’s health.
👉 Join the waitlist here for early access
❤️ Enjoying this? If this post sparked something for you, click the ❤️ at the bottom. It helps more than you know and tells me you're reading.
Coming in July 2025:
🎧 Blindspot Capital: The Podcast
Formerly FemmeHealth Founders, our podcast relaunches this summer under a sharper lens and a bolder name. Blindspot Capital explores the undercurrents shaping health innovation from the deals that stall to the systems that silence. This season, we speak to the people shifting what gets seen, funded, and scaled.Confirmed guests include:
💥 Ida Tin (Clue, FemTech Assembly) on founding the femtech category
🧠 Lisa Suennen (venture partner & healthtech strategist) on how institutional capital moves.We are also in conversation with voices from maternal science to global policy stay tuned as the full season drops.
👉 Subscribe to Blindspot Capital wherever you get your podcasts, or listen directly on Substack.
I write weekly at FemmeHealth Ventures Alliance about capital, care, and the future of overlooked markets. If you are building, backing, or allocating in this space, I’d love to connect.
Disclaimer & Disclosure
This content is for informational and educational purposes only. It does not constitute financial, investment, legal, or medical advice, or an offer to buy or sell any securities. Opinions expressed are those of the author and may not reflect the views of affiliated organisations. Readers should seek professional advice tailored to their individual circumstances before making investment decisions. Investing involves risk, including potential loss of principal. Past performance does not guarantee future results.
Many people seem to assume that AI is by definition "neutral" because we are talking about computers. I taught a class on big data to social science students and we discussed racial issues in facial recognition. Many students were shocked to find out the many ways bias was built in, even if it wasn't out of malice. The issue you talk about is similar. I can't believe it is 2025 and we keep not learning from the past!
Thank you Maryann for your care and driving this space. I do my bit to encourage women to be their own health advocates because of the many gaps we have to fill and solutions to fight for. I often think of the impacts of AI on health practitioners but not once have I thought about the databank shaping our future. So thank you for opening my eyes to that. I see the conflicts and contradictions which makes it a lottery when it comes to finding the right practitioners to get the best solutions. In more recent editions of my weekly newsletter I ask women to ask their practitioners what information sources they are using to arrive at their recommendations. I will certainly be leaning into that more with them. And I’ll be following your efforts too. Well done and thank you. Anita xx p.s. I have a podcast here in Substack. What a fantastic topic this would be to talk about if you’d like to join me. Axx