Yudara Kularathne: Dick Pics for Science, Child Emergency Doctor to AI Founder & Model Training vs. Regulatory Affairs - E523

· Healthcare,Southeast Asia,BRAVE Podcast

"How come we have such powerful technology in hospitals, but people at home don’t have access? We have the best technologies and care in hospitals, but not everyone wants to go there, or sometimes they don’t even realize they need to. I kept thinking about this, especially about my friend, during the COVID lockdown when I had time to reflect. I used some web images to create a tool that could identify what’s normal and abnormal in male anatomy. I shared it with my friend, who then shared it with others. Within two weeks, I started receiving messages from strangers saying, ‘I love your app. It’s amazing. I had a pimple in my private parts, didn’t know what it was, and now I finally understand. Thank you!’ I was stunned by the response—thousands of downloads in just two weeks. It made me realize the huge gap in accessible healthcare solutions." - Yudara Kularathne CEO & Cofounder of HeHealth

"I was immediately worried this guy might be in a crisis, so I rushed to his house. When I got there, he was on the balcony, about to jump. We pulled him back, and I asked, 'What’s going on?' It was Christmas 2019, and he said, 'I’m dying.' As a doctor, I had managed his medical records before, so I knew his history. I told him, 'You had the tests, and they came back fine.' Of course, he could have contracted a new infection, so I asked, 'How do you know you’re infected?' He said, 'I’ve embarrassed everyone. I can’t even tell my wife.' The social pressure was overwhelming. He’d had unprotected sex outside his marriage and noticed a rash on his private parts. After Googling his symptoms, he convinced himself it was HIV. The stigma and lack of access to accurate information in sensitive areas pushed him to suffer in silence, unable to share his fear with anyone." - Yudara Kularathne CEO & Cofounder of HeHealth

"Technology moves so fast that by the time you publish a paper, your technology is already twice as advanced as what you published. This is a limitation of the FDA as well—there’s little point in getting approval for technology when six months later you have something significantly better. You have to reapply and modify submissions constantly. So, we decided to be upfront, communicate honestly, and work collaboratively with all the relevant organizations. For example, I’ve worked with the FDA and spoken at their conferences as a content expert on data. One of the key steps we took was collecting initial data directly from the community. We sent out brochures, inviting participation, and people responded. 'Pick for science!' Yes, believe me, people truly responded." - Yudara Kularathne CEO & Cofounder of HeHealth

Yudara Kularathne, CEO & Cofounder of HeHealth, and Jeremy Au discussed:

1. D*ck Pics for Science: Yudara described the progress of HeHealth, an AI-powered men’s health app launched in 2022. The app helps users detect sexually transmitted infections (STIs) by analyzing images. Within two weeks of its release, the app received over 20,000 downloads globally, including from countries like Uzbekistan and Kazakhstan. Yudara recounted a case where a user with monkeypox contributed 150 detailed images documenting his condition over two weeks to help improve the app’s diagnostic accuracy. This approach leveraged community contributions to create robust, annotated datasets while adhering to ethical guidelines.

2. Child Emergency Doctor to AI Founder: Yudara reflected on his journey from managing over 10,000 cases as a pediatric emergency doctor to founding HeHealth. During his time at Sengkang General Hospital, he built a dedicated pediatric emergency department with support from KK Women’s and Children’s Hospital. The COVID-19 pandemic disrupted hospital workflows and inspired him to address healthcare gaps. He cited a personal encounter in 2019, when a friend mistakenly believed he had HIV and considered suicide, as the moment that fueled his commitment to bridging stigma-driven healthcare disparities through technology.

3. Model Training vs. Regulatory Affairs: Yudara emphasized the balance between building AI models and navigating regulations. HeHealth used incremental data collection and synthetic data to improve accuracy and reduce biases, particularly for underrepresented groups. In 2023, HeHealth faced a $50 million FTC lawsuit in the U.S., citing privacy concerns about sensitive user data. Using AI tools for legal strategy and transparent communication, Yudara successfully resolved the case within two months. He highlighted how innovation often outpaces regulation and the importance of proactive collaboration with regulators to mitigate risks.

Jeremy and Yudara also talked about the role of synthetic data in addressing racial biases in AI healthcare tools and the challenges of scaling health tech in emerging markets.

Introduction and Guest Welcome

(00:00) Jeremy Au: Hey, Yudara, really excited to have you on the show.

(00:03) Yudara Kularathne: Good morning, Jeremy.

(00:03) Jeremy Au: I've known you for a couple of years now ever since building out the startup. It's been fascinating to see you transform both as a person, as a founder, but also bringing the practice of medicine and startups together. Please introduce yourself.

(00:16) Yudara Kularathne: Thanks, Jeremy. It's a pleasure. I'm an engineer, doctor, and entrepreneur by chronological order. I'm self-taught from reading books to an engineer. I used to code quite a lot of time, and then started my medical education, graduated from NUS as a doctor. Of course, after 15 years of being a doctor, I moved to entrepreneur. Happy to walk along the timeline a bit in detail in the next segment.

Early Life and Education in Sri Lanka
(39) Jeremy Au: What was it like growing up in Sri Lanka and moving to Singapore?
(43) Yudara Kularathne: Yeah, Jeremy, I was born in Sri Lanka and grew up in the southern part of the country and then moved to Colombo, which is the capital. I did my primary and secondary education in Colombo. I was lucky enough, they have something like PSLE, so I got to move from a village (01:00) to the capital to study in the number one school of the country.

I was very passionate about pushing myself, so it was a good chance. Through that, I managed to get in through my O levels and get into the best medical school. But focusing a little bit on growing up in Sri Lanka, I think now that I'm living in Singapore, I see the big contrast.

When we were small, my primary school was literally on a beach. So a big part of my primary school was playing cricket on the beach. And when the teachers came, only then we ran to the class and started studying. It was a very fond memory, and I have a lot of happy feelings about it.

So I think that was a massive deal. We used to understand society, the connections among people, and the value of life and money because Sri Lanka is a developing country. Southern parts are even much poorer, even though I was born to a relatively well-positioned family.

I grew up with the kids and then I used (02:00) to see these things and grow up with that. And of course, moving to the capital, it's a big city, relatively close to Singapore.

Cultural Transition to Singapore
(02:07) Yudara Kularathne: From medical school, I got to work with two professors in Singapore at NUS, and they asked me to move over. There was a path in my medical school to NUS medical school that I could switch over to at any time. Sadly, it's not there anymore.

With that, my switch over to Singapore to complete my medical school was another big cultural difference. And yeah, it was all a good memory. I honestly wish, knowing that my sons are growing up in Singapore, I could send them to either Sri Lanka or Taiwan, which is where my wife is from, to grow up in a real village.

(02:42) Jeremy Au: Like you said, it was like three differences, right? It was a village to a city. It's from Sri Lanka to Singapore. And obviously, you're changing your medical school as well. So what was the culture shock, arriving in Singapore?
(02:53) Yudara Kularathne: It was a massive cultural shock. In Sri Lanka, in my college, I used to speak British English, and then (03:00) of course, we have Singlish here. That was a big change, even though you might think it's small, but I had a lot of confusion and misunderstanding.

In Sri Lanka and Indian culture, when a senior says something, we don't question at all. Keep the head down and say, "Yes, sir." That was very different from Singapore. There were a few times I thought I understood the instructions, but I never questioned or clarified. What I thought I understood was wrong. So I executed the instructions, and then it was wrong.

But that was for three months. I adjusted quite fast, thanks to a lot of friends, even before I came to Singapore. There were quite a few friends I knew from Singapore who were in Sri Lanka and also through my family. So it was a bit of a bumpy road.

Journey into Medicine
(03:42) Jeremy Au: And what's interesting is that you chose to not only study medicine, but also continue in it. What drew you to medicine in the first place and why did you continue? Because a lot of people start and they don't complete, for example, right? So why start and why continue?
(03:57) Yudara Kularathne: Even though I started (04:00) my growth in engineering first, when I was about 10, I started coding in languages, which a lot of people now don't know, like QBasic, DOS, and carried these programs in 3.5-inch diskettes. But when I started to grow up, I always had this passion to help people.

And of course, being born to South Asian parents, they always say being a doctor is the biggest thing. And on top of it, my family didn't have any doctors. We had engineers and lawyers, but not doctors. So they were always saying, "Why don't you just be?" They didn't push me per se, but because I was doing very well in academics, they said, "Why don't you do medicine?"

And I thought about it. I also felt that's something I can help people with a lot. I loved it. I enjoyed my medical school time. After that, I actually wanted to consider doing a surgical topic as a specialist, but when I started doing it, I realized it's not for me.

Then I was accidentally, on the first day itself, pushed to the emergency department in KKH in a pediatric emergency, and I loved it. That was the (05:00) turning point, and I never looked back. And of course, I trained as an emergency physician and a specialist in NUS, and I took UK and US exams as well.

And yeah, the biggest thing is when people come at the lowest point of their life. Usually, emergencies for most of them are the lowest point of their life. You can help them come over, and seeing that smile, especially in pediatric emergencies, when you fix the problem, they are smiling, they are running around. That kind of feeling is not something you can buy with money. So when you feel that, I never felt it as a pain. I was always looking forward to it.

Pediatric Emergency Medicine
(05:35) Jeremy Au: So what's it like to lead a pediatric emergency team? Because you mentioned that you've handled and managed over 10,000 emergencies. First of all, these are kids, right? Pediatric kids. So obviously, they're cute, but you're talking about emergencies. So what kind of emergencies are you talking about?
(05:51) Yudara Kularathne: I'll start with some joke because, as I said, I was pushed to pediatric emergency by accident. Nobody in the doctor's community (06:00) wants to do pediatric emergency because it's very tough, and parents are very tough. You don't get praises at all. I was there by accident, and I told my supervisor at that time, "Look, this is like a transition. Just give me three months. You won't see me anymore."

He said, "Yeah, sure. We need people because there's no one to work." And after that, I stayed there for four years in pediatric emergency, and he still laughed at me and said, "Look, you only asked me to stay for three months, and then you never left the department."

I truly enjoyed it. Of course, after becoming a specialist, there was this interesting situation. The government was building a new general hospital in a growing state with a lot of young families. So the Ministry of Health wanted to have a pediatric emergency in Sengkang, and they were looking for suitable people.

My original plan was to complete my training and go back to KK. But at that time, the KK head asked me, "Do you want to take charge of starting a small department there? We will support you." My other bosses felt I was suitable to run this small team from the beginning.

So I picked people, I wrote (07:00) the necessary pathways with KK Hospital, and I started small. This was pre-COVID, and we did a really good job. We built a team to the point that we were actually seeing roughly about 10%.

I had a very small team. It was under the bigger emergency department, and we ran as a small team. We had separate nurses, separate doctors, and a separate small space we called our pediatric corner. It was a beautiful experience. But of course, with COVID, things changed, and hospitals completely changed their physical structure and so on. And at the same time, that's where my transition to entrepreneurship came in.

(07:36) Jeremy Au: What does a pediatric emergency look like? What does it mean? You mean like babies crying or they're going to die? Could you share what a case of a pediatric emergency could look like?
(07:47) Yudara Kularathne: Yeah, Jeremy, you tell the two ends of the story. Sometimes babies cry, and first-time mothers don't know what to do. They bring them to the emergency, and sometimes it's just that maybe they're hungry. Mothers are not feeding enough. Or maybe they are feeding too much. Then (08:00) maybe the baby is too full, feels bloated, and cries.

That's like the simplest thing you can see. Of course, we help them with that. And sadly, sometimes we see life-threatening conditions leading to death or even serious accidents that can lead to death, bringing them to pediatric emergencies.

So we have to be prepared for the whole spectrum of things. Again, it's not a one-man show; it's a whole team. As a department, we train the doctors. We have a position in Sengkang for them to get used to it before they transition out.

Usually, it can be simple infections, falls, small fractures, sprains, bumps, and serious things like kidney problems or heart problems. Even though they are rare, they can happen to babies because their organs are growing very fast. So we do see the more serious ones. And of course, rarely, we had cases of pediatric serious injuries. Even now, every month, you do see one or two, sadly.

Entrepreneurial Beginnings
(08:57) Jeremy Au: And you mentioned that you started exploring (09:00) entrepreneurship. So there you are, you're enjoying pediatric emergency and for the first time in your career exploring entrepreneurship, which is a very different role, different functions. So how did that interest come about?
(09:12) Yudara Kularathne: Actually, it's not like a switch. Of course, there was a time when I switched over, but it's a continuous process. As I said, I started my life coding mostly.

When I went to medical school, I never stopped coding, and I was always in tech. Then my best friends were always proper engineers. I'm the tech bro who's in medical school. But with those things, we wrote small apps, played around, and there were two main events. Even during my medical career, in the background, I was hanging out with my engineering friends. Two events happened to truly force me to do something by myself.

One is, of course, one of my friends one day sent a message to me saying, "Thanks for being a good friend." Knowing that, as an emergency doctor, I had seen this kind of thing in suicide notes, I was immediately worried that this guy was going through a crisis. So I quickly rushed to his house. This (10:00) guy was on the balcony about to jump. Then we grabbed him, and I asked him what was going on.

This was 2019 Christmas time. He was saying, "I'm dying." I was like, "Look, as a doctor, I actually used to manage his medical records. So I know his records." I was like, "No, you had the test. You were fine. Of course, you can get a new infection, but how do you know that you had the infection?"

He said, "I embarrassed everybody. I don't even want to tell my wife." The social circumstances were so strong that he had unprotected sex with someone outside his marriage, and then because of it, he didn't want to tell anybody. He saw a rash on his private parts and thought it was HIV by Googling his symptoms.

It shows how much stigma and lack of access to data in sensitive areas cause these things. He was not ready to share with anybody and was suffering internally. At some point, he thought, "Look, I'm going to die from HIV, so why don't I make it faster so that everybody's happy?" That shocked me. I kept thinking about it.

I was in charge of the east part of the COVID operations with a few of (11:00) my colleagues. There were dormitories, S22, and so forth. There were two young boys brought to the hospital dead from COVID. Sadly, it broke my heart because we didn't even know they had COVID. They lacked oxygen and died.

That kind of made me think, how come we have such powerful technology in the hospital, but at home, people don't get access? At the same time, it rang the same bell as my friend's incident—we have the best technologies and care at hospitals, but not all circumstances warrant people wanting to come to the hospital, or sometimes they don't know they need to come.

So then I thought about my friend while working during COVID because it was lockdown, so I had a lot of time. I used some web images to develop a technology to see what a normal anatomy of a male organ looks like and what's abnormal. Then I passed it to my friend, and he passed it to his friends.

After about two weeks, a lot of random strangers messaged me, saying, "I love your app. It's such an amazing thing. There was no way I could understand. I had this (12:00) pimple in my private parts, couldn't understand it. Now that I can understand, thank you." So I was like, "Wow, I didn't realize that's such a big gap."

Thousands of downloads happened within two weeks, and then I thought, why can't I put some effort into improving it? As a second step, I could list down three diseases properly. I talked to a few of my friends, and they all loved it. Traction grew.

I have to mention one guy, one of my doctor friends, Daniel He. He said, "Dude, I think this is a good idea. I'll throw some money. Just do it. But you have to quit your job." That was like, "Are you sure? I have a very comfortable life, and my family is happy." He said, "Yeah, it's going to be tough, but just do it."

Of course, I talked to my wife. She was an entrepreneur before. That was one of the biggest things. She said, "Yeah, let's do it together."

And yeah, then in 2022, I jumped out and started. Things grew well. We got funding from the U.S. and moved to U.S. operations. I'll take a pause again. Of course, that's the real journey with ups and downs. So here we are, still growing in the right (13:00) direction.

Founding eHealthy
(13:00) Jeremy Au: Can you talk about what eHealthy is first? And then after that, we'll talk about the ups and downs.
(13:05) Yudara Kularathne: eHealthy is a men's sexual health wellness app helping men detect STIs. The idea is clear: build decentralized infectious disease detection models and diagnostic models in the digital realm.

That means a few things. It should be outside the hospital, easily accessible to people through their phone. It has to be digital. That means there's no physical aspect per se. It has to be data—images, sounds, whatever digital data we capture. And it has to be at a population level. That means screening diseases in a matter of minutes for millions of people.

So that was the idea, and of course, we developed some technologies through eHealthy. We slowly moved to things like monkeypox, where we are one of the leading research labs in the world. Then, of course, a few areas like synthetic data for COVID, where we showed some innovative ways. Hopefully, there won't be new pandemics, but if there is (14:00) any situation, we are ready to roll out some very innovative technologies. That's the key for eHealthy and the bigger vision.


Challenges and Controversies
(14:08) Jeremy Au: What were some of the ups and downs?
(14:10) Yudara Kularathne: I'm very happy to share about the ups and downs. As a startup, we had very good traction. We didn't have to do marketing per se. People were using it. There were times when we had 20,000 downloads in one week, and it was all around the world.

We were sometimes surprised by places like Uzbekistan and Kazakhstan, which I hardly knew where they were. During the early stages of the Russia-Ukraine war, we had Russians and Ukrainians using it quite a lot. We had a really good run. And of course, because of the traction, we had significant exposure to the global audience through media, innovative ways, and so on.

Of course, like any technology, you have hype, and then the fear factor comes in. When I started operations in the U.S. in 2023, a lot of people appreciated it, but a few felt negative about some aspects of it (15:00), especially the data privacy part.

As per HIPAA in the U.S., we had a few extra layers of protection. But the idea of us collecting private pictures of men around the U.S. for helping them still disturbed quite a large part of society. Some people misunderstood it as promoting sex, which is clearly not the case. Sex and sexual health are very different.

We were focusing on people who had no information and access to care in a very stigmatized section of healthcare. We started with these three areas, especially data privacy. A few people felt there was no prior comparison for me to gauge and say, "Look, they have done this, and this is what they are doing." So it was unclear if it was standard or not.

A few people started writing some negative and misunderstood concepts. The fear trend on social media moved much faster than the initial hype. There was a massive hoo-ha in the U.S. about data privacy, data storage, and consent, and so on (16:00).

I believe innovation is two steps ahead of regulation. So as a doctor, I took the necessary steps. For me, there was one simple principle: as a physician, do no harm. Protect the patient's autonomy. You always tell the patient what it is and let them decide what’s good for them.

We followed those principles, but again, as I said, because of misunderstood writings about different aspects, we took it as constructive criticism. But that also drew unnecessary attention from politics. I was literally caught up in a massive ideological discussion in the U.S. about workism and…

Facing Legal Battles and Financial Struggles
(16:36) Yudara Kularathne: Sexuality and gender and politics clearly affected some people, and they took it as their life's mission to destroy us. They complained to governments, institutes, and so on, escalating into some government investigations.

We had multiple legal cases in 2024. We ran out of money because of the operations and legal costs (17:00), but we had a clear plan. Luckily, I was surrounded by good professors from Harvard University. Long story short, we are growing significantly in 2025.


Big News and Overcoming FTC Scrutiny
(17:09) Yudara Kularathne: I have some big news.
(17:10) Jeremy Au: Happy to share one of them during the podcast for the first time. Yeah, so let's talk about that because it's always tough to be part of legal scrutiny. In this case, we had the FTC doing the investigation and having a lot of conversations with you. It must have been very stressful. So how did you manage your own emotions or thinking? How was that experience like?
(17:31) Yudara Kularathne: I think, Jeremy, the most important thing was to negotiate and get out of it without being fined or found guilty of wrongdoing. That’s very important.

A lot of people put money into throwing misinformation campaigns and digital campaigns against us.


Navigating Legal Threats and Using AI in Court
(17:45) Yudara Kularathne: I got death threats in California when I was living in San Francisco. Anyway, coming back to the FTC—they sent me a charge sheet, and when I looked at it, they were suggesting I might face up to a $50 million fine and a possible jail (18:00) term.

So I talked to my lawyers, and they said, "That looks possible. I'll get you out of the $50 million fine to maybe $1 or $2 million. I'll guarantee that you won't go to jail, but you have to pay us $2 million upfront, and most likely $5 million overall."

They said the deal was clear: $50 million versus $5 million. That sounds obvious for anybody, including not going to jail. But I had different plans because I knew morally I didn’t do anything wrong. I was very clear, based on the right principles, of not doing any harm and letting people decide for themselves.

Of course, we didn’t have money at that time. Clearly, I didn’t have $2 million in my pocket to pay the legal team. So I said, "I’m going to take it on by myself to tell the truth."

And of course, I used AI in a very innovative way. We built a complete agentic system using FTC data, which is publicly available. I trained a few agents, and then we literally had court (19:00) discussions through agents.

Of course, I disclosed this to them in an email. They said, "You cannot video record the sessions, which are court sessions." I said, "Sure, I can take notes." They said, "Yeah, sure." So my agents took notes, and we worked.

Clearly, what I did was focus on telling the truth. I was very honest from end to end. I think nobody has concluded an FTC investigation in two months, but we did.

They were very happy. We gave everything. Of course, they did point out some of the minor limitations we had with our university collaborations and so forth. I’m a very small startup. We raised roughly about $2 million and were a pre-seed startup.

Our revenue was definitely less than $1 million at that time. The FTC hardly goes after small companies like ours, but they decided to. Then we showed that we had done our part. They pointed out small things we could have improved. I accepted those as small things.

Look, they were on the roadmap, but clearly, we couldn’t do them without a budget. And we agreed on a settlement.


Settlement and Mental Health Challenges
(19:59) Yudara Kularathne: (20:00) Without doing any wrongdoing from my side. Coming back to the mental health part, my co-founder and I were very stressed. At the same time, I had really brave souls who backed me up, especially a few professors at universities in the U.S.

When the university forced them to detach from my project, they said, "No, we believe this is the future of sexual health. This is how you get the technology to 300 million people overnight, literally, if you need it. Everybody carries a phone, and he developed a technology and patented it. He's the only guy who can do it."

So they backed me, and with that strength, I managed to keep my head down, focus on telling the truth, and complete our journey. Of course, now looking back, all the legal battles are over, and everybody agrees that there were some minor points, but we are on the right path with the right intentions.

Most importantly, here we are now working with governments around the world to bring the technology to people's hands. (21:00)

(21:00) Jeremy Au: Yeah.


Broader AI Issues in Medicine
(21:01) Jeremy Au: So I think it's interesting because the issues that you ran across in this case apply to broader AI tools in medicine, which is about guidance, diagnosis, and training data. Let’s talk about those issues.

For me, when I looked at the FTC letter, I think there were two components, right? One was the training data, the approach, and the level of quality. And I think that’s one side. The other side, of course, is your marketing—whether you’re making a diagnosis or supporting a diagnosis with a doctor versus making a diagnosis.

I think there are a lot of variations of that, right? And I think that’s the case for almost every AI app. They are trying to automate diagnosis at home, and they have these two issues. So could you talk a little bit more, because I know you’re building a GP AI as well? How do you see those two aspects? What’s the data side, and what’s the diagnosis side? How do you see those things playing out?

(21:50) Yudara Kularathne: Sounds good, Jeremy


Innovative Data Collection and Synthetic Data
(21:52) Yudara Kularathne: Data is something I’m very passionate about. When we started the project, a peer-reviewed paper was published in (22:00) Myoclinic Digital Journal, which is considered one of the top five global digital publications.

I explained every step, from step zero to step five, every single data point, how I collected it, how I got consent, and so on. I want to stress a few things.

Of course, we decided that we’re going to do incremental increases rather than collect data for five years, validate it after five years, and then move forward—because technology moves too fast. Technology moves so fast that even now, when we publish a paper, our technology is twice as good as what we publish.

This is a limitation of the FDA as well. There’s no point in getting FDA approval for a technology where six months later you have something twice as good. You have to reapply and modify your submissions. So we thought, okay, we’re going to be very honest, communicate these things upfront, and work with all the organizations.

I work with the FDA, for example. I have spoken to them as a content expert about the data at FDA conferences. One of the biggest things is we collected (23:00) initial data from the community.

We sent out brochures saying, "Contribute to science."

(23:05) Jeremy Au: Dig for science.

(23:06) Yudara Kularathne: Yes, we did.

(23:07) Jeremy Au: That’s…

(23:08) Yudara Kularathne: Believe me, people responded. One guy who had monkeypox documented his case from day zero to day 14, taking multiple pictures every day. He sent 150 images in one shot. He said, "Use it. Help other guys who might have the infection." That was one of the best-documented datasets I’ve ever seen in my life.

We took a different approach. Instead of going to the hospital, getting hospital approval, and collecting the data from there, I said, "I’m going to go to the patients directly. I’m going to tell them my objectives, and I’m going to collect data from them with their consent."

We documented everything and had IRBs (Institutional Review Boards), which means ethical approvals for these things. We are a small startup. They were comparing things with relatively known areas like neurology and cardiology.

If you look at it comparatively, Harvard has a sexual health group working on similar projects. They released a monkeypox model at that time. They had (24:00) 1,000 annotated data points. It was published in the paper.

I published at the same time, and my dataset was 10x better. They collected from hospitals, and I collected from the people—with very ethical methods, of course.

If you look at it from a bigger view and say 10,000 is more, I agree.


Building Trust and Overcoming Bias
(24:19) Yudara Kularathne: The second part of it is I had to tell this story because when we ran the first model, the data collected was mostly from us, which is white skin. I ran the model and then realized brown skin users had a lower accuracy.

I was like, wait, I have brown skin. What you're saying is I built a model that is biased against myself. I said, no, I cannot accept that. So we publicly talked about it. I put a lot of resources, as a small team, to build an answer for this rather than complain about it for the rest of my life.

So we built a synthetic data pipeline, which we actually created. For the first time, we used it and then showed that it works. But (25:00) being industry leaders at that time, we were the first to publish a Nature publication on synthetic data, and it worked well, reducing bias in the data.

Sadly, the paper came after the FTC investigation. So I said, I have a preprint; we already put it in an archive, which is accepted nowadays. We published it in preprint. They didn’t say it was wrong, but they felt there were some missing points they wanted to verify.

I said, look, time will verify these things. You just have to wait. But they agreed that there was nothing they could find wrong, and then they closed the case.

So that's the data part. I think now data is moving so fast. If you look at OpenAI's O3 model, internally, from what we hear in San Francisco, it's mostly trained on synthetic data. That's the future because these are clean, solid, small datasets, rather than a lot of rubbish, which has no value.

When it comes to penis pictures on the internet, there is a massive oversupply, which has no value. So that's (26:00) the problem. So we created proper, annotated, medically needed datasets for the first time in history for this sensitive area.

So that's the data. I'm not going to go into details.

(26:11) Jeremy Au: Maybe before we go to the diagnostic side, the traditional approach has been that you need a very large set of human data, raw data, annotated, diagnosed, et cetera. But of course, the problem with sexual health is that it's hard to get that list because it's a stigmatized disease.

So you're using a combination of crowdsourcing, data citizen science, AI to push the patterns, and synthetic data on top of that. It's a very different approach. Do you feel like that's going to change?

Because for me, when I hear synthetic data, I'll be like, doesn't synthetic data just create its own junk? If an AI trains itself, does it go crazy?

(26:52) Yudara Kularathne: So that nicely connects to the next part. I think you need to have a human in the loop.

I was talking to another doctor who's against synthetic data. (27:00) He said, "Using AI data multiple times, the model will collapse." This is where the human is not in the loop. AI is not what we call AGI, Artificial General Intelligence, that it can go by itself. No, you have to have guidance.

But when you use guidance to create synthetic data the right way, it clearly increases the diversity of the dataset. It clearly increases the quality of the dataset. You don't need trillions of GB of datasets to create it.

You can have a very solid, very well-represented dataset that is small enough. 10,000 is way more than enough. We ran a model recently and published that a 1,000 very well-documented synthetic dataset is good enough to represent the whole variation.

The portion of people who agree with me is getting more and more. With time, that will answer your question. At the moment, the general public doesn't agree because this is a very technical thing.

You need to understand the distribution variation algorithms in AI. What I'm using as an engineer now is very tough. It's very (28:00) new. Very few people have published work on this. So clearly, if you ask, they might say, "Oh yeah, there's only two people in the whole world."

I'm not going to buy it at this time. So again, it follows the pattern.

On the general side, if you look at autonomous driving, 90% of the data is synthetic now. People will accept it, but it will take some time. But the key factor is the human in the loop.

You cannot just let the AI run by itself. It needs qualified doctors guiding the AI in the right direction.

(28:29) Jeremy Au: I think that’s probably one of the key learnings from the FTC eHealth experience. But also, why you're building the GP AI.

(28:37) Yudara Kularathne: Yep.

Launching New AI Healthcare Initiatives
(28:37) Yudara Kularathne: My last six months have been about a very new startup because we decided eHealthy is going to continue our work, but we are not going to fight with the media. Like, we are not going to fight fire with fire.

We said, and this is something taught by one of my professors, Dr. Klausner—he’s a very well-known professor in the United States on infectious diseases. He told me one day, "Udara, in (29:00) the 1980s, when I introduced at-home HIV testing in San Francisco, people literally protested in front of my home. But I said, let it die off by itself. After two weeks, people go back to their work and then continue."

And he continued to push at-home HIV testing. At that time, it wasn’t really at home, but there were bathhouses for MSMs. So he was very focused on getting people tested at those high-risk places. Now, it’s a standard of care.

After 20 years, nobody even bothers asking if at-home testing is needed or not. It’s a standard of care in sexual health. And I think that helped me at that time, very clearly.

We laid low, we continued our eHealthy work, and we published quite a lot of papers, including in Nature. But, again, there was no business arm of it. We were not selling anything aggressively.

So Singaporeans cannot access U.S.-based services. Most of it is not accessible, but then there are parts where it can be accessed. So we focused on those things and got into regulation.

The core technologies we built for (30:00) eHealthy were reviewed by two of my Singaporean doctor friends. They felt it could actually be further built into a Singaporean primary care model. At the end of the day, infectious diseases are primary care problems.

We agreed, and we put some money into it and ran some model training. It gave good results. We showed it to some people, and they loved it.

Moving forward, the localization of AI to the Asia-Pacific is going to be a big area of growth. So then, what we did was use the government’s openly available regulations, specifically applied to Singaporean healthcare.

Using technologies like RAG (retrieval-augmented generation) agentic systems, we built a model purely focused on Singaporean primary care, based on Singaporean data and MOH guidelines. That showed some amazing results at the beginning.

I didn’t want to raise money, but a few people were very insistent on putting their money in, and they had hospital connections. So I said, sure. And we set up a separate team, and that team is growing very (31:00) well.

We are officially coming out of stealth this year. We have a cool product launch automating through agents almost 50 to 60 percent of the tasks in a primary care clinic, like taking the history, helping patients gather information, preparing the notes for the physician to review, and then follow-up care, completely automated based on Singapore guidelines.

So I’m very excited for it.

(31:23) Jeremy Au: Could you share a story about a time you’ve been brave?


Personal Reflections and Future Vision
(31:26) Yudara Kularathne: I think the biggest part is the world's largest legal body, the FTC, suing you for $50 million and a jail term. I decided to represent myself without a lawyer. A lot of people felt it was crazy, but I believe in the truth.

I thought telling the truth by myself, which is that I am passionate about saving the world and giving something back, was the right thing to do. Some people said I was stupid and crazy, but I thought if I’m passionate, I’m going to save the world. I’m not afraid to tell the truth to the world, including the biggest legal body.

AI was there to guide (32:00) me on legal matters—how I was supposed to answer, how I was supposed to submit the legal documents in writing, and so on. After the whole investigation, on the last day, the head of the legal team involved in my inquiry said, "I assure you, never go into school." I said, "No."

They were so impressed with how I submitted the legal documents in writing and how fast I did it. We were literally turning things over daily, whereas usually, they’d say, "Okay, we’ll come back in two weeks." We were like, "How about we meet up tomorrow?"

One thing I did was use AI. I believe AI can do magic in both medicine and legal work. We trusted that process. Most importantly, as a Buddhist, I knew karma would always come back in the right way, and I did it.

It was brave by any means to me as well. Now, a lot of investors keep asking, "Do you want me to put more money in?" I said, "This is the first time I’m saying no, maybe not now," because I’d be positive (33:00), and that gives them confidence as well.

There was a very touching moment yesterday. Actually, I was in J Labs when three people just walked up to me and said, "Look," and these are very well-known personalities in Singapore—one big investor and so on.

One of them said, "I don’t hang out on LinkedIn much because it’s all about people boasting about themselves. But you are very different. You are very candid about sharing everything very openly, whether it looks good or bad, very clearly along the journey. And you’re one of my favorite LinkedIn personalities."

I was like, wow, I didn’t realize it. It literally brought a tear to my eye. I didn’t realize people were actually reading. There were three people who walked up to me, and one person even wanted to take a picture. He said, "I tell my students, ‘This is the guy you need to follow.’"

I didn’t realize the world was listening. That was a very humbling feeling to have.

(33:48) Jeremy Au: On that note, I’d love to wrap things up. Thank you so much for sharing about your experience being a doctor who grew up in Sri Lanka and Singapore and being a pediatric (34:00) emergency specialist. I think it’s tremendous what you shared about the ups and downs, including the legal battles. Lastly, thanks for sharing your vision for the future of AI and healthcare across raw data, synthetic data, and diagnosis.

All right. Thank you so much, Yudara.

(34:16) Yudara Kularathne: Thanks, Jeremy. It’s the first time I’m sharing more details about what happened on the legal side, and I’ll share a very quick one.

We are building a government system covering a whole population with government support, and that will be the first in the world. We are documenting a population-level intervention using AI in sexual health.

We are doing it with Harvard, and I cannot tell the country at the moment, but it’s a big project. So we’ll be sharing more over time. Thanks, Jeremy.

Again, I really appreciate you. I always used to listen to Brave, and that helped me understand, look, I’m not the only guy going through this.

Now I’m returning the favor to anyone who thinks they are drowning in the startup world.

Thank you so much.