Jamie Chatbots Sweet Slip‑Up on MOH Site: Netizens Laugh Over Safe Sex Advice for Covid‑19 Patients

Jamie Chatbots Sweet Slip‑Up on MOH Site: Netizens Laugh Over Safe Sex Advice for Covid‑19 Patients

Chatbot Confusion on the Ministry of Health Site Makes Netizens LOL

Over the weekend, an awkward glitch in the Singapore Ministry of Health’s virtual assistant, Ask Jamie, caused a ripple of chuckles across the internet after it slipped into a bizarre mix‑up: giving family‑planning advice when people were asking about Covid‑19.

What Went Wrong?

When a user typed, “My daughter is tested Covid‑19 positive what should I do?”, the chatbot responded with a chorus of unrelated sexual‑health tips:

  • “You should practise safe sex through the correct and consistent use of condoms, or abstinence, for at least the whole duration of your female partner’s pregnancy.”
  • Variations of the question using “son” instead of “daughter” produced the same misguided response.

In contrast, a subtle tweak—asking the question in a slightly different word‑shape—prompted the correct answer: be calm, stay at home, and follow standard Covid‑19 protocols.

Why It Happened

Ask Jamie is driven by natural‑language‑processing software that parses user queries and pulls the nearest answer from government databases. However, it requires a painstaking “training” phase where human reviewers line up thousands of possible question variations and corresponding answers.

“You have questions that look semantically the same but should lead you to two different answers,” explained Mr Jude Tan, chief commercial officer at INTNT.ai. “That confusion is a sin in chatbot design—especially if the model isn’t trained to handle subtle semantic differences.”

Some users suggest that the chatbot saw keywords like “daughter” or “positive” and accidentally mixed them with a sexual‑health database, resulting in a “false positive.” Mr Tan estimates that an untrained bot can produce such an error in about one third of its replies. The larger the set of possible responses, the higher the chance of a slip‑up.

When a chatbot is wrong in just a couple of questions, “it can seriously dent a company’s reputation.” Mr Tan added that there are companies who launched bots only to shut them down a few years later due to public backlash.

Current Status and Spread

Ask Jamie was temporarily removed from the Ministry of Health’s website on Monday, October 4. However, it remains active on over 70 other government agency sites, delivering answers tailored to each host’s services.

The Straits Times has reached out to the Ministry for a comment, and the bot’s presence on other sites invites curiosity about how similar glitches might pop up elsewhere.

Bottom Line

Even the most polished tech can slip up—especially when it tries to juggle the intricacies of human language. Ask Jamie’s flub teaches a valuable lesson: rigorous training and continuous oversight are essential to keep a chatbot helpful and humour‑free.