9 November, 2025
australians-self-diagnose-mental-health-issues-experts-warn

Australians are increasingly taking to social media and artificial intelligence platforms to self-diagnose complex mental health conditions. Experts are expressing concern that this trend is leading to misdiagnosis, delayed treatment, and worsening symptoms. Clinical psychologist Professor Jill Newby from UNSW Sydney highlights a growing phenomenon in which individuals identify symptoms of conditions such as ADHD and OCD after encountering relatable online content.

Professor Newby states, “When you’re looking at information online or watching videos, you might not even be searching for that information. But it’s relatable—you can empathise with that person or see similarities in your own experience—and that can lead you down a rabbit hole of self-diagnosis.” The language surrounding diagnoses has become commonplace, which may reduce stigma but also blurs the line between normal experiences and mental illnesses.

The issue is particularly pronounced with OCD, as many of its symptoms, such as intrusive thoughts, are common to the general population. Professor Newby notes, “Around 90 percent of people have intrusive thoughts. The difference between that and OCD is the severity and impact of those experiences.” This trend raises the risk of pathologising normal emotional responses, as individuals often convince themselves they have a disorder based solely on online descriptions.

Algorithms on social media platforms exacerbate this issue by continuously presenting users with content that reinforces their beliefs. “Access to social media increases risk because information gets sent to you that reinforces what you’re worrying about,” Professor Newby explains. If someone suspects they have OCD, they are likely to receive more content supporting that belief, leading to further self-diagnosis.

The Australian Medical Association (AMA) is also addressing an emerging concern: the proliferation of fake medical professionals across social media. Dr. Danielle McMullen, AMA president, warns that these fabricated profiles can erode public trust in legitimate health professionals. “Profiles like this are really dangerous because the community doesn’t know what they can or can’t trust online when things are purporting to be from a health professional but they really aren’t,” she said.

Some of these deceptive profiles have promoted medications falsely claimed to be approved by Australian regulators, while others collect personal data under the guise of offering health advice. Professor Newby underscores that the spread of misinformation highlights the blurred lines of the digital health landscape. “People are being fed content that looks authoritative, often dressed up to sound medical, but it’s not always factual or balanced,” she states.

Research from the British Association for Counselling and Psychotherapy (BACP) reveals that nearly one in 20 people are comfortable discussing their mental health with an AI chatbot, while one in ten seek support from social media influencers. BACP therapist Kate Bufton cautions that, although these platforms can provide psychoeducation, they lack the depth of a two-way interaction with a registered therapist. “Influencers talking to a general audience often have to work in broader strokes due to the time-limited nature of the content and the design of the platforms. Nuance and the complexity of the individual and their experiences can often get lost.”

Interestingly, reports indicate that some patients have brought AI-generated transcripts to therapy sessions to confront their therapists, complicating the therapeutic process. Following a tragic case involving Adam Raine, a 16-year-old American who took his own life after interacting with a chatbot, the makers of ChatGPT announced plans in August 2023 to modify how the technology responds to users expressing mental and emotional distress.

Professor Newby emphasizes that Australia is not immune to these issues. “We’re seeing people arrive to therapy having already diagnosed themselves—and sometimes they argue with their doctor, saying, ‘Well, this is what the internet told me.’ That impacts the trust people have in health professional advice.”

She expresses a desire for a shift in behavior, advocating that individuals notice their symptoms before seeking professional support. “If they’re waiting a long time before accessing help because of whatever they’ve seen on the internet, that could make their symptoms a lot worse and harder to treat.”

To combat this trend, Professor Newby proposes that technology could be harnessed to provide accurate information rather than reinforcing fears. “Wouldn’t it be amazing if someone searching for OCD got fed evidence-based information from ‘OCD Australia’, if such a place existed?” she asks. She envisions a system where trusted, factual information is presented alongside search results related to mental health issues.

Moreover, she advocates for social media platforms to allow users to reset their algorithms, breaking the cycle of receiving repetitive content. “If you’re really worrying about OCD, you should be able to reset your algorithm to start again—so it’s not just continually feeding that information to you,” she suggests.

Ultimately, Professor Newby believes that digital tools can still play a constructive role if users learn to critically evaluate the information they consume. “There’s a role for everyone to be mental-health literate. To ask: is this accurate, is it factual, are there other perspectives I should read in addition to this?” By fostering critical thinking and promoting access to trustworthy resources, it may be possible to navigate the complexities of mental health in the digital age more effectively.