When ChatGPT launched in November 2022, its ability to mimic humans felt astonishing — and, frankly, a little scary.
Nearly two years later, both the technology and our relationship with it have progressed rapidly.
Two recent studies show just how much humans are able to trust chatbots already:
- One in six adults (17%) turn to chatbots for medical advice at least once a month, according to a survey from KFF, a health policy nonprofit. For adults under 30, that rose to 25%.
- Another study found that patients actually preferred answers written by chatbots over those from physicians.
- Health care professionals aren’t out of jobs yet, though — most adults (56%) said they aren’t confident they can differentiate between true and false claims from AI chatbots.
- DebunkBot — a chatbot designed by researchers — was actually able to talk users out of conspiracy theories they believed in.
And people have increasingly been turning to ChatGPT for mental health support and therapy.
This makes sense…
… given that the average cost of psychotherapy in the US is $100-$200 per session. And in 2022, the average wait time for a new-patient doctor’s appointment was 26 days.
If these use cases seem risky, you better hold on — AI is gearing up to take over some high-stakes matters:
- It’s cracking into aviation, with several companies planning to use AI for air traffic control systems.
- AI is helping to diagnose and predict disease, and could one day assist surgeons in the operating room.
Already, we’re letting it drive our cars, do our work, and play with our kids.
But where do Americans seem to draw the line? Politics. Two-thirds say they are not confident in AI-powered election information.