ChatGPT will keep 'hallucinating' wrong answers for years to come and won't take off until its on your cellphone, Morgan Stanley says

The AI language bot sometimes "hallucinates", meaning it generates responses that are seemingly convincing, but are actually wrong, Morgan Stanley said.