Open in App
  • Local
  • U.S.
  • Election
  • Politics
  • Sports
  • Lifestyle
  • Education
  • Real Estate
  • Newsletter
  • Windows Central

    Microsoft Copilot lead says AI chatbots must 'learn' to ask for help to reduce hallucination episodes: "The thing that's really missing today is that a model doesn't raise its hands and say 'Hey, I'm not sure, I need help'"

    By Kevin Okemwa,

    6 days ago

    https://img.particlenews.com/image.php?url=1kpb0j_0vJ1JOGk00

    What you need to know

    • Microsoft's top executive and lead of its Gen AI division says AI chatbots need to learn how to communicate when unsure of the responses generated.
    • The executive says it will help save on cost and significantly reduce instances of hallucination-generated information misconstrued as accurate.
    • Copilot helps enhance productivity at the workplace by taking over mundane tasks as professionals focus on learning new skills and building their crafts.

    The main challenges preventing advances in the generative AI landscape include privacy, security, and accuracy. AI-powered tools like ChatGPT and Copilot have been spotted generating wrong responses and spreading misinformation about elections . Experts and researchers indicate the issue is systemic.

    The issue doesn't seem to be going away any time soon. While speaking to AFP , Microsoft VP Vik Singh indicated:

    "Just to be really frank, the thing that's really missing today is that a model doesn't raise its hands and say 'Hey, I'm not sure, I need help.'"

    While instances of AI tools hallucinating have seemingly disappeared into the horizon, it feels like Microsoft and OpenAI are constantly trapped in the corridors of justice, fighting multiple copyright infringement lawsuits .

    OpenAI CEO Sam Altman admits it's impossible to develop AI tools like ChatGPT without copyrighted content . Altman says copyright law doesn't bar training of AI models using copyrighted content, and brands it as "fair use."

    Despite broad access to unlimited content, AI tools occasionally fall off the rails and generate inaccurate responses and answers to queries. According to Singh, “real smart people” are looking for a way to let a chatbot admit “when it doesn’t know the right answer and to ask for help.”

    Singh says a chatbot that asks for assistance from humans 50% of the time is much more useful because it helps save money. For context, "every time a new request comes in, they spend $8 to have a customer service rep answer it, so there are real savings to be had, and it's also a better experience for the customer because they get a faster response."

    Copilot helps save money and helps boost creativity

    https://img.particlenews.com/image.php?url=3gt9bV_0vJ1JOGk00

    Dedicated Copilot key on the ASUS Zenbook S 16 (Image credit: Ben Wilson | Windows Central)

    Investors in the AI landscape recently flagged their concern over Microsoft's exorbitant expenditure on AI projects , citing that it's difficult to establish a clear profitable path.

    Generative AI tools like Copilot are designed to boost employees' productivity and give them time to focus on tasks and learn new skills that would ordinarily be ignored. Additionally, they are also expected to generate revenue and profit.

    For instance, Lumen, a telecom company saves up to $50 million annually by integrating Copilot into its workflow.  The tool takes over mundane tasks such as research, giving the employees ample time to call and interact with clients.

    The tech giant plans to integrate Copilot across its tech stack, making it more autonomous. But what does this mean for job security in the targeted fields? Singh says the tech isn't designed to replace professionals at the workplace . Instead, it will help create more time for them to be more creative and even create more opportunities.

    🎒The best Back to School deals📝

    Expand All
    Comments /
    Add a Comment
    YOU MAY ALSO LIKE
    Local News newsLocal News
    datasciencecentral.com13 days ago

    Comments / 0