Open in App
  • U.S.
  • Election
  • Newsletter
  • Business Insider

    Internal Amazon sales guidelines spread doubt about OpenAI capabilities while bad-mouthing Microsoft and Google

    By Eugene Kim,

    13 hours ago
    https://img.particlenews.com/image.php?url=0mw7OR_0vA6nGvK00
    • Guidelines show that AWS wants to address customer questions about OpenAI, Microsoft, and Google.
    • AWS salespeople are instructed to question OpenAI's security and customer support in pitches.
    • AWS highlights its AI infrastructure, enterprise security, and cost efficiency over rivals.

    OpenAI lacks advanced security and customer support. It's just a research company, not an established cloud provider. The ChatGPT maker isn't focused enough on corporate customers.

    These are just some of the talking points Amazon Web Services' salespeople are told to follow when dealing with customers using or close to buying OpenAI's products, according to internal sales guidelines obtained by Business Insider.

    Other talking points from the documents include OpenAI's lack of access to third-party AI models and weak enterprise-level contracts. AWS salespeople should dispel the hype around AI chatbots like ChatGPT and steer the conversation toward AWS's strength of running the cloud infrastructure behind popular AI services, the guidelines added.

    "For generative AI workloads, AWS will compete most often w/ Microsoft's Azure OpenAI Service, OpenAI (directly), and Google Cloud's Generative AI on Vertex AI," one of the documents said. "Move beyond the hype with AI chatbots, and focus on the [foundation models] that power them and the cloud infrastructure needed to help enterprise customers safely create, integrate, deploy, and manage their own generative AI applications using their own data."

    The guideline documents are from late 2023 to spring 2024. They reflect Amazon's urgency to aggressively counter the growth of AI rivals, especially OpenAI. The viral success of ChatGPT put OpenAI at the forefront of the AI pack, even though Amazon has been working on this technology for years.

    The effort to criticize OpenAI is also unusual for Amazon, which often says it's so customer-obsessed that it pays little attention to competitors.

    This is the latest sign that suggests Amazon knows it has work to do to catch up in the AI race. OpenAI, Microsoft, and Google have taken an early lead and could become the main platforms where developers build new AI products and tools.

    Though Amazon created an AGI team last year, the company's existing AI models are considered less powerful than those made by its biggest competitors. Instead, Amazon has prioritized selling AI tools like Bedrock, which gives customers access to third-party AI models. AWS also offers cloud access to in-house AI chips that compete with Nvidia graphics-processing units, with mixed results .

    AI growth

    An Amazon spokesperson told BI that AWS was the "leader in cloud," with projected revenue of more than $100 billion this year. Much of the growth has come from its new AI services, which are on pace to generate "multibillion dollars" in revenue this year, the spokesperson added. The spokesperson said AWS had announced more than twice the number of AI features as the next three closest competitors combined since 2023.

    "It's still early days for generative AI, and with so many companies offering varied services, we work to equip our sales teammates with the information they need to help customers understand why AWS is the best, easiest, most performant place to build generative AI applications," the spokesperson wrote in an email. "To parse the language as anything more than that or mischaracterize our leadership position is misguided speculation."

    OpenAI's spokesperson declined to comment.

    https://img.particlenews.com/image.php?url=0m0LCQ_0vA6nGvK00
    OpenAI CEO Sam Altman.

    'Important moment'

    The documents appear to acknowledge that Amazon is playing catch-up to OpenAI. Many of AWS's customers got started on AI projects with OpenAI technology, like ChatGPT and its GPT models, because of the startup's "timing in the market, ease of use, and overall model intelligence capabilities," Amazon said in one of the guidelines.

    But now is a good time to go after those customers to convert them to AWS services, particularly Bedrock, a tool that has partnerships with AI-model providers including Anthropic, Meta, and Cohere, the document said. It also claimed that Anthropic's Claude model in particular had surpassed OpenAI's GPT models in terms of "intelligence, accuracy, speed, and cost."

    The customers most likely to migrate to AWS are the ones who are already "All In" on AWS for the majority of their cloud-computing needs but "who chose to evaluate OpenAI for their first generative AI workloads," it added.

    "This is an important moment for the field to take action on," one of the documents said. "Amazon, in partnership with various foundation model providers, has now created a stronger value proposition for customers that should not only inspire them to migrate their generative AI workloads onto AWS, but also, choose AWS for their next GenAI projects."

    Switching to AWS

    Amazon's spokesperson said some of those efforts were starting to pay off. They described four AWS customers — HUDstats, Arcanum AI, Forcura, and Experian — that initially used OpenAI's products but switched to AWS's AI services after facing "limitations with flexibility and scalability."

    "In Q2 2024, AWS had its biggest quarter-over-quarter increase in revenue since Q2 2022, and much of this growth is being fueled by customer adoption of generative AI," the spokesperson said. "Ultimately, customers are choosing AWS because we continue to be the significant leader in operational excellence, security, reliability, and the overall breadth and depth of our services."

    Microsoft and Google

    It's not just OpenAI that AWS is going after. The sales guidelines also outline how AWS sales reps should respond to customers' questions about Microsoft and Google.

    https://img.particlenews.com/image.php?url=21OPsW_0vA6nGvK00
    Satya Nadella, Microsoft's CEO.

    If a customer talks about Microsoft's and Google's AI infrastructure and chips, AWS salespeople should say Amazon has more than five years of experience investing in its own silicon processors, including its AI chips, Trainium and Inferentia, the documents advised.

    The guidelines also highlight AWS's cost and energy efficiency compared with competing products and note the limited availability of Microsoft's Maia AI chip. One of the guidelines also points out Google's limitations in the number of foundation models offered.

    "We're flattered they're worried about us, but fiction doesn't become fact just because it's in talking points," a Google spokesperson, Atle Erlingsson, told BI. "Not only do we offer more than 150 first, third and open-source models via Vertex AI, our AI infrastructure offers best overall performance, best cost performance, as well as uptime and security."

    Microsoft's spokesperson declined to comment.

    'Cut through the hype'

    For customers who say Microsoft and OpenAI are at the "cutting edge" of generative AI, AWS wants its salespeople to "cut through the hype" and ensure customers understand that AWS has solutions "across the entire stack" of generative-AI technology, from the bottom infrastructure to the AI applications used by end customers, it said.

    In situations where Microsoft pitches its AI-powered analytics software, Fabric, to customers, AWS salespeople are instructed to say, "Microsoft Fabric is a new (unproven) offering." It says that Fabric doesn't offer many integration points with Azure's generative-AI services and that AWS's analytics services "offer superior functionality" across diverse workloads.

    Microsoft has said 67% of Fortune 500 companies use Fabric.

    'Misleading FUD'

    The documents also share AWS's "value propositions" that should be emphasized during sales pitches. They include AWS's ease of use, including "enterprise-grade security and privacy," and the ability to customize AI models using the customer's data. It also stresses AWS's price efficiency and broad set of AI chips offered, as well as its own AI-powered applications, like Amazon Q.

    One of the documents said customers typically consider nine criteria before choosing an AI model and service provider: customization, personalization, accuracy, security, monitoring, cost, ease of use, responsible AI, and innovation.

    Despite the competitive tone of the guidelines, AWS also tells salespeople to use caution and clarity when discussing the data its rivals use for model training. OpenAI, for example, has said it may use customer data to train the consumer version of ChatGPT but not the business data shared through its enterprise product.

    "The APIs and the Enterprise chatbots from Microsoft, Google, and OpenAI all declare product terms specifying that customer data is not used for model training," one of the documents said. "Be careful to not use misleading FUD (Fear, Uncertainty, Doubt) by conflating competitors' enterprise solutions with consumer services."

    Do you work at Amazon? Got a tip?

    Contact the reporter, Eugene Kim, via the encrypted-messaging apps Signal or Telegram ( +1-650-942-3061 ) or email ( ekim@businessinsider.com ). Reach out using a nonwork device. Check out Business Insider's source guide for other tips on sharing information securely.

    Read the original article on Business Insider
    Expand All
    Comments / 0
    Add a Comment
    YOU MAY ALSO LIKE
    Most Popular newsMost Popular
    Business Insider12 hours ago

    Comments / 0