Amazon is designing its own microchips, called Inferentia and Trainium, for generative AI use, hoping to provide an alternative to increasingly expensive and backlogged Nvidia GPUs. However, prior to Amazon’s initiative, other companies like Microsoft and Google had already made advancements in the generative AI field. It’s a gamble, but Amazon’s custom silicon chips might give them an advantage; however, Nvidia’s GPUs are currently the preferred choice for training models.
Amazon’s commercial cloud dominance and existing customer base could attract users to their generative AI offerings. AWS also has some experience in the field, having developed two other AI services: HealthScribe and CodeWhisperer. Amazon is focused on providing solutions rather than directly competing with ChatGPT. Security concerns surrounding proprietary information have led some companies to ban ChatGPT, but Amazon ensures data privacy through its Bedrock service. That service will allow organizations to choose foundation models from Cohere, Anthropic and Stability AI, which like OpenAI all train their AI’s with ‘blackbox’ data. Over 100,000 customers are currently using machine learning on AWS.
The whytry.ai article you just read is a brief synopsis; the original article can be found here: Read the Full Article…