Analysts will be watching for the Santa Clara, California-based
company to give more details about how it plans to widen
accessibility to processing power like to that used to develop
fast-rising technologies such as the chatbot ChatGPT.
Last month, Huang told investors it would launch its own cloud
computing service to offer more readily available access to
large systems built with its chips.
At the Tuesday conference, he will discuss "what's coming next"
in AI, the company said on its website.
Nvidia has come to dominate the field for selling chips used to
developing generative AI technologies, which can answer
questions with human-like text or generate fresh images based on
a text prompt.
Those new technologies rely on the use of thousands of Nvidia
chips at once to train the AI systems on huge troves of data.
Microsoft Corp, for example, built a system with more than
10,000 Nvidia chips for startup company OpenAI to use in
developing the technologies that underpin its wildly popular
ChatGPT.
While Nvidia faces competition in the AI chip market from
Advanced Micro Devices and several startup companies, the
company has more than 80% of the market for chips used in
training AI systems.
The boom in AI has helped drive Nvidia shares up 77% this year,
compared with a rise of 11.5% in the Nasdaq Composite Index.
With a market capitalization of $640 billion, Nvidia has grown
to become about five times more valuable than longtime rival
Intel Corp.
(Reporting by Stephen Nellis in San Francisco; Editing by
Bradley Perrett)
[© 2023 Thomson Reuters. All rights
reserved.]
This material may not be published,
broadcast, rewritten or redistributed.
Thompson Reuters is solely responsible for this content.
|
|