Warren Buffett once said, “Never invest in a business you cannot understand.” However, as the “Oracle of Omaha” era is about to come to a close, Buffett has made a decision that goes against the “family rule”: to buy Google stock at a high premium of about 40 times its free cash flow.
Yes, Buffett has bought “AI-themed stocks” for the first time, neither OpenAI nor NVIDIA. All investors are asking one question: Why Google?
Back to the end of 2022. At that time, ChatGPT had emerged, and the top executives at Google sounded the “red alert”. They kept holding meetings and even urgently recalled two of the founders. But at that time, Google looked like a sluggish, bureaucratic dinosaur.
It hurriedly launched the chatbot Bard, but made factual mistakes during the demonstration, causing the company's stock price to plummet and its market value to evaporate by hundreds of billions of dollars in a single day. Subsequently, it integrated its AI teams and launched the multimodal Gemini 1.5.
However, this product, regarded as a trump card, only sparked a few hours of heated discussion in the tech circle before being overshadowed by OpenAI's subsequent video generation model Sora, quickly becoming ignored.
Somewhat awkwardly, it was Google's researchers who published a groundbreaking academic paper in 2017 that laid a solid theoretical foundation for this wave of AI revolution.
The paper “Attention Is All You Need”
The proposed Transformer model
The rival mocked Google. OpenAI's CEO Altman looks down on Google's taste, “I can't help but think about the aesthetic differences between OpenAI and Google.”
The former CEO of Google is also dissatisfied with the company's laziness, “Google has always believed that work-life balance… is more important than winning competitions.”
This series of predicaments also raises doubts about whether Google has fallen behind in the AI competition.
But the change has finally come. In November, Google launched Gemini 3, which surpassed its competitors, including OpenAI, on most benchmark metrics. More importantly, Gemini 3 was completely trained using Google’s self-developed TPU chips, which are now positioned by Google as a low-cost alternative to Nvidia GPUs and are officially being sold to external customers.
Google is showcasing its strengths on two fronts: the Gemini 3 series directly responds to OpenAI in the software arena; the other front challenges Nvidia's long-standing dominance in hardware with TPU chips.
Kick OpenAI, punch NVIDIA.
Ultraman felt the pressure as early as last month. In an internal letter, he stated that Google “might bring some temporary economic headwinds to our company.” This week, after hearing that a major company purchased TPU chips, Nvidia, whose stock price once plummeted by 7% during trading, had no choice but to personally send a letter to soothe the market.
Google CEO Sundar Pichai said in a recent podcast that Google employees should catch up on sleep. “From an external perspective, we may seem quiet or behind during that time, but in reality, we are solidifying all the foundational components and pushing forward vigorously based on that.”
The situation has now reversed. Pichai said, “We have now reached a turning point.”
At this time, ChatGPT is celebrating its third anniversary. In these three years, AI has opened up a feast of Silicon Valley capital and alliances; however, beneath the feast, concerns about a bubble have emerged. Has the industry reached a turning point?
overtake
On November 19, Google released its latest artificial intelligence model, Gemini 3.
A test data set shows that in most tests covering expert knowledge, logical reasoning, mathematics, and image recognition, Gemini 3 significantly outperformed the latest models from other companies, including ChatGPT. It only slightly lagged in the sole programming ability test, ranking second.
The Wall Street Journal said, “Let's call it the next generation top model of America.” Bloomberg said that Google has finally awakened. Musk and Altman praised it highly. Some netizens joked that this is the GPT-5 that Altman idealizes.
The CEO of Box, a cloud content management platform, stated that after previewing Gemini 3, the performance improvement was so significant that they initially doubted the accuracy of their evaluation methods. However, repeated tests confirmed that the model outperformed all internal assessments by a double-digit margin.
The CEO of Salesforce said that he had been using ChatGPT for three years, but Gemini 3 overturned his perception in just two hours: “Holy shit… there's no going back. This is simply a qualitative leap, reasoning, speed, image and video processing… everything is sharper and faster. It feels like the world has turned upside down once again.”
Gemini 3
Why is Gemini 3 performing so outstandingly, and what has Google done about it?
The project leader of Gemini posted, “In short: improved pre-training and post-training.” Some analyses suggest that the model's pre-training still follows the logic of Scaling Law—by optimizing pre-training (such as larger datasets, more efficient training methods, more parameters, etc.), the model's capabilities can be enhanced.
The one who wants to know the most about Gemini 3's secrets is Ultraman.
Last month, prior to the release of Gemini 3, he sent a warning in an internal letter to OpenAI employees, stating that “Google's recent work is outstanding from any perspective,” particularly in the area of pre-training, and the progress Google has made may bring “some temporary economic headwinds” for the company, and “the atmosphere from the outside will be relatively severe for a period of time.”
Although ChatGPT still has a significant advantage over Gemini in terms of user volume, the gap is narrowing.
In the past three years, the number of ChatGPT users has grown rapidly. In February of this year, its weekly active user count reached 400 million, and by this month, it has surged to 800 million. Gemini announced its monthly active user data, reporting 450 million monthly active users in July, which has increased to 650 million by this month.
With a market share of about 90% in the global search market, Google naturally has mastered the core channels for promoting its AI models, enabling direct access to a vast number of users.
OpenAI is currently valued at $500 billion, making it the highest-valued startup in the world. It is also one of the fastest-growing companies in history, with revenue skyrocketing from nearly $0 in 2022 to an estimated $13 billion this year. However, it is also expected to burn over $100 billion in the coming years to achieve general artificial intelligence, while needing to spend hundreds of billions more on server rentals. In other words, it still needs to seek financing.
Google has an undeniable advantage: a thicker wallet.
Google's latest quarterly financial report shows that its revenue has surpassed $100 billion for the first time, reaching $102.3 billion, a year-on-year increase of 16%, with a profit of $35 billion, up 33% year-on-year. The company's free cash flow is $73 billion, and capital expenditures related to AI are expected to reach $90 billion this year.
It doesn't have to worry about its search business being eroded by AI for the time being, as its search and advertising still show double-digit growth. Its cloud business is booming, and even OpenAI rents its servers.
In addition to having self-generating cash flow, Google also possesses resources that OpenAI cannot match, such as a vast amount of ready-made data for training and optimizing models, as well as its own computing infrastructure.
On November 14, Google announced an investment of $40 billion to build a new data center.
OpenAI is adept at maneuvering and has signed computing power trading agreements worth over $1 trillion with various parties. Therefore, as Google rapidly approaches with Gemini, investors' doubts intensify: Can the growth pie that OpenAI has drawn truly fill the void?
crack
A month ago, Nvidia's market value surpassed $5 trillion, and the market's passion for artificial intelligence pushed this “AI arms dealer” to new heights. However, Google's Gemini 3 uses TPU chips, which have opened a crack in Nvidia's solid fortress.
The Economist cites data from investment research firm Bernstein, stating that Nvidia's GPUs account for more than two-thirds of the total cost of a typical AI server rack. In contrast, Google's TPU chips are priced at only 10% to 50% of the equivalent performance Nvidia chips. These savings add up to a considerable amount. Investment bank Jefferies estimates that Google will produce about 3 million of these chips next year, nearly half of Nvidia's output.
Last month, the well-known AI startup Anthropic planned to adopt Google's TPU chips on a large scale, reportedly with a transaction amount reaching several billion dollars. A report on November 25 stated that tech giant Meta is also in talks to adopt TPU chips in its data centers before 2027, with a value reaching several billion dollars.
Google CEO Sundar Pichai introduces TPU chips
The internet giants in Silicon Valley are also betting on chips, either through self-research or collaboration with chip companies, but no company has made progress like Google.
The history of TPU dates back more than a decade. At that time, Google began developing a dedicated accelerator chip for internal use to improve the efficiency of search, maps, and translation. Since 2018, it has started selling TPUs to cloud computing customers.
Since then, TPU has also been used to support Google's internal AI development. During the development of models like Gemini, the AI team interacted with the chip team: the former provided actual needs and feedback, while the latter customized and optimized the TPU accordingly, which in turn improved AI development efficiency.
NVIDIA currently occupies over 90% of the AI chip market. Its GPUs were initially used for rendering realistic game graphics, relying on thousands of computing cores to process tasks in parallel, and this architecture also puts it far ahead in the operation of artificial intelligence.
The TPU developed by Google is a so-called application-specific integrated circuit ( ASIC ), which is a “specialist” designed specifically for certain computing tasks. It sacrifices some flexibility and applicability, resulting in higher energy efficiency. In contrast, NVIDIA GPUs are like “generalists,” offering flexible functions and strong programmability, but at the cost of higher expenses.
However, at the current stage, no company, including Google, has the capability to completely replace Nvidia. Although TPU chips have been developed to the seventh generation, Google remains a major customer of Nvidia. An obvious reason is that Google’s cloud business needs to serve thousands of customers worldwide, and utilizing the computing power of GPUs ensures its attractiveness to clients.
Even companies that purchase TPUs have to embrace Nvidia. Shortly after Anthropic announced its collaboration with Google TPU, it also announced a significant deal with Nvidia.
The Wall Street Journal reported that “investors, analysts, and data center operators say Google's TPU is one of the biggest threats to NVIDIA's dominance in the AI computing market, but to challenge NVIDIA, Google must begin selling these chips more broadly to external customers.”
Google's AI chip has become one of the few alternatives to Nvidia chips, which directly lowered Nvidia's stock price. Nvidia later posted to soothe the market panic caused by the TPU. It expressed happiness about “Google's success,” but emphasized that Nvidia is already a generation ahead of the industry, and its hardware is more versatile than the TPU and other similarly designed chips for specific tasks.
Nvidia is also under pressure from market concerns about a bubble, as investors fear that the massive capital investment does not match the profit outlook. Investment sentiment is also volatile, as there is fear of Nvidia's business being taken away while also worrying that AI chips may not sell well.
Michael Burry, the well-known “short seller” from the United States, said he has bet over $1 billion against tech companies like Nvidia. He became famous for shorting the U.S. housing market in 2008, and his story was later adapted into the highly-rated film “The Big Short.” He stated that today's AI frenzy is similar to the internet bubble of the early 21st century.
Michael Burry
Nvidia distributed a seven-page document to analysts, rebutting the criticisms from Burry and others. However, the document did not quell the controversy.
mode
Google is experiencing a sweet period as its stock price rises against the trend in the AI bubble. Buffett's company purchased its shares in the third quarter, Gemini 3 received positive feedback, and the TPU chip has investors excited, all of which are propelling Google to new heights.
In the past month, AI concept stocks such as Nvidia and Microsoft have fallen by more than 10%, while Google's stock price has risen by about 16%. Currently, it has a market capitalization of $3.86 trillion, ranking third in the world, behind only Nvidia and Apple.
Analysts refer to Google's artificial intelligence model as vertical integration.
As a rare “full-stack self-made” player in the tech circle, Google holds the entire chain in its hands: deploying its self-researched TPU chips on Google Cloud, training its own AI large models, which can then be seamlessly integrated into core businesses like search and YouTube. The advantages of this model are also obvious, as it does not rely on Nvidia and possesses efficient, low-cost computing sovereignty.
Another model is the more common loose alliance model. The giants each play their part: Nvidia is responsible for GPUs, while companies like OpenAI and Anthropic focus on developing AI models. Cloud giants like Microsoft purchase GPUs from chip manufacturers to host the models of these AI laboratories. In this network, there are no absolute allies or opponents: when collaboration is possible, they work together for mutual benefit, and when it's time to compete, they do not hold back.
Players have formed a “circular structure” where funds circulate in a closed loop among a few tech giants.
Generally speaking, the cycle of financing is as follows: Company A first pays Company B a sum of money (such as investment, loan, or lease), and Company B then uses this money to purchase products or services from Company A. Without this “start-up capital,” B may not be able to afford to buy at all.
One example is that OpenAI splurged $300 billion to buy computing power from Oracle, which then spent billions to purchase NVIDIA chips to build data centers. NVIDIA, in turn, invested up to $100 billion back into OpenAI—on the condition that OpenAI continues to use its chips. (OpenAI pays $300 billion to Oracle → Oracle uses this money to buy NVIDIA chips → NVIDIA uses the profits to invest back into OpenAI.)
Such cases have given rise to a maze-like map of capital flows. In a report dated October 8, Morgan Stanley analysts depicted the capital flow of the Silicon Valley AI ecosystem with a photograph. The analysts warned that the lack of transparency makes it difficult for investors to clarify the real risks and returns.
The Wall Street Journal commented on this photo, saying, “The arrows connecting them are as intricate as a plate of spaghetti.”
With the boost of capital, the outline of that giant object is waiting to take shape, and no one knows its true form. Some are panicking, while others are pleasantly surprised.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
No wonder Buffett finally placed a bet on Google.
Author: Ma Leilei
Source: Wu Xiaobo Channel CHANNELWU
Warren Buffett once said, “Never invest in a business you cannot understand.” However, as the “Oracle of Omaha” era is about to come to a close, Buffett has made a decision that goes against the “family rule”: to buy Google stock at a high premium of about 40 times its free cash flow.
Yes, Buffett has bought “AI-themed stocks” for the first time, neither OpenAI nor NVIDIA. All investors are asking one question: Why Google?
Back to the end of 2022. At that time, ChatGPT had emerged, and the top executives at Google sounded the “red alert”. They kept holding meetings and even urgently recalled two of the founders. But at that time, Google looked like a sluggish, bureaucratic dinosaur.
It hurriedly launched the chatbot Bard, but made factual mistakes during the demonstration, causing the company's stock price to plummet and its market value to evaporate by hundreds of billions of dollars in a single day. Subsequently, it integrated its AI teams and launched the multimodal Gemini 1.5.
However, this product, regarded as a trump card, only sparked a few hours of heated discussion in the tech circle before being overshadowed by OpenAI's subsequent video generation model Sora, quickly becoming ignored.
Somewhat awkwardly, it was Google's researchers who published a groundbreaking academic paper in 2017 that laid a solid theoretical foundation for this wave of AI revolution.
The paper “Attention Is All You Need”
The proposed Transformer model
The rival mocked Google. OpenAI's CEO Altman looks down on Google's taste, “I can't help but think about the aesthetic differences between OpenAI and Google.”
The former CEO of Google is also dissatisfied with the company's laziness, “Google has always believed that work-life balance… is more important than winning competitions.”
This series of predicaments also raises doubts about whether Google has fallen behind in the AI competition.
But the change has finally come. In November, Google launched Gemini 3, which surpassed its competitors, including OpenAI, on most benchmark metrics. More importantly, Gemini 3 was completely trained using Google’s self-developed TPU chips, which are now positioned by Google as a low-cost alternative to Nvidia GPUs and are officially being sold to external customers.
Google is showcasing its strengths on two fronts: the Gemini 3 series directly responds to OpenAI in the software arena; the other front challenges Nvidia's long-standing dominance in hardware with TPU chips.
Kick OpenAI, punch NVIDIA.
Ultraman felt the pressure as early as last month. In an internal letter, he stated that Google “might bring some temporary economic headwinds to our company.” This week, after hearing that a major company purchased TPU chips, Nvidia, whose stock price once plummeted by 7% during trading, had no choice but to personally send a letter to soothe the market.
Google CEO Sundar Pichai said in a recent podcast that Google employees should catch up on sleep. “From an external perspective, we may seem quiet or behind during that time, but in reality, we are solidifying all the foundational components and pushing forward vigorously based on that.”
The situation has now reversed. Pichai said, “We have now reached a turning point.”
At this time, ChatGPT is celebrating its third anniversary. In these three years, AI has opened up a feast of Silicon Valley capital and alliances; however, beneath the feast, concerns about a bubble have emerged. Has the industry reached a turning point?
overtake
On November 19, Google released its latest artificial intelligence model, Gemini 3.
A test data set shows that in most tests covering expert knowledge, logical reasoning, mathematics, and image recognition, Gemini 3 significantly outperformed the latest models from other companies, including ChatGPT. It only slightly lagged in the sole programming ability test, ranking second.
The Wall Street Journal said, “Let's call it the next generation top model of America.” Bloomberg said that Google has finally awakened. Musk and Altman praised it highly. Some netizens joked that this is the GPT-5 that Altman idealizes.
The CEO of Box, a cloud content management platform, stated that after previewing Gemini 3, the performance improvement was so significant that they initially doubted the accuracy of their evaluation methods. However, repeated tests confirmed that the model outperformed all internal assessments by a double-digit margin.
The CEO of Salesforce said that he had been using ChatGPT for three years, but Gemini 3 overturned his perception in just two hours: “Holy shit… there's no going back. This is simply a qualitative leap, reasoning, speed, image and video processing… everything is sharper and faster. It feels like the world has turned upside down once again.”
Gemini 3
Why is Gemini 3 performing so outstandingly, and what has Google done about it?
The project leader of Gemini posted, “In short: improved pre-training and post-training.” Some analyses suggest that the model's pre-training still follows the logic of Scaling Law—by optimizing pre-training (such as larger datasets, more efficient training methods, more parameters, etc.), the model's capabilities can be enhanced.
The one who wants to know the most about Gemini 3's secrets is Ultraman.
Last month, prior to the release of Gemini 3, he sent a warning in an internal letter to OpenAI employees, stating that “Google's recent work is outstanding from any perspective,” particularly in the area of pre-training, and the progress Google has made may bring “some temporary economic headwinds” for the company, and “the atmosphere from the outside will be relatively severe for a period of time.”
Although ChatGPT still has a significant advantage over Gemini in terms of user volume, the gap is narrowing.
In the past three years, the number of ChatGPT users has grown rapidly. In February of this year, its weekly active user count reached 400 million, and by this month, it has surged to 800 million. Gemini announced its monthly active user data, reporting 450 million monthly active users in July, which has increased to 650 million by this month.
With a market share of about 90% in the global search market, Google naturally has mastered the core channels for promoting its AI models, enabling direct access to a vast number of users.
OpenAI is currently valued at $500 billion, making it the highest-valued startup in the world. It is also one of the fastest-growing companies in history, with revenue skyrocketing from nearly $0 in 2022 to an estimated $13 billion this year. However, it is also expected to burn over $100 billion in the coming years to achieve general artificial intelligence, while needing to spend hundreds of billions more on server rentals. In other words, it still needs to seek financing.
Google has an undeniable advantage: a thicker wallet.
Google's latest quarterly financial report shows that its revenue has surpassed $100 billion for the first time, reaching $102.3 billion, a year-on-year increase of 16%, with a profit of $35 billion, up 33% year-on-year. The company's free cash flow is $73 billion, and capital expenditures related to AI are expected to reach $90 billion this year.
It doesn't have to worry about its search business being eroded by AI for the time being, as its search and advertising still show double-digit growth. Its cloud business is booming, and even OpenAI rents its servers.
In addition to having self-generating cash flow, Google also possesses resources that OpenAI cannot match, such as a vast amount of ready-made data for training and optimizing models, as well as its own computing infrastructure.
On November 14, Google announced an investment of $40 billion to build a new data center.
OpenAI is adept at maneuvering and has signed computing power trading agreements worth over $1 trillion with various parties. Therefore, as Google rapidly approaches with Gemini, investors' doubts intensify: Can the growth pie that OpenAI has drawn truly fill the void?
crack
A month ago, Nvidia's market value surpassed $5 trillion, and the market's passion for artificial intelligence pushed this “AI arms dealer” to new heights. However, Google's Gemini 3 uses TPU chips, which have opened a crack in Nvidia's solid fortress.
The Economist cites data from investment research firm Bernstein, stating that Nvidia's GPUs account for more than two-thirds of the total cost of a typical AI server rack. In contrast, Google's TPU chips are priced at only 10% to 50% of the equivalent performance Nvidia chips. These savings add up to a considerable amount. Investment bank Jefferies estimates that Google will produce about 3 million of these chips next year, nearly half of Nvidia's output.
Last month, the well-known AI startup Anthropic planned to adopt Google's TPU chips on a large scale, reportedly with a transaction amount reaching several billion dollars. A report on November 25 stated that tech giant Meta is also in talks to adopt TPU chips in its data centers before 2027, with a value reaching several billion dollars.
Google CEO Sundar Pichai introduces TPU chips
The internet giants in Silicon Valley are also betting on chips, either through self-research or collaboration with chip companies, but no company has made progress like Google.
The history of TPU dates back more than a decade. At that time, Google began developing a dedicated accelerator chip for internal use to improve the efficiency of search, maps, and translation. Since 2018, it has started selling TPUs to cloud computing customers.
Since then, TPU has also been used to support Google's internal AI development. During the development of models like Gemini, the AI team interacted with the chip team: the former provided actual needs and feedback, while the latter customized and optimized the TPU accordingly, which in turn improved AI development efficiency.
NVIDIA currently occupies over 90% of the AI chip market. Its GPUs were initially used for rendering realistic game graphics, relying on thousands of computing cores to process tasks in parallel, and this architecture also puts it far ahead in the operation of artificial intelligence.
The TPU developed by Google is a so-called application-specific integrated circuit ( ASIC ), which is a “specialist” designed specifically for certain computing tasks. It sacrifices some flexibility and applicability, resulting in higher energy efficiency. In contrast, NVIDIA GPUs are like “generalists,” offering flexible functions and strong programmability, but at the cost of higher expenses.
However, at the current stage, no company, including Google, has the capability to completely replace Nvidia. Although TPU chips have been developed to the seventh generation, Google remains a major customer of Nvidia. An obvious reason is that Google’s cloud business needs to serve thousands of customers worldwide, and utilizing the computing power of GPUs ensures its attractiveness to clients.
Even companies that purchase TPUs have to embrace Nvidia. Shortly after Anthropic announced its collaboration with Google TPU, it also announced a significant deal with Nvidia.
The Wall Street Journal reported that “investors, analysts, and data center operators say Google's TPU is one of the biggest threats to NVIDIA's dominance in the AI computing market, but to challenge NVIDIA, Google must begin selling these chips more broadly to external customers.”
Google's AI chip has become one of the few alternatives to Nvidia chips, which directly lowered Nvidia's stock price. Nvidia later posted to soothe the market panic caused by the TPU. It expressed happiness about “Google's success,” but emphasized that Nvidia is already a generation ahead of the industry, and its hardware is more versatile than the TPU and other similarly designed chips for specific tasks.
Nvidia is also under pressure from market concerns about a bubble, as investors fear that the massive capital investment does not match the profit outlook. Investment sentiment is also volatile, as there is fear of Nvidia's business being taken away while also worrying that AI chips may not sell well.
Michael Burry, the well-known “short seller” from the United States, said he has bet over $1 billion against tech companies like Nvidia. He became famous for shorting the U.S. housing market in 2008, and his story was later adapted into the highly-rated film “The Big Short.” He stated that today's AI frenzy is similar to the internet bubble of the early 21st century.
Michael Burry
Nvidia distributed a seven-page document to analysts, rebutting the criticisms from Burry and others. However, the document did not quell the controversy.
mode
Google is experiencing a sweet period as its stock price rises against the trend in the AI bubble. Buffett's company purchased its shares in the third quarter, Gemini 3 received positive feedback, and the TPU chip has investors excited, all of which are propelling Google to new heights.
In the past month, AI concept stocks such as Nvidia and Microsoft have fallen by more than 10%, while Google's stock price has risen by about 16%. Currently, it has a market capitalization of $3.86 trillion, ranking third in the world, behind only Nvidia and Apple.
Analysts refer to Google's artificial intelligence model as vertical integration.
As a rare “full-stack self-made” player in the tech circle, Google holds the entire chain in its hands: deploying its self-researched TPU chips on Google Cloud, training its own AI large models, which can then be seamlessly integrated into core businesses like search and YouTube. The advantages of this model are also obvious, as it does not rely on Nvidia and possesses efficient, low-cost computing sovereignty.
Another model is the more common loose alliance model. The giants each play their part: Nvidia is responsible for GPUs, while companies like OpenAI and Anthropic focus on developing AI models. Cloud giants like Microsoft purchase GPUs from chip manufacturers to host the models of these AI laboratories. In this network, there are no absolute allies or opponents: when collaboration is possible, they work together for mutual benefit, and when it's time to compete, they do not hold back.
Players have formed a “circular structure” where funds circulate in a closed loop among a few tech giants.
Generally speaking, the cycle of financing is as follows: Company A first pays Company B a sum of money (such as investment, loan, or lease), and Company B then uses this money to purchase products or services from Company A. Without this “start-up capital,” B may not be able to afford to buy at all.
One example is that OpenAI splurged $300 billion to buy computing power from Oracle, which then spent billions to purchase NVIDIA chips to build data centers. NVIDIA, in turn, invested up to $100 billion back into OpenAI—on the condition that OpenAI continues to use its chips. (OpenAI pays $300 billion to Oracle → Oracle uses this money to buy NVIDIA chips → NVIDIA uses the profits to invest back into OpenAI.)
Such cases have given rise to a maze-like map of capital flows. In a report dated October 8, Morgan Stanley analysts depicted the capital flow of the Silicon Valley AI ecosystem with a photograph. The analysts warned that the lack of transparency makes it difficult for investors to clarify the real risks and returns.
The Wall Street Journal commented on this photo, saying, “The arrows connecting them are as intricate as a plate of spaghetti.”
With the boost of capital, the outline of that giant object is waiting to take shape, and no one knows its true form. Some are panicking, while others are pleasantly surprised.