Welcome to the FIA News & Insights, a one-stop resource that includes insights from senior investment professionals on timely market events, their views on the economy and their respective markets. Find updates on the latest media information on Frost Investment Advisors, LLC and the most recent reprints, as well as, archival information for your reference.
The emergence of consumer and commercial applications of artificial intelligence have been decades in the making. As early as the 1940s, Alan Turing investigated the theoretical possibility of creating machines with the ability to exhibit intelligence. In 1997, IBM’s Deep Blue supercomputer beat world champion Garry Kasparov in a game of chess, marking an important milestone in the early history of AI. The explosion of consumer and investor interest in the technology was triggered by the release of a then little-known generative AI application ChatGPT in late 2022. ChatGPT, developed by OpenAI in collaboration with Microsoft, provides conversational, human-like responses to a unique user question. Not only does ChatGPT formulate text responses to people’s queries (something called generative writing), it also creates images, videos, music, computer code and other human-like output that presents predictable patterns. These generative AI models are trained to understand human language, making it easy for people to interact with them. Users simply prompt ChatGPT with a few words and get back near human responses. These new generative capabilities, along with a user-friendly interface, have contributed to ChatGPT’s rapid early adoption. With widespread news coverage, it took only five days for ChatGPT to surpass one million users, and in two months became the fastest consumer application to reach the 100 million user mark. TikTok and Instagram took nine months and 30 months, respectively, to reach 100 million users.
Source: Goldman Sachs GIS
Generative AI models require three elements: data, compute and a model. A very large, structured dataset is critical to train the model. The basis can be a spoken language, a computer programming language, protein sequences, or even images; anything that reflects predictable patterns will work as a data source. Advanced algorithms, like those written by the data scientists at OpenAI, serve as the foundation for the creation of a large language model (LLM). These algorithms are then trained on the desired set of data, a process that could require tens of thousands of advanced semiconductors called Graphics Processor Units (GPUs). GPUs can cost as much as $30,000 per processor, requiring massive capital investment, limiting participation to the largest players in the market. Once trained, the LLM can learn the patterns of almost anything with structure, which when prompted, can respond to user queries.
Source: Center for Research on Foundation Models (CRFM), Stanford University Institute for Human-Centered Artificial Intelligence
Exponential advances in generative AI models are improving their sophistication, often outperforming humans on a wide variety of cognitive tasks. The extraordinary advancements in language understanding and reading comprehension were in part enabled by a new technology called transformers developed by Google in 2017. Previously, data was processed sequentially, generally in the order the words appeared, but important relationships often exist between words even if they don’t appear next to each other in a sentence. Transformers enabled all the words in each body of text to be analyzed at the same time rather than in sequence. This major breakthrough allowed LLMs to understand the complex interplay between words (or between pixels in an image, or the parts of a protein) regardless of where they appear in a sequence. This evolution enabled a much-improved understanding of text that these models read and write, enabling them to be trained on much larger datasets. Currently, the most popular large language models are transformer-based.
Source: Douwe Kiela, Max Bartolo, Yixin Nie et al., “Dynabench: Rethinking Benchmarking in NLP,”; J.P. Morgan Asset Mgmt
Generative AI produces responses that are articulate and indistinguishable from what a well-informed person would provide, as illustrated in the comparison below between ChatGPT and earlier models, represented by Siri and Google. In the example, the voice assistants and ChatGPT were both prompted with the same question. Siri and Google (on the left below), when prompted with a complicated question, will list the relevant websites to further research the question on your own. While helpful with easy questions, they fail to provide answers when given a complex prompt. On the right is ChatGPT, built on generative AI with transformer technology. This LLM better interprets the prompt, determining the relevant content needed to produce a more complete response, condensing the time needed to research and respond to a few seconds.
Source: Siri, Google, ChatGPT, Morgan Stanley Research
Generative AI is already being implemented across several different industries, with most LLMs currently targeting text-based applications, but the range of potential applications continues to broaden. The following are some of the early use cases.
Already, knowledge workers and students are using generative AI copilots to generate documents, complete term papers and summarize meetings or e-mails. In the legal field, generative AI is being used to help analyze contracts, as well as draft and summarize documents. Content creators are generating images and videos with incredible detail. Software developers are using assistants to help write new code, and debug and fix existing code. Customer service agents are being supported by AI assistants that help improve resolution, increasing customer satisfaction.
Source: NVIDIA internal analysis, Goldman Sachs, Cowen, Statista, Capital One, Wall Street Journal, Resource Watch
Prompt: Several giant wooly mammoths approach treading through a snowy meadow, their long, wooly fur lightly blows in the wind as they walk, snow-covered trees and dramatic snowcapped mountains in the distance. midafternoon light with wispy clouds and a sun high in the distance creates a warm glow, the low camera view is stunning, capturing the large furry mammal with beautiful photography and depth of field.
Source: OpenAI
The most important contribution generative AI offers may be in improved quality of life and longevity as it is deployed in the healthcare industry. New processes have the potential to change the development of new drugs where the current discovery process is inefficient with excessive costs and a high failure rate. AlphaFold, an algorithm developed by scientists at DeepMind, has already predicted the structure of all proteins in the human body based on their amino acid sequences. Generative AI models can use this data to create entirely new proteins, opening many possibilities in biology. For example, Amgen is partnering with Nvidia by combining the Nvidia DGX system with Amgen’s deCODE human data to build a “diversity atlas for drug target and disease-specific biomarker discovery.” While it represents almost 20% of GDP, the healthcare system is wildly inefficient and at times ineffective, making it a promising target for AI. We can hope for better outcomes and cost savings with the introduction of new predictive tools for early diagnosis and detection of disease, more personalized care and new innovative diagnostic tools. The improvement in finding and targeting biomarkers could make drug development more cost effective, timelier and better targeted. AI could also reduce wasted spending through more efficient workflows, improved analysis and more individually-tailored care.
Source: DeepMind, Google Research, 2022 & beyond: Health (Feb 23, 2023), GAO, Morgan Stanley Research
The United States is in the lead and American businesses will be the first to benefit from the rapid deployment of AI-related tools, which should increase productivity and reduce input costs. This shift will pose risks for firms that are slow to adapt and raises the specter of lost employment for workers assigned tasks that are most impacted. A recent study from Goldman Sachs, “The Shifting Talent Landscape – Impact of Gen AI and Other Megatrends on Staffers & Recruiting Marketplaces”, written by George K. Tong, CFA®, suggests that the administrative and legal professions have the greatest exposure to work tasks that could be automated by generative AI, while there is low risk of automation eliminating jobs that require human interaction, physical labor or fine motor skills. Most occupations are exposed to at least some degree of automation, but we think generative AI may serve as a complement, rather than a substitute, making workers more productive.
Source: Goldman Sachs Global Investment Research
Technological innovation typically leads to increased economic growth, resulting in more job creation, not less. The invention of the personal computer and the introduction of the Internet did not reduce employment for information analysts, even though their introduction helped automate many tasks for those professions. Improved productivity reduces the costs of goods and services, thus stimulating economic growth, providing opportunities for businesses that embrace the change. The Internet led to new jobs for positions like web design, software development, and digital marketing, which did not exist previously.
Source: IPUMS, McKinsey Global Institute Analysis
We think the large technology incumbents hold several key advantages in this next technology phase. Generative AI is expensive to operate and scale, requiring massive amounts of compute, access to large pools of data and technical expertise. The large technology incumbents generally have the most proprietary data to train these models on, the most compute infrastructure and the most data scientists that specialize in the field of AI.
The key players can be categorized as the enablers, the empowered and the applications providers. The enablers are the companies developing the technology and building the infrastructure necessary to run these programs. Semiconductor manufacturers are the first group to benefit, providing the “picks and shovels.” Perhaps no company is more synonymous with the build phase for generative AI systems than Nvidia. It pioneered the Graphics Processing Unit (GPU) more than 25 years ago, initially for the purpose of rendering graphics in video games. Fortuitously, GPUs are also highly effective in the development of AI models due to their ability to perform thousands of computations simultaneously, a method called parallel processing.
A second group of enablers are the major cloud providers that own the infrastructure, which must expand exponentially to support the development of these models, then scale to answer end-users’ complex queries. Massive capital investment will be required to build out the more complex infrastructure necessary for AI computing, which is much more expensive than existing traditional computing platforms. Over the next several years there will be a continuous cycle of growth as users embrace more and more of the applications that become available. The mega-cap, global technology companies have access to large, proprietary data sets to train these models, the most compute infrastructure (often with preferential access to the most advanced GPUs) and the most data scientists that specialize in the field of AI.
Source: Dell’Oro, Company data, Morgan Stanley Research
Later stage enablers include the model providers. These are the companies writing the algorithms that comprise the large language models. ChatGPT’s OpenAI is the best-known, but there are other large language models that are reporting similar results. These include Google’s Gemini, Meta’s open-source Llama, and models from privately held companies such as Anthropic and Mistral. As of early 2024, many of these models have largely closed the gap with OpenAI’s GPT-4 version.
The empowered are companies that are leveraging generative AI to make their products and client experiences better. Social media companies are one example in this category, as many use generative AI for their recommender engines, which are the algorithms used to curate the content that you see in your feed. Recommending content from across the entire network or internet that is contextually relevant, as is the case with something like Instagram Reels or TikTok, is significantly more difficult and compute intensive than only showing you content from your friends and family within your own network. The improved engagement provides meaningfully higher monetization potential.
As developers and entrepreneurs visualize new uses, companies new and old will be busy creating applications on top of this generative AI infrastructure. For example, Microsoft is building a series of copilots on top of ChatGPT. There is an Office365 Copilot for knowledge workers, a GitHub Copilot for software developers and a security Copilot to help detect cyber threats.
While this is the very early stage of a technology that may have the ability to radically affect much of our economy, and our lives, it is difficult to imagine its full potential. Some limitations could lead to a more measured pace of adoption. There could be intellectual property issues, depending on what data these large language models are trained on. If they are trained on the open internet, as was the case with ChatGPT, then there is the risk of inadvertently violating intellectual property rights. These AI systems can also make things up at times, something called hallucinations. These AI models are probabilistic models designed to recognize patterns and make predictions. If the algorithms are trained on insufficient data, or if there are biases in how the algorithms are written or in the data used to train the model, then these models can generate misleading results. There are also governance and compliance issues to navigate, and greater burden on an already stressed power grid, given the amount of electricity these data centers can require. These could all be hurdles to rapid development and mass adoption.
While a lot of uncertainty remains around both the capabilities and adoption timelines of generative AI, we still view generative AI as a major technology breakthrough that holds tremendous economic and investment promise. According to a recent study, the United States is one of the top countries for AI research today and remains home to more than half of the top AI institutions in the world. Whether you think AI is mostly hype or truly transformative, the United States and U.S.-based technology companies should be well positioned to lead in this new AI era.
Source: Macro Polo
Amazon (6.86%), Apple (7.68%), Google (6.74%), Meta (3.72%), Microsoft (12.82%), Nvidia (11.42%), Spotify (1.05%), and Uber (1.37%) are publicly traded companies which are held in the Frost Growth Equity Fund. *Weights as of June 18, 2024
Frost Investment Advisors, LLC, a wholly owned subsidiary of Frost Bank, one of the oldest and largest Texas-based banking organizations, offers a family of mutual funds to institutional and retail investors. The firm has offered institutional and retail shares since 2008.
Frost Investment Advisors' (FIA) family of funds provides clients with diversification by offering separate funds for equity and fixed income strategies. Registered with the SEC in January 2008, FIA manages more than $3.7 billion in mutual fund assets and provides investment advisory services to institutional and high-net-worth clients, Frost Bank, and Frost Investment Advisors’ affiliates. As of April 30, 2024, the firm has $4.3 billion in assets under management, including the mutual fund assets referenced above. Mutual fund investing involves risk, including possible loss of principal. Current and future portfolio holdings are subject to risks as well.
To determine if a fund is an appropriate investment for you, carefully consider the fund’s investment objectives, risk, charges, and expenses. There can be no assurance that the fund will achieve its stated objectives. This and other information can be found in the Class A-Shares Prospectus, Investor Shares Prospectus or Class I-Shares Prospectus, or by calling 1-877-71-FROST. Please read the prospectus carefully before investing.
Frost Investment Advisors, LLC (the "Adviser") serves as the investment adviser to the Frost mutual funds. The Frost mutual funds are distributed by SEI Investments Distribution Co. (SIDCO) which is not affiliated with Frost Investment Advisors, LLC or its affiliates. Check the background of SIDCO on FINRA's http://brokercheck.finra.org/.
Frost Investment Advisors, LLC provides services to its affiliates, Frost Wealth Advisors, Frost Brokerage Services, Inc. and Frost Investment Services, LLC. Services include market and economic commentary, recommendations for asset allocation targets and selection of securities; however, its affiliates retain the discretion to accept, modify or reject the recommendations.
Frost Wealth Advisors (FWA) is a division of Frost Bank [a bank subsidiary of Cullen/Frost Bankers Inc. (NYSE: CFR)]. Brokerage services are offered through Frost Brokerage Services, Inc., Member FINRA/SIPC, and investment advisory services are offered through Frost Investment Services, LLC, a registered investment adviser. Both companies are subsidiaries of Frost Bank.
This commentary is as of May 24, 2024, for informational purposes only and is not investment advice, a solicitation, an offer to buy or sell, or a recommendation of any security to any person. Managers’ opinions, beliefs and/or thoughts are as of the date given and are subject to change without notice. The information presented in this commentary was obtained from sources and data considered to be reliable, but its accuracy and completeness is not guaranteed. It should not be used as a primary basis for making investment decisions. Consider your own financial circumstances and goals carefully before investing. Certain sections of this commentary contain forward-looking statements that are based on our reasonable expectations, estimates, projections, and assumptions. Forward-looking statements are not indicators or guarantees of future performance and involve certain risks and uncertainties, which are difficult to predict. Past performance is not indicative of future results. Diversification strategies do not ensure a profit and cannot protect against losses in a declining market. All indices are unmanaged, and investors cannot invest directly into an index. You should not assume that an investment in the securities or investment strategies identified was or will be profitable.
NOT FDIC Insured • NO Bank Guarantee • MAY Lose Value
Get the latest posts straight to your inbox