Have you ever heard the term ‘Luddite’? There were actually groups of people in England who destroyed machinery, particularly in cotton and woollen mills, from roughly 1811 to 1816[1].
Why would they do this? They, like many of us, were afraid of change. There was a lot of longstanding employment in cotton and woollen mills, and these people were afraid that if machines took over these jobs, they’d be abandoned by society, out of work, and unable to provide for their families.
This paradigm can explain how many people may be looking at AI today.
Did the ATM replace or augment the bank teller?
A factory-oriented job is one where a machine may very well be able to fully complete tasks like processing cotton and wool. These types of jobs have less ‘give and take’ human feedback inherent within them—the cotton is not expressing some sort of opinion about how things are going.
A bank teller is different in that every interpersonal interaction could be an opportunity for both sides to learn something and seek a better outcome for both parties. Even if most people, before the ATM’s introduction, were withdrawing and depositing funds, one would never say that 100% of people were doing only this and never anything else.
Once people grew comfortable using the ATM, people parking their cars and coming into the branch was almost a signal that they needed something more, something that interacting with the ATM alone could not support. Many of these services were worth more to customers, and the bank could make incremental revenues by providing many of these higher-valued added services[2].
So while machines obviated the need for as many workers on the cotton factory floors, they actually created a need for more bank tellers to provide higher value-added services beyond withdrawing and depositing money.
Is AI more like a factory machine or an ATM?
This is the quintessential question, in that the focus really becomes full human replacement in certain roles OR rather augmentation and a human pilot/AI copilot type of relationship. Microsoft (NASDAQ:MSFT) was very purposeful with its Copilot branding of certain AI assistants, seeking to emphasize the added value coming from the person working with AI and not AI just doing everything without supervision.
The answer will most likely be a spectrum, with certain jobs looking like they could be closer to being ‘replaced’ by AI and certain other jobs looking like they could be ‘highly augmented’ by AI.
The academic framework
When trying to predict what the future may look like in addressing these types of ‘social science’ questions, it can be difficult to structure the analysis in a way that allows for any valuable insights.
The O*NET database covers 1,016 occupations and their detailed work activities and tasks. The key ingredient is that jobs are then broken into tasks, and tasks can be evaluated against whether it is perceived that they can be done in such a way with access to an LLM that could lead to roughly 50% time savings. It is clear that some of this is subject to estimates and judgements, but it does take the rather complex universes of ‘US occupations’ and ‘what LLMs can do’ and marries them in a way that allows at least some insights to be gleaned[3].
Some of the published work indicates[4]:
-
1.8% of jobs could have over half their tasks affected by LLMs with simple interfaces and general training. With current and future developments, this share could jump to more than 46% of jobs, with more than 50% of their tasks affected by LLMs.
-
On average, LLMs are relevant to approximately 14% of tasks per occupation.
-
Using National Employment Matrix data from the Bureau of Labor Statistics, it is estimated that about 80% of workers are in occupations with at least 10% of tasks exposed to LLMs, assuming partial implementation of complementary software.
-
Only about 1.86% of tasks can be fully automated by LLMs, plus additional software integrations without human oversight.
-
The two job clusters that appear most exposed to LLMs are ‘Scientists & Researchers’ and ‘Technologists’, which could include software engineers and data scientists.
So far, the academic work appears to support ‘complementarity’ and ‘augmentation’ rather than ‘replacement.’
Examining the anecdotal evidence
Politics and policy can be influenced by human stories as well as academic studies. There are some notable quotations discussing the potential for AI to increase productivity, particularly in the area of software development[5].
-
“We’re now seeing that the developers using GitHub Copilot of 55% more productive on tasks.” – Scott Guthrie, Microsoft EVP of Cloud & AI
-
On Amazon’s Code Whisperer: “Internal tests showed 57% faster task completion and 27% higher likelihood of success.” – Adam Selipski, AWS CEO
Coding is an interesting use case in that the various AI-assisted approaches may directly help with a process that used to involve a lot of internet searching and trial and error. We don’t yet hear of AI assistants getting things exactly right 100% of the time—so those expecting perfection or close to it at this point would be disappointed—but it does allow for the honing of the process and greater levels of efficiency in getting through various roadblocks and challenges that may arise.
What about freelancers?
Copywriting has been a notable and flexible freelance position that people take on, with practitioners typically building a client base within a particular industry. For instance, financial advisors frequently need a variety of marketing copy on their websites, brochures and emails. It doesn’t make sense for them to write all of it themselves—their expertise is in working with clients on financial planning and related matters. Working with copywriters has been a longstanding solution, but it’s easy to surmise that this type of job could be at a huge risk of being fully replaced by LLMs.
In 2024, it may be that the copy is not yet viewed as ‘good enough’ for the full replacement of human copywriters—with some stories of copywriters being asked to ‘polish up’ the work of LLMs—but it’s important to keep in mind that models can iterate and improve quite quickly.
It also opens up an important part of the augmentation discussion. If a copywriter, coder or translator is only doing ‘basic’ work, LLMs may be able to replace this work fully. The person would need to move up the curve, applying themselves to more advanced activities that the LLMs cannot complete as easily. However, there could be some learning or training required to get there[6].
Conclusion: Change will be the sole constant
The spectrum will be an important mental model. If people are waiting for LLMs to complete tasks in full with zero mistakes/hallucinations, they could be waiting a long time, but if people are looking to improve their efficiencies in certain daily tasks, AI could be ready to help with that immediately.
We would do well to remember past innovations, like the calculator or computer, where we used to need people to be very good at doing math accurately, but in 2024, we have so many tools for that that it is far less important. It’s more important to understand the concepts embedded in math rather than do the arithmetic. AI will likely be similar in that it will be less important for people to write words themselves, but it will be more important for people to understand the elements of a great story and how to edit AI outputs or prompt AI further and further in this direction.
_______
[1] Source: https://en.wikipedia.org/wiki/Luddite
[2] Source: Weiss et al. “AI Index: Mapping the $4 Trillion Enterprise Impact.” Morgan Stanley (NYSE:MS) Research. October 1, 2023.
[3] Source: Eloundou et al. “GPTs are GPTs: Labor market impact potential of LLMs.” Science. Vol 384, Issue 6702. June 21, 2024.
[4] Source for bullets: Eloundou et al. June 21, 2024.
[5] Source for quotes: Parker et al. “Leveraging AI to Drive Efficiency.” Morgan Stanley Research. 27 February 2024.
[6] Source: Mims, Christopher. “AI Doesn’t Kill Jobs? Tell that to Freelancers.” Wall Street Journal. June 21, 2024.
________________________________
DISCLAIMER
This material is prepared by WisdomTree and its affiliates and is not intended to be relied upon as a forecast, research or investment advice, and is not a recommendation, offer or solicitation to buy or sell any securities or to adopt any investment strategy. The opinions expressed are as of the date of production and may change as subsequent conditions vary. The information and opinions contained in this material are derived from proprietary and non-proprietary sources. As such, no warranty of accuracy or reliability is given and no responsibility arising in any other way for errors and omissions (including responsibility to any person by reason of negligence) is accepted by WisdomTree, nor any affiliate, nor any of their officers, employees or agents. Reliance upon information in this material is at the sole discretion of the reader. Past performance is not a reliable indicator of future performance.