Close Menu
FSNN | Free Speech News NetworkFSNN | Free Speech News Network
  • Home
  • News
    • Politics
    • Legal & Courts
    • Tech & Big Tech
    • Campus & Education
    • Media & Culture
    • Global Free Speech
  • Opinions
    • Debates
  • Video/Live
  • Community
  • Freedom Index
  • About
    • Mission
    • Contact
    • Support
Trending

Cambodian journalist Luot Sophal detained over military water shortage report

34 minutes ago

Polygon Tops Ethereum In Daily Transaction Fees Over The Weekend

50 minutes ago

Steak ‘n Shake Says Bitcoin Has Lifted Sales ‘Dramatically’ in 9 Months

51 minutes ago
Facebook X (Twitter) Instagram
Facebook X (Twitter) Discord Telegram
FSNN | Free Speech News NetworkFSNN | Free Speech News Network
Market Data Newsletter
Tuesday, February 17
  • Home
  • News
    • Politics
    • Legal & Courts
    • Tech & Big Tech
    • Campus & Education
    • Media & Culture
    • Global Free Speech
  • Opinions
    • Debates
  • Video/Live
  • Community
  • Freedom Index
  • About
    • Mission
    • Contact
    • Support
FSNN | Free Speech News NetworkFSNN | Free Speech News Network
Home»Opinions»Debates»Keep Calm and Adapt
Debates

Keep Calm and Adapt

News RoomBy News Room2 hours agoNo Comments12 Mins Read478 Views
Share Facebook Twitter Pinterest Copy Link LinkedIn Tumblr Email VKontakte Telegram
Keep Calm and Adapt
Share
Facebook Twitter Pinterest Email Copy Link

Listen to the article

0:00
0:00

Key Takeaways

Playback Speed

Select a Voice

Software can now write software better than even the most advanced human experts. And where software has gone, the other learned professions will follow. That is the gist of a viral essay titled “Something Big Is Happening” posted to X on 9 February by Matt Shumer, who runs an AI start-up. The time for “cocktail party polite” answers to the question “what’s the deal with AI?” is over, he warns, and it is time to confront people with the scary truth: mass technological unemployment is upon us, and devastation of most other kinds of “cognitive work” will soon follow. His argument focusses on software engineering, and on Claude Code Opus 4.6 and OpenAI Codex 5.3 in particular. Both of these models were released on 6 February, and they are now so good that, he tells us, “I am no longer needed for the actual technical work of my job.”

Shumer is right about the new AI coding tools—they are great. I now spend less time coding because I can tell Claude and Codex what and how to code, and how I want that code tested. I have not yet been able to replicate Shumer’s one shot—that is, walk away to come back to a perfect application four hours later. In my experience, AI still makes mistakes and does things I do not want it to do. Nevertheless, the transformative capacities Shumer describes are real.

But while I accept Shumer’s observations, I dispute his inferences. For a start, there’s a lot more to software engineering than coding. Someone still has to ask the models to produce the app in the first place, which requires a degree of background knowledge if you are attempting anything tricky or complex, as is often the case in a large enterprise. More pertinently, Shumer’s broader argument assumes that if machines can perform many tasks, human work will disappear. This seems to assume that jobs are just collections of tasks, but they are also systems of responsibility that involve risk and return. Used well, AI is likely to reduce risks and increase returns. However, history and experience suggest that when the cost of doing things falls, societies do not run out of work. They attempt new things and do new work. What we are likely to see is job transformation not job elimination.

Shumer argues that software was the first learned profession to be solved by AI because AI needs a lot of code and now AI models are being used to improve AI models (per the release notes of OpenAI Codex 5.3). He argues that the technical superfluity that has hit software engineers will soon also hit lawyers, accountants, writers, journalists, customer service workers, and doctors, and that the coming “intelligence explosion” will devastate these professions and many more. But while the virality of Shumer’s post and its breathless urgency suggest that some new and apocalyptic threshold has been crossed, doomy arguments like these have been made for a long time.

Robots will put humans out of work. Cover of Der Spiegel in 1964, 1978 and 2017.
ht @gduval_altereco pic.twitter.com/QjkGDohNEq

— Lionel Page (@page_eco) November 20, 2017

In 2013, Carl Benedikt Frey & Michael Osborne released a report titled “The Future of Employment: How Susceptible Are Jobs to Computerisation?” Their headline claim was that 47 percent of US jobs were “at high risk” due to automation. Less widely known was a 2016 OECD paper contesting Frey and Osbourne’s conclusions by Melanie Arntz, Terry Gregory, and Ulrich Zierahn. Jobs, they wrote, should be thought of as occupations made up of a variety of tasks, many of which are not automatable. Approaching the problem this way, the OECD paper reduced the figure of jobs at high risk to nine percent.

There is a long history of fear produced by technological change, dating back at least as far as the early 19th-century, when textile workers known as Luddites smashed mechanised knitting machines with hammers. But while technology can be a threat to particular occupations, in the long run, it creates more jobs than it destroys. The British government treated Luddite vandalism as insurrection and made it a capital offence. Luddite leaders were hanged and many others were transported to the penal colonies of New South Wales and Van Diemen’s Land (Tasmania). The repression of sabotage led to reduced costs in clothing production, and as a result, the textile industry exploded in scale.

Indeed, a surge in demand for wool led to the rapid expansion of the colonies. In his history of Australia, Tony Abbott speaks of the “Wool Rush” following the Napoleonic Wars and preceding the Gold Rushes of the 1850s. Clothing became cheap and abundant. While some Luddite tasks—such as “cropping” textiles until they were smooth with shears by hand—vanished, textile employment overall grew because more clothes were sold. When textile frames were small and artisanal, clothes were expensive and people could only afford a few. When the frames entered factories and became faster and bigger, cloth and clothes got cheaper and people could afford more, which meant that employment increased along with demand.

ChatGPT and the Future of the Professions

Professionals must learn to work with the machines or they will be replaced by them.

It is, in other words, a mistake to confuse “my skill is no longer needed” with “human labour is no longer needed” because new occupations are also created as a result of new technology. In software, what was expensive, difficult, and risky last year is now cheaper, easier, and less risky. But the availability of Photoshop, Premiere, and cheap camcorders did not end media, it caused an explosion of media because the cost of production fell. There is more content than ever in the world because you do not need a studio to produce it anymore, and software will be the same. A game project that would have required an artist, coders, testers, UI designers, writers, game designers, and producers to coordinate all the above can now become a person-plus-AI project.

And many other worthy projects that would have required teams can now become person-plus-AI projects. Two years ago, a serious software project would need a business analyst, a solution designer, a developer or two, a project manager, subject matter experts, a test analyst and a system administrator. Putting all those people in a room creates an expensive meeting and a coordination overhead. Large software projects could be spectacular failures. As the risk, difficulty, and expense of software projects crashes, adoption will surge. Paul Jarvis’s 2019 book Company of One advocated staying small as a deliberate business strategy. AI makes such endeavours easier.

Let me return to coding, which is my line of work and a core part of Shumer’s argument. While it is true that you can get an AI to write you an app. The AI cannot decide whether the app should exist, whether the business should take the risk, whether the legal exposure is acceptable, and whether the user requirement is understood or misunderstood. These are not technical tasks; they are judgments linked to institutional responsibilities and realities. Shumer himself notes that AI adoption “will be slowed by compliance, liability and institutional inertia” in roles that require relationships and trust, physical presence, licences, and permits, particularly in heavily regulated industries.

Software engineering fell first to AI because its inputs and outputs are textual, and because it relies upon immediate feedback loops in simulated environments. As a result, AI is well suited to computing, which is a highly symbolic problem domain. This cannot be said of law, medicine, and management, or of civil, mechanical, and aeronautical engineering. These disciplines are embedded in physical, social, and legal systems that existed long before computing machinery. They have external real-world constraints, not just digital and informational ones.

What I am driving at here is that software engineering is “computationally pure” in a way that the grease and oil of mechanical engineering is not, let alone the blood and guts of medicine or the enraged conflicts of legal disputes. Most economic activity involves a lot more than the symbolic processing that goes on in computers—it still exists in the physical world of atoms not the digital world of bytes. Even software development is as much organisational negotiation as programming. The hard part is discovering what the system actually needs to do, not writing the functions that implement it. Lack of clarity and incorrect assumptions in requirement specifications have led to software disasters.

This matters because Shumer is assuming that if AI can do code, then it can do everything. It would be more accurate to say that if a task is already reducible to symbolic processing by a computer, then AI will be able to do it well. But there is a moral and legal point too: jobs are not just collections of tasks, they come with responsibilities. Someone has to stand behind the decisions taken, and if necessary, stand in court. Professions such as accounting, law, and medicine have duties of care, duties to the client, duties to the state, and requirements to observe professional ethics. A lawyer is not just a drafter of contracts. A doctor is not just a diagnosis engine. A programmer is not just a typist of code. Professionals are actors in normative systems that can be held responsible. Shumer is arguing AI will reduce labour demand for software engineers. I am arguing AI will reduce software costs, difficulty and risk, and that huge opportunities are about to be created as a result.

In 2010, the Queensland Health introduced a new automated payroll system for around 80,000 hospital workers. The original budget was AU$6 million but it ended up costing more than AU$1.2 billion, and thousands of hospital staff were overpaid, underpaid, or not paid at all. The problem was not that the computer could not calculate wages. It was that the complex rules of the relevant awards—such as penalty rates for shifts, overtime due to extra hours, local rostering practices, and negotiated exceptions—were not properly understood before they were encoded. Code was produced in a compressed timeline. Lack of testing resulted in chaos. The system went live without a parallel pay test (where the old and new system calculate pay for the same period, side by side). Shumer’s argument implicitly assumes that coding is the difficult part of software engineering but the Queensland hospitals payroll case demonstrates that understanding what the system has to do and testing it are critical too.

AI coding tools pose some risks (e.g. privacy and security breaches) but they also provide ways to mitigate risks. Most importantly, they fundamentally change the economics of writing software. The cost, time, and risk of writing software is about to collapse. The cost of big meetings to coordinate complex projects will disappear. And yet panic about the obliteration of software jobs strikes me as baseless. There have always been more worthy software projects proposed than could be developed. Most were rejected because they were too hard, too expensive, and too risky because they needed an expensive team and a long timeline. Unable to get software written, people hacked out manual fixes with spreadsheets, emails, and shared drives instead. I have never heard of a worthy software project that failed to get a green light because it was too easy, too cheap, and had too little risk.

Shumer is right to argue that now is the time to attempt an ambitious project. He is right to point out that humans using AI will outcompete humans not using AI. It is also true that people and companies will need to adapt as AI shifts work from execution to specification and supervision. And Shumer is right to say that people should make more serious use of AI at work. Get used to supervising your AIs. Try giving them substantial tasks, not just simple questions. Now is the time for software engineers to get into the code models, though most are already doing so. But I see no compelling reason for chefs to be supervising AIs anytime soon. 

Yes, there will be disruption and some jobs will be lost. But Shumer is almost certainly wrong about extinction. As yet unknown and unimagined new jobs and new industries will be created. History shows that adjustment to economic shocks takes time and can be painful. I am not going to dispute Shumer’s view you should be financially prepared for a sudden redundancy. Regardless of AI, it is prudent to have a nest egg in case your income ceases at short notice. But overall, AI expands the set of solvable problems and will reduce the cost of access to timely and relevant knowledge. 

AI is not about to eliminate work. If you are a knowledge worker, it is going to change your job by lowering the threshold at which problems become economic to solve. It will also lower the cost and risk of software projects. But this is not going to make software engineers extinct. The place I work has a mountain of software problems and a limited budget, and with AI I can now get a lot more done. We will see is a shift from expensive inflexible “Software as a Service” (SaaS) and “Commercial off the Shelf” (COTS) products to custom-written applications that do exactly what one company needs. Mass production made clothes cheaper. Not only will AI make software cheaper, it will also enable software engineers to act more like tailors, fitting the application to the exact needs of a client, at much lower cost, difficulty, and risk than last year. Instead of “off the shelf” systems that are a rough fit for their job or business, people will be able to get bespoke systems that are an exact fit. When the cost of something falls, societies often want more of it. This was the case with clothes. It will be the case with code.



Read the full article here

Fact Checker

Verify the accuracy of this article using AI-powered analysis and real-time sources.

Get Your Fact Check Report

Enter your email to receive detailed fact-checking analysis

5 free reports remaining

Continue with Full Access

You've used your 5 free reports. Sign up for unlimited access!

Already have an account? Sign in here

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Telegram Copy Link
News Room
  • Website
  • Facebook
  • X (Twitter)
  • Instagram
  • LinkedIn

The FSNN News Room is the voice of our in-house journalists, editors, and researchers. We deliver timely, unbiased reporting at the crossroads of finance, cryptocurrency, and global politics, providing clear, fact-driven analysis free from agendas.

Related Articles

Cryptocurrency & Free Speech Finance

Steak ‘n Shake Says Bitcoin Has Lifted Sales ‘Dramatically’ in 9 Months

51 minutes ago
Media & Culture

The American Constitution Society Still Does Not Have A Competing Theory Other Than “Antitrumpism”

2 hours ago
Cryptocurrency & Free Speech Finance

Kraken Backs ‘Trump Accounts’ for Wyoming Newborns

2 hours ago
Cryptocurrency & Free Speech Finance

Bitcoin’s Long-Term Holders Show Signs of Strain After February Sell-Off

3 hours ago
Cryptocurrency & Free Speech Finance

Philippines’ Digital Bank Maya Looks to US Market for Up to $1B IPO

4 hours ago
Cryptocurrency & Free Speech Finance

Binance Founder CZ: Privacy ‘Missing Link’ for Crypto Payments Adoption

7 hours ago
Add A Comment
Leave A Reply Cancel Reply

Editors Picks

Polygon Tops Ethereum In Daily Transaction Fees Over The Weekend

50 minutes ago

Steak ‘n Shake Says Bitcoin Has Lifted Sales ‘Dramatically’ in 9 Months

51 minutes ago

The American Constitution Society Still Does Not Have A Competing Theory Other Than “Antitrumpism”

2 hours ago

Keep Calm and Adapt

2 hours ago
Latest Posts

Dollar bearish positioning hits highest since 2012.

2 hours ago

Bitcoin Sentiment Hits Lows Amid Oversold Signals

2 hours ago

Kraken Backs ‘Trump Accounts’ for Wyoming Newborns

2 hours ago

Subscribe to News

Get the latest news and updates directly to your inbox.

At FSNN – Free Speech News Network, we deliver unfiltered reporting and in-depth analysis on the stories that matter most. From breaking headlines to global perspectives, our mission is to keep you informed, empowered, and connected.

FSNN.net is owned and operated by GlobalBoost Media
, an independent media organization dedicated to advancing transparency, free expression, and factual journalism across the digital landscape.

Facebook X (Twitter) Discord Telegram
Latest News

Cambodian journalist Luot Sophal detained over military water shortage report

34 minutes ago

Polygon Tops Ethereum In Daily Transaction Fees Over The Weekend

50 minutes ago

Steak ‘n Shake Says Bitcoin Has Lifted Sales ‘Dramatically’ in 9 Months

51 minutes ago

Subscribe to Updates

Get the latest news and updates directly to your inbox.

© 2026 GlobalBoost Media. All Rights Reserved.
  • Privacy Policy
  • Terms of Service
  • Our Authors
  • Contact

Type above and press Enter to search. Press Esc to cancel.

🍪

Cookies

We and our selected partners wish to use cookies to collect information about you for functional purposes and statistical marketing. You may not give us your consent for certain purposes by selecting an option and you can withdraw your consent at any time via the cookie icon.

Cookie Preferences

Manage Cookies

Cookies are small text that can be used by websites to make the user experience more efficient. The law states that we may store cookies on your device if they are strictly necessary for the operation of this site. For all other types of cookies, we need your permission. This site uses various types of cookies. Some cookies are placed by third party services that appear on our pages.

Your permission applies to the following domains:

  • https://fsnn.net
Necessary
Necessary cookies help make a website usable by enabling basic functions like page navigation and access to secure areas of the website. The website cannot function properly without these cookies.
Statistic
Statistic cookies help website owners to understand how visitors interact with websites by collecting and reporting information anonymously.
Preferences
Preference cookies enable a website to remember information that changes the way the website behaves or looks, like your preferred language or the region that you are in.
Marketing
Marketing cookies are used to track visitors across websites. The intention is to display ads that are relevant and engaging for the individual user and thereby more valuable for publishers and third party advertisers.