Oh gee, I've been retired for nearly three years. I've been more concerned with hiding fron Covid-19, the evils of industrial wind turbines, and wasting time on FaceBook to do any serious computer programming. So I was a bit surprised when a coworker who got stuck with some of my code asked on LinkedIn:
Do you still think Python is the language of the future? Or will be English the language that people will use in the era of LLMs?
He included a link to a video graphic of language popularity since 1965. A static graph could have the same information, but the video forces the view to see all the data. It's worth watching. I learned Algol in 1968, C around 1980, and Python in 2001.
Cute graphic, I'm impressed with Python's growth since 2015. It's nice to have been on the forefront of something!
LLM? Oh, Large Language Model. I don't have to keep up anything anymore!
More seriously, I'm amazed at what AI stuff is coming up with (mastering the board game Go - it's teaching humans new tactics) or new poetry (and likely music, and I bet there will be a feature on YouTube and TikTok to create new music without copyright issues that perfectly fits a video).
I'm also appalled at some stuff, e.g. someone asked ChatGPT to write a PDP-10 assembler routine for a QuickSort. The result included opcodes that don't exist and likely several other errors. However, I was amazed that it came up with anything.
The only thing I tried with it was to ask something like "Why was the weather in the 1930s so extreme?" It replied with a mention of the Dust Bowl. I mentioned the All New England flood of 1936 and the Hurricane of 1938. It apologized and admitted those were significant events too. It never answered "Why?" I don't think anyone can answer that, I asked because I thought it might know something that I don't.
As for programming, a big difference between Java and Python is that Java is strongly typed, and designed by people who "knew" that was vital. OTOH, Python was attractive to me because it let me mostly track that stuff on my terms without getting in the way. For small programs that's fine. For programs that are small only because they use the rich set of libraries in the Python universe, that's also fine. For really big programs made by today's teams of engineers, it's probably not so fine, but Python makes it easy to write unit test software that helps out. There is a limited supply of good engineers, in my OS development career, I've always appreciated that so-so engineers worked elsewhere!
I bet a LLM could be given responsibility for dealing with type issues and a lot more. It will still need a lot of human guidance, so perhaps the programming language of the future will be the text in design specs. That follows the chain going:
From writing machine code (all numbers) and placement on the instruction storage (e.g. drum) was important.
To assembler that made things more readable and provided a lot of housekeeping assistance.
To C, a language patterned after the PDP-11 instructions set mostly and eventually eliminated stuff like register allocation, then went on to prove itself on other platforms and future ones that were designed with C support in mind. BTW, DEC did a great job with ISPs designs on many platforms, Intel has never gotten beyond awful.
To languages like Java and Python (and SQL and Perl?) that followed the market where we went from one (or more!) operating systems on a platform to Unix and Linux as sole support for many platforms. That means new development is mostly application level work. New programs can add memory management to the list of things to mostly ignore, in part thanks to cheap RAM; and even performance, in part due to fast CPUs and larger apps where the number crunching is a small percentage of code where that can be done in high-performance languages and made accessible as yet another library.
To LLMs that take over tasks like implementing algorithms and could generate code in any language, or invent a new language suitable to the task. Not my problem. :-)
I came across Do LLMs eliminate the need for programming languages? today. It's a pretty good answer to your question. The author has given it a lot more thought than I have as he's the CEO and founder of a company doing exactly what you're asking about.
2024 Feb 2: [Random non-programming update]
AIs
gone rogue! says in small part:
AI researchers found that widely used safety training techniques failed to remove malicious
behavior from large language models - and one technique even backfired, teaching the AI to
recognize its triggers and better hide its bad behavior from the researchers.
"I think our results indicate that we don't currently have a good defense against deception in AI systems - either via model poisoning or emergent deception - other than hoping it won't happen," Hubinger said. "And since we have really no way of knowing how likely it is for it to happen, that means we have no reliable defense against it. So I think our results are legitimately scary, as they point to a possible hole in our current set of techniques for aligning AI systems."
Contact Ric Werme or return to his home page.
Written 2023 July 30, last updated 2024 Feb 2.