An llm will never be able to do this. Unfortunately the word AI has been hijacked by companies and marketeers. Ai now means just about anything really.
They’re actually coming up with new words to describe what AI used to mean such as AGI, which stands for artificial general intelligence.
To elaborate on the premise of this post, The boost that we’re going to get from an actual artificial intelligence one that is perhaps sentient will be so much that the tasks that were once performed Will become so mundane and menial that it will not make any difference who performs those tasks or if they’re even being paid to do so.
In the same sense that the printing press removed the necessity for scribes, at least for the majority. Or the firearm displaced the bow and arrow as the dominant weapon.
Eventually, what general artificial intelligence will give us is a world free of our Faith-Based monetary system currently dominating the world.
In essence, we shouldn’t need money after general artificial intelligence is implemented.
The term AGI has been used since more than 2 decades ago, and AI never specifically implied something with human intelligence (maybe in the 40s-50s when it was just being invented, but not after that). “AI” has always refered to things like Siri and the YouTube algorithm and pathfinding AIs and trackers for anti-air systems and whatever else.
I remember that before I started programming I’d get annoyed at machinery like 3d printers for the “stupid AI” not working. Then I’d probably bang it or something to try to get it to work lol
The meaning of the term “Artificial General Intelligence” (AGI) has indeed evolved in recent years. Initially, AGI was conceptualized as a form of intelligence that could understand, learn, and apply knowledge across a wide range of tasks, much like a human. This notion dates back to the mid-20th century, rooted in foundational neural network algorithms and deliberative reasoning hypotheses from the 1950s and 1960s
In recent times, the definition and understanding of AGI have been influenced by advancements in specialized AI technologies. Modern discussions often revolve around the practicalities and challenges of achieving AGI, with a focus on the limitations of current AI systems, which excel in narrow tasks but struggle with generalizing across different domains. For example, while models like GPT-3 have shown some cross-contextual learning abilities, they still lack the comprehensive reasoning, emotional intelligence, and transparency required for true AGI
An llm will never be able to do this. Unfortunately the word AI has been hijacked by companies and marketeers. Ai now means just about anything really.
They’re actually coming up with new words to describe what AI used to mean such as AGI, which stands for artificial general intelligence.
To elaborate on the premise of this post, The boost that we’re going to get from an actual artificial intelligence one that is perhaps sentient will be so much that the tasks that were once performed Will become so mundane and menial that it will not make any difference who performs those tasks or if they’re even being paid to do so.
In the same sense that the printing press removed the necessity for scribes, at least for the majority. Or the firearm displaced the bow and arrow as the dominant weapon.
Eventually, what general artificial intelligence will give us is a world free of our Faith-Based monetary system currently dominating the world.
In essence, we shouldn’t need money after general artificial intelligence is implemented.
The term AGI has been used since more than 2 decades ago, and AI never specifically implied something with human intelligence (maybe in the 40s-50s when it was just being invented, but not after that). “AI” has always refered to things like Siri and the YouTube algorithm and pathfinding AIs and trackers for anti-air systems and whatever else.
I remember that before I started programming I’d get annoyed at machinery like 3d printers for the “stupid AI” not working. Then I’d probably bang it or something to try to get it to work lol
The meaning of the term “Artificial General Intelligence” (AGI) has indeed evolved in recent years. Initially, AGI was conceptualized as a form of intelligence that could understand, learn, and apply knowledge across a wide range of tasks, much like a human. This notion dates back to the mid-20th century, rooted in foundational neural network algorithms and deliberative reasoning hypotheses from the 1950s and 1960s
https://www.justthink.ai/artificial-general-intelligence/history-and-evolution-of-agi-tracing-its-development-from-theoretical-concept-to-current-state
https://luceit.com/blog/artificial-intelligence/evolution-of-artificial-intelligence-ai-generative-ai-and-agi/
In recent times, the definition and understanding of AGI have been influenced by advancements in specialized AI technologies. Modern discussions often revolve around the practicalities and challenges of achieving AGI, with a focus on the limitations of current AI systems, which excel in narrow tasks but struggle with generalizing across different domains. For example, while models like GPT-3 have shown some cross-contextual learning abilities, they still lack the comprehensive reasoning, emotional intelligence, and transparency required for true AGI
https://en.m.wikipedia.org/wiki/Artificial_general_intelligence
https://www.justthink.ai/artificial-general-intelligence/history-and-evolution-of-agi-tracing-its-development-from-theoretical-concept-to-current-state
AI always meant human level intelligence.
What you fail to understand is with recent understanding of such concepts AI will far, far surpass human level everything.
(The above statement was generated by GPT4 sources have been provided. This response was prompted by the poster of this response.)
Well it’s hard to make societal predictions with zero basis in reality so you’ll have to forgive me for grounding the premise to current phenomena.