Functionality of Biology and Robotics -- How to Understand the Mechanisms of Reality
Updated: Jun 26
Introduction: Are We Organic Machines?
Let's be honest, some of you might not like my latest pondering: what if biology is simply "organic machinery," and insects, animals, and even ourselves are just "organic automatons"?
Sure, this analogy might cause some discomfort, especially in our times of the AI revolution, where metallic machines threathen our jobs and our love life. And indeed, being compared to machines might feel unsettling, even dehumanizing. However, what else are we, truly? Complex machines capable of independent thought and moral judgement, perhaps? Machinery advanced enough to replicate and self-regenerate? What if a machine is simply an automated system?
An apparatus consisting of interrelated parts with separate functions, used in the performance of some kind of work.
On the contrary, you might argue that we're far more intricate than machines. However, why must a machine have a specific degree of complexity, in order to be or not considered as such? What happens when a machine overcomes the complexity of a human being? Regardless of a complexity of an object, be it a simple watch or an insect, the definition of "machine" is applicable to both:
A watch is designed to tell the time. Humans are designed for adaptibility and long-term survival. (Apparatus).
A machine, like a car, consist of different parts, that correlate with each other to allow a degree of performance. The human organs, likewise, operate in synergy, allowing you to function in the world around you. (Inter-relations of parts).
Remember: Different people are good at different things. Due to the diversity of our merits, different people perform best in roles they're strong at. Of course, you wouldn't use a calculator to measure mental health. You'd use a therapist to assess it. (Designation).
Why, then, would humans not be considered a type of machinery?
Beyond Human Limits
The future might bring about machines with capabilities equal to, or even surpassing, those of humans. Still, such surpassions were already been made:
The IBM-made device dubbed 'Deep Blue' defeated the reigning champion, Garry Kasparov, in an unusually swift chess game. (Cybernews.com).
It should take far less time to build a robot than to raise a human, depending on the former's complexity. Apply this collectively and you can mass-produce robots at the same time it takes to successfully graduate a single class of students.
A basic calculator might not have emotions, but imagine a future populated by robots with an increased notion of emotional intelligence.
The potential for machines to excel in any field currently dominated by humans is vast. Right now, they might be restricted to specific tasks, but advancements in robotics hold the key to a future where robots surpass even our definition of "human."
The Ever-Evolving Relationship Between Man and Machine
The fact that we are far more complex than a robot right now doesn't mean they won't reach our level, or even surpass it. What if biological reproduction becomes obsolete, replaced by factories mass-producing the next generation of humans, not in flesh and blood, but in some advanced, non/semi-biological form?
This future might not be as outlandish as it seems. Emotions, after all, are functions triggered by events or thoughts. As such, we can already code complex AI to react in ways that mimic human emotions. That's although an artificial machine can't necessarily experience emotions. Could a future script one day create true feeling in robots, or at least a parallel experience indistinguishable from our own?
The key lies in the brain, the very essence of who we are. Despite its mysteries, like consciousness and dreaming, the engineering of a similar organ in non-organic form might be achievable. The question is this: Why limit a brain to biological material? Just as prosthetic legs defy our expectation of flesh and bone, why can't the brain be made of metal or another advanced material, as long as it performs the same function?
Artificial intelligence is literally based on the human brain. Much of technology is based on biology:
The British Royal Navy may be developing submarines based on fish.
Video game enemies, like Dr. Eggman's Badnik robots, are based on real-life animals (insects, fish, etc).
The line between human and machine may one day become startlingly unclear, as the innate biological element will depend more and more on artificial, technological constructs and extensions.
The scientific community is still grappling with a definitive explanation of sentience. However, there are some generally accepted characteristics:
Subjectivity: The ability to experience feelings and emotions like joy, sadness, fear, etc. These experiences are internal and unique to the individual.
Consciousness: Awareness of oneself and one's surroundings. This includes the ability to perceive, process, and respond to stimuli.
Qualia: This refers to the "what it is like" aspect of experience. For instance, the redness you see when you look at a rose is a quale.
Self-preservation: The instinct to avoid harm and survive.
Here are some additional points to consider:
Complexity: Sentience is likely linked to a certain level of brain complexity. "To-date, it appears that sentience arises naturally as a byproduct of increasing the number of neurons in a network. Humans are sentient; flies and mosquitoes are not." -- Nick Saraev.
Diversity: Sentience might manifest differently in various species. "Animals have their own preferences, desires, and needs; we humans may not always know what they are. But if we can use our knowledge of animal sentience to monitor and measure their emotional states, then we can seek to ensure that we avoid causing them pain and distress." -- World Animal Protection Blog.
The Hard Problem: How physical processes in the brain give rise to subjective experiences remains a major philosophical question.
Overall, the question of sentience is multifaceted and constantly evolving as our understanding of consciousness grows.
Androids and the Reshaping of Humanity
The creation of sophisticated robots, or "androids," capable of mirroring human capabilities, presents a future both exhilarating and unsettling. Imagine a world where our robotic counterparts, not limited by biology, surpass us in every field. These "clones" could render countless jobs obsolete, not just menial labor, but even complex professions like doctors, judges, psychologists and philosophers.
The ramifications extend beyond the workplace. If androids could perfectly mimic human behavior and interaction, the very need for companionship, friendship, or even romantic relationships could become a question.
This potential future is a double-edged sword. On one hand, it promises a near limitless pool of labor and resources. On the other, it raises the chilling prospect of being surpassed by machines that are far more intelligent and powerful.
This is a dilemma that humanity will soon face, as advancements in robotics accelerate. I believe we have three options only:
Ruthlessly compete by improving the human element through training and discipline.
Cooperate with technological advancement by recognizing its revolutionary legitimacy (Ori Sindel).
The ability to replicate humans, (not necessarily as individual clones, but as beings with similar or even superior capabilities), could drastically alter our need for social interaction. Furthermore, social interaction has already been compromised by search engines.
Whether these advanced androids, assuming they achieve sentience and functionality on par with humans, would deserve the same rights. According to Law and Senses Blog:
In recent years, the question of whether AI may be added to the list of potential rights holders has grown increasingly pressing. Many researchers believe the exponential development of AI will culminate in the achievement of a capacity for consciousness, and this inevitability is not far off. Equally inevitable, in that case, would be the question of whether AI should be afforded rights. As science journalist John Markoff has aptly put, we must determine whether AI is to become “our masters, slaves, or partners”
Would this usher in a new "race" of beings - androids, pseudo-humans, or perhaps even "metal people" - that demand recognition as equals?
The ultimate challenge, I argue, lies in the concept of a "soul." If we see ourselves as complex organic machinery, then what purpose does this intangible, often religious notion serve?
Perhaps, even within a theistic framework, one could argue that our creators, be they deities or advanced beings themselves, simply chose to build us with organic materials rather than synthetic ones.
Does Consciousness Matter in a World of Code?
The concept of consciousness has long held a central place in our understanding of ourselves. But in the age of advanced AI, the question arises: does it even matter?
Every action, after all, can be seen as a function, a set of instructions leading to a specific outcome. Perhaps our universe itself is merely a complex script, governed by scientific, mathematical, and biological rules. This reasoning lays the footwork for the computer simulation theory.
Consider walking. Does my subjective experience of walking matter if a machine can replicate it flawlessly? This leads to the same result problem: Must we focus on the means so much, if the result, their success, is already evident?
This line of thinking leads to a potentially controversial conclusion: consciousness and soul, might be irrelevant in an automated universe.
Imagine creating a "metal person" with a brain functionally identical to our own. Could this being achieve a consciousness akin to ours, even amidst its non-biological composition? Does it have to matter if it already fulfills its designation?
Music provides a simple analogy. Must a song be "organic" to be beautiful? Can electronic music not resonate as deeply as a melody played on a traditional instrument?
The answer is subjective, but it highlights that the material source doesn't diminish the quality of the creation. A melody doesn't require a "soul" to be beautiful and complex, nor a consciusness. It's extremely common to math, and every sound is reduced to a note. You cannot extend the human element with technology, without math.
Why wouldn't the same hold true for consciousness?
Mr. Nathan Lasher's Feedback
The human brain is far more complex and complicated than any robot or machine out there. It is the most advanced computer in the world. It has video and auditory equipment and can store information for us. Tell me how it isn’t like a computer.
Aren’t all machines automated systems? We design them that way. And the same thing happened during the industrial revolution. A wave of technology was introduced which people were sure would replace humans. They did not, and AI will not be the end of us this time. All it did was create more jobs as they needed people to operate the machines. Nothing can replace human intelligence.
Is our subconscious mind not what makes humans automated systems? Does your body not automatically digest food or pump blood through your system? These are things which you don’t need to tell your body to do. Last time I checked breathing was pretty automated. Does the brain not automatically do those things for us, so we don’t have to think to do them for ourselves?
The functionality of biology and robotics converges in groundbreaking ways, blending the precision of mechanical systems with the adaptability and complexity found in living organisms.
Comments