I'm against trans-humanism btw. Acording my understanding of trans-humanism (thm) is basically technology will cure everything and we could live as long as we want with no cares. That scares me to no avail, and it urks me that anyboby would want that.Then lets argue the scientific aspects and stop giving any backing to those who are just cranky about not getting their own way.
What exactly are the advances being made that will further this? Will it solve just injury, or disease as well? Will it extend life or just allow a person to live a natural one?
These questions, to me at least, keep the topic within guidelines and along a scientific bent.
I'm against trans-humanism btw. Acording my understanding of trans-humanism (thm) is basically technology will cure everything and we could live as long as we want with no cares. That scares me to no avail, and it urks me that anyboby would want that.
Thanks Kelly! Always knew you where cool and we'd probably be on the same wavelength since I saw you rocking the Dalek avatar a while back. I mean whose more awesome than a pair of Doctor Who fans!I agree with Koltsix's original post. (which was damn near poetic, BTW, well put!) Leaning toward a Design-oriented view of science, I wonder how far we will be allowed to go before we're stopped.
A couple years ago, I saw a show on the science channel about the superhuman stuff that Stan Lee is so fascinated with and it inspired me to write an episodic story about a future based on uncontrolled technological advancements. It appears its happening faster than I can write it.
Speaking of literary analogies, I wonder if Aldous Huxley will have the right of it. Something, really, that only time can tell. Its fun to hear what they can do though. Fun and a little scary at the same time.
If machines developed an intelligence capable of true sentience and where capable of adapting they'd probably rise against us. Scientist see adaptability as a positive ability to produce in machines, never realizing that adaptability may one day turn into a machine overriding it's own programming for what it considers the logical course of action when considering all possibilities. It's actions would also be subject to emotion, dependent on whether emotion is a by product of intelligence and sentience. If it's not then they'd follow an extremely logical course of thinking and rid the earth of us save a few brains for abstract thought if they weren't capable of it themselves, as in my opinion humanities greatest boon is our imagination and is perhaps a machines greatest deficit. Humanity if it stays the course it's on is self destructive and essentially right now as put by Mr. Smith in the Matrix a virus of the earth. Taking more of the planet then we give slowly decimating our own resources.I've always wondered what would happen if artificial intelligence surpassed humans... Would the machines rise against humans? Or would we make mistakes like in the movies where we create robots to take care of us so well, they end up ruling over us for our own safety...
Sent from my iPhone using MonsterAquariaNetwork app