Transforming you to a more innovative and logical machine
Print this pageAdd to Favorite
"I visualize a time when we will be to robots what dogs are to humans, and I'm rooting for the machines."
 CLAUDE SHANNON, The Mathematical Theory of Communication



Welcome Geeks !!!
Gone are those days when kids spent their time sticking to their PC and playing games. The next generation of computer is ARTIFICIAL INTELLIGENCE/ ROBOTICS. So, get ready to create your own robot and smash others' innovation.
As robots have become more advanced and sophisticated, experts and academics have increasingly explored the questions of what ethics might govern robots' behavior, and whether robots might be able to claim any kind of social, cultural, ethical or legal rights. One scientific team has said that it is possible that a robot brain will exist by 2019. Others predict robot intelligence breakthroughs by 2050. Recent advances have made robotic behavior more sophisticated.

Vernor Vinge has suggested that a moment may come when computers and robots are smarter than humans. He calls this "the Singularity." He suggests that it may be somewhat or possibly very dangerous for humans. This is discussed by a philosophy called Singularitarianism.

In 2009, experts attended a conference to discuss whether computers and robots might be able to acquire any autonomy, and how much these abilities might pose a threat or hazard. They noted that some robots have acquired various forms of semi-autonomy, including being able to find power sources on their own and being able to independently choose targets to attack with weapons. They also noted that some computer viruses can evade elimination and have achieved "cockroach intelligence." They noted that self-awareness as depicted in science-fiction is probably unlikely, but that there were other potential hazards and pitfalls. Various media sources and scientific groups have noted separate trends in differing areas which might together result in greater robotic functionalities and autonomy, and which pose some inherent concerns.

Some experts and academics have questioned the use of robots for military combat, especially when such robots are given some degree of autonomous functions. There are also concerns about technology which might allow some armed robots to be controlled mainly by other robots. The US Navy has funded a report which indicates that as military robots become more complex, there should be greater attention to implications of their ability to make autonomous decisions.

Some public concerns about autonomous robots have received media attention, especially one robot, EATR, which can continually refuel itself using biomass and organic substances which it finds on battlefields or other local environments. Another significant military robot is the SWORDS robot, which is currently used in ground-based combat. It can use a variety of weapons, and there is some discussion of giving it some degree of autonomy in battleground situations.
Let’s wind up with the three fundamental Rules of Robotics.... We have: one, a robot may not injure a human being, or, through inaction, allow a human being to come to harm. Two, a robot must obey the orders given it by human beings except where such orders would conflict with the First Law. And three, a robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

ISAAC ASIMOV, Astounding Science Fiction, Mar. 1942

©2009 MWM Intelligentsia
Site Designed By Pentacle Technosols