Thursday, July 26, 2007

SystemTap



SystemTap
SystemTap provides free software (GPL) infrastructure to simplify the gathering of information about the running Linux system. This assists diagnosis of a performance or functional problem. SystemTap eliminates the need for the developer to go through the tedious and disruptive instrument, recompile, install, and reboot sequence that may be otherwise required to collect data.


Powered by ScribeFire.

Monday, May 14, 2007

Shaping the future





Charlie's Diary: Shaping the future

I understand that you're expecting a talk about where the next 20 years
are taking us, how far technology will go, how people will use the net,
and whether big shoulder pads and food pills will be fashionable.
Personally, I'm still waiting for my personal jet car — I've been
waiting about fifty years now — and I mention this as a note of
caution: while personal jet cars aren't obviously impossible, their
non-appearance should give us some insights into how attempts to
predict the future go wrong.




Powered by ScribeFire.

Wednesday, May 09, 2007

n the Field of Battle (Or Even Above It), Robots Are a Soldier's Best Friend





Bots on The Ground - washingtonpost.com



By Joel Garreau

Washington Post Staff Writer

Sunday, May 6, 2007; D01



The most effective way to find and destroy a land mine is to step on it.



This has bad results, of course, if you're a human. But not so much if you're a robot and have as many legs as a centipede sticking out from your body. That's why Mark Tilden, a robotics physicist at the Los Alamos National Laboratory, built something like that. At the Yuma Test Grounds in Arizona, the autonomous robot, 5 feet long and modeled on a stick-insect, strutted out for a live-fire test and worked beautifully, he says. Every time it found a mine, blew it up and lost a limb, it picked itself up and readjusted to move forward on its remaining legs, continuing to clear a path through the minefield.



Finally it was down to one leg. Still, it pulled itself forward. Tilden was ecstatic. The machine was working splendidly.



The human in command of the exercise, however -- an Army colonel -- blew a fuse.



The colonel ordered the test stopped.



Why? asked Tilden. What's wrong?



The colonel just could not stand the pathos of watching the burned, scarred and crippled machine drag itself forward on its last leg.



This test, he charged, was inhumane.



* * *



The wars in Afghanistan and Iraq have become an unprecedented field study in human relationships with intelligent machines. These conflicts are the first in history to see widespread deployment of thousands of battle bots. Flying bots range in size from Learjets to eagles. Some ground bots are like small tanks. Others are the size of two-pound dumbbells, designed to be thrown through a window to scope out the inside of a room. Bots search caves for bad guys, clear roads of improvised explosive devices, scoot under cars to look for bombs, spy on the enemy and, sometimes, kill humans.



Even more startling than these machines' capabilities, however, are the effects they have on their friendly keepers who, for example, award their bots "battlefield promotions" and "purple hearts." "Ours was called Sgt. Talon," says Sgt. Michael Maxson of the 737th Ordnance Company (EOD). "We always wanted him as our main robot. Every time he was working, nothing bad ever happened. He always got the job done. He took a couple of detonations in front of his face and didn't stop working. One time, he actually did break down in a mission, and we sent another robot in and it got blown to pieces. It's like he shut down because he knew something bad would happen." The troops promoted the robot to staff sergeant -- a high honor, since that usually means a squad leader. They also awarded it three "purple hearts."



Humans have long displayed an uncanny ability to make emotional connections with their manufactured helpmates. Car owners for generations have named their vehicles. In "Cast Away," Tom Hanks risks his life to save a volleyball named Wilson, who has become his best friend and confidant. Now that our creations display elements of intelligence, however, the bonds humans forge with their machines are even more impressive. Especially when humans credit their bots with saving their lives.



Ted Bogosh recalls one day in Camp Victory, near Baghdad, when he was a Marine master sergeant running the robot repair shop.



That day, an explosive ordnance disposal technician walked through his door. The EODs, as they are known, are the people who -- with their robots -- are charged with disabling Iraq's most virulent scourge, the roadside improvised explosive device. In this fellow's hands was a small box. It contained the remains of his robot. He had named it Scooby-Doo.



"There wasn't a whole lot left of Scooby," Bogosh says. The biggest piece was its 3-by-3-by-4-inch head, containing its video camera. On the side had been painted "its battle list, its track record. This had been a really great robot."



The veteran explosives technician looming over Bogosh was visibly upset. He insisted he did not want a new robot. He wanted Scooby-Doo back.



"Sometimes they get a little emotional over it," Bogosh says. "Like having a pet dog. It attacks the IEDs, comes back, and attacks again. It becomes part of the team, gets a name. They get upset when anything happens to one of the team. They identify with the little robot quickly. They count on it a lot in a mission."



The bots even show elements of "personality," Bogosh says. "Every robot has its own little quirks. You sort of get used to them. Sometimes you get a robot that comes in and it does a little dance, or a karate chop, instead of doing what it's supposed to do." The operators "talk about them a lot, about the robot doing its mission and getting everything accomplished." He remembers the time "one of the robots happened to get its tracks destroyed while doing a mission." The operators "duct-taped them back on, finished the mission and then brought the robot back" to a hero's welcome.



Near the Tigris River, operators even have been known to take their bot fishing. They put a fishing rod in its claw and retire back to the shade, leaving the robot in the sun.



Of the fish, Bogosh says, "Not sure if we ever caught one or not."

'Sort of Alive'



What do you mean "robot"?



Does a machine have to declare independence from its humans to qualify? In 2005, four bots competing in the Defense Advanced Research Projects Agency (DARPA) Grand Challenge successfully traversed 132 miles of the Mojave Desert all by themselves.



Most, however, are more tightly connected to their humans.



The American military and paramilitary intelligence forces are legendarily skittish about fielding an intelligent weapon that's entirely autonomous. They like having a human in the decision-making loop. If anything goes wrong -- if a Boy Scout troop were to be mistaken for an al-Qaeda cell -- they want to have a person to blame. They hate explaining that the robot had a glitch in its algorithm.



So where does the air vehicle called the Predator fit? It is unmanned, and impressive. In 2002, in Yemen, one run by the CIA came up behind an SUV full of al-Qaeda leaders and successfully fired a Hellfire missile, leaving a large smoking crater where the vehicle used to be. Was this the first bot to incinerate Homo sapiens? It is an artificially intelligent machine. But a remote human told it to fire the missile. So can it be said that we now actually have murderous robots? Reasonable people differ. The fellows in the SUV, of course, might find these distinctions overly fine.



More significant than autonomy, thinks Rodney Brooks, may be the way humans have evolved to recognize instantly when an entity behaves like it's alive -- "animate" is the word he uses. Brooks is director of the MIT Computer Science and Artificial Intelligence Laboratory, co-founder and chief technology officer of the pioneering firm iRobot and author of "Flesh and Machines: How Robots Will Change Us."



What the battle bots are teaching us is how easily we identify our own creations as animate.



Digital pets like the Tamagotchi or the Furby, designed to be cute, have long caused children to make spooky levels of connection. Sherry Turkle, founder of the MIT Initiative on Technology and Self, quotes kids describing intelligent machines as "sort of alive."



Robots at MIT with fanciful names like Cog and Kismet are intentionally built to display what look like emotions. Kismet can listen, and speak with expression. Its cartoonish eyes, ears, eyebrows, eyelids and lips move to create facial expressions that make it appear to be happy, sad, disgusted, calm, interested, angry or surprised.



Humans respond so readily to Kismet, created by Cynthia Breazeal, that graduate students working in the lab at night have been known to put up a curtain between themselves and the bot, Brooks reports. They couldn't stand the way it seemed to gaze around and stare at them. It broke their concentration. These humans are as sophisticated about robots as anyone on Earth. Yet even they are freaked by Kismet's lifelike behavior. "We're programmed biologically to respond to certain sorts of things," Brooks explains.



It's not about how the machine works. It's about how humans are wired.



What's remarkable about the battle bots is that humans bond with them even though their designers have made no attempt to load them with emotional cues. Troops use cheap little scouts like the MARCbot as a point man on reconnaissance. It is basically a remote-controlled toy truck made military tough and embellished with a long neck to which has been added a camera. Its little head peers around doors and into windows, or it cranes up to examine suspicious packages. Soldiers refer to such bots as "like a 10th man," and "like an extra set of eyes," Brooks reports. They also refer to them as "Johnny Five," the intelligent robot of our celluloid dreams in the "Short Circuit" films.



It's common for a soldier to cut out a magazine picture of a woman, tape it to the antenna and name the bot something like "Cheryl," says Paul Varian, a former Army chief warrant officer who has served three tours in Iraq with the Robotic Systems Joint Project Office. "There's an awful lot of picture-taking," he says. One guy who married just before deployment wanted his wife to see the gal who was his constant companion. It was a PackBot. "One Guard unit got so attached to a development model that we gave it to them. It was pretty beat up. They put it in a place of honor in their museum."



"When we first got there, our robot, his name was Frankenstein" says Sgt. Orlando Nieves, an EOD from Brooklyn. "He'd been in a couple of explosions and he was made of pieces and parts from other robots." Not only did the troops promote him to private first class, they awarded him an EOD badge -- a coveted honor. "It was a big deal. He was part of our team, one of us. He did feel like family."



"I've been a proponent for a long time of painting a mouth and eyes on the Global Hawk," the Learjet-size surveillance bot, says retired Col. Tom Ehrhard, a former chief of the Air Force's "Skunk Works" -- its strategy, concepts and doctrine division.



"It looks like a blind mole. Give it some character. Make it easier for humans to deal with -- more animate. Humans are social animals. Make that other thing part of your family, your social structure. Try to animate and make either fearsome or lovable your implements of war."



Brooks wonders if engineers should actively try to create battle bots that adapt to particular individuals, like a dog. "I'm from Australia," he says, "and Australian sheepdogs -- you and the dog build up a strong working relationship. One could imagine a tactical situation where you're a troop of guys who train together, know each other quite intimately, recognize little hand gestures. One can imagine a deeper sense of communication with a particular bot that is working with those people for a long time, seeing their image, the way their hats are cocked, identifying them based on their gait. One could imagine doing the research to get to that. It's not out of this world."

Medals for Machines



If a robot performs magnificently in battle, should its operator get a medal? Even if that human is hundreds, if not thousands, of miles from the action? The question goes to the heart of a novel issue: Where does the bot end and the human begin?



Last month, the Army announced that it would allow unmanned air vehicle (UAV) operators to earn the Distinguished Flying Cross, the high military honor awarded for heroism or extraordinary achievement while participating in flight.



When Greg Harbin was piloting one of the first Predators over Bosnia from his desk in Hungary, he wasn't actively trying to become the first UAV operator in history to be awarded a medal. He was trying to avoid becoming the first UAV pilot to incinerate a schoolyard full of children.



The then-32-year-old Air Force captain's bot was "dead square right over the city" of Mostar, looking for snipers, when its engine conked out. As it spiraled down, still carrying 2,000 gallons of fuel, Harbin could see on his screen, through the cross hairs of the nose camera, that "it was dead-centered on that school. If I'd allowed that thing to hit into that school, could you imagine?" he recalls. "I would be the first guy in the history of the world to kill somebody with an unmanned aircraft."



Using the kind of fancy flying he'd learned as a fighter pilot, and only the 15 minutes' worth of battery power he had left, he miraculously pulled the bot out of its spiral and found an airstrip run by the French. To make sure the barely-under-control bot didn't hit all their transport planes, he intentionally crash-landed it on their runway. "The French were pretty funny on the radio. They didn't know what it was. They videotaped the whole thing."



But that's not the significant part.



As he was struggling to bring the bot down without an engine, he could see "the ground coming real fast." He dropped the landing gear, flared the wings, pushed the stick forward and then started fumbling around at the bottom of his desk chair.



He had bonded so tightly with the machine hundreds of miles away that he was searching for the lever that would allow him to eject.

It's About the Humans



"EODs are very brave people," Varian says. "They don't put on any sense of weakness." That's why when they get "back from a really rough mission -- the bot got blown up and it could have been them -- we have an area for them to come in, get out of the heat. The refrigerator is stocked with cold drinks and snacks. We've got a satellite phone in the shop. If they haven't talked to the wife, mom and dad, the girlfriend, we give them a phone call back to the States.



"When we talk a system, we're talking the operator as well. We take these guys out, show them where to get a good shower, take them over to the PX, the chow hall. Wind them down while the techs go ahead and try to make a repair on that particular robot. We preach to everybody, you have to look at the system. Not just the machine, but the human."



Nonetheless, Ehrhard worries that a bot "completely changes the human role" in warfare. "That is not lost on these people," he says. "Part of what a society has to do to sacrifice lives is make it heroic. Do you see the dilemma? It's a fundamental dilemma. Look at what we do for the military. Give them uniforms, great burials, medals. I'm retired and people still call me colonel. It's all out of respect for what you have done. How do robots change that?"

Glimpsing the Future



It's not just muddy-boot soldiers facing these questions. It's all of us.



No less an authority than Bill Gates, in a recent Scientific American article titled "A Robot in Every Home," announces that the next big technological wave sweeping the world is robots. He compares their rise to the PC revolution he helped lead.



The bot world "reminds me so much of that time when Paul Allen and I looked at the convergence of new technologies and dreamed of the day when a computer would be on every desk and in every home," he writes. "Robotic devices will become a nearly ubiquitous part of our day-to-day lives." He describes a world "when the PC will get up off the desktop and allow us to see, hear, touch and manipulate objects in places where we are not physically present" -- from planting crops, to allowing doctors to treat distant patients, to providing care for children and the elderly.



That world is arriving fast. The 2 million personal bots in use around the world in 2004 are expected to grow to 7 million next year. The South Korean Ministry of Information and Communication hopes to put a bot in every home there within six years.



Joseph W. Dyer, the retired three-star admiral who heads iRobot's government and industrial division, says, "The androids in the movie 'I, Robot,' who are projected to exist in 2030? We think that timeline is about right."



Most bots won't look semi-human, like C-3PO. Nonetheless, Gates says, "they could have just as profound an impact on the way we work, communicate, learn and entertain ourselves as the PC has had over the past 30 years."



Perhaps those days have already arrived.



Right now Avis is airing a 30-second spot that features a young man in a necktie having a conversation with the navigation bot in his rental car.



"Traffic ahead," the female voice says to him.



"Incredible!" he replies. "You found a golf course near the conference -- awesome Chinese. Now you find me a way around traffic."



He shakes his head and lifts his thumbs off the wheel in a gesture of emotional helplessness.



"I love you," he says with feeling.



The music swells:



Turn around / Every now and then I get a little bit lonely / And you're never coming 'round.



Fall hard for a bot at Avis.



Post a Comment



View all comments that have been posted about this article.



Your washingtonpost.com User ID will be displayed with your comment.



Comments: (Limit 5,000 characters)



Comments that include profanity or personal attacks or other inappropriate comments or material will be removed from the site. Additionally, entries that are unsigned or contain "signatures" by someone other than the actual author will be removed. Finally, we will take steps to block users who violate any of our posting standards, terms of use or privacy policies or any other policies governing this site. Please review the full rules governing commentaries and discussions. You are fully responsible for the content that you post.





Powered by ScribeFire.

Thursday, May 03, 2007

How can I optimize the Windows 2000/XP/2003 virtual memory (Pagefile)?





Pagefile Optimization

How can I optimize the Windows 2000/XP/2003 virtual memory (Pagefile)?




Powered by ScribeFire.

Tuesday, May 01, 2007

Tuesday, April 24, 2007

Seven Post-Install Tips for Ubuntu 7.04





PC World - Seven Post-Install Tips for Ubuntu 7.04

Seven Post-Install Tips for Ubuntu 7.04
Fixes, extras, and eye candy: Here are seven steps to take just after installing Feisty Fawn.




Powered by ScribeFire.