• CES
  • AMARTS
  • Electronic Kid
  • Useful-news
  • Forum
  • Fellowship
  • E-Library
  • All

Archives

gravatar

PETMAN robot to closely simulate soldiers (w/ Video)

  PETMAN robot to closely simulate soldiersEnlarge

  A freely walking biped robot the size and shape of a human being is being developed to realistically simulate a soldier wearing protective clothing. 

The , PETMAN (from Protection Ensemble Test Mannequin), is being developed by Boston Dynamics for the US Army for testing chemical protection clothing. The anthropomorphic robot will be able to balance itself while walking, crawling, doing calisthenics, and generally moving freely like a human while being exposed to chemical warfare agents.
At its current stage the robot resembles a box on legs, but in its final form it will closely resemble a person, having “the shape and size of a standard human,” according to VP of Engineering at Boston Dynamics, Robert Playter. When completed, PETMAN will be the first anthropomorphic robot to move dynamically like a real person, Playter said. The army also wanted the robot to simulate physiological responses inside the suit, such as sweating, temperature and humidity control, and even breathing, to even more realistically simulate a soldier wearing a protective suit.

The prototype robot walks heel-to-toe just like a human, and remains balanced even when pushed. In tests it has achieved a fast walking speed of 4.4 mph walking on a moving . PETMAN’s walking algorithm and the mechanical design are based on a previous Boston Dynamics robot known as Big Dog, which is designed for carrying supplies over almost any terrain. Like its predecessor, the PETMAN robot has a hydraulic actuation system and articulated legs with shock-absorbing elements. The robot is under the control of an on-board computer and an array of sensors and internal monitoring systems.
The 13-month developmental period will be followed by 17 months of building, installation, and validation. Delivery of the completed and tested robot is expected some time in 2011. Boston Dynamics is a small company that spun off MIT in 1992. It specializes in robotics, with many of its projects being for military applications. Measurement Technology Northwest, Midwest Research Institute (MRI), Smith Carter CUH2A (SCC) and HHI Corporation are all partners in the PETMAN project.

Source from
 

gravatar

Apple patent application for 3D viewing glasses

Apple patent application for 3D viewing glasses

  -- Apple has filed a patent application for electronic video spectacles that will allow wearers to watch films in 3D on the inside of the glasses. Fans have already nicknamed the gadget iSpecs.

Users would attach their , iPod, or other device to the spectacles, which have a special lens that can split the image into two frames — one for each eye — and then project the image onto the spectacles. The two images would create a stereoscopic effect since they would appear to have been taken from slightly different angles, and this would simulate 3D.
According to the patent application (number 20100079356) the images would be equivalent to high definition in quality, and sensors inside the spectacles would detect the precise location of the wearer’s eyes to ensure the image is projected at exactly the right place and is comfortable to watch. The device could be controlled by the wearer’s head movements, such as nodding or head shaking, or by voice control. Sound would be provided by earphones fitted into the device. To enhance the viewing experience, the spectacles could even vibrate in response to content such as explosions.
The spectacles would also incorporate a small camera and infrared sensors embedded in the frames to stream video of the surroundings to a smaller screen in the glasses if anyone approached or tried to gain their attention while they were watching the film. The aim of this system is to make people feel more comfortable about wearing the glasses in public, such as during plane, train, or bus trips.
Apple patent application for 3D viewing glasses
Enlarge


There are already a few video glasses that can be plugged into an iPod to allow viewers to watch films, but none offer 3-D viewing or high-quality images. Another drawback of the previous is that wearers are unable to see what is happening around them, which makes them reluctant to wear them in public. Some have suspected the is a hoax since news of it was published on the Web on April 1st, but the US Patent and Trademark Office is unlikely to be involved in a hoax and the application was filed in late 2008. never comments on patent applications.
 Source from
 

gravatar

Engineering Students Showcase Nextgen Robots During Research Expo

Research Expo RobotsEnlarge

UCSD engineering students will unveil the new iFling,a remote-controlled robot that can pick up and throw ping pong balls, during Research Expo April 15.
  --  In the UCSD Coordinated Robotics Lab, mechanical engineering professor Tom Bewley and his students, have just released the latest generation of their Switchblade family of agile treaded vehicles.
As featured in the film Hurt Locker, small robotic vehicles already play a key role in the safe disposal of improvised explosive devices in modern urban warfare. As the military and industry work together to improve the performance of such existing robots, engineers at UC San Diego are exploring new roles for small robotic systems in combat. The families of agile, autonomous robotic systems they are developing are also expected to have significant roles in homeland security, border patrol, search and rescue, and planetary exploration.
In the UCSD Coordinated Robotics Lab, mechanical engineering professor Tom Bewley and his students, have just released the latest generation of their Switchblade family of agile treaded vehicles. Switchblade can pop wheelies, climb stairs and rubble, and carry substantial payloads such as real-time video; Light Detection And Ranging (LIDAR), an optical remote sensing technology; chemical, radiation and biological sensors; and GPS. The can also, literally, run circles around other treaded vehicles in its class, and can be produced for a fraction of the cost, according to the engineers.
“The focus of our lab is on an application of robotics that is today much less developed — the deployment of multiple inexpensive robots for the exploration of dangerous and confined environments, such as buildings, caves, mines, and tunnels,” Bewley said.
The mechanical engineering students will showcase Switchblade, along with four other student-designed robots during the Jacobs School’s annual Research Expo April 15. One of the robots that will make its debut at Research Expo is the new and improved iFling, a fun, remote-controlled vehicle that can, among other tasks, pick up and throw ping pong balls. iFling, which was designed using a new 3D printer, has potential commercial use as a toy, Bewley said.
The mechanical will be among more than 250 UCSD graduates students who will be presenting posters at Research Expo.
Provided by Jacobs School of Engineering

Source from
 

gravatar

* Consumer & Gadgets * Hardware * Robotics Warwick students take rescue robot to RoboCup Rescue Championship

Warwick
 students take rescue robot to RoboCup Rescue Championship in GermanyEnlarge

(PhysOrg.com) -- University of Warwick (UK) students are poised to take their  "rescue robot" to the RoboCup Rescue Championship in Germany next week. The students developed their robot in a team project bringing together Engineering  and Computer Science students and and they will soon be on their way to the 9th RoboCup German Open in Magdeburg from April 15-18.


The robots are designed to crawl over and through difficult terrain or such as destroyed buildings in search of trapped survivors. The German competition will put Paul’s through its paces in a simulated disaster environment which requires robots to demonstrate their capabilities in mobility, sensory perception, planning, mapping, and operator interfaces, while searching for simulated victims in difficult environments.
The Warwick team think they have a competition winning trick with their rescue robot as they have constructed it with a which has “4 degrees of movement” this gives that arm more turning flexibility than even the head of an owl. This allows the arm to turn and weave in tight situations and change its orientation without having to move the whole robot. The team also has a second rescue robot is also under development which will have its own mapping capabilities using LiDAR technology (Light Detection and Ranging).
Warwick students take rescue robot to RoboCup Rescue Championship 
in Germany
Enlarge


The team’s academic project Director is Professor Ken Young from WMG (Warwick Manufacturing Group) at the University of Warwick. He says: “This project gives our students hand on experience of solving a real world engineering problem. Not only will they learn practical lessons such as how to integrate leading technology to create a practical working solution they may also come up with their own truly innovative ideas that could be taken up by technology companies and make the even more sought after as employees of high tech engineering firms.”

Provided by University of Warwick (news : web)
Source from 
 

gravatar

Japanese robo-suit promises superpowers for greying farmers

  The 
metal-and-plastic exoskeleton boasts eight electric motors that amplify 
the strength of the wearer's arms and legsEnlarge

A Tokyo Agriculture and Technology (TAT) University postgraduate student is seen demonstrating the new power-assist suit for elderly agriculture workers, developed by TAT professor Shigeki Toyama. The power assist suits are said to reduce the user's physical effort by about 62 percent.
While Robocop and Iron Man can dodge bullets and crush villains, a new powered suit from Japan promises its elderly users more modest powers, such as pulling up radishes without getting a backache.

Unlike its heavily-armed Hollywood counterparts, the Power Assist Suit aims to make life easier for Japan's army of greying farmers.
The metal-and-plastic boasts eight electric motors that amplify the strength of the wearer's arms and legs, as well as sensors that can detect movements and respond to commands through a voice-recognition system.
Professor Shigeki Toyama and his team developed the power-enhancing suit at the Tokyo University of Agriculture and Technology, and Toyama plans to set up a company to start producing the futuristic outfit by the end of the year.
"If the farmer bends over to grasp a radish, his back will be firmly supported," said Gohei Yamamoto, one of the students working on the team, as he recently demonstrated the suit on his university campus.
"A brief vocal instruction will instantly straighten the rods along his legs, giving him the power he needs to pull the vegetable without effort."
Fifteen years in the making, the robosuit will soon hit the market in Japan to help ageing farmers harvest their fruit and vegetables while avoiding backaches and nasty cramps, its developers say.
Japan, with a low birthrate and a high life expectancy, is facing a demographic crisis as its population rapidly ages and shrinks.
have long been common in Japan, and robo-suits are making inroads in hospitals and retirement homes, where they can help carers lift patients or aid in physical rehabilitation exercises.
But with two thirds of the country's farm-workers already over 65 years old, the agriculture sector is a potentially lucrative untapped market.
The suit should hit the Japanese market in 2012, when it will initially retail for about one million yen (11,000 dollars), a price tag its makers hope to halve if the device is mass-produced, the team said.
There are however no plans so far to sell the suits overseas.
"I doubt that the suit would sell in Europe and in America, where foreign migrants workers often perform farm-related tasks," Toyama said.
The team has developed a heavy-duty 30 kilogram (66 pound) model, for lifting big loads and pulling vegetables out of the ground, and a 23 kilogram version designed for lighter tasks such as picking grapes.
The robo-suits can reduce the user's physical effort by 62 percent on average, the inventors say. When bending knees the muscular activity is reduced by half, and the suit can also take most of the strain out of crouching.
"We conducted a survey of 102 people for the latest model, asking what part of the body hurt when they picked grapes," Yamamoto said. "Most farmers complained about aches in their arms, necks and lower backs."
The suits are already tough, but soon they will also become smarter.
By the end of the year Toyama plans to start working on augmented reality goggles on which useful information could be displayed for the farmer, in much the same way as data is projected onto the inside of a fighter jet's cockpit.
Useful information might include how ripe the grapes are, or the user's heart rate and calorie consumption, said Toyama. "The goggles would tell you for instance how long you've been working and when you should rest."
Source From physorg
 

gravatar

Online e-expo features more than 100 university robotics labs

  EcccerobotEnlarge

One of the robots at EXPO21xx: ECCCEROBOT (Embodied Cognition in a Compliantly Engineered Robot), which was designed by researchers at the AI Lab at the University of Zurich. Credit: University of Zurich.
(PhysOrg.com) -- In an effort to bring together the top academic robotics labs under one roof, a project called EXPO21XX has created an online exhibition to showcase the diversity in today's robotics research. At one website, robotics researchers and enthusiasts can view the projects underway in more than 100 university robotics labs from around the world. 
EXPO21XX, which has been providing electronic exhibitions (or e-expos) since 2001 for a variety of industries, launched the virtual University Robotics Platform in 2008. The e-expos offer a convenient and free alternative to the conventional expo by taking advantage of the internet and the latest multimedia technology.
Similar to a real-life expo, EXPO21XX’s e-expos are organized into halls, corridors, and individual stands. The University Robotics Platform, for example, is located in e-Hall05 of the Automation e-expo (where there are currently 37 e-Halls).
On the Universities Robotics Platform, you can learn about many different robotics projects that don’t always make it to the news headlines (and some that do). For example, visitors can watch videos of the following robots:
*A robotic jaw that simulates human chewing behavior, which was designed by researchers at the Mechatronics and Robotics Research Group at Massey University in New Zealand.
*A that uses Facebook, which was designed by researchers the Interactive Robots and Media Laboratory at the United Arab Emirates University.
*A robot that performs stereotactic , which implants an electrode in the brain for a therapy called that could help treat Parkinson’s disease. The robot was designed by researchers at the Institute for Robotics and Cognitive Systems at the University of Luebeck in Germany.
*A giant six-legged ant-like robot that skitters across the ground and flips itself over to walk “upside down.” The was developed by researchers at the Active Structures Laboratory at the Université Libre de Bruxelles in Brussels, Belgium, for gait studies.
*A brain-actuated robotic wheelchair constructed by researchers at the Neurotechnology Lab at the University of Zaragoza in Spain. The wheelchair is being designed to provide people with severe neuromuscular disabilities with a certain degree of mobility.
Prominent among the presenters is Prof. Peter Bock of George Washington University (Computer Science) who was invited to publish a complete presentation of the functionality of ALISA (Adaptive Learning Image and Signal Analysis) and the results from his recent funded projects with Bosch and then DTRA (Defense Threat Reduction Agency of the DOD).
The Universities Robotics Platform continues to grow, with a few stands currently being reserved for additional universities. If you want to get a glimpse of the future, this is definitely worth checking out.
Visit the site at www.expo21xx.com/automation21xx/university.htm



gravatar

Researchers develop a robot that folds towels

A team from Berkeley's Electrical Engineering and Computer Sciences department has figured out how to get a robot to fold previously unseen towels of different sizes. Their approach solves a key problem in robotics -- how to deal with flexible, or "deformable," objects.

Who wouldn't want a robot that could make your bed or do the laundry? Well, a team of Berkeley researchers has brought us one important step closer by, for the first time, enabling an to reliably fold piles of previously unseen towels.
Robots that can do things like assembling cars have been around for decades. The towel-folding robot, though, is doing something very new, according to the leaders of the Berkeley team, doctoral student Jeremy Maitin-Shepard and Assistant Professor Pieter Abbeel of Berkeley's Department of and Computer Sciences.

Robots like the car-assembly ones are designed to work in highly structured settings, which allows them to perform a wide variety of tasks with mind-boggling precision and repeatability — but only in carefully controlled environments, Maitin-Shepard and Abbeel explain. Outside of such settings, their capabilities are much more limited. Automation of household tasks like laundry folding is somewhat compelling in itself. But more significantly, according to Maitin-Shepard, the task involves one that's proved a challenge for robots: perceiving and manipulating "deformable objects" - things that are flexible, not rigid, so their shape isn't predictable. A towel is deformable; a mug or a computer isn't.
A video — posted on this page — tells the story best. It shows a robot built by the Menlo Park robotics company Willow Garage and running an developed by the Berkeley team, faced with a heap of towels it's never "seen" before. The towels are of different sizes, colors and materials.
The robot picks one up and turns it slowly, first with one arm and then with the other. It uses a pair of high-resolution cameras to scan the towel to estimate its shape. Once it finds two adjacent corners, it can start folding. On a flat surface, it completes the folds — smoothing the towel after each fold, and making a neat stack. "Existing work on robotic laundry and towel folding has shown that starting from a known configuration, the actual folding can be performed using standard techniques in robotic manufacturing," says Maitin-Shepard.
But there's been a bottleneck: getting a towel picked up from a pile where its configuration is unknown and arbitrary, and turning it into a known, predictable shape. That's because existing computer-vision techniques, which were primarily developed for rigid objects, aren't robust enough to handle possible variations in three-dimensional shape, appearance and texture that can occur with deformable objects, the researchers say.
Solving that problem helps a robot fold towels. But more significantly, it addresses a key issue in the development of robotics.
"Many important problems in robotics and computer vision involve deformable objects," says Abbeel, "and the challenges posed by robotic towel-folding reflect important challenges inherent in robotic perception and manipulation for deformable objects."
The team's technical innovation is a new computer vision-based approach for detecting the key points on the cloth for the robot to grasp, an approach that is highly effective because it depends only on geometric cues that can be identified reliably even in the presence of changes in appearance and texture.
The approach has proven highly reliable. The robot succeeded in all 50 trials that were attempted on previously unseen towels with wide variations in appearance, material and size, according to the team's report on its research, which is being presented in May at the International Conference on Robotics and Automation 2010 in Anchorage. Their paper is posted online (PDF).
The system was implemented on a prototype version of the PR2, a mobile robotic platform that was developed by Willow Garage, using the open-source Operating System (ROS) software framework.
Two undergraduates, Marco Cusumano-Towner, a junior in EECS, and Jinna Lei, a senior math major, assisted on the project.
Provided by University of California - Berkeley (news : web)
Reference

gravatar

Japan unveils humanoid robot that laughs and smiles

 A 
model (R) touches the face of a humanoid robot called 
"Geminoid-F" (L)Enlarge

A model (R) touches the face of a humanoid robot called "Geminoid-F" (L) at a press conference in Osaka. Japanese researchers said Saturday they have developed a humanoid robot that can laugh and smile as it mimics a person's facial expressions.
Japanese researchers said Saturday they have developed a humanoid robot that can laugh and smile as it mimics a person's facial expressions.


The robot, Geminoid TMF, can move its rubber facial skin to imitate a smile, a laugh showing teeth, and a grim look with furrowed brows, by receiving electric signals from the person it is modelled on.
The researchers demonstrated with a robot made to look exactly like an attractive woman in her 20s with long dark hair. The woman and the robot were dressed in the same clothes - a black skirt and black leather jacket.
The robot smiled and furrowed its brow in almost simultaneous of the woman, whose face was filmed with a video camera which then provided information on her expressions to the robot through electric signals.
"I felt like I had a twin sister," the woman told reporters afterwards.
The developers said they expected the robot to be eventually used in real-life situations, for example in hospitals.
"We've already got some data showing that the robot gave patients psychological security by nodding and smiling at them, when patients were checked on by doctors," said Satoko Inoue, spokeswoman for Kokoro, one of the two companies involved in the development.
"A new technology always creates some fears and negative opinions," but the researchers wanted to make robots that could express something similar to human emotions, said Hiroshi Ishiguro, a professor at Osaka University who led the research.

Reference